Sep 27, 2020
Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.
Nov 11, 2021
This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.
创建者 Ruiwen W•
Aug 1, 2020
some errors in the assignments
创建者 V B•
Sep 24, 2020
创建者 JJ Y•
Sep 26, 2020
Sequence models are heavy subjects and it would be unrealistic to expect a 4-week course to go into all the depths of RNN, GRN, LSTM etc. This course does a great job covering important types of neural networks and showing their applications. However, the labs and assignments could have done more in (a) helping us look a little deeper into the implementations of different NN building components, and (b) aligning better with the lecture videos.
Really Good examples: Week1 labs and assignment illustrate the implementations of some of the basic layer classes, and outline the overall flow of NN training with Trax. Week4 labs and assignment illustrate the implementation of the loss layer based on the unique triple loss function.
Not so Good examples: Week1 uses a whole video explaining gradient calculation in Trax. Yet there is no illustration of how it's integrated in backward propagation in Trax. Week2 videos and the labs/assignment are more disjoint. There is a video explaining the scan() function, but it does not show up in the assignment at all.
创建者 Yaron K•
Apr 29, 2022
The 4th week on Siamese networks was well done. The Weeks on RNN GRU and LSTMs basically gave the equations and some intuition but most of the emphasis was on building a model with them using Googles TRAX Deep learning Framework model. Which the lecturers believe to be better than Tenserflow2. At least when it comes to debugging - it isn't. Make the smallest error (say with shape parameters) - and you get a mass of error messages which don't really help. Now at least for shape errors there is no excuse for this - since all that is needed is to run checks on the first batch of the first epoch that pinpoint exactly where there's a shape discrepancy.
创建者 Amlan C•
Oct 9, 2020
Despite the theoretical underpinings I do not feel this course lets you write an NER algo on your own . Majority of these courses have been using data Whats supplied by coursera and so is the case with models. In real life we have to either create this data or use some opensource data like from kaggle or whatever. I think it'd be better if we orient the course using publicly available appropriate data and models trained by students to be used for actual analysis.
创建者 Maury S•
Mar 8, 2021
Like some of the other courses in this specialization, this one has promise but comes off as a so far somewhat careless effort compared to the usual quality of content from Andrew Ng. The lecturers are OK but not great, and it is unclear what the role of Lukasz Kaiser is beyond reading introductions to many of the lecture. There is a strange focus on simplifying with the Google Trax model at the cost of not really teaching the underlying maths.
创建者 Petru R•
Apr 13, 2022
The course requires a solid background on deep learning, it does not explain in detail the LSTMs or how is the programming part keeping the weights of the 2 parts of the siamese network identical.
Is Trax providing other ways of generating data for siamese networks for training other than writing a custom function?
创建者 Business D•
Dec 14, 2020
I regret a lack of proper guidance in the coding exercises, compounded with the incomplete documentation of the trax library. I also feel we could build models with greater performance. An accuracy of 0.54 for the identification of question duplicates doesn't seem to be the state of the art...
You could do better!
创建者 Rajaseharan R•
Mar 9, 2022
Too much focus on the Data generator in the assignments. There should be a library function in Trax to do it. Might have to do some data preparation before hand but the generator should be a standard library function. Also, I hoped to learn a bit more indepth in terms of entity labelling.
创建者 Huang J•
Dec 23, 2020
The course videos are too short to convey the ideas behind the methodology. It requires understanding of the methodology before following the course material. Also, the introduction on Trax is fine, but would prefer to have a version of the assignments on TensorFlow.
创建者 A V A•
Nov 13, 2020
Good course teaching the applicatons of LSTMs/GRUs in language generation, NER and for matching question duplicates using Siamese networks. Would have been more helpful if there was more depth in the topics.
创建者 J N B P•
Mar 16, 2021
This course is good for practical knowledge with really good projects but it lags in the theoretical part you must be familiar with the concepts to get the most out of this course.
创建者 Nguyen B L•
Jul 5, 2021
I am now confusing by too many Deep learning framework. Also the content is somehow repeated with the Deep learning specialization.
创建者 shinichiro i•
Apr 24, 2021
I just want them to use Keras, since I have no inclination to study new shiny fancy framework such as Trax.
创建者 martin k•
Apr 26, 2021
Lectures are quite good, but assignments are really bad. Not helpful at all
创建者 Deleted A•
Jan 3, 2021
assignments were easy and similar.learned less than expected.
创建者 Alberto S•
Nov 1, 2020
Content is interesting, but some details are under explained.
创建者 Ashim M•
Nov 22, 2020
Would've been better with a better documented library.
创建者 Mahsa S•
Mar 26, 2021
I prefer to learn more about nlp in pytorch
创建者 Leon V•
Sep 28, 2020
Grader output could be more useful.
创建者 Kota M•
Aug 21, 2021
Sadly, the quality of the material is much lower than the previous two courses. Assignment repeatedly asks to implement data generators with a lot of for-loops. We should focus more on the network architecture rather than python programming. That being said, the implementation is not good either. Learners would have to learn to program anyways.
创建者 Patrick C•
Dec 23, 2020
Assignments are very difficult to complete because of inaccurate information (off by one on indices and other sloppy mistakes). You also don't learn much from them because almost all the code is already provided. It would be much better if they built up your understand from first principles instead of rushing through fill in the blank problems.
创建者 Mostafa E•
Dec 13, 2020
The course did well in explaining the concepts of RNNs... but it may in fact have provided less knowledge than the NLP course in Deep Learning specialization.
I was looking forward to see more details on how translation works using LSTMs, go over some famous LSTM networks such as GNMT, and explain some accuracy measures such as the BLEU score.
创建者 Greg D•
Dec 24, 2020
Spends a lot of time going over tedious implementation details rather than teaching interesting NLP topics and nuances, especially in the assignments. Introduction to Trax seems to be the only saving grace, one bonus star :)))).
For having Andrew Ng's course as suggested background for this course this is a big step (read as fall) down.
创建者 Artem R•
Dec 1, 2020
Course could be completed without watching videos - just by using hints and comments in assignments, videos are short and shallow, choice of Deep Learning framework (TRAX) is questionable - I won't use it in production.
Despite the course is 4 weeks long it could be accomplished in 4 days - I don't feel that it was worth the time.