JH
Oct 4, 2020
Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks
SB
Nov 20, 2020
The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.
By Woosung Y
•Nov 7, 2020
Great course for the understanding basic concept of attention module in NLP. What I learned in this course mainly based on text data processing. (I feel that the voice or sound data will be a little different to apply.) I was able to make a solid understanding through practical examples.
One thing that I felt was lacking is
There is no theoretical background on the convergence. I don't understand why such NLP model can be converge to optimal solution. It may work. But why? I need to search more literature.
By Anna Z
•Jul 12, 2022
I loved the instructor's clear explanation! Thanks for extracting the essence of these cutting-edge models and teaching that to me.
Towards the end of the course (e.g. Course 4), I felt that the assignments were not designed to allow me to really understand what's going on inside the neural network. Perhaps that's not one of the course's goals -- in that case, it'd be nice to get an idea of to what extent do people working at different positions in industry master the mechanisms inside each network.
By Fan P
•Nov 18, 2020
The materials in Week 3 are not sufficiently clear to explain the BERT model. The instructors sometimes repeated itself and only explain the surface but never going inside. The assignment of week 3 was designed lack of comprehension. One thing is very interesting but didn't mention is that how BERT model can achieve to be trained by the self-supervised method on almost any dataset. Would be good to show how does BERT prepare the training dataset, can it be generalized to other types of dataset?
By Luis F C d L
•Apr 3, 2022
The course is one of a kind in the sense that there's very limited courses that try to aboard the transformers/attention subject.
I really enjoyed the content but there were some times that I'd like to have some more depth about the implementation side of things...
I know the subject is very complex so in general I apreciate the efforts of putting that kind of content for us. Thanks to that I rope I can evolve the models we use in my company to these state of the art transformers.
By Bilgehan E
•Mar 31, 2024
The syllabus and lab exercise outlines are commendable. However, the quality of the lectures and lab descriptions falls short. For a reference to the expected standard, consider Andrew Ng's previous machine learning courses. At a minimum, please use ChatGPT to enhance the language quality of the Jupyter descriptive text. Despite the highly enjoyable topic, following this course was as frustrating as Andrew Ng’s courses were enjoyable.
By Jerry C
•Nov 3, 2020
Overall the course is a nice introduction to new or cutting-edge NLP techniques using deep learning with good explanations and diagrams. The course is a bit too easy in terms of hand-holding; a large part of the assignments can be easily completed given the hints without deeply understanding what is going on. Also, occasionally there are typos or incoherent wording which detract from the overall experience.
By Lucas B P
•Mar 10, 2023
The course has a really rich content, and it is very well explained for the most part.
However, there are some errors in some tasks, that the team has not fixed for many months now. The labs run on low-end machines that can really get in the way of finishing some of the tasks. The third week is a bit rushed as well.
Overall, it is a really good course and totally worth it.
By Stephen S
•Jun 21, 2021
Content wise it's excellent as always, I am not giving 5 stars, because of two reasons: a) audio including transcript is sometimes not of best quality (in english) as it would be generated by a machine b) readings are very brief and just quickly summarizing what has been taught in the video (could go in more depth). I would give 4,5 stars if that would be possible.
By Keith B
•Aug 4, 2022
I don't think I got much from the lectures or the assignments in the last two weeks of the course (weeks 3 and 4). However, the ungraded labs in week 4 (Reformer LSH and Revnet) were brilliant and really helped me to better understand much of the material from weeks 3 and 4. If I were doing it again, I would probably skip the lectures and just do those labs.
By Amey N
•Oct 4, 2020
The course gives an encompassing overview of the latest tools and technologies which are driving the NLP domain. Thus, the focus gradually shifts from implementation and towards design.
Since the models require specialized equipment, they go beyond the scope of a personal computer and create a requirement for high-performance computing.
By Audrey B
•Jan 4, 2022
Great content, although the focus is definitely more on the attention mechanisms and on the Transformer architecture than on the applications themselves. Still really enjoyed it and I now feel like I have a better grasp of Transfer Learning and its associated methods. Content is very clear and well explained
By Ankit K S
•Nov 30, 2020
This is really an interesting specialization with lots of things to learn in the domain of NLP ranging from basic to advanced concepts. It covers the state of the art Transformer architecture in great detail. The only thing with which I felt uncomfortable is the use of Trax Library in assignments.
By Vijay A
•Nov 20, 2020
Covers the state of the art in NLP! We get an overview and a basic understanding of designing and using attention models. Each week deserves to be a course in itself - could have actually designed a specialization on the attention based models so that we get to learn and understand better.
By Alexandre B
•May 20, 2023
This course is quite complete as it presents major hot NLP tasks with transformers, but unfortunately it presents only one framework: Trax, and not Hugging Face's which is also really useful and used in the field. I would have liked to have a lesson about chatGPT like models.
By Naman B
•Apr 28, 2021
It would have been better if we use standard frameworks like PyTorch instead of Trax. Also, the Course Videos are a bit confusing at times. It would have been great if the Math part would have been taught as Andrew Ng Taught in Deep Learning Course.
By Cees R
•Nov 29, 2020
Not new to NLP, I enjoyed this course and learned things I didn't know before. From an educational perspective, I didn't like that the two "optional" exercises were way harder than the too easy "fill in x here" assignment.
By Zicong M
•Dec 14, 2020
Overall good quality, but seems a bit short and content are squeezed.
I don't like the push of Trax neither, it is has yet become the mainstream and personally I don't find that helpful for my professional career.
By Jonas B
•Apr 11, 2023
A good course, that I can recomment without a doubt. I would strongly recomment to complement it by reading the additional ressources (see week 4 -> "References") as well as hugging face tutorial for NLP.
By Gonzalo A M
•Jan 21, 2021
I think that we could go deeper in the last course because you taught a lot of complex concepts but I did not feel confidence to replicate them. It was better to explain transformers with more detail
By Cornel M
•Jun 18, 2023
The lectures need more insights to understand not only the 'how' but a reasonable amount of the 'why', too. Andrew is very good at doing this in his lectures and provide his intuitions and insights.
By Vishwam G
•Mar 10, 2024
It could have been better if Transformers library from hugging face is explored more. and topics like Vision Transformers and utilization of Transformers for computer vision is explored.
By CLAUDIA R R
•Sep 7, 2021
It's a great course, more difficult than I thought but very well structured and explained. Although more didactic free videos can complete the lessons from others websites.
By Anonymous T
•Oct 15, 2020
great course content but go for this only if you have done previous courses and have some background knowledge otherwise you won't be able to relate
By Qiao D
•Nov 4, 2022
The content is great, but it will be even better if we have a more in-depth understanding of the knowledge rather than a very quick crash course.
By Moustafa S
•Oct 3, 2020
good course covers everything i guess, the only down side for me is trax portion, i would've prefered if it was on TF maybe, but still great job