Chevron Left
Voltar para Natural Language Processing with Probabilistic Models

Comentários e feedback de alunos de Natural Language Processing with Probabilistic Models da instituição

1,240 classificações
219 avaliações

Sobre o curso

In Course 2 of the Natural Language Processing Specialization, you will: a) Create a simple auto-correct algorithm using minimum edit distance and dynamic programming, b) Apply the Viterbi Algorithm for part-of-speech (POS) tagging, which is vital for computational linguistics, c) Write a better auto-complete algorithm using an N-gram language model, and d) Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Melhores avaliações

12 de Dez de 2020

A truly great course, focuses on the details you need, at a good pace, building up the foundations needed before relying more heavily on libraries an abstractions (which I assume will follow).

2 de Dez de 2020

A neatly organized course introducing the students to basics of Processing text data, learning word embedding and most importantly on how to interpret the word embedding. Great Job!!

Filtrar por:

1 — 25 de 218 Avaliações para o Natural Language Processing with Probabilistic Models

por sukanya n

21 de Jul de 2020

Viterbi algorithm could be explained better and Week 4 seemed very rushed with lots of details just glossed over. The assignment of week 4 compared to previous weeks seemed pretty easy.

por Oleh S

3 de Ago de 2020

This course is more mature than the first one. Materials are very interesting and provide nice intuition for the probabilistic models. One can study the basics of auto-correction, Markov and Hidden Markov Models as well as the N-gram models and very important approach - Word2Vec, which is the essential part of the modern deep learning algorithms. I really enjoyed this part.

However, there are some minor suggestions:

1. Lectures duration could be longer - it will help to provide more depth in materials in both math and code side. I know that this is a simple version of real academic course, but in order to increase the quality you should consider the increasing duration;

2. Programming assignments are not balanced and there are still some minor ambiguities. For instance, the first and HMM assignments are tough, whereas the last one is a piece of cake.

3. The course can be enhanced with the additional part dedicated to Probability Theory, maybe a few lectures more.

I recommend this course to everyone interested in NLP. Note, you should read and study the additional resources to reinforce your knowledge, here is just the starting point for a good researcher. Keep going, guys!

por Dan C

8 de Jul de 2020

Lots of Quality Control issues. using paid customers as proofreaders is tacky.

por ES

7 de Jul de 2020

Homework is too easy. The answers are pretty much given to us.

por Gabriel T P C

3 de Ago de 2020

To lessons are shallow, exercises to repetitive.

por Kabakov B

6 de Set de 2020

It is the worst course on ever. It is too simple for those who already took DL specialization and too difficult for new ones. The 'lectures' are too superficial and you will barely understand a thing. But tasks are huge -- a lot of spaghetti code with few levels of enclosed IF's, with constructions like A[i][j:k][l+1]. You will spend your time doing the bad implementation of 100K times implemented things that will not provide you with enlightenment on how they are implemented because of a lack of the theory. And nobody will teach you to use standard tools on simple and understandable examples.

por Zhendong W

10 de Jul de 2020

A great course indeed! However, it would be even nicer to have the lecture videos in a slower pace, maybe go through the examples in more detail. Sometimes it felt too quick to jump directly through the theory to examples.

por Mark M

19 de Jul de 2020

This second course like the first feels like a first or second year university course. Sometimes the explanations are weak or missing. There was no explanation for why the Viterbi algorithm works, no explanation for how to decide which embedding extraction method (W1 columns, W2 rows, or average of the two) method to use. There seemed to be little or no TA support. Many people were posting questions and not receiving answers from TAs. I posted the mistakes I identified in the course content, but I don't think anyone is going to act on this. It would have been good if the last exercise were repeated in Tensorflow. Also it would have been good to actually use the embeddings for something in the last exercise. From the PCA graph, the embeddings looked pretty poor.

por Manik S

13 de Ago de 2020

Although the content is great but the way of teaching is lacking relative to how Andrew teaches in Deep Learning specialization. More interactive teaching with a pen tablet would be more engaging. The whole course seems like a recital of the slides. And the intructor's voice is a little bit irritating to listen to over longer durations. Otherwise the course provides a lot of learning if you can bear it.

por Greg D

27 de Dez de 2020

The lecture videos are slow and shallow with little focus on building intuition. Similar with the assignments, instead of relying on existing libraries (that are popular for a reason) it painfully goes through implementing things in detail which doesn't really help you in any way later on.

100% recommend to save your time and money (and the sanity wasted on

meticulously hand-rolling things) on this and do something else instead

por John A J

25 de Set de 2020

It was a good course to introduce on AutoCorrect, AutoComplete, and Create your own Word Embeddings. However, I feel that the instructor focused too much on the implementation details. The concepts of why the pioneers are able to formulate the solution or train of thought for the different algorithms is lost. Although it taught me a little bit of implementation, but for me the implementation is just cherry on top as these things can easily be googled. So, It would have a better impact if it also teaches the concepts/thinking behind these algorithm so that I can re-use its underlying idea. Overall, it is good course to get started.

por Laurence G

16 de Mar de 2021

The material covered provides a good tour of probabilistic language models, however the course needs work. Some issues were: Excessive reading off of mathematical formulas without providing the intuition behind it, the section on Viterbi was awful, a large chunk of week 4 could be replaced with a single block of pytorch/tensorflow with a note saying: "For more detail go take the deep learning course".

por Andreas B

4 de Out de 2020

Too many autograder issues. For instance in week 4 even if all code is correct, you get incorrect error messages about results being expected a completely incorrect type. Also, some minor maths errors and missing deeper insights concerning maths and motivations.

por François D

17 de Jul de 2020

Great teacher, good pace in lectures and assignments. There are of course some redundancies wrt the previous specializations but it's nice to feel that you understand the content a bit better every time. Didn't find the forums (internal & slack) very useful, could be better structured. Can't wait for the next 2 courses.

por Manzoor A

20 de Ago de 2020

Excellent! I know this course is the beginning of my NLP journey, but I can't expect more than this . The ungraded labs are very useful to practice and then apply it to the assignment. I am giving 5 star because There is only 5.

por Sohail Z

19 de Ago de 2020

Brilliant course!!!! love it every aspect of the course. i am really grateful to the team for such amazing courses. they are easy to digest and provide sufficient math knowledge to understand the models.

por Alan K F G

21 de Ago de 2020

Professor Younes really makes easier for me to go along the lectures and to be focus. The structure of the course helped me a lot to constantly review the same concepts as I went further in order to learn new things.

por Saurabh K

14 de Jul de 2020

I have a wonderful experience. Try not to look at the hints, resolve yourself, it is excellent course for getting the in depth knowledge of how the black boxes work. Happy learning.

por Kritika M

10 de Ago de 2020

This course is great. Actually the NLP specialization so far has been really good. The lectures are short and interesting and you get a good grasp on the concepts.

por Andrei N

11 de Jul de 2020

A great course in the very spirit of the original Andrew Ng's ML course with lots of details and explanations of fundamental approaches and techniques.

por Minh T H L

31 de Jul de 2020

Thanks for sharing your knowledge. I am happy during the course and I also leave a couple of feedback for minor improvement. All the best.

por Ajay D

17 de Ago de 2020

Course was very insightful about the latest enhancements in the field of NLP. The exercises designed was very hands on and I loved that. However I felt a bit incomplete as I didn't see any large dataset in action, maybe my expectation was wrong. I was also wondering if I can get to see some more applications of these language model and word embeddings in the course.

por Kravchenko D

21 de Ago de 2020

Nice course, but assignments in this course are less practical than in the first course of this specialization. The last assignment in this course was implementing the word embeddings generation using your own neural network. The whole process of writing your own neural network is nice except the resulting word embeddings that look very bad and ungrouped on the plot and the text in the notebook says: "You can see that woman and queen are next to each other. However, we have to be careful with the interpretation of this projected word vectors" without an explanation of what's wrong with the results. So I think that the last assignment should be reworked by the reviewers to have illustrative results at the end, not just "Word embeddings at the end are bad. Bye-Bye, see you in the next course"

por Vitalii S

9 de Jan de 2021

Good information and presentation, but one should work on functions grading.

The problem is when you did something wrong in function C7, but from code and results point of view grader thinks it is OK. Only at C11 you figure out that something went wrong and waste time searching what is incorrect, all way down to the C7.

I think you should make kind of unit testing of a functions to make sure that it is really correct.

por Kartik C

8 de Mai de 2021

The content and lectures are very good, but the assignments are overly restrictive, forcing one to do it in exactly one way and giving them no room to try (and maybe fail sometimes) while exploring different ways of doing something. Feels like during the assignments you are not learning anything just doing what you are being told to do.