Chevron Left
Voltar para Mathematics for Machine Learning: Multivariate Calculus

Comentários e feedback de alunos de Mathematics for Machine Learning: Multivariate Calculus da instituição Imperial College London

3,126 classificações
526 avaliações

Sobre o curso

This course offers a brief introduction to the multivariate calculus required to build many common machine learning techniques. We start at the very beginning with a refresher on the “rise over run” formulation of a slope, before converting this to the formal definition of the gradient of a function. We then start to build up a set of tools for making calculus easier and faster. Next, we learn how to calculate vectors that point up hill on multidimensional surfaces and even put this into action using an interactive game. We take a look at how we can use calculus to build approximations to functions, as well as helping us to quantify how accurate we should expect those approximations to be. We also spend some time talking about where calculus comes up in the training of neural networks, before finally showing you how it is applied in linear regression models. This course is intended to offer an intuitive understanding of calculus, as well as the language necessary to look concepts up yourselves when you get stuck. Hopefully, without going into too much detail, you’ll still come away with the confidence to dive into some more focused machine learning courses in future....

Melhores avaliações


Nov 13, 2018

Excellent course. I completed this course with no prior knowledge of multivariate calculus and was successful nonetheless. It was challenging and extremely interesting, informative, and well designed.


Aug 04, 2019

Very Well Explained. Good content and great explanation of content. Complex topics are also covered in very easy way. Very Helpful for learning much more complex topics for Machine Learning in future.

Filtrar por:

376 — 400 de {totalReviews} Avaliações para o Mathematics for Machine Learning: Multivariate Calculus

por Mohamed H

Aug 05, 2019



Apr 02, 2019

very good

por Yash V P

Mar 25, 2019

very cool

por Nidal M G

Nov 11, 2018

very good

por Edward K

Sep 04, 2018

very nice

por Bielushkin M

Jun 08, 2018


por Kuo P

Mar 15, 2018


por Rodrigo F

Sep 18, 2019


por Мусаллямов Д Н

May 31, 2019


por James A

Jan 14, 2019


por AMIT K A

Jul 27, 2018









por Wong Y W M

Feb 21, 2020


por Bálint - H F

Mar 20, 2019

Great !

por Shanxue J

May 23, 2018


por Liang Y

Jun 21, 2019


por Shuvo D N

May 26, 2019


por Nitish K S

Jul 18, 2018

nice !

por Kailun C

Jan 25, 2020


por Nathan L

Mar 06, 2020


por Zhao J

Sep 11, 2019



Jun 26, 2018


por Omar D

May 05, 2020


por Rinat T

Aug 01, 2018

the part about neural networks needs improvement (some more examples of simple networks, the explanation of the emergence of the sigmoid function). exercises on partial derivatives need to be focused more on various aspects of partial differentiation rather than on taking partial derivatives of some complicated functions. I felt like there was too much of the latter which is not very efficient because the idea of partial differentiation is easy to master but not always its applications. just taking partial derivatives of some sophisticated functions (be it for the sake of Jacobian or Hessian calculation) turns into just doing lots of algebra the idea behind which has been long understood. so while some currently existing exercises on partial differentiation, Jacobian and Hessian should be retained, about 50 percent or so of them should be replaced with exercises which are not heavy on algebra but rather demonstrate different ways and/or applications in which partial differentiation is used. otherwise all good.

por Yaroslav K

Apr 08, 2020

1) Totally British English with a bunch of very rare-used words and phrases globally. 2) The pace of the course is just not suitable for me. If you don't have strong math or engineer background you will need to search for the explanations somewhere else (khan academy - a great resource, etc.). Closer to the end of the course I stopped having a full understanding of what's going on and why. So I could calculate things, but I don't feel that I will able to that in 1-2 week because I didn't have a time and opportunity to strengthen gained skills. 3) Also I don't understand why instructors (especially David) don't visualize what they say like Sal or Grant are doing. They draw on the desk and on the plots and so on. Sometime it looks like you just listen to audio-book about the Math.

I will take Stanford ML course after this course and also review what I've learned here with Khan Academy resource.

por Jack C

May 31, 2020

Great course! It was a pleasure to learn Multivariate Calculus, and Sam Cooper was great! I was even able to understand Neural Networks, which I had always found confusing! However, surprisingly, the final two weeks taught by David Dye about Optimisation and Regression were not taught well. I did not understand how to use them in practice, and the main reason why is because of Gradient Descent, an important algorithm, was not explained very well. The reason why this was so surprising is that David Dye was amazing in Linear Algebra, and I understood everything very well. Thank you Imperial College London for this great course, and I hope you edit it to explain Gradient Descent better.