Chevron Left
Voltar para Machine Learning: Regression

Comentários e feedback de alunos de Machine Learning: Regression da instituição Universidade de Washington

4.8
estrelas
5,287 classificações
987 avaliações

Sobre o curso

Case Study - Predicting Housing Prices In our first case study, predicting house prices, you will create models that predict a continuous value (price) from input features (square footage, number of bedrooms and bathrooms,...). This is just one of the many places where regression can be applied. Other applications range from predicting health outcomes in medicine, stock prices in finance, and power usage in high-performance computing, to analyzing which regulators are important for gene expression. In this course, you will explore regularized linear regression models for the task of prediction and feature selection. You will be able to handle very large sets of features and select between models of various complexity. You will also analyze the impact of aspects of your data -- such as outliers -- on your selected models and predictions. To fit these models, you will implement optimization algorithms that scale to large datasets. Learning Outcomes: By the end of this course, you will be able to: -Describe the input and output of a regression model. -Compare and contrast bias and variance when modeling data. -Estimate model parameters using optimization algorithms. -Tune parameters with cross validation. -Analyze the performance of the model. -Describe the notion of sparsity and how LASSO leads to sparse solutions. -Deploy methods to select between models. -Exploit the model to form predictions. -Build a regression model to predict prices using a housing dataset. -Implement these techniques in Python....

Melhores avaliações

KM
4 de Mai de 2020

Excellent professor. Fundamentals and math are provided as well. Very good notebooks for the assignments...it’s just that turicreate library that caused some issues, however the course deserves a 5/5

PD
16 de Mar de 2016

I really enjoyed all the concepts and implementations I did along this course....except during the Lasso module. I found this module harder than the others but very interesting as well. Great course!

Filtrar por:

751 — 775 de 954 Avaliações para o Machine Learning: Regression

por FOTSING K H C

4 de Ago de 2019

great

por 李真

19 de Fev de 2016

Great

por Vaibhav K

20 de Set de 2020

good

por YASA S K R

31 de Ago de 2020

good

por ANKAN M

16 de Ago de 2020

nice

por Saurabh A

19 de Jul de 2020

good

por Keyur M

9 de Jun de 2020

good

por Vaibhav S

16 de Mai de 2020

Good

por Vansh S

10 de Mai de 2019

nice

por 王曾

25 de Set de 2017

good

por Birbal

13 de Out de 2016

good

por FW Y

16 de Ago de 2017

做中学

por Ganji R

8 de Nov de 2018

E

por Anunathan G S

28 de Ago de 2018

L

por IDOWU H A

20 de Mai de 2018

I

por Ruchi S

8 de Nov de 2017

e

por Alessandro B

27 de Set de 2017

e

por Navinkumar

17 de Fev de 2017

g

por ngoduyvu

16 de Fev de 2016

v

por Miguel P

2 de Dez de 2015

I

por manuel S

13 de Ago de 2017

Interesting course. However, I have some mixed feelings:

I have a BS in mathematics, in Mexico (a "licenciatura", which is just between "BS" and "MS")

So, I'd say I have pretty good knowledge of statistics. So, now it is "training" instead of "fitting". It's "overfitting" instead of "multi colinearity". There are some algorithms to remove/add features (Ridge/Lasso), which -as noted- induce bias in the parameters. However, more "formal" methods susch as stepwise regression and bayesian sequences, are completely ignored.

That'd be fine except for the fact that there not even the slightest attempt to approach statistic significant, neither for the model nor for the individual parameters.

Some other methods (moving averages, Henderson MA, Mahalanobis distances) should also be covered.

So, in summary, an interesting course in the sense that ti gives an idea as to where lies the state of the art, but a little bit disappointing in the sense that -except for some new labels for the same tricks, and a humongous computing power- there is still nothing new under the sun. Still, worth the time invested

por Grant R V

29 de Fev de 2016

An excellent and quite extensive foray into regression analyses from single-variable linear regression to nearest-neighbor and kernel regression techniques, including how to use gradient vs. coordinate descent for optimization and proper L1 and L2 regularization methods. The lecture slides have some questionable pedagogical and aesthetic qualities, and they could use some more polish from someone who specializes in teaching presentation methods, but the meat of the course comes from its quizzes and programming assignments, which are well split between practical use (via Graphlab Create and SFrame) and a nuts-and-bolts assignment that have you implement these methods from scratch. An extremely valuable course for someone who wants to use these for a data science application but also wants to understand the mathematics and statistics behind them to an appreciable degree.

por William K

23 de Ago de 2017

The only complaint I have is that the programming exercises were not challenging enough. The lecture videos were great to build up an understanding from fundamentals, but the assignments did not fully test the concepts. There were too many exercises that were fill-in-the-blank with most of the code already written. I would appreciate more rigorous programming exercises to facilitate an in-depth understanding of the topics. Moreover, the programming exercises were not applicable to real-world applications because all the data was already neatly presented and the desired outcome was known ahead of time. In order to mimic real-world machine learning problems, we should be required to clean the data and answer open-ended questions that require exploring and understanding the data before developing machine learning models to extract usable information.

por Denys G

14 de Jan de 2016

Courses like this are always difficult to judge because of the great variety of students coursera reaches. That is, some class members finished this course in the first week it was open, others still struggled till the last minute. For some the math was too simply, for others the python programming was too confusing. All in all it strikes a reasonable balance between novice learners and more advanced students.

What the course could stand to really benefit from is some kind of repository of code, for those students who successfully completed the assignments to compare to their own. It seems pretty clear that there are some advanced python users whose insights could help improve one's coding skills.

por Marvin J A

27 de Nov de 2015

(Beta-Test review)

Status: Still on the first week.

The content is an easy follow, though it might seem to be a slight difficulty for those without a heavy background in calculus. So far, all the links (to the downloadable csv's and ipynb files) work well. All the videos have no apparent bugs and/or problems. I would also suggest to have the slides available for download as in the previous module.

I don't think writing over the animation is a bad thing as long as it's still understandable.

As an aside, I suggest editing out the swallowing sound you might occasionally hear whenever either instructor is speaking. To some, it seems a bit off-putting.

Great course, overall.

Thanks,

Marvin