Machine Learning Feature Selection in Python

4.1
estrelas
44 classificações
oferecido por
Coursera Project Network
1,957 já se inscreveram
Neste projeto guiado, você irá:

Demonstrate univariate filtering methods of feature selection such as SelectKBest

Demonstrate wrapper-based feature selection methods such as Recursive Feature Elimination

Demonstrate feature importance estimation, dimensionality reduction, and lasso regularization techniques

Clock2 hours
IntermediateIntermediário
CloudSem necessidade de download
VideoVídeo em tela dividida
Comment DotsInglês
LaptopApenas em desktop

In this 1-hour long project-based course, you will learn basic principles of feature selection and extraction, and how this can be implemented in Python. Together, we will explore basic Python implementations of Pearson correlation filtering, Select-K-Best knn-based filtering, backward sequential filtering, recursive feature elimination (RFE), estimating feature importance using bagged decision trees, lasso regularization, and reducing dimensionality using Principal Component Analysis (PCA). We will focus on the simplest implementation, usually using Scikit-Learn functions. All of this will be done on Ubuntu Linux, but can be accomplished using any Python I.D.E. on any operating system. We will be using the IDLE development environment to demonstrate several feature selection techniques using the publicly available Pima Diabetes dataset. I would encourage learners to experiment using these techniques not only for feature selection, but hyperparameter tuning as well. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

Habilidades que você desenvolverá

Data SciencePython ProgrammingScikit-Learn

Aprender passo a passo

Em um vídeo reproduzido em uma tela dividida com a área de trabalho, seu instrutor o orientará sobre esses passos:

  1. Defining Terms relating to Feature Selection and Dimensionality Reduction

  2. Introduce Algorithms with Embedded Feature Selection

  3. Demonstrate two Univariate Selection Methods: Pearson Correlation Filtering and SelectKBest f_classif

  4. Demonstrate two Wrapper Methods: Backward Sequential and RFE

  5. Demonstrate Feature Importance Estimation using Bagged Decision Trees

  6. Dimensionality Reduction using Principal Component Analysis

  7. Demonstrate Lasso Regularization

  8. Expanding concepts to hyperparameter optimization and model selection

Como funcionam os projetos guiados

Sua área de trabalho é um espaço em nuvem, acessado diretamente do navegador, sem necessidade de nenhum download

Em um vídeo de tela dividida, seu instrutor te orientará passo a passo

Perguntas Frequentes – FAQ

Perguntas Frequentes – FAQ

Mais dúvidas? Visite o Central de Ajuda ao Aprendiz.