Deep-Dive into Tensorflow Activation Functions

oferecido por
Coursera Project Network
Neste projeto guiado, você irá:

Learn when, where, why and how to use different activation functions and for which situations

Code examples of each activation function from scratch in Python

Clock2 hours
IntermediateIntermediário
CloudSem necessidade de download
VideoVídeo em tela dividida
Comment DotsInglês
LaptopApenas em desktop

You've learned how to use Tensorflow. You've learned the important functions, how to design and implement sequential and functional models, and have completed several test projects. What's next? It's time to take a deep dive into activation functions, the essential function of every node and layer of a neural network, deciding whether to fire or not to fire, and adding an element of non-linearity (in most cases). In this 2 hour course-based project, you will join me in a deep-dive into an exhaustive list of activation functions usable in Tensorflow and other frameworks. I will explain the working details of each activation function, describe the differences between each and their pros and cons, and I will demonstrate each function being used, both from scratch and within Tensorflow. Join me and boost your AI & machine learning knowledge, while also receiving a certificate to boost your resume in the process! Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

Habilidades que você desenvolverá

  • Neural Network Activation Functions
  • Deep Learning
  • Artificial Neural Network
  • Python Programming
  • Tensorflow

Aprender passo a passo

Em um vídeo reproduzido em uma tela dividida com a área de trabalho, seu instrutor o orientará sobre esses passos:

  1. Review the Activation Functions, Their Properties & the Principle of Nonlinearity

  2. Implementing Linear and Binary Step Activations

  3. Implementing Ridge-based Activation Functions (ReLu family)

  4. Implementing Variations of Relu & the Swish Family of Non-Monotonic Activations

  5. Implementing Radial-based Activation Functions (RBF family)

Como funcionam os projetos guiados

Sua área de trabalho é um espaço em nuvem, acessado diretamente do navegador, sem necessidade de nenhum download

Em um vídeo de tela dividida, seu instrutor te orientará passo a passo

Perguntas Frequentes – FAQ

Perguntas Frequentes – FAQ

Mais dúvidas? Visite o Central de Ajuda ao estudante.