Vanishing gradients with RNNs

Loading...
deeplearning.ai
4.8(26,151 个评分) | 280K 名学生已注册
课程 5(共 5 门,深度学习 专项课程
查看授课大纲

您将学习的技能

Natural Language Processing, Long Short Term Memory (LSTM), Gated Recurrent Unit (GRU), Recurrent Neural Network, Attention Models

审阅

4.8(26,151 个评分)
  • 5 stars
    83.64%
  • 4 stars
    13.13%
  • 3 stars
    2.50%
  • 2 stars
    0.46%
  • 1 star
    0.24%
JY
Oct 29, 2018

The lectures covers lots of SOTA deep learning algorithms and the lectures are well-designed and easy to understand. The programming assignment is really good to enhance the understanding of lectures.

JS
Jul 12, 2020

brilliant course, great quality instruction from Andrew Ng. The only faults are that some of the labs have not been supervised properly being a but buggy and a couple of later lectures were very dry.

从本节课中
Recurrent Neural Networks
Learn about recurrent neural networks. This type of model has been proven to perform extremely well on temporal data. It has several variants including LSTMs, GRUs and Bidirectional RNNs, which you are going to learn about in this section.

教学方

  • Placeholder

    Andrew Ng

    Instructor
  • Placeholder

    Kian Katanforoosh

    Curriculum Developer
  • Placeholder

    Teaching Assistant - Younes Bensouda Mourri

    Mathematical & Computational Sciences, Stanford University, deeplearning.ai

探索我们的目录

免费加入并获得个性化推荐、更新和优惠。