Chapter 5 Sequence Models
课程视频:https://www.bilibili.com/video/BV1FT4y1E74V?p=151&vd_source=d0416378a50b5f05a80e1ed2ccc0792f
对应内容:
Chapter 5: Sequence Models
Week 1: Recurrent Neural Networks
1.1 Why Sequence Models?
1.2 Notation
1.3 Recurrent Neural Network Model
1.4 Backpropagation through time
1.5 Different types of RNNs
1.6 Language model and sequence generation
1.7 Sampling novel sequences
1.8 Vanishing gradients with RNNs
1.9 Gated Recurrent Unit GRU
1.10 LSTM long short term memory unit
1.11 Bidirectional RNN
1.12 Deep RNNs
Week 2: Natural Language Processing and Word Embeddings
2.1 Word Representation
2.2 Using Word Embeddings
2.3 Properties of Word Embeddings
2.4 Embedding Matrix
2.5 Learning Word Embeddings
2.6 Word2Vec
2.7 Negative Sampling
2.8 GloVe Word Vectors
2.9 Sentiment Classification
2.10 Debiasing Word Embeddings
Week 3: Sequence models & Attention mechanism
3.1 Basic Models
3.2 Picking the most likely sentence
3.3 Beam Search
3.4 Refinements to Beam Search
3.5 Error analysis in beam search
3.6 Bleu Score optional
3.7 Attention Model Intuition
3.8 Attention Model
3.9 Speech recognition
3.10 Trigger Word Detection
笔记:

















