MK
Mar 14, 2024
Cant express how thankful I am to Andrew Ng, literally thought me from start to finish when my school didnt touch about it, learn a lot and decided to use my knowledge and apply to real world projects
JY
Oct 30, 2018
The lectures covers lots of SOTA deep learning algorithms and the lectures are well-designed and easy to understand. The programming assignment is really good to enhance the understanding of lectures.
By T?t T V
?Feb 11, 2018
good
By Han C
?Feb 6, 2018
Good
By Nurtas K
?Mar 10, 2025
мрм
By Shakirullah K
?Feb 12, 2025
n/a
By 华卓隽
?May 13, 2019
666
By 莫毅啸
?Aug 3, 2018
ths
By 黄家鸿
?Jun 12, 2018
非常好
By 雷后超
?Apr 21, 2018
666
By Sylvain D
?Feb 12, 2018
top
By 杨天奇
?Apr 11, 2025
很好
By DuongTHQE180049
?Mar 6, 2025
ok
By Souleymane D
?Sep 9, 2022
ok
By Mohamed M
?Sep 28, 2020
<3
By Parth S
?Jan 4, 2020
kk
By Ming G
?Aug 26, 2019
gj
By Pham X V
?Nov 6, 2018
:
)
By Amira K
?Jun 13, 2025
-
By Wassana K
?Jun 7, 2021
By Srikanta P S
?Apr 15, 2021
A
By Abdou L D
?Jul 16, 2020
-
By Jainil K
?Aug 12, 2019
-
By Musa A
?Jul 9, 2019
A
By 郑毅腾
?May 14, 2018
i
By wangdawei
?Mar 30, 2018
赞
By Mathias S
?Apr 23, 2018
The Sequence Models course was the one I sought out in the deep learning specialization. Very interesting assignments, e.g. neural machine translation, music composition, etc. - much more interesting than the convolutional network models, in my opinion. However, it is also much more difficult to follow; probably the most difficult one of the five courses.
Prof. Ng did a wonderful job in the delivering the materials, as always. However, I expected a lot more details about the sequence models, and recurrent networks as much as the ones given in the previous courses. I was looking forward to learn more in-depth about this model, but I didn't feel I get all that I wanted. For example, I wish there an example, step-by-step walkthrough of the backpropagation through time (BPTT) algorithm, especially for the LSTM and GRU models.
The assignments were a little more difficult to follow, I think. To me, the instructions were not as clear as the previous courses (in my opinion), especially when using Keras objects/layers - "use this *object/layer*" but it wasn't clear whether or not to fiddle with the arguments. Usually when it does require a specific value for the argument (e.g. axis=x), it will be mentioned either in the text or code comments. I guess it's a good challenge, but I find myself doing more trial-and-error with the coding to get it to work instead of having some guidance on how to use those Keras objects/layers. The discussion forums do help, however. Lastly, some of the assignments involved building a recurrent model using Keras layers, I felt like there was not enough explanation why such architecture, layers, or hyperparameter values were chosen.
Overall, I liked the course, I did learn a lot from the course, and enjoyed the models we get to play with in the assignments. I think I will still run into problems trying to devise my own sequence models, and fumble with Keras. I wish there is a more in-depth course on the sequence model. Prof. Ng's delivery was excellent; I enjoyed listening to every one of his lectures (even at 2x speed) :)
Thank you to Prof. Ng, and all the people who worked hard to develop the course.