Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think Andrej is arguing over and above the reasons you cite. Not only should you learn backprop because of the same reason you learn to do 2+2, but you should learn backprop ALSO because it's a leaky abstraction.

This is a non-trivial statement, because there are other things which are not leaky. For example, he's not arguing that deep learning practitioners should also learn assembly programming or go into how CUBLAS implements matrix multiplication. Although these things are nice to learn, you probably won't need them 99.9% of the times. Backprop knowledge, however, is much more crucial to design novel deep learning systems.



> Backprop knowledge, however, is much more crucial to design novel deep learning systems.

I would argue that it's not just for that. You need to understand what is happening inside DNN if you want to construct it properly.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: