Dropout: A Simple Way to Prevent Neural Networks from Overfitting

Dropout: A Simple Way to Prevent Neural Networks from Overfitting

Released Saturday, 12th December 2020
Good episode? Give it some love!
Dropout: A Simple Way to Prevent Neural Networks from Overfitting

Dropout: A Simple Way to Prevent Neural Networks from Overfitting

Dropout: A Simple Way to Prevent Neural Networks from Overfitting

Dropout: A Simple Way to Prevent Neural Networks from Overfitting

Saturday, 12th December 2020
Good episode? Give it some love!
Rate Episode
List

In today’s paper I talk about one of the most important papers in Deep Learning. Dropouts were introduced to make neural networks understand the data better. What’s fascinating is that the idea of dropping out units in a neural network was inspired from Darwanian theory of evolution. It is absolutely amazing when people derive ideas from the natural world and translate those ideas into different scientific disciplines. I had a really wonderful time reading the paper and understanding its core ideas. For this episode, I have changed the format a bit. You will understand what I mean by that if you have followed my episodes thus far.

Paper: https://jmlr.org/papers/volume15/srivastava14a/srivastava14a.pdf Please follow me on Spotify if you like this podcast. Suggest to your friends and colleagues if you think I am worth their time. That would mean a lot to me.

Also, I would really like it if you gave me feedback or suggestions. If not, write me a mail nonetheless! I would love to get to know you. Email: paperinanutshell@gmail.com

If you like my episodes, please follow me on Facebook, Instagram and Twitter. Follow me on Spotify and Apple Pocasts.

Facebook: https://www.facebook.com/Paper-In-A-Nutshell-101609351791726Instagram: https://www.instagram.com/paperinanutshell/Twitter: https://twitter.com/NutshellPaper Streaming links: https://anchor.fm/debayan-bhattacharya

Show More