How To Avoid Overfitting [For Beginners] [Deep Learning] [Keras]

Gökhan Gerdan
2 min readDec 9, 2019

Best way to avoid overfitting is to avoid using deep learning and use random forest. But why random forest? Random forest doesn’t overfit? Ofcourse random forest overfits too. But if you increase the number of trees overfitting should decrease.

Neural networks tends to overfit more than classical machine learning algorithms. But what are simple techniques to prevent overfitting when training neural networks for beginners.

[UPDATE] Check this out. I created a Kaggle kernel for this post as an example!!!

Less layers, less neurons => Less overfitting

If you simplify your model by decreasing number of layers and number of neurons per layer you can decrease overfitting but prediction accuracy can also decrease.

Add dropout layers to generalize your model

Dropout layers randomly drops some of connections between layers and that decrease overfitting by randomness of the selected connections.

Stopping earlier than initial epochs

Stop learning process when validation loss starts rising up.

Conclusion

These are just basic techniques to prevent your first neural network from overfitting. You can test these techniques easily using keras.

--

--

Gökhan Gerdan
Gökhan Gerdan

No responses yet