How useful is pre-t...

How useful is pre-training on ImageNet, actually?  


Andrea Lim
Active Member Admin
Joined: 2 years ago
Posts: 18
29/11/2018 12:03 pm  

Pre-training on ImageNet has been a common step for developing ML models. But how useful is it, actually?

Recent papers from FAIR ( and Google Brain ( explore the utility of transfer learning with other datasets. Do you agree with their findings?


Mo Rebaie
Estimable Member
Joined: 2 years ago
Posts: 108
06/04/2019 2:11 am  

Thanks to and Sir Andrew for illustrating the concept of transfer learning, I understood this topic while taking the Deep Learning specialization.

Transfer learning saves a lot of time especially when we use an available pre-trained model, ImageNet's architecture is well structured in the convolutional and pooling and fully connected layers, and it's tremendously helpful in training various deep neural networks on different tasks.



We use cookies to collect information about our website and how users interact with it. We’ll use this information solely to improve the site. You are agreeing to consent to our use of cookies if you click ‘OK’. All information we collect using cookies will be subject to and protected by our Privacy Policy, which you can view here.


Please Login or Register