Community

How useful is pre-t...
 
Share:

How useful is pre-training on ImageNet, actually?  

  RSS

Andrea Lim
(@andrea)
Active Member Admin
Joined: 10 months ago
Posts: 17
29/11/2018 12:03 pm  

Pre-training on ImageNet has been a common step for developing ML models. But how useful is it, actually?

Recent papers from FAIR ( http://bit.ly/2DVoVAD) and Google Brain ( http://bit.ly/2DS7qAR) explore the utility of transfer learning with other datasets. Do you agree with their findings?

Andrea


Quote
Mo Rebaie
(@mo-rebaie)
Eminent Member
Joined: 4 months ago
Posts: 46
06/04/2019 2:11 am  

Thanks to deeplearning.ai and Sir Andrew for illustrating the concept of transfer learning, I understood this topic while taking the Deep Learning specialization.

Transfer learning saves a lot of time especially when we use an available pre-trained model, ImageNet's architecture is well structured in the convolutional and pooling and fully connected layers, and it's tremendously helpful in training various deep neural networks on different tasks.

M.R


ReplyQuote
Share:
Subscribe to our newsletter

We use cookies to collect information about our website and how users interact with it. We’ll use this information solely to improve the site. You are agreeing to consent to our use of cookies if you click ‘OK’. All information we collect using cookies will be subject to and protected by our Privacy Policy, which you can view here.

OK
  
Working

Please Login or Register