Community

How useful is pre-training on ImageNet, actually?  

  RSS

Andrea Lim
(@andrea)
Active Member Admin
Joined: 3 months ago
Posts: 16
29/11/2018 12:03 pm  

Pre-training on ImageNet has been a common step for developing ML models. But how useful is it, actually?

Recent papers from FAIR ( http://bit.ly/2DVoVAD) and Google Brain ( http://bit.ly/2DS7qAR) explore the utility of transfer learning with other datasets. Do you agree with their findings?

Andrea


ReplyQuote
Share:
Subscribe to our newsletter

We use cookies to collect information about our website and how users interact with it. We’ll use this information solely to improve the site. You are agreeing to consent to our use of cookies if you click ‘OK’. All information we collect using cookies will be subject to and protected by our Privacy Policy, which you can view here.

OK
  
Working

Please Login or Register