User-defined backpr...

User-defined backpropagation  


New Member
Joined: 1 year ago
Posts: 1
06/04/2019 2:21 am  

 want to implement an end-to-end neural network architecture. However, at some layers I want to define my own rules for backpropagation. More specifically, for different mini-batches I want to use only a subset of the parameters at a given layer (say the top-k largest parameters) and use only these parameters in the next layers of the networks. During backpropagation, only weights that depend on the selected parameters should be updated. Clearly, one can implement everything from scratch but of course I want to make use of existing deep learning frameworks (TensorFlow, Theano, Keras, etc). Could you please advice me which of these libraries would be flexible enough to accommodate such user defined rules? The ultimate goal is to be able to train models with a huge number of parameters in a scalable way using the fact that different minibatches will work only with a subset of the parameters. Any suggestions/comments are welcome!

Mo Rebaie
Estimable Member
Joined: 1 year ago
Posts: 106
13/04/2019 6:49 am  

Hello Koki, in this case I suggest you to work with Tensorflow, I'm currently working with Tensorflow in building DNNs, it's extremely powerful in training models with a huge numbers of parameters, and be careful how to choose the mini-batch size.

The Intro to Tensorflow for AI, ML & DL course by at Coursera helps a lot, the 2nd course in the specialization should be launched in 18 April, I recommend you to take the series.



We use cookies to collect information about our website and how users interact with it. We’ll use this information solely to improve the site. You are agreeing to consent to our use of cookies if you click ‘OK’. All information we collect using cookies will be subject to and protected by our Privacy Policy, which you can view here.


Please Login or Register