want to implement an end-to-end neural network architecture. However, at some layers I want to define my own rules for backpropagation. More specifically, for different mini-batches I want to use only a subset of the parameters at a given layer (say the top-k largest parameters) and use only these parameters in the next layers of the networks. During backpropagation, only weights that depend on the selected parameters should be updated. Clearly, one can implement everything from scratch but of course I want to make use of existing deep learning frameworks (TensorFlow, Theano, Keras, etc). Could you please advice me which of these libraries would be flexible enough to accommodate such user defined rules? The ultimate goal is to be able to train models with a huge number of parameters in a scalable way using the fact that different minibatches will work only with a subset of the parameters. Any suggestions/comments are welcome!
Hello Koki, in this case I suggest you to work with Tensorflow, I'm currently working with Tensorflow in building DNNs, it's extremely powerful in training models with a huge numbers of parameters, and be careful how to choose the mini-batch size.
The Intro to Tensorflow for AI, ML & DL course by deeplearning.ai at Coursera helps a lot, the 2nd course in the specialization should be launched in 18 April, I recommend you to take the series.