Which of the following is true?
- The update back propagation rule always updates the `i`th weight of the `j`th neuron of the `k`th layer
by the same amount as the `i`th weight of the `j`th neuron of the `k+1`st layer.
- symbol-to-number differentiation is how Theano and Tensorflow evaluate partial derivatives.
- `L_2`-parameter regularization is a kind of parameter norm penalty.