Which of the following is true?
- The first of three RNN architectures we considered last week had an output at each time step and had recurrent connections between hidden units.
- Backpropagation through time is the same a teacher forcing learning.
- Bidirectional RNNs are Turing Machine based and make use of a NN-controlled tape head.