User talk:Promawer

Your last edit to the LSTM article contains misleading and opinionated info. "The problem with the first generation of recurrent neural network(RNN) is that the function of the neuron is not perfect". In theory, "vanilla" RNNs can keep track of long-term dependencies. The problem of vanilla RNNs is just computational, when back-propagating, the gradients may tend to "vanish" or "explode", because of the multiplications (and other operations) using finite-precision numbers.