Recurrent neural networks | | Learn Neural Networks
Unlike multi-layer perceptrons, recurrent networks can use their internal memory to process sequences of arbitrary length. Therefore, RNN networks are applicable in such where something is divided into segments, for example, handwriting recognition or speech recognition. Many different architectural solutions for recurrent networks, from simple to complex, have been proposed. Recently, the most common network with long-term and short-term memory (LSTM) and controlled recurrent unit (GRU). In the diagram above ​​the neural network A receives some data X at the input and outputs some value h. The cyclic connection in RNNs allows to transfer information from the current network step to the next. There are many varieties, solutions and constructive elements of recurrent neural networks. The difficulty of the recurrent network is that if you take into account each time step, it becomes necessary for each time step to create its own layer of neurons, which causes serious computational