Other attributes
Long short-term memory network (LSTM) is a variation of recurrent neural network. It was proposed by the German researchers Sepp Hochreiter and Juergen Schmidhuber as a solution to the vanishing gradient problem.
LSTMs hold information outside the normal flow of the recurrent neural network in its memory blocks or cells. The information can be stored in, written to or read from a cell as if it is data in a computer. The memory blocks are responsible for remembering things and manipulations and regulated by structures called gates. The gating mechanism contains three non-linear gates, input, output and forget gate.
LSTMs are implemented with element-wise multiplication by Sigmoids layers output of one and zero. It has the advantage of being differentiable and suited for backpropagation.
LSTMs are used in text generation, handwriting recognition, handwriting generation, music generation, language translation and image captioning.