RNN: Data assimilation

Unlike convolutional neural networks, RNN, recurrent neural networks, have loops. RNNs can therefore be used to combine with data assimilation for time series predictions, such as climate projections.

LSTMs, long short term memory networks, a special case of RNNs, can learn long term dependencies.

The simplest architecture consists of four layers.

The first layer, forget gate layer, determines what informaiton to throw away by using a sigmoid layer outputing a number between 00 and 11.

The second layer determins what new information to store by using a sigmoid layer and a tanh layer.

The third layer updates the old state.

The fourth layer determins what to output by using a sigmoid layer and a tanh layer.

References:

https://colah.github.io/posts/2015-08-Understanding-LSTMs/

Arcucci R, Zhu J, Hu Sh, Guo Yi (2020): Deep data assimilation: Integrating deep learning with data assimilation. Applied Sciences, 2021, 11, 3, 1114, https://www.mdpi.com/2076-3417/11/3/1114, 2076-3417