RNN Backward Approach

An approach to model a Recurrent Neural Network based on a State Space Model where the previous state is being used to calculte the new current state.

This approach might be more appropriate when applied to monthly data as we then need to get predictions early and not after the month.

We can translate this continous model to a Neural Network structure via universal approximation:

Architecture

  • Allows regular time labeling (see that index is always the same per layer)
  • Disallows to include as input