Chaotic time series predictions with ReservoirComputing.jl

07/29/2021, 5:10 PM5:20 PM UTC


Are you interested in how machine learning can be used to predict the behavior of the "unpredictable" chaotic systems? This talk will be a deep dive into ReservoirComputing.jl (, a package in the SciML ecosystem focused on a class of stabilized machine learning specialized for handling learning these difficult dynamical systems.


Chaoticity is by definition hard to predict or to reproduce using forecasting models. With the advent of Deep Learning (DL) a lot of effort has been dedicated to this problem, with the default approaches being represented by Recurrent Neural Networks (RNNs) and Long Short Term Memory networks (LSTMs). More recently a new family of models has proved more effective in tackling chaotic systems, namely Reservoir Computing (RC) models. Given the relative infancy of the RC paradigm it is not simple to find an implementation of such models, let alone a full library. With ReservorComputing.jl we want to propose a Julia package that allows the user to quickly get started with a fast growing range of RC models, ranging from the standard Echo State Networks (ESNs) to the more exotic Reservoir Computing with Cellular Automata (RECA). In this talk a brief introduction to the concept of RC will be given and afterwards the capabilities of ReservoirComputing.jl will be illustrated using interactive examples.

Reservoir Computing models work by expanding the input data into a higher dimensional space, called reservoir. After this expansion the resulting states are collected and the model is trained against the desired input as a linear regression problem. This approach allows for fast training times, and solves several problems of Neural Networks training, like the vanishing gradient. Not only are the models in the RC family faster and safer to train, but, as mentioned before, it has been shown that they are also better in the prediction and reproduction of chaotic systems. RC models are mainly composed of three sections: an input to reservoir coupler, the reservoir, and a reservoir to output coupler. The last section is the result of the training process, and is dependent on the training method that one chooses to utilize. It is easy to see that using different constructions for these elements is possible to obtain different results in the task at hand. To properly explore the RC models a quick way to access these layers is needed in their implementation.

At a high level, the implementation of ReservoirComputing.jl gives the user the appropriate tools needed for a quick setup of the desired model, allowing an exploration of these family of models for the prediction of a given time series. Otherwise if one chooses to delve more deeply into the customization of the model, the implementation of ReservoirComputing.jl follows a modularity designed to leave users with the freedom to fully customize the system they intend to train and use for predictions. This not only helps with possible recombinations of layers already implemented in the library, but allows for expansions with the aid of external libraries or custom code. Leveraging the great package ecosystem of Julia the user could decide to train the RC system using not yet implemented regression approaches with an external package. At the same time it is possible to use a reservoir matrix construction not present in the library, either by custom construction, or again by using other packages, like LightGraphs.jl.

After the brief introduction of the RC paradigm the talk intends to illustrate the concepts defined above using concrete examples. It will be shown both the ease of use of the package and some of the possible variations included that can be explored. Finally a demonstration of possible customizations will be illustrated, both by custom defined layers and by leveraging other libraries of the Julia ecosystem.

Platinum sponsors

Julia Computing

Gold sponsors

Relational AI

Silver sponsors

Invenia LabsConningPumas AIQuEra Computing Inc.King Abdullah University of Science and TechnologyDataChef.coJeffrey Sarnoff

Media partners

Packt Publication

Fiscal Sponsor