Theory of Condensed Matter: Hard Condensed Matter

June 4, 2019 at 1:30 p.m. in MAINZ Seminarraum (Staudingerweg 9, 03-122)

Prof. Dr. Jairo Sinova
Institut für Physik, SPICE
sinova@uni-mainz.de

An introduction to Reservoir Computing
Dr. Herbert Jaeger (Jacobs University, Bremen, Germany)


Recurrent neural networks (RNNs) are general approximators for nonlinear dynamical systems and have recently become widely used in the "deep learning" field of machine learning, especially for speech and language processing tasks. For instance, Google's speech recognition and language translation services are based on RNNs.
However, the deep learning set-ups for RNN training are computationally expensive, require very large volumes of training data, and need high-precision numerical processing. For such reasons, deep-learning variants of RNNs are problematic in fields where training data are scarce, where fast and cheap algorithms are desired, or where noisy or low-precision hardware is to be used. This is often the case in domains of nonlinear signal processing, control, brain-machine interfacing, biomedical signal processing, or unconventional (non-digital) computing hardware.
Reservoir Computing (RC) is an alternative machine learning approach for RNNs which is in many ways complementary to deep learning. In RC, a large, random, possibly low-precision and noisy RNN is used as a nonlinear excitable medium - called the "reservoir" - which is driven by an input signal. The reservoir itself is not adapted or trained.
Instead, only a "readout" mechanism is trained, which assembles the desired output signal from the large variety of random, excited signals within the reservoir. This readout training is cheap - typically just a linear regression. RC has become a popular approach in research that aims at useful computations on the basis on unconventional hardware (non-digital, noisy, low-precision).
The talk gives an introduction to the basic principles and variants of RC. Illustrative examples will be selected according to wishes from the audience

All interested are cordially welcome!