diff --git a/reservoirs3.html b/reservoirs3.html index 79c3bd4..8730c5c 100644 --- a/reservoirs3.html +++ b/reservoirs3.html @@ -292,29 +292,28 @@

Research Engineer

The Impact of Noise on Recurrent Neural Networks III

We are finally set to analyze the impacts of noise on our particular model of recurrent computation - reservoir computing - with echo state networks. In the previous post, we implemented a GPU simulator together with an extremely simple - noise model by simply adding Gaussian noise to each element of the output signal from the reservoir. In principle, the dynamics of a system will have more - complicated noise based on the details of the computation being done, but in our case using such a simple model will - let us explore intuitively why we should expect a substantial degradation in performance in the first place. Based on our discussion - in the first post, we have additionally considered all products of the output signals, and are assessing the performance - of the reservoir by using it for the NARMA10 task. + with echo state networks. In the previous post, we implemented a GPU simulator together with + an extremely simple noise model by adding Gaussian noise to each element of the output signal from the reservoir. + In principle, the dynamics of a system will have more complicated noise based on the details of the computation being done, + but in our case using such a simple model will let us explore intuitively why we should expect a substantial degradation in + performance in the first place. Based on our discussion in the first post, we have additionally considered all products of the + output signals, and are assessing the performance of the reservoir by using it for the NARMA10 task.

In particular, we are going to analyze the impact of the Gaussian noise by considering a simple model of how the signals - become corrupted. We will group the product signals by the number of terms in the product, and compute the probability - that the signal is uncorrupted. Doing this, we will be able to reason about the expected number of useful signals, - and establish that with noise the expected number of signals should scale polynomially, rather than exponentially. - We would naively expect exponentiality because the number of possible product signals from a set of signals is given - by the cardinality of its power set. + become corrupted. We naively expect an exponential number of product signals because the number of possible product signals + from a set of signals is given by the cardinality of its power set - there are an exponential number of possible combinations + from any set. We will group these product signals by the number of terms in each product, and compute the probability + that the product signal is uncorrupted from the noise. Doing this, we will be able to reason about the expected number + of useful signals, and establish that with noise the expected number of signals should scale polynomially, rather than exponentially.

- Once we have established that the expected number of signals in this simple model is polynomial, in fact a monomial, - rather than exponential, we can plot how we expect the exponent of the monomial to vary with noise. To think about - the limits - when the noise becomes very large, the network will be unable to learn, and hence its performance - will be independent of the number of signals, giving an exponent of zero. When the noise goes to zero, - then we can expect the reservoir will be able to make use of all of the exponentially many signals, and approximation - of a monomial dependence becomes incorrect. This can be seen as the binomial coefficient that gives the number of - possible signals constructed from a subset of half of the signals is exponentially large. Let's give the code a look. + Once we have established that the expected number of signals in this simple model approximately a monomial, + rather than exponential, we will plot how we expect the exponent of the monomial to vary with noise. To think about + the limits of this plot - when the noise becomes very large, the echo state network will be unable to learn, and hence + its performance will be independent of the number of signals, giving an exponent of zero. When the noise goes to zero, + then we can expect the reservoir will be able to make use of the exponentially-many signals, and the approximation + of a monomial dependence becomes invalid. Let's give the code a look.

diff --git a/resume.pdf b/resume.pdf index b119ae7..b2d00d5 100644 Binary files a/resume.pdf and b/resume.pdf differ