Skip to content

Commit

Permalink
Update files at 2023-12-04 10:43:50
Browse files Browse the repository at this point in the history
  • Loading branch information
Anthony Polloreno authored and Anthony Polloreno committed Dec 4, 2023
1 parent 7b25955 commit 289bb8b
Show file tree
Hide file tree
Showing 2 changed files with 17 additions and 18 deletions.
35 changes: 17 additions & 18 deletions reservoirs3.html
Original file line number Diff line number Diff line change
Expand Up @@ -292,29 +292,28 @@ <h2>Research Engineer</h2>
<h1>The Impact of Noise on Recurrent Neural Networks III</h1>
<div class="paragraph">
<p> We are finally set to analyze the impacts of noise on our particular model of recurrent computation - reservoir computing
with echo state networks. In the previous post, we implemented a GPU simulator together with an extremely simple
noise model by simply adding Gaussian noise to each element of the output signal from the reservoir. In principle, the dynamics of a system will have more
complicated noise based on the details of the computation being done, but in our case using such a simple model will
let us explore intuitively why we should expect a substantial degradation in performance in the first place. Based on our discussion
in the first post, we have additionally considered all products of the output signals, and are assessing the performance
of the reservoir by using it for the NARMA10 task.
with echo state networks. In the <a href="reservoirs2.html">previous post</a>, we implemented a GPU simulator together with
an extremely simple noise model by adding Gaussian noise to each element of the output signal from the reservoir.
In principle, the dynamics of a system will have more complicated noise based on the details of the computation being done,
but in our case using such a simple model will let us explore intuitively why we should expect a substantial degradation in
performance in the first place. Based on our discussion in the first post, we have additionally considered all products of the
output signals, and are assessing the performance of the reservoir by using it for the NARMA10 task.
</p>
<p>
In particular, we are going to analyze the impact of the Gaussian noise by considering a simple model of how the signals
become corrupted. We will group the product signals by the number of terms in the product, and compute the probability
that the signal is uncorrupted. Doing this, we will be able to reason about the expected number of useful signals,
and establish that with noise the expected number of signals should scale polynomially, rather than exponentially.
We would naively expect exponentiality because the number of possible product signals from a set of signals is given
by the cardinality of its power set.
become corrupted. We naively expect an exponential number of product signals because the number of possible product signals
from a set of signals is given by the cardinality of its power set - there are an exponential number of possible combinations
from any set. We will group these product signals by the number of terms in each product, and compute the probability
that the product signal is uncorrupted from the noise. Doing this, we will be able to reason about the expected number
of useful signals, and establish that with noise the expected number of signals should scale polynomially, rather than exponentially.
</p>
<p>
Once we have established that the expected number of signals in this simple model is polynomial, in fact a monomial,
rather than exponential, we can plot how we expect the exponent of the monomial to vary with noise. To think about
the limits - when the noise becomes very large, the network will be unable to learn, and hence its performance
will be independent of the number of signals, giving an exponent of zero. When the noise goes to zero,
then we can expect the reservoir will be able to make use of all of the exponentially many signals, and approximation
of a monomial dependence becomes incorrect. This can be seen as the binomial coefficient that gives the number of
possible signals constructed from a subset of half of the signals is exponentially large. Let's give the code a look.
Once we have established that the expected number of signals in this simple model approximately a monomial,
rather than exponential, we will plot how we expect the exponent of the monomial to vary with noise. To think about
the limits of this plot - when the noise becomes very large, the echo state network will be unable to learn, and hence
its performance will be independent of the number of signals, giving an exponent of zero. When the noise goes to zero,
then we can expect the reservoir will be able to make use of the exponentially-many signals, and the approximation
of a monomial dependence becomes invalid. Let's give the code a look.
</p>
</div>
<iframe src="Webpost3.html" style="height:7500px; width:100%; border:none;" ></iframe>
Expand Down
Binary file modified resume.pdf
Binary file not shown.

0 comments on commit 289bb8b

Please sign in to comment.