Skip to content

Commit

Permalink
Update files at 2023-11-30 23:14:31
Browse files Browse the repository at this point in the history
  • Loading branch information
Anthony Polloreno authored and Anthony Polloreno committed Dec 1, 2023
1 parent 7b0bbce commit 7b15d53
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion Webpost2.html
Original file line number Diff line number Diff line change
Expand Up @@ -7493,7 +7493,7 @@ <h1 id="Part-2:-Simulations">Part 2: Simulations<a class="anchor-link" href="#Pa
</div>
<div class="jp-InputArea jp-Cell-inputArea"><div class="jp-InputPrompt jp-InputArea-prompt">
</div><div class="jp-RenderedHTMLCommon jp-RenderedMarkdown jp-MarkdownOutput" data-mime-type="text/markdown">
<p>From Chebyshev [1] we know that for a random variable with variance $\sigma^2$ $\Pr(|X-\mu |\geq k )\leq {\frac {\sigma^2}{k^{2}}}$. For an ensemble average over $N$ reservoirs, if the ensemble has variance $\sigma_0^2$ then the random variable $X_N = \frac{1}{N} X_i$ has variance $\sigma^2 = \sigma_0^2/N$ so that $\Pr(|X_N-\mu|\geq k\sigma_0 ) \leq {\frac {1}{Nk^{2}}}$. Whereas before we might expect $\Pr(|X-\mu|\geq \sigma_0) \leq 68\%$ if $X$ were Gaussian distributed, we now have, for $k \sim 1$ and $N\sim 500$, a bound of $.02\%$. That's much tighter! This depends on the actual shape of the distribution and its actual moment, but we expect such an improvement to be sufficient... and we will empirically observe that it is!</p>
<p>From Chebyshev [1] we know that for a random variable with variance $\sigma^2$ $\Pr(|X-\mu |\geq k )\leq {\frac {\sigma^2}{k^{2}}}$. For an ensemble average over $N$ reservoirs, if the ensemble has variance $\sigma_0^2$ then the random variable $X_N = \frac{1}{N} \sum_{i=1}^N X_i$ has variance $\sigma^2 = \sigma_0^2/N$ so that $\Pr(|X_N-\mu|\geq k\sigma_0 ) \leq {\frac {1}{Nk^{2}}}$. Whereas before we might expect $\Pr(|X-\mu|\geq \sigma_0) \leq 68\%$ if $X$ were Gaussian distributed, we now have, for $k \sim 1$ and $N\sim 500$, a bound of $.02\%$. That's much tighter! This depends on the actual shape of the distribution and its actual moment, but we expect such an improvement to be sufficient... and we will empirically observe that it is!</p>
<p>[1] <a href="https://en.wikipedia.org/wiki/Chebyshev%27s_inequality">https://en.wikipedia.org/wiki/Chebyshev%27s_inequality</a></p>
</div>
</div>
Expand Down
Binary file modified resume.pdf
Binary file not shown.

0 comments on commit 7b15d53

Please sign in to comment.