Skip to content

Commit

Permalink
Merge pull request #462 from reneeotten/documentation_docstrings
Browse files Browse the repository at this point in the history
updates to documentation and docstring
  • Loading branch information
newville authored Mar 20, 2018
2 parents cd02480 + f02bc9e commit b6f5789
Show file tree
Hide file tree
Showing 8 changed files with 208 additions and 196 deletions.
Binary file modified doc/_images/emcee_corner.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
227 changes: 113 additions & 114 deletions doc/builtin_models.rst

Large diffs are not rendered by default.

26 changes: 13 additions & 13 deletions doc/confidence.rst
Original file line number Diff line number Diff line change
Expand Up @@ -62,10 +62,10 @@ starting point::
>>> result = mini.minimize()
>>> print(lmfit.fit_report(result.params))
[[Variables]]
a: 0.09943895 +/- 0.000193 (0.19%) (init= 0.1)
b: 1.98476945 +/- 0.012226 (0.62%) (init= 1)
[[Correlations]] (unreported correlations are < 0.100)
C(a, b) = 0.601
a: 0.09943896 +/- 1.9322e-04 (0.19%) (init = 0.1)
b: 1.98476945 +/- 0.01222678 (0.62%) (init = 1)
[[Correlations]] (unreported correlations are < 0.100)
C(a, b) = 0.601

Now it is just a simple function call to calculate the confidence
intervals::
Expand Down Expand Up @@ -102,15 +102,15 @@ uncertainties and correlations.
which will report::

[[Variables]]
a1: 2.98622120 +/- 0.148671 (4.98%) (init= 2.986237)
a2: -4.33526327 +/- 0.115275 (2.66%) (init=-4.335256)
t1: 1.30994233 +/- 0.131211 (10.02%) (init= 1.309932)
t2: 11.8240350 +/- 0.463164 (3.92%) (init= 11.82408)
[[Correlations]] (unreported correlations are < 0.500)
C(a2, t2) = 0.987
C(a2, t1) = -0.925
C(t1, t2) = -0.881
C(a1, t1) = -0.599
a1: 2.98622120 +/- 0.14867187 (4.98%) (init = 2.986237)
a2: -4.33526327 +/- 0.11527506 (2.66%) (init = -4.335256)
t1: 1.30994233 +/- 0.13121177 (10.02%) (init = 1.309932)
t2: 11.8240351 +/- 0.46316470 (3.92%) (init = 11.82408)
[[Correlations]] (unreported correlations are < 0.500)
C(a2, t2) = 0.987
C(a2, t1) = -0.925
C(t1, t2) = -0.881
C(a1, t1) = -0.599
95.45% 68.27% _BEST_ 68.27% 95.45%
a1: -0.27286 -0.14165 2.98622 +0.16353 +0.36343
a2: -0.30444 -0.13219 -4.33526 +0.10688 +0.19683
Expand Down
46 changes: 23 additions & 23 deletions doc/fitting.rst
Original file line number Diff line number Diff line change
Expand Up @@ -404,10 +404,10 @@ Solving with :func:`minimize` gives the Maximum Likelihood solution::
>>> mi = lmfit.minimize(residual, p, method='Nelder', nan_policy='omit')
>>> lmfit.printfuncs.report_fit(mi.params, min_correl=0.5)
[[Variables]]
a1: 2.98623688 (init= 4)
a2: -4.33525596 (init= 4)
t1: 1.30993185 (init= 3)
t2: 11.8240752 (init= 3)
a1: 2.98623689 (init = 4)
a2: -4.33525597 (init = 4)
t1: 1.30993186 (init = 3)
t2: 11.8240752 (init = 3)

>>> plt.plot(x, y)
>>> plt.plot(x, residual(mi.params) + y, 'r')
Expand Down Expand Up @@ -461,18 +461,19 @@ You can see that we recovered the right uncertainty level on the data::
median of posterior probability distribution
--------------------------------------------
[[Variables]]
a1: 3.00395737 +/- 0.148140 (4.93%) (init= 2.986237)
a2: -4.34880797 +/- 0.129770 (2.98%) (init=-4.335256)
t1: 1.32070726 +/- 0.145682 (11.03%) (init= 1.309932)
t2: 11.7701458 +/- 0.505031 (4.29%) (init= 11.82408)
noise: 0.09774012 +/- 0.004329 (4.43%) (init= 1)
[[Correlations]] (unreported correlations are < 0.100)
C(a2, t2) = 0.982
C(a2, t1) = -0.935
C(t1, t2) = -0.892
C(a1, t1) = -0.507
C(a1, a2) = 0.203
C(a1, t2) = 0.163
a1: 2.99342394 +/- 0.15851315 (5.30%) (init = 2.986237)
a2: -4.34384999 +/- 0.12454831 (2.87%) (init = -4.335256)
t1: 1.32338403 +/- 0.14120290 (10.67%) (init = 1.309932)
t2: 11.7962437 +/- 0.48632272 (4.12%) (init = 11.82408)
noise: 0.09761521 +/- 0.00431795 (4.42%) (init = 1)
[[Correlations]] (unreported correlations are < 0.100)
C(a2, t1) = -0.965
C(a2, t2) = 0.959
C(t1, t2) = -0.927
C(a1, a2) = -0.241
C(a1, t2) = -0.168
C(a2, noise) = -0.116
C(t1, noise) = 0.107

>>> # find the maximum likelihood solution
>>> highest_prob = np.argmax(res.lnprob)
Expand All @@ -486,17 +487,16 @@ You can see that we recovered the right uncertainty level on the data::
>>> print(p)
Maximum likelihood Estimation
-----------------------------
Parameters([('a1', <Parameter 'a1', 2.9838386218794306, bounds=[-inf:inf]>),
('a2', <Parameter 'a2', -4.3360301800977243, bounds=[-inf:inf]>),
('t1', <Parameter 't1', 1.3099319599456074, bounds=[-inf:inf]>),
('t2', <Parameter 't2', 11.813711030433806, bounds=[-inf:inf]>)])

Parameters([('a1', <Parameter 'a1', 2.9684811738216754, bounds=[-inf:inf]>),
('a2', <Parameter 'a2', -4.355238699173162, bounds=[-inf:inf]>),
('t1', <Parameter 't1', 1.3337647386777762, bounds=[-inf:inf]>),
('t2', <Parameter 't2', 11.758394302818514, bounds=[-inf:inf]>)])
>>> # Finally lets work out a 1 and 2-sigma error estimate for 't1'
>>> quantiles = np.percentile(res.flatchain['t1'], [2.28, 15.9, 50, 84.2, 97.7])
>>> print("1 sigma spread", 0.5 * (quantiles[3] - quantiles[1]))
>>> print("2 sigma spread", 0.5 * (quantiles[4] - quantiles[0]))
1 sigma spread 0.145719626384
2 sigma spread 0.292199907106
1 sigma spread 0.1414604069179637
2 sigma spread 0.453234685099423

Getting and Printing Fit Reports
===========================================
Expand Down
8 changes: 5 additions & 3 deletions doc/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,16 +13,18 @@ Downloading and Installation
.. _jupyter: https://jupyter.org/
.. _matplotlib: https://matplotlib.org/
.. _dill: https://github.com/uqfoundation/dill
.. _asteval: https://github.com/newville/asteval
.. _six: https://github.com/benjaminp/six

Prerequisites
~~~~~~~~~~~~~~~

The lmfit package requires `Python`_, `NumPy`_, and `SciPy`_.
The lmfit package requires `Python`_, `NumPy`_, `SciPy`_, `asteval`_, and `six`_.

Lmfit works with Python versions 2.7, 3.4, 3.5, and 3.6. Support for Python 2.6
and 3.3 ended with lmfit versions 0.9.4 and 0.9.8, respectively. Scipy version
0.17 or higher, NumPy version 1.10 or higher, and six version 1.10 or higher are
required.
0.17 or higher, NumPy version 1.10 or higher, asteval version 0.9.12 or higher,
and six version 1.10 or higher are required.

In order to run the test suite, either the `nose`_ or `pytest`_ package is
required. Some functionality of lmfit requires the `emcee`_ package, some
Expand Down
84 changes: 42 additions & 42 deletions doc/model.rst
Original file line number Diff line number Diff line change
Expand Up @@ -152,20 +152,20 @@ components, including a :meth:`fit_report` method, which will show::
[[Model]]
Model(gaussian)
[[Fit Statistics]]
# fitting method = leastsq
# function evals = 35
# data points = 101
# variables = 3
chi-square = 3.40884
reduced chi-square = 0.03478
Akaike info crit = -336.26371
Bayesian info crit = -328.41835
chi-square = 3.40883599
reduced chi-square = 0.03478404
Akaike info crit = -336.263713
Bayesian info crit = -328.418352
[[Variables]]
amp: 8.88021829 +/- 0.113594 (1.28%) (init= 5)
cen: 5.65866102 +/- 0.010304 (0.18%) (init= 5)
wid: 0.69765468 +/- 0.010304 (1.48%) (init= 1)
[[Correlations]] (unreported correlations are < 0.100)
C(amp, wid) = 0.577

amp: 8.88021830 +/- 0.11359492 (1.28%) (init = 5)
cen: 5.65866102 +/- 0.01030495 (0.18%) (init = 5)
wid: 0.69765468 +/- 0.01030495 (1.48%) (init = 1)
[[Correlations]] (unreported correlations are < 0.100)
C(amp, wid) = 0.577

As the script shows, the result will also have :attr:`init_fit` for the fit
with the initial parameter values and a :attr:`best_fit` for the fit with
Expand Down Expand Up @@ -871,28 +871,29 @@ which prints out the results::
[[Model]]
(Model(gaussian) + Model(line))
[[Fit Statistics]]
# fitting method = leastsq
# function evals = 46
# data points = 101
# variables = 5
chi-square = 2.57856
reduced chi-square = 0.02686
Akaike info crit = -360.45702
Bayesian info crit = -347.38142
chi-square = 2.57855517
reduced chi-square = 0.02685995
Akaike info crit = -360.457020
Bayesian info crit = -347.381417
[[Variables]]
amp: 8.45931061 +/- 0.124145 (1.47%) (init= 5)
cen: 5.65547872 +/- 0.009176 (0.16%) (init= 5)
wid: 0.67545523 +/- 0.009916 (1.47%) (init= 1)
slope: 0.26484403 +/- 0.005748 (2.17%) (init= 0)
intercept: -0.96860201 +/- 0.033522 (3.46%) (init= 1)
[[Correlations]] (unreported correlations are < 0.100)
C(slope, intercept) = -0.795
C(amp, wid) = 0.666
C(amp, intercept) = -0.222
C(amp, slope) = -0.169
C(cen, slope) = -0.162
C(wid, intercept) = -0.148
C(cen, intercept) = 0.129
C(wid, slope) = -0.113
amp: 8.45931062 +/- 0.12414515 (1.47%) (init = 5)
cen: 5.65547873 +/- 0.00917678 (0.16%) (init = 5)
wid: 0.67545524 +/- 0.00991686 (1.47%) (init = 1)
slope: 0.26484404 +/- 0.00574892 (2.17%) (init = 0)
intercept: -0.96860202 +/- 0.03352202 (3.46%) (init = 1)
[[Correlations]] (unreported correlations are < 0.100)
C(slope, intercept) = -0.795
C(amp, wid) = 0.666
C(amp, intercept) = -0.222
C(amp, slope) = -0.169
C(cen, slope) = -0.162
C(wid, intercept) = -0.148
C(cen, intercept) = 0.129
C(wid, slope) = -0.113

and shows the plot on the left.

Expand Down Expand Up @@ -979,25 +980,24 @@ binary operator. A full script using this technique is here:
which prints out the results::

[[Model]]
(Model(jump) <function convolve at 0x112ef1320> Model(gaussian))
(Model(jump) <function convolve at 0x10480e598> Model(gaussian))
[[Fit Statistics]]
# fitting method = leastsq
# function evals = 27
# function evals = 23
# data points = 201
# variables = 3
chi-square = 21.48845
reduced chi-square = 0.10853
Akaike info crit = -443.39364
Bayesian info crit = -433.48373
chi-square = 21.6932855
reduced chi-square = 0.10956205
Akaike info crit = -441.486726
Bayesian info crit = -431.576811
[[Variables]]
mid: 5 (fixed)
sigma: 0.62393255 +/- 0.012818 (2.05%) (init= 1.5)
center: 4.52795480 +/- 0.009261 (0.20%) (init= 3.5)
amplitude: 0.62852927 +/- 0.001783 (0.28%) (init= 1)
[[Correlations]] (unreported correlations are < 0.100)
C(center, amplitude) = 0.339
C(sigma, amplitude) = 0.276

mid: 5 (fixed)
center: 4.52495463 +/- 0.00937255 (0.21%) (init = 3.5)
sigma: 0.62328669 +/- 0.01297258 (2.08%) (init = 1.5)
amplitude: 0.62362920 +/- 0.00179096 (0.29%) (init = 1)
[[Correlations]] (unreported correlations are < 0.100)
C(center, amplitude) = 0.338
C(sigma, amplitude) = 0.276


and shows the plots:
Expand Down
11 changes: 11 additions & 0 deletions doc/whatsnew.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,17 @@ significant to the use and behavior of the library. This is not meant
to be a comprehensive list of changes. For such a complete record,
consult the `lmfit github repository`_.


.. _whatsnew_099_label:

Version 0.9.9 Release Notes
==========================================
Lmfit now uses the asteval (https://github.com/newville/asteval) package
instead of distributing its own copy. The minimum required asteval version
is 0.9.12, which is available on PyPi. If you see import errors related to
asteval, please make sure that you actually have the latest version installed.


.. _whatsnew_096_label:

Version 0.9.6 Release Notes
Expand Down
2 changes: 1 addition & 1 deletion lmfit/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -1816,7 +1816,7 @@ def plot(self, datafmt='o', fitfmt='-', initfmt='--', xlabel=None,
Keyword arguments for the axes for the fit plot.
fig_kws : dict, optional
Keyword arguments for a new figure, if there is one being created.
sinitial conditions for the fithow_init : bool, optional
show_init : bool, optional
Whether to show the initial conditions for the fit (default is False).
Returns
Expand Down

0 comments on commit b6f5789

Please sign in to comment.