Skip to content

Commit

Permalink
Merge pull request #136 from ArnoStrouwen/LT
Browse files Browse the repository at this point in the history
[skip ci] LanguageTool
  • Loading branch information
ChrisRackauckas authored Jan 8, 2023
2 parents 00165b5 + 09ca37c commit fa7a14d
Show file tree
Hide file tree
Showing 35 changed files with 271 additions and 268 deletions.
32 changes: 16 additions & 16 deletions docs/src/comparisons/cppfortran.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ to help the transition.

## Why SciML? High-Level Workflow Reasons

If you're coming from "hardcore" C++/Fortran computing environments, some things to check
If you're coming from hardcore C++/Fortran computing environments, some things to check
out with Julia's SciML are:

* **Interactivity** - use the interactive REPL to easily investigate numerical details.
Expand All @@ -21,15 +21,15 @@ out with Julia's SciML are:
* **Symbolic modeling languages** - writing models by hand can leave a lot of performance
on the table. Using high-level modeling tools like
[ModelingToolkit](https://github.com/SciML/ModelingToolkit.jl) can automate symbolic
simplifications which
simplifications, which
[improve the stability and performance of numerical solvers](https://www.youtube.com/watch?v=ZFoQihr3xLs).
On complex models, even the best handwritten C++/Fortran code is orders of magnitude behind
the code that symbolic tearing algorithms can achieve!
* **Composable Library Components** - In C++/Fortran environments, every package feels like
a silo. Arrays made for PETSc cannot easily be used in Trilinos, and converting Sundials
NVector outputs to DataFrames for post-simulation data processing is a process itself.
The Julia SciML environment embraces interoperability. Don't wait for SciML to do it: by
using generic coding with JIT compilation these connections create new optimized code on
using generic coding with JIT compilation, these connections create new optimized code on
the fly and allow for a more expansive feature set than can ever be documented. Take
[new high-precision number types from a package](https://github.com/JuliaArbTypes/ArbFloats.jl)
and stick them into a nonlinear solver. Take
Expand All @@ -43,16 +43,16 @@ out with Julia's SciML are:
one line of code. This gives you a way to incrementally adopt new features/methods
while retaining the older pieces you know and trust.
* **Don't Start from Scratch** - SciML builds on the extensive
[Base library of Julia](https://docs.julialang.org/en/v1/) and thus grows and improves
[Base library of Julia](https://docs.julialang.org/en/v1/), and thus grows and improves
with every update to the language. With hundreds of monthly contributors to SciML and
hundreds of monthly contributors to Julia, SciML is one of the most actively developed
open source scientific computing ecosystems out there!
open-source scientific computing ecosystems out there!
* **Easier High-Performance and Parallel Computing** - With Julia's ecosystem,
[CUDA](https://github.com/JuliaGPU/CUDA.jl) will automatically install of the required
binaries and `cu(A)*cu(B)` is then all that's required to GPU-accelerate large-scale
linear algebra. [MPI](https://github.com/JuliaParallel/MPI.jl) is easy to install and
use. [Distributed computing through password-less SSH](https://docs.julialang.org/en/v1/manual/distributed-computing/). [Multithreading](https://docs.julialang.org/en/v1/manual/multi-threading/)
is automatic and baked into a lot of libraries, with a specialized algorithm to ensure
is automatic and baked into many libraries, with a specialized algorithm to ensure
hierarchical usage does not oversubscribe threads. Basically, libraries give you a lot
of parallelism for free, and doing the rest is a piece of cake.
* **Mix Scientific Computing with Machine Learning** - Want to [automate the discovery
Expand All @@ -68,7 +68,7 @@ solvers:
## Why SciML? Some Technical Details

Let's face the facts, in the [open benchmarks](https://benchmarks.sciml.ai/stable/) the
pure-Julia solvers tend to outperform the classic "best" C++ and Fortran solvers in almost
pure-Julia solvers tend to outperform the classic best C++ and Fortran solvers in almost
every example (with a few notable exceptions). But why?

The answer is two-fold: Julia is as fast as C++/Fortran, and the algorithms are what matter.
Expand All @@ -94,8 +94,8 @@ There are many ways which Julia's algorithms achieve performance advantages. Som
highlight include:

* Julia is at the forefront of numerical methods research in many domains. This is highlighted
in [the differential equation solver comparisons](https://www.stochasticlifestyle.com/comparison-differential-equation-solver-suites-matlab-r-julia-python-c-fortran/)
where the Julia solvers were the first to incorporate "newer" optimized Runge-Kutta tableaus,
in [the differential equation solver comparisons](https://www.stochasticlifestyle.com/comparison-differential-equation-solver-suites-matlab-r-julia-python-c-fortran/),
where the Julia solvers were the first to incorporate newer optimized Runge-Kutta tableaus,
around half a decade before other software. Since then, the literature has only continued
to evolve, and only Julia's SciML keeps up. At this point, many of the publication's first
implementation is in [OrdinaryDiffEq.jl](https://github.com/SciML/OrdinaryDiffEq.jl) with
Expand All @@ -107,8 +107,8 @@ highlight include:
However, in modern Julia, every function from `log` to `^` has been reimplemented in the
Julia standard library to improve numerical correctness and performance. For example,
[Pumas, the nonlinear mixed effects estimation system](https://www.biorxiv.org/content/10.1101/2020.11.28.402297v2)
built on SciML and
[used by Moderna for the vaccine trials](https://www.youtube.com/watch?v=6wGSCD3cI9E)
built on SciML, and
[used by Moderna for the vaccine trials](https://www.youtube.com/watch?v=6wGSCD3cI9E),
notes in its paper that approximations to such math libraries itself gave a 2x performance
improvement in even the most simple non-stiff ODE solvers over matching Fortran
implementations. Pure Julia linear algebra tooling, like
Expand All @@ -123,11 +123,11 @@ highlight include:
[sparsity patterns are automatically deduced from code and optimized on](https://openreview.net/pdf?id=rJlPdcY38B). [Nonlinear equations are symbolically-torn](https://www.youtube.com/watch?v=ZFoQihr3xLs), changing large nonlinear systems into sequential solving of much smaller
systems and benefiting from an O(n^3) cost reduction. These can be orders of magnitude
cost reductions which come for free, and unless you know every trick in the book it will
be hard to match SciML's performance!
be difficult to match SciML's performance!
* Pervasive automatic differentiation mixed with compiler tricks wins battles. Many
high-performance libraries in C++ and Fortran cannot assume that all of its code is
compatible with automatic differentiation, and thus many internal performance tricks are
not applied. For example
not applied. For example,
[ForwardDiff.jl's chunk seeding](https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/docs/src/dev/how_it_works.md) allows for a single call to `f` to generate multiple columns of a Jacobian.
When mixed with [sparse coloring tools](https://github.com/JuliaDiff/SparseDiffTools.jl),
entire Jacobians can be constructed with just a few `f` calls. Studies in applications
Expand All @@ -138,8 +138,8 @@ highlight include:

To really highlight how JIT compilation and automatic differentiation integration can
change algorithms, let's look at the problem of differentiating an ODE solver. As is
[derived an discussed in detail at a seminar with the American Statistical Association](https://www.youtube.com/watch?v=Xwh42RhB7O4),
there are many ways to implement well-known "adjoint" methods which are required for
[derived and discussed in detail at a seminar with the American Statistical Association](https://www.youtube.com/watch?v=Xwh42RhB7O4),
there are many ways to implement well-known adjoint methods which are required for
performance. Each has different stability and performance trade-offs, and
[Julia's SciML is the only system to systemically offer all of the trade-off options](https://sensitivity.sciml.ai/stable/manual/differential_equation_sensitivities/). In many cases,
using analytical adjoints of a solver is not advised due to performance reasons, [with the
Expand All @@ -153,7 +153,7 @@ showcased as the seeding methods in this plot:

![](https://i0.wp.com/www.stochasticlifestyle.com/wp-content/uploads/2022/10/Capture7.png?w=2091&ssl=1)

Unless one directly defines special "vjp" functions, this is how the Julia SciML methods
Unless one directly defines special vjp functions, this is how the Julia SciML methods
achieve orders of magnitude performance advantages over CVODES's adjoints and PETSC's
TS-adjoint.

Expand Down
14 changes: 7 additions & 7 deletions docs/src/comparisons/matlab.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# [Getting Started with Julia's SciML for the MATLAB User](@id matlab)

If you're a MATLAB user who has looked into Julia for some performance improvements, you
may have noticed that the standard library does not have all of the "batteries" included
may have noticed that the standard library does not have all of the batteries included
with a base MATLAB installation. Where's the ODE solver? Where's `fmincon` and `fsolve`?
Those scientific computing functionalities are the pieces provided by the Julia SciML
ecosystem!
Expand All @@ -15,13 +15,13 @@ ecosystem!
grow as more complex algorithms are required.
* **Julia is quick to learn from MATLAB** - Most ODE codes can be translated in a few
minutes. If you need help, check out the
[QuantEcon MATLAB-Python-Julia Cheatsheet.](https://cheatsheets.quantecon.org/)
[QuantEcon MATLAB-Python-Julia Cheat Sheet.](https://cheatsheets.quantecon.org/)
* **Package Management and Versioning** - [Julia's package manager](https://github.com/JuliaLang/Pkg.jl)
takes care of dependency management, testing, and continuous delivery in order to make
the installation and maintenance process smoother. For package users, this means it's
easier to get packages with complex functionality in your hands.
* **Free and Open Source** - If you want to know how things are being computed, just look
[at our Github organization](https://github.com/SciML). Lots of individuals use Julia's
[at our GitHub organization](https://github.com/SciML). Lots of individuals use Julia's
SciML to research how the algorithms actually work because of how accessible and tweakable
the ecosystem is!
* **Composable Library Components** - In MATLAB environments, every package feels like
Expand All @@ -37,7 +37,7 @@ ecosystem!
binaries and `cu(A)*cu(B)` is then all that's required to GPU-accelerate large-scale
linear algebra. [MPI](https://github.com/JuliaParallel/MPI.jl) is easy to install and
use. [Distributed computing through password-less SSH](https://docs.julialang.org/en/v1/manual/distributed-computing/). [Multithreading](https://docs.julialang.org/en/v1/manual/multi-threading/)
is automatic and baked into a lot of libraries, with a specialized algorithm to ensure
is automatic and baked into many libraries, with a specialized algorithm to ensure
hierarchical usage does not oversubscribe threads. Basically, libraries give you a lot
of parallelism for free, and doing the rest is a piece of cake.
* **Mix Scientific Computing with Machine Learning** - Want to [automate the discovery
Expand All @@ -52,16 +52,16 @@ In this plot, `MATLAB` in orange represents MATLAB's most commonly used solvers:
## Need a case study?

Check out [this talk from NASA Scientists getting a 15,000x acceleration by switching from
Simulink to Julia's ModelingToolkit!](https://www.youtube.com/watch?v=tQpqsmwlfY0).
Simulink to Julia's ModelingToolkit!](https://www.youtube.com/watch?v=tQpqsmwlfY0)

## Need Help Translating from MATLAB to Julia?

The following resources can be particularly helpful when adopting Julia for SciML for the
first time:

* [QuantEcon MATLAB-Python-Julia Cheatsheet](https://cheatsheets.quantecon.org/)
* [QuantEcon MATLAB-Python-Julia Cheat Sheet](https://cheatsheets.quantecon.org/)
* [The Julia Manual's Noteworthy Differences from MATLAB page](https://docs.julialang.org/en/v1/manual/noteworthy-differences/#Noteworthy-differences-from-MATLAB)
* Double check your results with [MATLABDiffEq.jl](https://github.com/SciML/MATLABDiffEq.jl)
* Double-check your results with [MATLABDiffEq.jl](https://github.com/SciML/MATLABDiffEq.jl)
(automatically converts and runs ODE definitions with MATLAB's solvers)
* Use [MATLAB.jl](https://github.com/JuliaInterop/MATLAB.jl) to more incrementally move
code to Julia.
Expand Down
12 changes: 6 additions & 6 deletions docs/src/comparisons/python.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
# [Getting Started with Julia's SciML for the Python User](@id python)

If you're an Python user who has looked into Julia, you're probably wondering what is the
If you're a Python user who has looked into Julia, you're probably wondering what is the
equivalent to SciPy is. And you found it: it's the SciML ecosystem! To a Python developer,
SciML is SciPy, but with the high-performance GPU, capabilities of PyTorch, and
neural network capabilities, all baked right in. With SciML, there is no "separate world"
neural network capabilities, all baked right in. With SciML, there is no separate world
of machine learning sublanguages: there is just one cohesive package ecosystem.

## Why SciML? High-Level Workflow Reasons
Expand All @@ -30,7 +30,7 @@ of machine learning sublanguages: there is just one cohesive package ecosystem.
binaries and `cu(A)*cu(B)` is then all that's required to GPU-accelerate large-scale
linear algebra. [MPI](https://github.com/JuliaParallel/MPI.jl) is easy to install and
use. [Distributed computing through password-less SSH](https://docs.julialang.org/en/v1/manual/distributed-computing/). [Multithreading](https://docs.julialang.org/en/v1/manual/multi-threading/)
is automatic and baked into a lot of libraries, with a specialized algorithm to ensure
is automatic and baked into many libraries, with a specialized algorithm to ensure
hierarchical usage does not oversubscribe threads. Basically, libraries give you a lot
of parallelism for free, and doing the rest is a piece of cake.
* **Mix Scientific Computing with Machine Learning** - Want to [automate the discovery
Expand All @@ -48,7 +48,7 @@ The following resources can be particularly helpful when adopting Julia for SciM
first time:

* [The Julia Manual's Noteworthy Differences from Python page](https://docs.julialang.org/en/v1/manual/noteworthy-differences/#Noteworthy-differences-from-Python)
* Double check your results with [SciPyDiffEq.jl](https://github.com/SciML/SciPyDiffEq.jl)
* Double-check your results with [SciPyDiffEq.jl](https://github.com/SciML/SciPyDiffEq.jl)
(automatically converts and runs ODE definitions with SciPy's solvers)
* Use [PyCall.jl](https://github.com/JuliaPy/PyCall.jl) to more incrementally move
code to Julia.
Expand All @@ -59,7 +59,7 @@ The following chart will help you get quickly acquainted with Julia's SciML Tool

|Workflow Element|SciML-Supported Julia packages|
| --- | --- |
|matplotlib|[Plots](https://docs.juliaplots.org/stable/), [Makie](https://docs.makie.org/stable/)|
|Matplotlib|[Plots](https://docs.juliaplots.org/stable/), [Makie](https://docs.makie.org/stable/)|
|`scipy.special`|[SpecialFunctions](https://github.com/JuliaMath/SpecialFunctions.jl)|
|`scipy.linalg.solve`|[LinearSolve](http://linearsolve.sciml.ai/dev/)|
|`scipy.integrate`|[Integrals](https://integrals.sciml.ai/)|
Expand All @@ -86,5 +86,5 @@ you use analytical adjoint definitions? You can, but there are tricks to mix aut
differentiation into the adjoint definitions for a few orders of magnitude improvement too,
as [explained in this blog post](https://www.stochasticlifestyle.com/direct-automatic-differentiation-of-solvers-vs-analytical-adjoints-which-is-better/).

This facts, along with many others, compose to algorithmic improvements with the
These facts, along with many others, compose to algorithmic improvements with the
implementation improvements, which leads to orders of magnitude improvements!
8 changes: 4 additions & 4 deletions docs/src/comparisons/r.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,16 +21,16 @@ is the ecosystem for doing this with Julia.
the differential equation solver to use specialized hardware acceleration.
* **A Global Harmonious Documentation for Scientific Computing** - R's documentation for
scientific computing is scattered in a bunch of individual packages where the developers
do not talk to each other! This not only leads to documentation differences but also
"style" differences: one package uses `tol` while the other uses `atol`. With Julia's
do not talk to each other! This not only leads to documentation differences, but also
style differences: one package uses `tol` while the other uses `atol`. With Julia's
SciML, the whole ecosystem is considered together, and inconsistencies are handled at the
global level. The goal is to be working in one environment with one language.
* **Easier High-Performance and Parallel Computing** - With Julia's ecosystem,
[CUDA](https://github.com/JuliaGPU/CUDA.jl) will automatically install of the required
binaries and `cu(A)*cu(B)` is then all that's required to GPU-accelerate large-scale
linear algebra. [MPI](https://github.com/JuliaParallel/MPI.jl) is easy to install and
use. [Distributed computing through password-less SSH](https://docs.julialang.org/en/v1/manual/distributed-computing/). [Multithreading](https://docs.julialang.org/en/v1/manual/multi-threading/)
is automatic and baked into a lot of libraries, with a specialized algorithm to ensure
is automatic and baked into many libraries, with a specialized algorithm to ensure
hierarchical usage does not oversubscribe threads. Basically, libraries give you a lot
of parallelism for free, and doing the rest is a piece of cake.
* **Mix Scientific Computing with Machine Learning** - Want to [automate the discovery
Expand All @@ -50,7 +50,7 @@ first time:
* [The Julia Manual's Noteworthy Differences from R page](https://docs.julialang.org/en/v1/manual/noteworthy-differences/#Noteworthy-differences-from-R)
* [Tutorials on Data Wrangling and Plotting in Julia (Sections 4 and 5)](http://tutorials.pumas.ai/)
written for folks with a background in R.
* Double check your results with [deSolveDiffEq.jl](https://github.com/SciML/deSolveDiffEq.jl)
* Double-check your results with [deSolveDiffEq.jl](https://github.com/SciML/deSolveDiffEq.jl)
(automatically converts and runs ODE definitions with R's deSolve solvers)
* Use [RCall.jl](https://juliainterop.github.io/RCall.jl/stable/) to more incrementally move
code to Julia.
Expand Down
Loading

0 comments on commit fa7a14d

Please sign in to comment.