Skip to content

Commit

Permalink
docs/readme: add leaf book to readme
Browse files Browse the repository at this point in the history
  • Loading branch information
MichaelHirn committed Apr 20, 2016
1 parent af0d7e6 commit ad03fa9
Show file tree
Hide file tree
Showing 6 changed files with 127 additions and 180 deletions.
155 changes: 54 additions & 101 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,44 +2,40 @@

## Introduction

Leaf is a Machine Intelligence Framework engineered by software developers, not
scientists. It was inspired by the brilliant people behind TensorFlow, Torch,
Caffe, Rust and numerous research papers and brings modularity, performance and
portability to deep learning. Leaf is lean and tries to introduce minimal
Leaf is a open Machine Learning Framework for hackers to build classical, deep
or hybrid machine learning applications. It was inspired by the brilliant people
behind TensorFlow, Torch, Caffe, Rust and numerous research papers and brings
modularity, performance and portability to deep learning.

Leaf has one of the simplest APIs, is lean and tries to introduce minimal
technical debt to your stack.

Leaf is a few months old, but thanks to its architecture and Rust, it is already one of
the fastest Machine Intelligence Frameworks in the world.
Leaf is a few months old, but thanks to its architecture and Rust, it is already
one of the fastest Machine Intelligence Frameworks available.

<div align="center">
<img src="http://autumnai.com/images/autumn_leaf_benchmarks_alexnet.png"><br><br>
</div>

> See more Deep Neural Networks benchmarks on [Deep Learning Benchmarks][deep-learning-benchmarks-website].
Leaf is portable. Run it on CPUs, GPUs, FPGAs on machines with an OS or on
Leaf is portable. Run it on CPUs, GPUs, and FPGAs, on machines with an OS, or on
machines without one. Run it with OpenCL or CUDA. Credit goes to
[Collenchyma][collenchyma] and Rust.

Leaf is part of the [Autumn][autumn] Machine Intelligence Platform, which is
working on making AI algorithms 100x more computational efficient. It seeks to bring
real-time, offline AI to smartphones and embedded devices.
working on making AI algorithms 100x more computational efficient.

We see Leaf as the core of constructing high-performance machine intelligence
applications. Leaf's design makes it easy to publish independent modules to make
e.g. deep reinforcement learning, visualization and monitoring, network
distribution, [automated preprocessing][cuticula] or scaleable production
deployment easily accessible for everyone.

For more info, refer to
* the [Leaf examples][leaf-examples],
* the [Leaf Documentation][documentation],
* the [Autumn Website][autumn] or
* the [Q&A](#qa)

[caffe]: https://github.com/BVLC/caffe
[rust]: https://www.rust-lang.org/
[autumn]: http://autumnai.com
[leaf-book]: http://autumnai.com/leaf/book
[tensorflow]: https://github.com/tensorflow/tensorflow
[benchmarks]: #benchmarks
[leaf-examples]: #examples
Expand All @@ -52,22 +48,42 @@ For more info, refer to
## Getting Started

If you are new to Rust you can install it as detailed [here][rust_download].
We also recommend taking a look at the [official Getting Started Guide][rust_getting_started].
### Documentation

To learn how to build classical, deep or hybrid machine learning applications with Leaf, check out the [Leaf - Machine Learning for Hackers][leaf-book] book.

For additional information see the [Rust API Documentation][documentation] or the [Autumn Website][autumn].

Or start by running the **Leaf examples**.

If you're using Cargo, just add Leaf to your `Cargo.toml`:
We are providing a [Leaf examples repository][leaf-examples], where we and
others publish executable machine learning models build with Leaf. It features
a CLI for easy usage and has a detailed guide in the [project
README.md][leaf-examples].

Leaf comes with an examples directory as well, which features popular neural
networks (e.g. Alexnet, Overfeat, VGG). To run them on your machine, just follow
the install guide, clone this repoistory and then run

```bash
# The examples currently require CUDA support.
cargo run --release --no-default-features --features cuda --example benchmarks alexnet
```

[leaf-examples]: https://github.com/autumnai/leaf-examples

### Installation

> Leaf is build in [Rust][rust]. If you are new to Rust you can install Rust as detailed [here][rust_download].
We also recommend taking a look at the [official Rust - Getting Started Guide][rust_getting_started].

To start building a machine learning application (Rust only for now. Wrappers are welcome) and you are using Cargo, just add Leaf to your `Cargo.toml`:

```toml
[dependencies]
leaf = "0.2.0"
```

If you're using [Cargo Edit][cargo-edit], you can
call:

```bash
cargo add leaf
```
[rust_download]: https://www.rust-lang.org/downloads.html
[rust_getting_started]: https://doc.rust-lang.org/book/getting-started.html
[cargo-edit]: https://github.com/killercup/cargo-edit
Expand All @@ -88,24 +104,24 @@ opencl = ["leaf/opencl"]

> More information on the use of feature flags in Leaf can be found in [FEATURE-FLAGS.md](./FEATURE-FLAGS.md)
### Contributing

## Examples
If you want to start hacking on Leaf (e.g.
[adding a new `Layer`](http://autumnai.com/leaf/book/create-new-layer.html))
you should start with forking and cloning the repository.

We are providing a [Leaf examples repository][leaf-examples], where we and
others publish executable machine learning models build with Leaf. It features
a CLI for easy usage and has a detailed guide in the [project
README.md][leaf-examples].
We have more instructions to help you get started in the [CONTRIBUTING.md][contributing].

Leaf comes with an examples directory as well, which features popular neural
networks (e.g. Alexnet, Overfeat, VGG). To run them on your machine, just follow
the install guide, clone this repoistory and then run
We also has a near real-time collaboration culture, which happens
here on Github and on the [Leaf Gitter Channel][gitter-leaf].

```bash
# The examples currently require CUDA support.
cargo run --release --no-default-features --features cuda --example benchmarks alexnet
```
> Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as below, without any additional terms or conditions.
[leaf-examples]: https://github.com/autumnai/leaf-examples
[contributing]: CONTRIBUTING.md
[gitter-leaf]: https://gitter.im/autumnai/leaf
[mj]: https://twitter.com/mjhirn
[hobofan]: https://twitter.com/hobofan
[irc]: https://chat.mibbit.com/?server=irc.mozilla.org&channel=%23rust-machine-learning

## Ecosystem / Extensions

Expand All @@ -120,84 +136,21 @@ and extensible as possible. More helpful crates you can use with Leaf:

## Support / Contact

- With a bit of luck, you can find us online on the #rust-machine-learing IRC at irc.mozilla.org,
- With a bit of luck, you can find us online on the #rust-machine-learning IRC at irc.mozilla.org,
- but we are always approachable on [Gitter/Leaf][gitter-leaf]
- For bugs and feature request, you can create a [Github issue][leaf-issue]
- For more private matters, send us email straight to our inbox: [email protected]
- Refer to [Autumn][autumn] for more information

[leaf-issue]: https://github.com/autumnai/leaf/issues

## Contributing

Want to contribute? Awesome! We have [instructions to help you get started][contributing].

Leaf has a near real-time collaboration culture, and it happens here on Github and
on the [Leaf Gitter Channel][gitter-leaf].

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0
license, shall be dual licensed as below, without any additional terms or
conditions.

[contributing]: CONTRIBUTING.md
[gitter-leaf]: https://gitter.im/autumnai/leaf
[mj]: https://twitter.com/mjhirn
[hobofan]: https://twitter.com/hobofan
[irc]: https://chat.mibbit.com/?server=irc.mozilla.org&channel=%23rust-machine-learning

## Changelog

You can find the release history at the [CHANGELOG.md][changelog]. We are using [Clog][clog], the Rust tool for auto-generating CHANGELOG files.

[changelog]: CHANGELOG.md
[Clog]: https://github.com/clog-tool/clog-cli

## Q&A

#### _Why Rust?_

Hardware has just recently become strong enough to support real-world
usage of machine intelligence e.g. super-human image recognition, self-driving
cars, etc. To take advantage of the computational power of the underlying
hardware, from GPUs to clusters, you need a low-level language that allows for
control of memory. But to make machine intelligence widely accessible you want
to have a high-level, comfortable abstraction over the underlying hardware.

Rust allows us to cross this chasm.
Rust promises performance like C/C++ but with safe memory-control. For now we
can use C Rust wrappers for performant libraries. But in the future Rust
rewritten libraries will have the advantage of zero-cost safe memory control,
that will make large, parallel learning networks over CPUs and GPUs more
feasible and more reliable to develop. The development of these future libraries
is already under way e.g. [Glium][glium].

On the usability side, Rust offers a trait-system that makes it easy for
researchers and hobbyists alike to extend and work with Leaf as if it were
written in a higher-level language such as Ruby, Python, or Java.

#### _Who can use Leaf?_

We develop Leaf under the MIT open source license, which, paired with the easy
access and performance, makes Leaf a first-choice option for researchers and
developers alike.

#### _Why did you open source Leaf?_

We believe strongly in machine intelligence and think that it will have a major
impact on future innovations, products and our society. At Autumn, we experienced
a lack of common and well engineered tools for machine learning and therefore
started to create a modular toolbox for machine learning in Rust. We hope that,
by making our work open source, we will speed up research and development of
production-ready applications and make that work easier as well.

#### _Who is Autumn?_

Autumn is a startup working on automated decision making. Autumn was started by
two developers, MJ and Max. The startup is located in Berlin and recently
received a pre-seed investment from Axel Springer and Plug&Play.

[glium]: https://github.com/tomaka/glium

## License

Licensed under either of
Expand Down
2 changes: 1 addition & 1 deletion doc/book/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ <h1>Leaf - Machine Learning for Hackers</h1>
classical, stochastic or hybrids, and solvers for executing and optimizing the
model.</p>
<p>This is already the entire API for machine learning with Leaf. To learn how
this is possible and how to build machine learning applications, refer to
this is possible and how to build machine learning applications, refer to chapters
<a href="./layers.html">2. Layers</a> and <a href="./solvers.html">3. Solvers</a>. Enjoy!</p>
<h2>Benefits+</h2>
<p>Leaf was built with three concepts in mind: accessibility/simplicity,
Expand Down
61 changes: 29 additions & 32 deletions doc/book/layer-lifecycle.html
Original file line number Diff line number Diff line change
Expand Up @@ -68,26 +68,24 @@ <h1 class="menu-title"></h1>

<div id="content" class="content">
<h1>Layer Lifecycle</h1>
<p>In <a href="./layers.html">2. Layers</a> we have already seen a little bit about how to
construct a <code>Layer</code> from a <code>LayerConfig</code>. In this chapter, we take
a closer look at what happens inside Leaf when initializing a <code>Layer</code> when
running the <code>.forward</code> of a <code>Layer</code> and when running the <code>.backward</code>. In the
next chapter <a href="./building-networks.html">2.2 Create a Network</a> we then
apply our knowledge to construct deep networks via the container layer.</p>
<p>Initialization (<code>::from_config</code>), <code>.forward</code> and <code>.backward</code> are the three most
important methods of a <code>Layer</code> and describe basically the entire API. Let's
take a closer look at what happens inside Leaf, when these methods are called.</p>
<p>In chapter <a href="./layers.html">2. Layers</a> we saw how to
construct a simple <code>Layer</code> from a <code>LayerConfig</code>. In this chapter, we take
a closer look at what happens inside Leaf when initializing a <code>Layer</code> and when running its
<code>.forward</code> and <code>.backward</code> methods. In the next chapter <a href="./building-networks.html">2.2 Create a Network</a> we
apply our knowledge to construct deep networks with the container layer.</p>
<p>The most important methods of a <code>Layer</code> are initialization (<code>::from_config</code>), <code>.forward</code> and <code>.backward</code>.
They basically describe the entire API, so let's take a closer look at what happens inside Leaf when these methods are called.</p>
<h3>Initialization</h3>
<p>A layer is constructed from a <code>LayerConfig</code> via the <code>Layer::from_config</code>
<p>A layer is constructed from a <code>LayerConfig</code> with the <code>Layer::from_config</code>
method, which returns a fully initialized <code>Layer</code>.</p>
<pre><code class="language-rust">let mut sigmoid: Layer = Layer::from_config(backend.clone(), &amp;LayerConfig::new(&quot;sigmoid&quot;, LayerType::Sigmoid))
let mut alexnet: Layer = Layer::from_config(backend.clone(), &amp;LayerConfig::new(&quot;alexnet&quot;, LayerType::Sequential(cfg)))
</code></pre>
<p>In the example above, the first layer has a Sigmoid worker
(<code>LayerType::Sigmoid</code>). The second layer has a Sequential worker.
Although both <code>Layer::from_config</code> methods, return a <code>Layer</code>, the behavior of
the <code>Layer</code> depends on the <code>LayerConfig</code> it was constructed with. The
<code>Layer::from_config</code> calls internally the <code>worker_from_config</code> method, which
(<code>LayerType::Sigmoid</code>) and the second layer has a Sequential worker.
Although both <code>::from_config</code> methods return a <code>Layer</code>, the behavior of
that <code>Layer</code> depends on the <code>LayerConfig</code> it was constructed with. The
<code>Layer::from_config</code> internally calls the <code>worker_from_config</code> method, which
constructs the specific worker defined by the <code>LayerConfig</code>.</p>
<pre><code class="language-rust">fn worker_from_config(backend: Rc&lt;B&gt;, config: &amp;LayerConfig) -&gt; Box&lt;ILayer&lt;B&gt;&gt; {
match config.layer_type.clone() {
Expand All @@ -99,35 +97,34 @@ <h3>Initialization</h3>
}
}
</code></pre>
<p>The layer specific <code>::from_config</code> (if available or needed) then takes care of
<p>The layer-specific <code>::from_config</code> (if available or needed) then takes care of
initializing the worker struct, allocating memory for weights and so on.</p>
<p>In case the worker layer is a container layer, its <code>::from_config</code> takes
<p>If the worker is a container layer, its <code>::from_config</code> takes
care of initializing all the <code>LayerConfig</code>s it contains (which were added via its
<code>.add_layer</code> method) and connecting them in
the order they were provided to the <code>LayerConfig</code> of the container.</p>
<p>Every <code>.forward</code> or <code>.backward</code> call that is now made to the returned <code>Layer</code> is
sent to the worker.</p>
<code>.add_layer</code> method) and connecting them in the order they were provided.</p>
<p>Every <code>.forward</code> or <code>.backward</code> call that is made on the returned <code>Layer</code> is
run by the internal worker.</p>
<h3>Forward</h3>
<p>The <code>forward</code> method of a <code>Layer</code> sends the input through the constructed
<p>The <code>forward</code> method of a <code>Layer</code> threads the input through the constructed
network and returns the output of the network's final layer.</p>
<p>The <code>.forward</code> method does three things:</p>
<ol>
<li>Reshape the input data if necessary</li>
<li>Sync the input/weights to the device were the computation happens. This step
removes the worker layer from the obligation to care about memory synchronization.</li>
<li>Call the <code>forward</code> method of the worker layer.</li>
<li>Sync the input/weights to the device where the computation happens. This step
removes the need for the worker layer to care about memory synchronization.</li>
<li>Call the <code>forward</code> method of the internal worker layer.</li>
</ol>
<p>In case, the worker layer is a container layer, the <code>.forward</code> method of the
container layer takes care of calling the <code>.forward</code> methods of its managed
<p>If the worker layer is a container layer, the <code>.forward</code> method
takes care of calling the <code>.forward</code> methods of its managed
layers in the right order.</p>
<h3>Backward</h3>
<p>The <code>.backward</code> of a <code>Layer</code> works quite similar to its <code>.forward</code>. Although it
does not need to reshape the input. The <code>.backward</code> computes
the gradient with respect to the input and the gradient w.r.t. the parameters but
only returns the gradient w.r.t the input as only that is needed to compute the
<p>The <code>.backward</code> method of a <code>Layer</code> works similarly to <code>.forward</code>, apart from
needing to reshape the input. The <code>.backward</code> method computes
the gradient with respect to the input as well as the gradient w.r.t. the parameters. However,
the method only returns the input gradient because that is all that is needed to compute the
gradient of the entire network via the chain rule.</p>
<p>In case the worker layer is a container layer, the <code>.backward</code> method of the
container layer takes care of calling the <code>.backward_input</code> and
<p>If the worker layer is a container layer, the <code>.backward</code> method
takes care of calling the <code>.backward_input</code> and
<code>.backward_parameter</code> methods of its managed layers in the right order.</p>

</div>
Expand Down
12 changes: 6 additions & 6 deletions doc/book/layers.html
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ <h1 class="menu-title"></h1>
<div id="content" class="content">
<h1>Layers</h1>
<h3>What is a Layer?</h3>
<p><a href="./deep-learning-glossary.html#Layer">Layers</a> are the highest-level and only building
<p><a href="./deep-learning-glossary.html#Layer">Layers</a> are the only building
blocks in Leaf. As we will see later on, everything is a layer. Even when
we construct <a href="./deep-learning-glossary.html#Network">networks</a>, we are still just
working with layers composed of smalle layers. This makes the API clean and expressive.</p>
Expand Down Expand Up @@ -157,15 +157,15 @@ <h4>Container Layers</h4>
can be found at
<a href="https://github.com/autumnai/leaf/tree/master/src/layers/container">src/layers/container</a>.</p>
<h3>Why Layers?</h3>
<p>The benefit of using a layer-based design approach is, that it allows for a very expressive
<p>The benefit of using a layer-based design approach is that it allows for a very expressive
setup that can represent, as far as we know, any machine learning algorithm.
That makes Leaf a framework, that can be used to construct practical machine
learning applications that combine different paradigms.</p>
<p>Other machine learning frameworks take a symbolic instead of a layered approach.
For Leaf, we decided against it, as we found it easier for developers to handle
layers, than mathematical expressions. More complex algorithms like LSTMs are
also harder to replicate in a symbolic framework than with layered ones. We
believe that Leafs layer approach strikes a great balance between,
For Leaf we decided against it, as we found it easier for developers to work with
layers than mathematical expressions. More complex algorithms like LSTMs are
also harder to replicate in a symbolic framework. We
believe that Leafs layer approach strikes a great balance between
expressiveness, usability and performance.</p>

</div>
Expand Down
2 changes: 1 addition & 1 deletion doc/book/leaf.html
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ <h1>Leaf - Machine Learning for Hackers</h1>
classical, stochastic or hybrids, and solvers for executing and optimizing the
model.</p>
<p>This is already the entire API for machine learning with Leaf. To learn how
this is possible and how to build machine learning applications, refer to
this is possible and how to build machine learning applications, refer to chapters
<a href="./layers.html">2. Layers</a> and <a href="./solvers.html">3. Solvers</a>. Enjoy!</p>
<h2>Benefits+</h2>
<p>Leaf was built with three concepts in mind: accessibility/simplicity,
Expand Down
Loading

0 comments on commit ad03fa9

Please sign in to comment.