diff --git a/README.md b/README.md index 6c8396b..4e9d712 100644 --- a/README.md +++ b/README.md @@ -34,9 +34,10 @@ And, therefore relieves the network designer of having to commit to a potentiall We support ***MxNet, Keras, and PyTorch***. Instructions for MxNet can be found [here](rational/mxnet). Instructions for Keras [here](rational/keras). The following README instructions **assume that you want to use rational activations in PyTorch.** - PyTorch>=1.4.0 - CUDA>=10.2 - +``` +PyTorch>=1.4.0 +CUDA>=10.2 +``` ## 3. Installation @@ -47,20 +48,23 @@ To install the rational_activations module, you can use pip, but:
For non TensorFlow and MXNet users, or **if the command bellow don't work** the package listed bellow don't work on your machine: #### TensorFlow or MXNet (and `torch==1.9.0`) - pip3 install -U pip wheel - pip3 install torch rational_activations +```shell +pip3 install -U pip wheel +pip3 install torch rational_activations +``` #### Other CUDA/Pytorch For any other torch version, please install from source: Modify requirements.txt to your corresponding torch version - pip3 install airspeed # to compile the CUDA templates - git clone https://github.com/ml-research/rational_activations.git - cd rational_activations - pip3 install -r requirements.txt --user - python3 setup.py install --user - +```shell +pip3 install airspeed # to compile the CUDA templates +git clone https://github.com/ml-research/rational_activations.git +cd rational_activations +pip3 install -r requirements.txt --user +python3 setup.py install --user +``` If you encounter any trouble installing rational, please contact [this person](quentin.delfosse@cs.tu-darmstadt.de). @@ -68,7 +72,7 @@ If you encounter any trouble installing rational, please contact [this person](q Rational can be integrated in the same way as any other common activation function. -~~~~ +```python import torch from rational.torch import Rational @@ -77,12 +81,13 @@ model = torch.nn.Sequential( Rational(), # e.g. instead of torch.nn.ReLU() torch.nn.Linear(H, D_out), ) -~~~~ +``` Please also check [the documentation :notebook_with_decorative_cover: ](https://rational-activations.readthedocs.io/en/latest/) ## 5. Cite Us in your paper -``` + +```bibtex @inproceedings{molina2019pade, title={Pad{\'e} Activation Units: End-to-end Learning of Flexible Activation Functions in Deep Networks}, author={Molina, Alejandro and Schramowski, Patrick and Kersting, Kristian},