Skip to content

Commit

Permalink
Fix broken baselines contributing link (#2541)
Browse files Browse the repository at this point in the history
Co-authored-by: gfmota <[email protected]>
Co-authored-by: Javier <[email protected]>
  • Loading branch information
3 people authored Oct 30, 2023
1 parent da9b326 commit d58812a
Showing 1 changed file with 49 additions and 50 deletions.
99 changes: 49 additions & 50 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,22 +23,21 @@
Flower (`flwr`) is a framework for building federated learning systems. The
design of Flower is based on a few guiding principles:

* **Customizable**: Federated learning systems vary wildly from one use case to
- **Customizable**: Federated learning systems vary wildly from one use case to
another. Flower allows for a wide range of different configurations depending
on the needs of each individual use case.

* **Extendable**: Flower originated from a research project at the University of
- **Extendable**: Flower originated from a research project at the University of
Oxford, so it was built with AI research in mind. Many components can be
extended and overridden to build new state-of-the-art systems.

* **Framework-agnostic**: Different machine learning frameworks have different
- **Framework-agnostic**: Different machine learning frameworks have different
strengths. Flower can be used with any machine learning framework, for
example, [PyTorch](https://pytorch.org),
[TensorFlow](https://tensorflow.org), [Hugging Face Transformers](https://huggingface.co/), [PyTorch Lightning](https://pytorchlightning.ai/), [MXNet](https://mxnet.apache.org/), [scikit-learn](https://scikit-learn.org/), [JAX](https://jax.readthedocs.io/), [TFLite](https://tensorflow.org/lite/), [fastai](https://www.fast.ai/), [Pandas](https://pandas.pydata.org/
) for federated analytics, or even raw [NumPy](https://numpy.org/)
[TensorFlow](https://tensorflow.org), [Hugging Face Transformers](https://huggingface.co/), [PyTorch Lightning](https://pytorchlightning.ai/), [MXNet](https://mxnet.apache.org/), [scikit-learn](https://scikit-learn.org/), [JAX](https://jax.readthedocs.io/), [TFLite](https://tensorflow.org/lite/), [fastai](https://www.fast.ai/), [Pandas](https://pandas.pydata.org/) for federated analytics, or even raw [NumPy](https://numpy.org/)
for users who enjoy computing gradients by hand.

* **Understandable**: Flower is written with maintainability in mind. The
- **Understandable**: Flower is written with maintainability in mind. The
community is encouraged to both read and contribute to the codebase.

Meet the Flower community on [flower.dev](https://flower.dev)!
Expand All @@ -58,11 +57,11 @@ Flower's goal is to make federated learning accessible to everyone. This series
2. **Using Strategies in Federated Learning**

[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-use-a-federated-learning-strategy-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-use-a-federated-learning-strategy-pytorch.ipynb))

3. **Building Strategies for Federated Learning**

[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-use-a-federated-learning-strategy-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-use-a-federated-learning-strategy-pytorch.ipynb))

4. **Custom Clients for Federated Learning**

[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-customize-the-client-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-customize-the-client-pytorch.ipynb))
Expand All @@ -73,66 +72,66 @@ Stay tuned, more tutorials are coming soon. Topics include **Privacy and Securit

[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/examples/flower-in-30-minutes/tutorial.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/examples/flower-in-30-minutes/tutorial.ipynb))


## Documentation

[Flower Docs](https://flower.dev/docs):
* [Installation](https://flower.dev/docs/framework/how-to-install-flower.html)
* [Quickstart (TensorFlow)](https://flower.dev/docs/framework/tutorial-quickstart-tensorflow.html)
* [Quickstart (PyTorch)](https://flower.dev/docs/framework/tutorial-quickstart-pytorch.html)
* [Quickstart (Hugging Face)](https://flower.dev/docs/framework/tutorial-quickstart-huggingface.html)
* [Quickstart (PyTorch Lightning [code example])](https://flower.dev/docs/framework/tutorial-quickstart-pytorch-lightning.html)
* [Quickstart (MXNet)](https://flower.dev/docs/framework/example-mxnet-walk-through.html)
* [Quickstart (Pandas)](https://flower.dev/docs/framework/tutorial-quickstart-pandas.html)
* [Quickstart (fastai)](https://flower.dev/docs/framework/tutorial-quickstart-fastai.html)
* [Quickstart (JAX)](https://flower.dev/docs/framework/tutorial-quickstart-jax.html)
* [Quickstart (scikit-learn)](https://flower.dev/docs/framework/tutorial-quickstart-scikitlearn.html)
* [Quickstart (Android [TFLite])](https://flower.dev/docs/framework/tutorial-quickstart-android.html)
* [Quickstart (iOS [CoreML])](https://flower.dev/docs/framework/tutorial-quickstart-ios.html)

- [Installation](https://flower.dev/docs/framework/how-to-install-flower.html)
- [Quickstart (TensorFlow)](https://flower.dev/docs/framework/tutorial-quickstart-tensorflow.html)
- [Quickstart (PyTorch)](https://flower.dev/docs/framework/tutorial-quickstart-pytorch.html)
- [Quickstart (Hugging Face)](https://flower.dev/docs/framework/tutorial-quickstart-huggingface.html)
- [Quickstart (PyTorch Lightning [code example])](https://flower.dev/docs/framework/tutorial-quickstart-pytorch-lightning.html)
- [Quickstart (MXNet)](https://flower.dev/docs/framework/example-mxnet-walk-through.html)
- [Quickstart (Pandas)](https://flower.dev/docs/framework/tutorial-quickstart-pandas.html)
- [Quickstart (fastai)](https://flower.dev/docs/framework/tutorial-quickstart-fastai.html)
- [Quickstart (JAX)](https://flower.dev/docs/framework/tutorial-quickstart-jax.html)
- [Quickstart (scikit-learn)](https://flower.dev/docs/framework/tutorial-quickstart-scikitlearn.html)
- [Quickstart (Android [TFLite])](https://flower.dev/docs/framework/tutorial-quickstart-android.html)
- [Quickstart (iOS [CoreML])](https://flower.dev/docs/framework/tutorial-quickstart-ios.html)

## Flower Baselines

Flower Baselines is a collection of community-contributed experiments that reproduce the experiments performed in popular federated learning publications. Researchers can build on Flower Baselines to quickly evaluate new ideas:

* [FedAvg](https://arxiv.org/abs/1602.05629):
* [MNIST](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/fedavg_mnist)
* [FedProx](https://arxiv.org/abs/1812.06127):
* [MNIST](https://github.com/adap/flower/tree/main/baselines/fedprox/)
* [FedBN: Federated Learning on non-IID Features via Local Batch Normalization](https://arxiv.org/abs/2102.07623):
* [Convergence Rate](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/fedbn/convergence_rate)
* [Adaptive Federated Optimization](https://arxiv.org/abs/2003.00295):
* [CIFAR-10/100](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/adaptive_federated_optimization)
- [FedAvg](https://arxiv.org/abs/1602.05629):
- [MNIST](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/fedavg_mnist)
- [FedProx](https://arxiv.org/abs/1812.06127):
- [MNIST](https://github.com/adap/flower/tree/main/baselines/fedprox/)
- [FedBN: Federated Learning on non-IID Features via Local Batch Normalization](https://arxiv.org/abs/2102.07623):
- [Convergence Rate](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/fedbn/convergence_rate)
- [Adaptive Federated Optimization](https://arxiv.org/abs/2003.00295):
- [CIFAR-10/100](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/adaptive_federated_optimization)

Check the Flower documentation to learn more: [Using Baselines](https://flower.dev/docs/baselines/using-baselines.html)
Check the Flower documentation to learn more: [Using Baselines](https://flower.dev/docs/baselines/how-to-use-baselines.html)

The Flower community loves contributions! Make your work more visible and enable others to build on it by contributing it as a baseline: [Contributing Baselines](https://flower.dev/docs/baselines/contributing-baselines.html)
The Flower community loves contributions! Make your work more visible and enable others to build on it by contributing it as a baseline: [Contributing Baselines](https://flower.dev/docs/baselines/how-to-contribute-baselines.html)

## Flower Usage Examples

Several code examples show different usage scenarios of Flower (in combination with popular machine learning frameworks such as PyTorch or TensorFlow).

Quickstart examples:

* [Quickstart (TensorFlow)](https://github.com/adap/flower/tree/main/examples/quickstart-tensorflow)
* [Quickstart (PyTorch)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch)
* [Quickstart (Hugging Face)](https://github.com/adap/flower/tree/main/examples/quickstart-huggingface)
* [Quickstart (PyTorch Lightning)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch-lightning)
* [Quickstart (fastai)](https://github.com/adap/flower/tree/main/examples/quickstart-fastai)
* [Quickstart (Pandas)](https://github.com/adap/flower/tree/main/examples/quickstart-pandas)
* [Quickstart (MXNet)](https://github.com/adap/flower/tree/main/examples/quickstart-mxnet)
* [Quickstart (JAX)](https://github.com/adap/flower/tree/main/examples/quickstart-jax)
* [Quickstart (scikit-learn)](https://github.com/adap/flower/tree/main/examples/sklearn-logreg-mnist)
* [Quickstart (Android [TFLite])](https://github.com/adap/flower/tree/main/examples/android)
* [Quickstart (iOS [CoreML])](https://github.com/adap/flower/tree/main/examples/ios)
- [Quickstart (TensorFlow)](https://github.com/adap/flower/tree/main/examples/quickstart-tensorflow)
- [Quickstart (PyTorch)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch)
- [Quickstart (Hugging Face)](https://github.com/adap/flower/tree/main/examples/quickstart-huggingface)
- [Quickstart (PyTorch Lightning)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch-lightning)
- [Quickstart (fastai)](https://github.com/adap/flower/tree/main/examples/quickstart-fastai)
- [Quickstart (Pandas)](https://github.com/adap/flower/tree/main/examples/quickstart-pandas)
- [Quickstart (MXNet)](https://github.com/adap/flower/tree/main/examples/quickstart-mxnet)
- [Quickstart (JAX)](https://github.com/adap/flower/tree/main/examples/quickstart-jax)
- [Quickstart (scikit-learn)](https://github.com/adap/flower/tree/main/examples/sklearn-logreg-mnist)
- [Quickstart (Android [TFLite])](https://github.com/adap/flower/tree/main/examples/android)
- [Quickstart (iOS [CoreML])](https://github.com/adap/flower/tree/main/examples/ios)

Other [examples](https://github.com/adap/flower/tree/main/examples):

* [Raspberry Pi & Nvidia Jetson Tutorial](https://github.com/adap/flower/tree/main/examples/embedded-devices)
* [PyTorch: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/pytorch-from-centralized-to-federated)
* [MXNet: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/mxnet-from-centralized-to-federated)
* [Advanced Flower with TensorFlow/Keras](https://github.com/adap/flower/tree/main/examples/advanced-tensorflow)
* [Advanced Flower with PyTorch](https://github.com/adap/flower/tree/main/examples/advanced-pytorch)
* Single-Machine Simulation of Federated Learning Systems ([PyTorch](https://github.com/adap/flower/tree/main/examples/simulation_pytorch)) ([Tensorflow](https://github.com/adap/flower/tree/main/examples/simulation_tensorflow))
- [Raspberry Pi & Nvidia Jetson Tutorial](https://github.com/adap/flower/tree/main/examples/embedded-devices)
- [PyTorch: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/pytorch-from-centralized-to-federated)
- [MXNet: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/mxnet-from-centralized-to-federated)
- [Advanced Flower with TensorFlow/Keras](https://github.com/adap/flower/tree/main/examples/advanced-tensorflow)
- [Advanced Flower with PyTorch](https://github.com/adap/flower/tree/main/examples/advanced-pytorch)
- Single-Machine Simulation of Federated Learning Systems ([PyTorch](https://github.com/adap/flower/tree/main/examples/simulation_pytorch)) ([Tensorflow](https://github.com/adap/flower/tree/main/examples/simulation_tensorflow))

## Community

Expand All @@ -144,12 +143,12 @@ Flower is built by a wonderful community of researchers and engineers. [Join Sla

## Citation

If you publish work that uses Flower, please cite Flower as follows:
If you publish work that uses Flower, please cite Flower as follows:

```bibtex
@article{beutel2020flower,
title={Flower: A Friendly Federated Learning Research Framework},
author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Fernandez-Marques, Javier and Gao, Yan and Sani, Lorenzo and Kwing, Hei Li and Parcollet, Titouan and Gusmão, Pedro PB de and Lane, Nicholas D},
author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Fernandez-Marques, Javier and Gao, Yan and Sani, Lorenzo and Kwing, Hei Li and Parcollet, Titouan and Gusmão, Pedro PB de and Lane, Nicholas D},
journal={arXiv preprint arXiv:2007.14390},
year={2020}
}
Expand Down

0 comments on commit d58812a

Please sign in to comment.