Skip to content

Commit

Permalink
Merge branch 'main' into add-datasets-ci-tests
Browse files Browse the repository at this point in the history
  • Loading branch information
tanertopal authored Sep 15, 2023
2 parents e11bf89 + 1cef0d4 commit 9cc92e1
Show file tree
Hide file tree
Showing 34 changed files with 240 additions and 198 deletions.
1 change: 1 addition & 0 deletions baselines/doc/source/_templates/base.html
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
<meta charset="utf-8"/>
<meta name="viewport" content="width=device-width,initial-scale=1"/>
<meta name="color-scheme" content="light dark">
<link rel="canonical" href="https://flower.dev/docs/baselines/{{ pagename }}.html">

{%- if metatags %}{{ metatags }}{% endif -%}

Expand Down
4 changes: 2 additions & 2 deletions datasets/flwr_datasets/partitioner/iid_partitioner_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,8 +56,8 @@ def test_load_partition_size(self, num_partitions: int, num_rows: int) -> None:
Only the correct data is tested in this method.
In case the dataset is dividable among `num_partitions` the size of each
partition should be the same. This checks if the randomly chosen partition
has size as expected.
partition should be the same. This checks if the randomly chosen partition has
size as expected.
"""
_, partitioner = _dummy_setup(num_partitions, num_rows)
partition_size = num_rows // num_partitions
Expand Down
10 changes: 5 additions & 5 deletions doc/locales/fr/LC_MESSAGES/framework-docs.po
Original file line number Diff line number Diff line change
Expand Up @@ -5050,15 +5050,15 @@ msgstr ""
msgid ""
"This can be achieved by customizing an existing strategy or by "
"`implementing a custom strategy from scratch <https://flower.dev/docs"
"/implementing-strategies.html>`_. Here's a nonsensical example that "
"/how-to-implement-strategies.html>`_. Here's a nonsensical example that "
"customizes :code:`FedAvg` by adding a custom ``\"hello\": \"world\"`` "
"configuration key/value pair to the config dict of a *single client* "
"(only the first client in the list, the other clients in this round to "
"not receive this \"special\" config value):"
msgstr ""
"Ceci peut être réalisé en personnalisant une stratégie existante ou en "
"`mettant en œuvre une stratégie personnalisée à partir de zéro "
"<https://flower.dev/docs/implementing-strategies.html>`_. Voici un "
"<https://flower.dev/docs/framework/how-to-implement-strategies.html>`_. Voici un "
"exemple absurde qui personnalise :code:`FedAvg` en ajoutant une paire "
"clé/valeur de configuration personnalisée ``\"hello\" : \"world\"`` au "
"config dict d'un *seul client* (uniquement le premier client de la liste,"
Expand Down Expand Up @@ -7087,7 +7087,7 @@ msgid ""
msgstr ""
"L'écriture d'une stratégie entièrement personnalisée est un peu plus "
"complexe, mais c'est celle qui offre le plus de souplesse. Lis le guide "
"`Implémentation des stratégies <implementing-strategies.html>`_ pour en "
"`Implémentation des stratégies <how-to-implement-strategies.html>`_ pour en "
"savoir plus."

#: ../../source/index.rst:31
Expand Down Expand Up @@ -10068,12 +10068,12 @@ msgstr ""
#: ../../source/ref-changelog.md:517
msgid ""
"New documentation for [implementing strategies](https://flower.dev/docs"
"/implementing-strategies.html) "
"/how-to-implement-strategies.html) "
"([#1097](https://github.com/adap/flower/pull/1097), "
"[#1175](https://github.com/adap/flower/pull/1175))"
msgstr ""
"Nouvelle documentation pour [mettre en œuvre des "
"stratégies](https://flower.dev/docs/implementing-strategies.html) "
"stratégies](https://flower.dev/docs/framework/how-to-implement-strategies.html) "
"([#1097](https://github.com/adap/flower/pull/1097), "
"[#1175](https://github.com/adap/flower/pull/1175))"

Expand Down
Binary file modified doc/source/_static/tutorial/flower-any.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
20 changes: 10 additions & 10 deletions doc/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -122,18 +122,18 @@
# Renamed pages
"installation": "how-to-install-flower.html",
"configuring-clients.html": "how-to-configure-clients.html",
"quickstart_mxnet": "quickstart-mxnet.html",
"quickstart_pytorch_lightning": "quickstart-pytorch-lightning.html",
"quickstart_huggingface": "quickstart-huggingface.html",
"quickstart_pytorch": "quickstart-pytorch.html",
"quickstart_tensorflow": "quickstart-tensorflow.html",
"quickstart_scikitlearn": "quickstart-scikitlearn.html",
"quickstart_xgboost": "quickstart-xgboost.html",
"quickstart_mxnet": "tutorial-quickstart-mxnet.html",
"quickstart_pytorch_lightning": "tutorial-quickstart-pytorch-lightning.html",
"quickstart_huggingface": "tutorial-quickstart-huggingface.html",
"quickstart_pytorch": "tutorial-quickstart-pytorch.html",
"quickstart_tensorflow": "tutorial-quickstart-tensorflow.html",
"quickstart_scikitlearn": "tutorial-quickstart-scikitlearn.html",
"quickstart_xgboost": "tutorial-quickstart-xgboost.html",
"example_walkthrough_pytorch_mnist": "example-walkthrough-pytorch-mnist.html",
"release_process": "release-process.html",
"release_process": "contributor-how-to-release-flower.html",
"saving-progress": "how-to-save-and-load-model-checkpoints.html",
"writing-documentation": "write-documentation.html",
"apiref-binaries": "apiref-cli.html",
"writing-documentation": "contributor-how-to-write-documentation.html",
"apiref-binaries": "ref-api-cli.html",
"fedbn-example-pytorch-from-centralized-to-federated": "example-fedbn-pytorch-from-centralized-to-federated.html",
# Restructuring: tutorials
"tutorial/Flower-0-What-is-FL": "tutorial-series-what-is-federated-learning.html",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@ Example: FedBN in PyTorch - From Centralized To Federated

This tutorial will show you how to use Flower to build a federated version of an existing machine learning workload with `FedBN <https://github.com/med-air/FedBN>`_, a federated training strategy designed for non-iid data.
We are using PyTorch to train a Convolutional Neural Network(with Batch Normalization layers) on the CIFAR-10 dataset.
When applying FedBN, only few changes needed compared to `Example: PyTorch - From Centralized To Federated <https://flower.dev/docs/example-pytorch-from-centralized-to-federated.html>`_.
When applying FedBN, only few changes needed compared to `Example: PyTorch - From Centralized To Federated <https://flower.dev/docs/examples/pytorch-from-centralized-to-federated.html>`_.

Centralized Training
--------------------
All files are revised based on `Example: PyTorch - From Centralized To Federated <https://flower.dev/docs/example-pytorch-from-centralized-to-federated.html>`_.
All files are revised based on `Example: PyTorch - From Centralized To Federated <https://flower.dev/docs/examples/pytorch-from-centralized-to-federated.html>`_.
The only thing to do is modifying the file called :code:`cifar.py`, revised part is shown below:

The model architecture defined in class Net() is added with Batch Normalization layers accordingly.
Expand Down Expand Up @@ -50,8 +50,8 @@ Let's take the next step and use what we've built to create a federated learning
Federated Training
------------------

If you have read `Example: PyTorch - From Centralized To Federated <https://flower.dev/docs/example-pytorch-from-centralized-to-federated.html>`_, the following parts are easy to follow, onyl :code:`get_parameters` and :code:`set_parameters` function in :code:`client.py` needed to revise.
If not, please read the `Example: PyTorch - From Centralized To Federated <https://flower.dev/docs/example-pytorch-from-centralized-to-federated.html>`_. first.
If you have read `Example: PyTorch - From Centralized To Federated <https://flower.dev/docs/examples/pytorch-from-centralized-to-federated.html>`_, the following parts are easy to follow, onyl :code:`get_parameters` and :code:`set_parameters` function in :code:`client.py` needed to revise.
If not, please read the `Example: PyTorch - From Centralized To Federated <https://flower.dev/docs/examples/pytorch-from-centralized-to-federated.html>`_. first.

Our example consists of one *server* and two *clients*. In FedBN, :code:`server.py` keeps unchanged, we can start the server directly.

Expand Down
2 changes: 1 addition & 1 deletion doc/source/how-to-configure-clients.rst
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ Configuring individual clients

In some cases, it is necessary to send different configuration values to different clients.

This can be achieved by customizing an existing strategy or by `implementing a custom strategy from scratch <https://flower.dev/docs/implementing-strategies.html>`_. Here's a nonsensical example that customizes :code:`FedAvg` by adding a custom ``"hello": "world"`` configuration key/value pair to the config dict of a *single client* (only the first client in the list, the other clients in this round to not receive this "special" config value):
This can be achieved by customizing an existing strategy or by `implementing a custom strategy from scratch <https://flower.dev/docs/framework/how-to-implement-strategies.html>`_. Here's a nonsensical example that customizes :code:`FedAvg` by adding a custom ``"hello": "world"`` configuration key/value pair to the config dict of a *single client* (only the first client in the list, the other clients in this round to not receive this "special" config value):

.. code-block:: python
Expand Down
2 changes: 1 addition & 1 deletion doc/source/how-to-use-strategies.rst
Original file line number Diff line number Diff line change
Expand Up @@ -86,4 +86,4 @@ Server-side evaluation can be enabled by passing an evaluation function to :code
Implement a novel strategy
--------------------------

Writing a fully custom strategy is a bit more involved, but it provides the most flexibility. Read the `Implementing Strategies <implementing-strategies.html>`_ guide to learn more.
Writing a fully custom strategy is a bit more involved, but it provides the most flexibility. Read the `Implementing Strategies <how-to-implement-strategies.html>`_ guide to learn more.
3 changes: 3 additions & 0 deletions doc/source/index.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
Flower Framework Documentation
==============================

.. meta::
:description: Check out the documentation of the main Flower Framework enabling easy Python development for Federated Learning.

Welcome to Flower's documentation. `Flower <https://flower.dev>`_ is a friendly federated learning framework.


Expand Down
4 changes: 3 additions & 1 deletion doc/source/ref-changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

## Unreleased

- **Support custom** `ClientManager` **in** `start_driver()` ([#2292](https://github.com/adap/flower/pull/2292))

- **Update REST API to support create and delete nodes** ([#2283](https://github.com/adap/flower/pull/2283))

### What's new?
Expand Down Expand Up @@ -536,7 +538,7 @@ We would like to give our **special thanks** to all the contributors who made Fl

- New option to keep Ray running if Ray was already initialized in `start_simulation` ([#1177](https://github.com/adap/flower/pull/1177))
- Add support for custom `ClientManager` as a `start_simulation` parameter ([#1171](https://github.com/adap/flower/pull/1171))
- New documentation for [implementing strategies](https://flower.dev/docs/implementing-strategies.html) ([#1097](https://github.com/adap/flower/pull/1097), [#1175](https://github.com/adap/flower/pull/1175))
- New documentation for [implementing strategies](https://flower.dev/docs/framework/how-to-implement-strategies.html) ([#1097](https://github.com/adap/flower/pull/1097), [#1175](https://github.com/adap/flower/pull/1175))
- New mobile-friendly documentation theme ([#1174](https://github.com/adap/flower/pull/1174))
- Limit version range for (optional) `ray` dependency to include only compatible releases (`>=1.9.2,<1.12.0`) ([#1205](https://github.com/adap/flower/pull/1205))

Expand Down
6 changes: 3 additions & 3 deletions doc/source/ref-example-projects.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ The TensorFlow/Keras quickstart example shows CIFAR-10 image classification
with MobileNetV2:

- `Quickstart TensorFlow (Code) <https://github.com/adap/flower/tree/main/examples/quickstart-tensorflow>`_
- `Quickstart TensorFlow (Tutorial) <https://flower.dev/docs/quickstart-tensorflow.html>`_
- `Quickstart TensorFlow (Tutorial) <https://flower.dev/docs/framework/tutorial-quickstart-tensorflow.html>`_
- `Quickstart TensorFlow (Blog Post) <https://flower.dev/blog/2020-12-11-federated-learning-in-less-than-20-lines-of-code>`_


Expand All @@ -34,7 +34,7 @@ The PyTorch quickstart example shows CIFAR-10 image classification
with a simple Convolutional Neural Network:

- `Quickstart PyTorch (Code) <https://github.com/adap/flower/tree/main/examples/quickstart-pytorch>`_
- `Quickstart PyTorch (Tutorial) <https://flower.dev/docs/quickstart-pytorch.html>`_
- `Quickstart PyTorch (Tutorial) <https://flower.dev/docs/framework/tutorial-quickstart-pytorch.html>`_


PyTorch: From Centralized To Federated
Expand All @@ -43,7 +43,7 @@ PyTorch: From Centralized To Federated
This example shows how a regular PyTorch project can be federated using Flower:

- `PyTorch: From Centralized To Federated (Code) <https://github.com/adap/flower/tree/main/examples/pytorch-from-centralized-to-federated>`_
- `PyTorch: From Centralized To Federated (Tutorial) <https://flower.dev/docs/example-pytorch-from-centralized-to-federated.html>`_
- `PyTorch: From Centralized To Federated (Tutorial) <https://flower.dev/docs/framework/example-pytorch-from-centralized-to-federated.html>`_


Federated Learning on Raspberry Pi and Nvidia Jetson
Expand Down
3 changes: 3 additions & 0 deletions doc/source/tutorial-quickstart-android.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Quickstart Android
==================

.. meta::
:description: Read this Federated Learning quickstart tutorial for creating an Android app using Flower.

Let's build a federated learning system using TFLite and Flower on Android!

Please refer to the `full code example <https://github.com/adap/flower/tree/main/examples/android>`_ to learn more.
3 changes: 3 additions & 0 deletions doc/source/tutorial-quickstart-fastai.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Quickstart fastai
=================

.. meta::
:description: Check out this Federated Learning quickstart tutorial for using Flower with FastAI to train a vision model on CIFAR-10.

Let's build a federated learning system using fastai and Flower!

Please refer to the `full code example <https://github.com/adap/flower/tree/main/examples/quickstart-fastai>`_ to learn more.
3 changes: 3 additions & 0 deletions doc/source/tutorial-quickstart-huggingface.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Quickstart 🤗 Transformers
==========================

.. meta::
:description: Check out this Federating Learning quickstart tutorial for using Flower with HuggingFace Transformers in order to fine-tune an LLM.

Let's build a federated learning system using Hugging Face Transformers and Flower!

We will leverage Hugging Face to federate the training of language models over multiple clients using Flower.
Expand Down
3 changes: 3 additions & 0 deletions doc/source/tutorial-quickstart-ios.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Quickstart iOS
==============

.. meta::
:description: Read this Federated Learning quickstart tutorial for creating an iOS app using Flower to train a neural network on MNIST.

In this tutorial we will learn how to train a Neural Network on MNIST using Flower and CoreML on iOS devices.

First of all, for running the Flower Python server, it is recommended to create a virtual environment and run everything within a `virtualenv <https://flower.dev/docs/recommended-env-setup.html>`_.
Expand Down
3 changes: 3 additions & 0 deletions doc/source/tutorial-quickstart-jax.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Quickstart JAX
==============

.. meta::
:description: Check out this Federated Learning quickstart tutorial for using Flower with Jax to train a linear regression model on a scikit-learn dataset.

This tutorial will show you how to use Flower to build a federated version of an existing JAX workload.
We are using JAX to train a linear regression model on a scikit-learn dataset.
We will structure the example similar to our `PyTorch - From Centralized To Federated <https://github.com/adap/flower/blob/main/examples/pytorch-from-centralized-to-federated>`_ walkthrough.
Expand Down
3 changes: 3 additions & 0 deletions doc/source/tutorial-quickstart-mxnet.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Quickstart MXNet
================

.. meta::
:description: Check out this Federated Learning quickstart tutorial for using Flower with MXNet to train a Sequential model on MNIST.

In this tutorial, we will learn how to train a :code:`Sequential` model on MNIST using Flower and MXNet.

It is recommended to create a virtual environment and run everything within this `virtualenv <https://flower.dev/docs/recommended-env-setup.html>`_.
Expand Down
3 changes: 3 additions & 0 deletions doc/source/tutorial-quickstart-pandas.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Quickstart Pandas
=================

.. meta::
:description: Check out this Federated Learning quickstart tutorial for using Flower with Pandas to perform Federated Analytics.

Let's build a federated analytics system using Pandas and Flower!

Please refer to the `full code example <https://github.com/adap/flower/tree/main/examples/quickstart-pandas>`_ to learn more.
3 changes: 3 additions & 0 deletions doc/source/tutorial-quickstart-pytorch-lightning.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Quickstart PyTorch Lightning
============================

.. meta::
:description: Check out this Federated Learning quickstart tutorial for using Flower with PyTorch Lightning to train an Auto Encoder model on MNIST.

Let's build a federated learning system using PyTorch Lightning and Flower!

Please refer to the `full code example <https://github.com/adap/flower/tree/main/examples/quickstart-pytorch-lightning>`_ to learn more.
3 changes: 3 additions & 0 deletions doc/source/tutorial-quickstart-pytorch.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Quickstart PyTorch
==================

.. meta::
:description: Check out this Federated Learning quickstart tutorial for using Flower with PyTorch to train a CNN model on MNIST.

.. youtube:: jOmmuzMIQ4c
:width: 100%

Expand Down
3 changes: 3 additions & 0 deletions doc/source/tutorial-quickstart-scikitlearn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Quickstart scikit-learn
=======================

.. meta::
:description: Check out this Federated Learning quickstart tutorial for using Flower with scikit-learn to train a linear regression model.

In this tutorial, we will learn how to train a :code:`Logistic Regression` model on MNIST using Flower and scikit-learn.

It is recommended to create a virtual environment and run everything within this `virtualenv <https://flower.dev/docs/recommended-env-setup.html>`_.
Expand Down
3 changes: 3 additions & 0 deletions doc/source/tutorial-quickstart-tensorflow.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Quickstart TensorFlow
=====================

.. meta::
:description: Check out this Federated Learning quickstart tutorial for using Flower with TensorFlow to train a MobilNetV2 model on CIFAR-10.

.. youtube:: FGTc2TQq7VM
:width: 100%

Expand Down
3 changes: 3 additions & 0 deletions doc/source/tutorial-quickstart-xgboost.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Quickstart XGBoost
==================

.. meta::
:description: Check out this Federated Learning quickstart tutorial for using Flower with XGBoost to train classification models on trees.

Let's build a horizontal federated learning system using XGBoost and Flower!

Please refer to the `full code example <https://github.com/adap/flower/tree/main/examples/quickstart-xgboost-horizontal>`_ to learn more.
1 change: 1 addition & 0 deletions examples/doc/source/_templates/base.html
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
<meta charset="utf-8"/>
<meta name="viewport" content="width=device-width,initial-scale=1"/>
<meta name="color-scheme" content="light dark">
<link rel="canonical" href="https://flower.dev/docs/examples/{{ pagename }}.html">

{%- if metatags %}{{ metatags }}{% endif -%}

Expand Down
Binary file modified examples/embedded-devices/_static/tmux_jtop_view.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed examples/embedded-devices/media/diagram.png
Binary file not shown.
Binary file removed examples/embedded-devices/media/tmux_jtop_view.gif
Binary file not shown.
2 changes: 1 addition & 1 deletion examples/quickstart-fastai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,4 +71,4 @@ Start client 2 in the second terminal:
python3 client.py
```

You will see that fastai is starting a federated training. Have a look to the [Flower Quickstarter documentation](https://flower.dev/docs/quickstart-fastai.html) for a detailed explanation.
You will see that fastai is starting a federated training. For a more in-depth look, be sure to check out the code on our [repo](https://github.com/adap/flower/tree/main/examples/quickstart-fastai).
2 changes: 1 addition & 1 deletion examples/quickstart-huggingface/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Federated HuggingFace Transformers using Flower and PyTorch

This introductory example to using [HuggingFace](https://huggingface.co) Transformers with Flower with PyTorch. This example has been extended from the [quickstart-pytorch](https://flower.dev/docs/quickstart-pytorch.html) example. The training script closely follows the [HuggingFace course](https://huggingface.co/course/chapter3?fw=pt), so you are encouraged to check that out for detailed explaination for the transformer pipeline.
This introductory example to using [HuggingFace](https://huggingface.co) Transformers with Flower with PyTorch. This example has been extended from the [quickstart-pytorch](https://flower.dev/docs/examples/quickstart-pytorch.html) example. The training script closely follows the [HuggingFace course](https://huggingface.co/course/chapter3?fw=pt), so you are encouraged to check that out for detailed explaination for the transformer pipeline.

Like `quickstart-pytorch`, running this example in itself is also meant to be quite easy.

Expand Down
2 changes: 1 addition & 1 deletion examples/quickstart-pytorch/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,4 +70,4 @@ Start client 2 in the second terminal:
python3 client.py
```

You will see that PyTorch is starting a federated training. Have a look to the [Flower Quickstarter documentation](https://flower.dev/docs/quickstart-pytorch.html) for a detailed explanation.
You will see that PyTorch is starting a federated training. Look at the [code](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch) for a detailed explanation.
Loading

0 comments on commit 9cc92e1

Please sign in to comment.