diff --git a/.devcontainer/Dockerfile b/.devcontainer/Dockerfile
index 2cf82414df56..016c12af2426 100644
--- a/.devcontainer/Dockerfile
+++ b/.devcontainer/Dockerfile
@@ -29,8 +29,8 @@ RUN apt install -y curl wget gnupg python3 python-is-python3 python3-pip git \
build-essential tmux vim
RUN python -m pip install \
- pip==23.1.2 \
- setuptools==68.0.0 \
+ pip==23.3.1 \
+ setuptools==68.2.2 \
poetry==1.5.1
USER $USERNAME
diff --git a/.github/actions/bootstrap/action.yml b/.github/actions/bootstrap/action.yml
index 3865cad1def6..584ae2634d9e 100644
--- a/.github/actions/bootstrap/action.yml
+++ b/.github/actions/bootstrap/action.yml
@@ -6,10 +6,10 @@ inputs:
default: 3.8
pip-version:
description: "Version of pip to be installed using pip"
- default: 23.1.2
+ default: 23.3.1
setuptools-version:
description: "Version of setuptools to be installed using pip"
- default: 68.0.0
+ default: 68.2.2
poetry-version:
description: "Version of poetry to be installed using pip"
default: 1.5.1
diff --git a/.github/workflows/cpp.yml b/.github/workflows/cpp.yml
index 16cd672ef034..35fe9813329e 100644
--- a/.github/workflows/cpp.yml
+++ b/.github/workflows/cpp.yml
@@ -35,9 +35,14 @@ jobs:
sudo apt-get update
sudo apt-get install -y clang-format cmake g++ clang-tidy cppcheck
- - name: Check Formatting
+ - name: Check source Formatting
run: |
- find src/cc/flwr -name '*.cc' -or -name '*.h' | xargs clang-format -i
+ find src/cc/flwr/src -name '*.cc' | xargs clang-format -i
+ git diff --exit-code
+
+ - name: Check header Formatting
+ run: |
+ find src/cc/flwr/include -name '*.h' -not -path "src/cc/flwr/include/flwr/*" | xargs clang-format -i
git diff --exit-code
- name: Build
diff --git a/README.md b/README.md
index efed9b0e477e..002d16066e78 100644
--- a/README.md
+++ b/README.md
@@ -23,22 +23,21 @@
Flower (`flwr`) is a framework for building federated learning systems. The
design of Flower is based on a few guiding principles:
-* **Customizable**: Federated learning systems vary wildly from one use case to
+- **Customizable**: Federated learning systems vary wildly from one use case to
another. Flower allows for a wide range of different configurations depending
on the needs of each individual use case.
-* **Extendable**: Flower originated from a research project at the University of
+- **Extendable**: Flower originated from a research project at the University of
Oxford, so it was built with AI research in mind. Many components can be
extended and overridden to build new state-of-the-art systems.
-* **Framework-agnostic**: Different machine learning frameworks have different
+- **Framework-agnostic**: Different machine learning frameworks have different
strengths. Flower can be used with any machine learning framework, for
example, [PyTorch](https://pytorch.org),
- [TensorFlow](https://tensorflow.org), [Hugging Face Transformers](https://huggingface.co/), [PyTorch Lightning](https://pytorchlightning.ai/), [MXNet](https://mxnet.apache.org/), [scikit-learn](https://scikit-learn.org/), [JAX](https://jax.readthedocs.io/), [TFLite](https://tensorflow.org/lite/), [fastai](https://www.fast.ai/), [Pandas](https://pandas.pydata.org/
-) for federated analytics, or even raw [NumPy](https://numpy.org/)
+ [TensorFlow](https://tensorflow.org), [Hugging Face Transformers](https://huggingface.co/), [PyTorch Lightning](https://pytorchlightning.ai/), [MXNet](https://mxnet.apache.org/), [scikit-learn](https://scikit-learn.org/), [JAX](https://jax.readthedocs.io/), [TFLite](https://tensorflow.org/lite/), [fastai](https://www.fast.ai/), [Pandas](https://pandas.pydata.org/) for federated analytics, or even raw [NumPy](https://numpy.org/)
for users who enjoy computing gradients by hand.
-* **Understandable**: Flower is written with maintainability in mind. The
+- **Understandable**: Flower is written with maintainability in mind. The
community is encouraged to both read and contribute to the codebase.
Meet the Flower community on [flower.dev](https://flower.dev)!
@@ -58,11 +57,11 @@ Flower's goal is to make federated learning accessible to everyone. This series
2. **Using Strategies in Federated Learning**
[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-use-a-federated-learning-strategy-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-use-a-federated-learning-strategy-pytorch.ipynb))
-
+
3. **Building Strategies for Federated Learning**
[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-use-a-federated-learning-strategy-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-use-a-federated-learning-strategy-pytorch.ipynb))
-
+
4. **Custom Clients for Federated Learning**
[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-customize-the-client-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-customize-the-client-pytorch.ipynb))
@@ -73,39 +72,39 @@ Stay tuned, more tutorials are coming soon. Topics include **Privacy and Securit
[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/examples/flower-in-30-minutes/tutorial.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/examples/flower-in-30-minutes/tutorial.ipynb))
-
## Documentation
[Flower Docs](https://flower.dev/docs):
-* [Installation](https://flower.dev/docs/framework/how-to-install-flower.html)
-* [Quickstart (TensorFlow)](https://flower.dev/docs/framework/tutorial-quickstart-tensorflow.html)
-* [Quickstart (PyTorch)](https://flower.dev/docs/framework/tutorial-quickstart-pytorch.html)
-* [Quickstart (Hugging Face)](https://flower.dev/docs/framework/tutorial-quickstart-huggingface.html)
-* [Quickstart (PyTorch Lightning [code example])](https://flower.dev/docs/framework/tutorial-quickstart-pytorch-lightning.html)
-* [Quickstart (MXNet)](https://flower.dev/docs/framework/example-mxnet-walk-through.html)
-* [Quickstart (Pandas)](https://flower.dev/docs/framework/tutorial-quickstart-pandas.html)
-* [Quickstart (fastai)](https://flower.dev/docs/framework/tutorial-quickstart-fastai.html)
-* [Quickstart (JAX)](https://flower.dev/docs/framework/tutorial-quickstart-jax.html)
-* [Quickstart (scikit-learn)](https://flower.dev/docs/framework/tutorial-quickstart-scikitlearn.html)
-* [Quickstart (Android [TFLite])](https://flower.dev/docs/framework/tutorial-quickstart-android.html)
-* [Quickstart (iOS [CoreML])](https://flower.dev/docs/framework/tutorial-quickstart-ios.html)
+
+- [Installation](https://flower.dev/docs/framework/how-to-install-flower.html)
+- [Quickstart (TensorFlow)](https://flower.dev/docs/framework/tutorial-quickstart-tensorflow.html)
+- [Quickstart (PyTorch)](https://flower.dev/docs/framework/tutorial-quickstart-pytorch.html)
+- [Quickstart (Hugging Face)](https://flower.dev/docs/framework/tutorial-quickstart-huggingface.html)
+- [Quickstart (PyTorch Lightning [code example])](https://flower.dev/docs/framework/tutorial-quickstart-pytorch-lightning.html)
+- [Quickstart (MXNet)](https://flower.dev/docs/framework/example-mxnet-walk-through.html)
+- [Quickstart (Pandas)](https://flower.dev/docs/framework/tutorial-quickstart-pandas.html)
+- [Quickstart (fastai)](https://flower.dev/docs/framework/tutorial-quickstart-fastai.html)
+- [Quickstart (JAX)](https://flower.dev/docs/framework/tutorial-quickstart-jax.html)
+- [Quickstart (scikit-learn)](https://flower.dev/docs/framework/tutorial-quickstart-scikitlearn.html)
+- [Quickstart (Android [TFLite])](https://flower.dev/docs/framework/tutorial-quickstart-android.html)
+- [Quickstart (iOS [CoreML])](https://flower.dev/docs/framework/tutorial-quickstart-ios.html)
## Flower Baselines
Flower Baselines is a collection of community-contributed experiments that reproduce the experiments performed in popular federated learning publications. Researchers can build on Flower Baselines to quickly evaluate new ideas:
-* [FedAvg](https://arxiv.org/abs/1602.05629):
- * [MNIST](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/fedavg_mnist)
-* [FedProx](https://arxiv.org/abs/1812.06127):
- * [MNIST](https://github.com/adap/flower/tree/main/baselines/fedprox/)
-* [FedBN: Federated Learning on non-IID Features via Local Batch Normalization](https://arxiv.org/abs/2102.07623):
- * [Convergence Rate](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/fedbn/convergence_rate)
-* [Adaptive Federated Optimization](https://arxiv.org/abs/2003.00295):
- * [CIFAR-10/100](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/adaptive_federated_optimization)
+- [FedAvg](https://arxiv.org/abs/1602.05629):
+ - [MNIST](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/fedavg_mnist)
+- [FedProx](https://arxiv.org/abs/1812.06127):
+ - [MNIST](https://github.com/adap/flower/tree/main/baselines/fedprox/)
+- [FedBN: Federated Learning on non-IID Features via Local Batch Normalization](https://arxiv.org/abs/2102.07623):
+ - [Convergence Rate](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/fedbn/convergence_rate)
+- [Adaptive Federated Optimization](https://arxiv.org/abs/2003.00295):
+ - [CIFAR-10/100](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/adaptive_federated_optimization)
-Check the Flower documentation to learn more: [Using Baselines](https://flower.dev/docs/baselines/using-baselines.html)
+Check the Flower documentation to learn more: [Using Baselines](https://flower.dev/docs/baselines/how-to-use-baselines.html)
-The Flower community loves contributions! Make your work more visible and enable others to build on it by contributing it as a baseline: [Contributing Baselines](https://flower.dev/docs/baselines/contributing-baselines.html)
+The Flower community loves contributions! Make your work more visible and enable others to build on it by contributing it as a baseline: [Contributing Baselines](https://flower.dev/docs/baselines/how-to-contribute-baselines.html)
## Flower Usage Examples
@@ -113,26 +112,26 @@ Several code examples show different usage scenarios of Flower (in combination w
Quickstart examples:
-* [Quickstart (TensorFlow)](https://github.com/adap/flower/tree/main/examples/quickstart-tensorflow)
-* [Quickstart (PyTorch)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch)
-* [Quickstart (Hugging Face)](https://github.com/adap/flower/tree/main/examples/quickstart-huggingface)
-* [Quickstart (PyTorch Lightning)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch-lightning)
-* [Quickstart (fastai)](https://github.com/adap/flower/tree/main/examples/quickstart-fastai)
-* [Quickstart (Pandas)](https://github.com/adap/flower/tree/main/examples/quickstart-pandas)
-* [Quickstart (MXNet)](https://github.com/adap/flower/tree/main/examples/quickstart-mxnet)
-* [Quickstart (JAX)](https://github.com/adap/flower/tree/main/examples/quickstart-jax)
-* [Quickstart (scikit-learn)](https://github.com/adap/flower/tree/main/examples/sklearn-logreg-mnist)
-* [Quickstart (Android [TFLite])](https://github.com/adap/flower/tree/main/examples/android)
-* [Quickstart (iOS [CoreML])](https://github.com/adap/flower/tree/main/examples/ios)
+- [Quickstart (TensorFlow)](https://github.com/adap/flower/tree/main/examples/quickstart-tensorflow)
+- [Quickstart (PyTorch)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch)
+- [Quickstart (Hugging Face)](https://github.com/adap/flower/tree/main/examples/quickstart-huggingface)
+- [Quickstart (PyTorch Lightning)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch-lightning)
+- [Quickstart (fastai)](https://github.com/adap/flower/tree/main/examples/quickstart-fastai)
+- [Quickstart (Pandas)](https://github.com/adap/flower/tree/main/examples/quickstart-pandas)
+- [Quickstart (MXNet)](https://github.com/adap/flower/tree/main/examples/quickstart-mxnet)
+- [Quickstart (JAX)](https://github.com/adap/flower/tree/main/examples/quickstart-jax)
+- [Quickstart (scikit-learn)](https://github.com/adap/flower/tree/main/examples/sklearn-logreg-mnist)
+- [Quickstart (Android [TFLite])](https://github.com/adap/flower/tree/main/examples/android)
+- [Quickstart (iOS [CoreML])](https://github.com/adap/flower/tree/main/examples/ios)
Other [examples](https://github.com/adap/flower/tree/main/examples):
-* [Raspberry Pi & Nvidia Jetson Tutorial](https://github.com/adap/flower/tree/main/examples/embedded-devices)
-* [PyTorch: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/pytorch-from-centralized-to-federated)
-* [MXNet: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/mxnet-from-centralized-to-federated)
-* [Advanced Flower with TensorFlow/Keras](https://github.com/adap/flower/tree/main/examples/advanced-tensorflow)
-* [Advanced Flower with PyTorch](https://github.com/adap/flower/tree/main/examples/advanced-pytorch)
-* Single-Machine Simulation of Federated Learning Systems ([PyTorch](https://github.com/adap/flower/tree/main/examples/simulation_pytorch)) ([Tensorflow](https://github.com/adap/flower/tree/main/examples/simulation_tensorflow))
+- [Raspberry Pi & Nvidia Jetson Tutorial](https://github.com/adap/flower/tree/main/examples/embedded-devices)
+- [PyTorch: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/pytorch-from-centralized-to-federated)
+- [MXNet: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/mxnet-from-centralized-to-federated)
+- [Advanced Flower with TensorFlow/Keras](https://github.com/adap/flower/tree/main/examples/advanced-tensorflow)
+- [Advanced Flower with PyTorch](https://github.com/adap/flower/tree/main/examples/advanced-pytorch)
+- Single-Machine Simulation of Federated Learning Systems ([PyTorch](https://github.com/adap/flower/tree/main/examples/simulation_pytorch)) ([Tensorflow](https://github.com/adap/flower/tree/main/examples/simulation_tensorflow))
## Community
@@ -144,12 +143,12 @@ Flower is built by a wonderful community of researchers and engineers. [Join Sla
## Citation
-If you publish work that uses Flower, please cite Flower as follows:
+If you publish work that uses Flower, please cite Flower as follows:
```bibtex
@article{beutel2020flower,
title={Flower: A Friendly Federated Learning Research Framework},
- author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Fernandez-Marques, Javier and Gao, Yan and Sani, Lorenzo and Kwing, Hei Li and Parcollet, Titouan and Gusmão, Pedro PB de and Lane, Nicholas D},
+ author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Fernandez-Marques, Javier and Gao, Yan and Sani, Lorenzo and Kwing, Hei Li and Parcollet, Titouan and Gusmão, Pedro PB de and Lane, Nicholas D},
journal={arXiv preprint arXiv:2007.14390},
year={2020}
}
diff --git a/baselines/depthfl/.gitignore b/baselines/depthfl/.gitignore
new file mode 100644
index 000000000000..fb7448bbcb01
--- /dev/null
+++ b/baselines/depthfl/.gitignore
@@ -0,0 +1,4 @@
+dataset/
+outputs/
+prev_grads/
+multirun/
\ No newline at end of file
diff --git a/baselines/depthfl/LICENSE b/baselines/depthfl/LICENSE
new file mode 100644
index 000000000000..d64569567334
--- /dev/null
+++ b/baselines/depthfl/LICENSE
@@ -0,0 +1,202 @@
+
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
+
+ APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "[]"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+ Copyright [yyyy] [name of copyright owner]
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
diff --git a/baselines/depthfl/README.md b/baselines/depthfl/README.md
new file mode 100644
index 000000000000..b8ab7ed18571
--- /dev/null
+++ b/baselines/depthfl/README.md
@@ -0,0 +1,171 @@
+---
+title: DepthFL:Depthwise Federated Learning for Heterogeneous Clients
+url: https://openreview.net/forum?id=pf8RIZTMU58
+labels: [image classification, system heterogeneity, cross-device, knowledge distillation]
+dataset: [CIFAR-100]
+---
+
+# DepthFL: Depthwise Federated Learning for Heterogeneous Clients
+
+> Note: If you use this baseline in your work, please remember to cite the original authors of the paper as well as the Flower paper.
+
+**Paper:** [openreview.net/forum?id=pf8RIZTMU58](https://openreview.net/forum?id=pf8RIZTMU58)
+
+**Authors:** Minjae Kim, Sangyoon Yu, Suhyun Kim, Soo-Mook Moon
+
+**Abstract:** Federated learning is for training a global model without collecting private local data from clients. As they repeatedly need to upload locally-updated weights or gradients instead, clients require both computation and communication resources enough to participate in learning, but in reality their resources are heterogeneous. To enable resource-constrained clients to train smaller local models, width scaling techniques have been used, which reduces the channels of a global model. Unfortunately, width scaling suffers from heterogeneity of local models when averaging them, leading to a lower accuracy than when simply excluding resource-constrained clients from training. This paper proposes a new approach based on depth scaling called DepthFL. DepthFL defines local models of different depths by pruning the deepest layers off the global model, and allocates them to clients depending on their available resources. Since many clients do not have enough resources to train deep local models, this would make deep layers partially-trained with insufficient data, unlike shallow layers that are fully trained. DepthFL alleviates this problem by mutual self-distillation of knowledge among the classifiers of various depths within a local model. Our experiments show that depth-scaled local models build a global model better than width-scaled ones, and that self-distillation is highly effective in training data-insufficient deep layers.
+
+
+## About this baseline
+
+**What’s implemented:** The code in this directory replicates the experiments in DepthFL: Depthwise Federated Learning for Heterogeneous Clients (Kim et al., 2023) for CIFAR100, which proposed the DepthFL algorithm. Concretely, it replicates the results for CIFAR100 dataset in Table 2, 3 and 4.
+
+**Datasets:** CIFAR100 from PyTorch's Torchvision
+
+**Hardware Setup:** These experiments were run on a server with Nvidia 3090 GPUs. Any machine with 1x 8GB GPU or more would be able to run it in a reasonable amount of time. With the default settings, clients make use of 1.3GB of VRAM. Lower `num_gpus` in `client_resources` to train more clients in parallel on your GPU(s).
+
+**Contributors:** Minjae Kim
+
+
+## Experimental Setup
+
+**Task:** Image Classification
+
+**Model:** ResNet18
+
+**Dataset:** This baseline only includes the CIFAR100 dataset. By default it will be partitioned into 100 clients following IID distribution. The settings are as follow:
+
+| Dataset | #classes | #partitions | partitioning method |
+| :------ | :---: | :---: | :---: |
+| CIFAR100 | 100 | 100 | IID or Non-IID |
+
+**Training Hyperparameters:**
+The following table shows the main hyperparameters for this baseline with their default value (i.e. the value used if you run `python -m depthfl.main` directly)
+
+| Description | Default Value |
+| ----------- | ----- |
+| total clients | 100 |
+| local epoch | 5 |
+| batch size | 50 |
+| number of rounds | 1000 |
+| participation ratio | 10% |
+| learning rate | 0.1 |
+| learning rate decay | 0.998 |
+| client resources | {'num_cpus': 1.0, 'num_gpus': 0.5 }|
+| data partition | IID |
+| optimizer | SGD with dynamic regularization |
+| alpha | 0.1 |
+
+
+## Environment Setup
+
+To construct the Python environment follow these steps:
+
+```bash
+# Set python version
+pyenv install 3.10.6
+pyenv local 3.10.6
+
+# Tell poetry to use python 3.10
+poetry env use 3.10.6
+
+# Install the base Poetry environment
+poetry install
+
+# Activate the environment
+poetry shell
+```
+
+
+## Running the Experiments
+
+To run this DepthFL, first ensure you have activated your Poetry environment (execute `poetry shell` from this directory), then:
+
+```bash
+# this will run using the default settings in the `conf/config.yaml`
+python -m depthfl.main # 'accuracy' : accuracy of the ensemble model, 'accuracy_single' : accuracy of each classifier.
+
+# you can override settings directly from the command line
+python -m depthfl.main exclusive_learning=true model_size=1 # exclusive learning - 100% (a)
+python -m depthfl.main exclusive_learning=true model_size=4 # exclusive learning - 25% (d)
+python -m depthfl.main fit_config.feddyn=false fit_config.kd=false # DepthFL (FedAvg)
+python -m depthfl.main fit_config.feddyn=false fit_config.kd=false fit_config.extended=false # InclusiveFL
+```
+
+To run using HeteroFL:
+```bash
+# since sbn takes too long, we test global model every 50 rounds.
+python -m depthfl.main --config-name="heterofl" # HeteroFL
+python -m depthfl.main --config-name="heterofl" exclusive_learning=true model_size=1 # exclusive learning - 100% (a)
+```
+
+### Stateful clients comment
+
+To implement `feddyn`, stateful clients that store prev_grads information are needed. Since flwr does not yet officially support stateful clients, it was implemented as a temporary measure by loading `prev_grads` from disk when creating a client, and then storing it again on disk after learning. Specifically, there are files that store the state of each client in the `prev_grads` folder. When the strategy is instantiated (for both `FedDyn` and `HeteroFL`) the content of `prev_grads` is reset.
+
+
+## Expected Results
+
+With the following command we run DepthFL (FedDyn / FedAvg), InclusiveFL, and HeteroFL to replicate the results of table 2,3,4 in DepthFL paper. Tables 2, 3, and 4 may contain results from the same experiment in multiple tables.
+
+```bash
+# table 2 (HeteroFL row)
+python -m depthfl.main --config-name="heterofl"
+python -m depthfl.main --config-name="heterofl" --multirun exclusive_learning=true model.scale=false model_size=1,2,3,4
+
+# table 2 (DepthFL(FedAvg) row)
+python -m depthfl.main fit_config.feddyn=false fit_config.kd=false
+python -m depthfl.main --multirun fit_config.feddyn=false fit_config.kd=false exclusive_learning=true model_size=1,2,3,4
+
+# table 2 (DepthFL row)
+python -m depthfl.main
+python -m depthfl.main --multirun exclusive_learning=true model_size=1,2,3,4
+```
+
+**Table 2**
+
+100% (a), 75%(b), 50%(c), 25% (d) cases are exclusive learning scenario. 100% (a) exclusive learning means, the global model and every local model are equal to the smallest local model, and 100% clients participate in learning. Likewise, 25% (d) exclusive learning means, the global model and every local model are equal to the larget local model, and only 25% clients participate in learning.
+
+| Scaling Method | Dataset | Global Model | 100% (a) | 75% (b) | 50% (c) | 25% (d) |
+| :---: | :---: | :---: | :---: | :---: | :---: | :---: |
+| HeteroFL DepthFL (FedAvg) DepthFL | CIFAR100 | 57.61 72.67 76.06 | 64.39 67.08 69.68 | 66.08 70.78 73.21 | 62.03 68.41 70.29 | 51.99 59.17 60.32 |
+
+```bash
+# table 3 (Width Scaling - Duplicate results from table 2)
+python -m depthfl.main --config-name="heterofl"
+python -m depthfl.main --config-name="heterofl" --multirun exclusive_learning=true model.scale=false model_size=1,2,3,4
+
+# table 3 (Depth Scaling : Exclusive Learning, DepthFL(FedAvg) rows - Duplicate results from table 2)
+python -m depthfl.main fit_config.feddyn=false fit_config.kd=false
+python -m depthfl.main --multirun fit_config.feddyn=false fit_config.kd=false exclusive_learning=true model_size=1,2,3,4
+
+## table 3 (Depth Scaling - InclusiveFL row)
+python -m depthfl.main fit_config.feddyn=false fit_config.kd=false fit_config.extended=false
+```
+
+**Table 3**
+
+Accuracy of global sub-models compared to exclusive learning on CIFAR-100.
+
+| Method | Algorithm | Classifier 1/4 | Classifier 2/4 | Classifier 3/4 | Classifier 4/4 |
+| :---: | :---: | :---: | :---: | :---: | :---: |
+| Width Scaling | Exclusive Learning HeteroFL| 64.39 51.08 | 66.08 55.89 | 62.03 58.29 | 51.99 57.61 |
+
+| Method | Algorithm | Classifier 1/4 | Classifier 2/4 | Classifier 3/4 | Classifier 4/4 |
+| :---: | :---: | :---: | :---: | :---: | :---: |
+| Depth Scaling | Exclusive Learning InclusiveFL DepthFL (FedAvg) | 67.08 47.61 66.18 | 68.00 53.88 67.56 | 66.19 59.48 67.97 | 56.78 60.46 68.01 |
+
+```bash
+# table 4
+python -m depthfl.main --multirun fit_config.kd=true,false dataset_config.iid=true,false
+```
+
+**Table 4**
+
+Accuracy of the global model with/without self distillation on CIFAR-100.
+
+| Distribution | Dataset | KD | Classifier 1/4 | Classifier 2/4 | Classifier 3/4 | Classifier 4/4 | Ensemble |
+| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
+| IID | CIFAR100 | ✗ ✓ | 70.13 71.74 | 69.63 73.35 | 68.92 73.57 | 68.92 73.55 | 74.48 76.06 |
+| non-IID | CIFAR100 | ✗ ✓ | 67.94 70.33 | 68.68 71.88 | 68.46 72.43 | 67.78 72.34 | 73.18 74.92 |
+
diff --git a/baselines/depthfl/depthfl/__init__.py b/baselines/depthfl/depthfl/__init__.py
new file mode 100644
index 000000000000..3343905e1879
--- /dev/null
+++ b/baselines/depthfl/depthfl/__init__.py
@@ -0,0 +1 @@
+"""Flower summer of reproducibility : DepthFL (ICLR' 23)."""
diff --git a/baselines/depthfl/depthfl/client.py b/baselines/depthfl/depthfl/client.py
new file mode 100644
index 000000000000..481ac90f1c79
--- /dev/null
+++ b/baselines/depthfl/depthfl/client.py
@@ -0,0 +1,181 @@
+"""Defines the DepthFL Flower Client and a function to instantiate it."""
+
+import copy
+import pickle
+from collections import OrderedDict
+from typing import Callable, Dict, List, Tuple
+
+import flwr as fl
+import numpy as np
+import torch
+from flwr.common.typing import NDArrays, Scalar
+from hydra.utils import instantiate
+from omegaconf import DictConfig
+from torch.utils.data import DataLoader
+
+from depthfl.models import test, train
+
+
+def prune(state_dict, param_idx):
+ """Prune width of DNN (for HeteroFL)."""
+ ret_dict = {}
+ for k in state_dict.keys():
+ if "num" not in k:
+ ret_dict[k] = state_dict[k][torch.meshgrid(param_idx[k])]
+ else:
+ ret_dict[k] = state_dict[k]
+ return copy.deepcopy(ret_dict)
+
+
+class FlowerClient(
+ fl.client.NumPyClient
+): # pylint: disable=too-many-instance-attributes
+ """Standard Flower client for CNN training."""
+
+ def __init__(
+ self,
+ net: torch.nn.Module,
+ trainloader: DataLoader,
+ valloader: DataLoader,
+ device: torch.device,
+ num_epochs: int,
+ learning_rate: float,
+ learning_rate_decay: float,
+ prev_grads: Dict,
+ cid: int,
+ ): # pylint: disable=too-many-arguments
+ self.net = net
+ self.trainloader = trainloader
+ self.valloader = valloader
+ self.device = device
+ self.num_epochs = num_epochs
+ self.learning_rate = learning_rate
+ self.learning_rate_decay = learning_rate_decay
+ self.prev_grads = prev_grads
+ self.cid = cid
+ self.param_idx = {}
+ state_dict = net.state_dict()
+
+ # for HeteroFL
+ for k in state_dict.keys():
+ self.param_idx[k] = [
+ torch.arange(size) for size in state_dict[k].shape
+ ] # store client's weights' shape (for HeteroFL)
+
+ def get_parameters(self, config: Dict[str, Scalar]) -> NDArrays:
+ """Return the parameters of the current net."""
+ return [val.cpu().numpy() for _, val in self.net.state_dict().items()]
+
+ def set_parameters(self, parameters: NDArrays) -> None:
+ """Change the parameters of the model using the given ones."""
+ params_dict = zip(self.net.state_dict().keys(), parameters)
+ state_dict = OrderedDict({k: torch.tensor(v) for k, v in params_dict})
+ self.net.load_state_dict(prune(state_dict, self.param_idx), strict=True)
+
+ def fit(
+ self, parameters: NDArrays, config: Dict[str, Scalar]
+ ) -> Tuple[NDArrays, int, Dict]:
+ """Implement distributed fit function for a given client."""
+ self.set_parameters(parameters)
+ num_epochs = self.num_epochs
+
+ curr_round = int(config["curr_round"]) - 1
+
+ # consistency weight for self distillation in DepthFL
+ consistency_weight_constant = 300
+ current = np.clip(curr_round, 0.0, consistency_weight_constant)
+ phase = 1.0 - current / consistency_weight_constant
+ consistency_weight = float(np.exp(-5.0 * phase * phase))
+
+ train(
+ self.net,
+ self.trainloader,
+ self.device,
+ epochs=num_epochs,
+ learning_rate=self.learning_rate * self.learning_rate_decay**curr_round,
+ config=config,
+ consistency_weight=consistency_weight,
+ prev_grads=self.prev_grads,
+ )
+
+ with open(f"prev_grads/client_{self.cid}", "wb") as prev_grads_file:
+ pickle.dump(self.prev_grads, prev_grads_file)
+
+ return self.get_parameters({}), len(self.trainloader), {"cid": self.cid}
+
+ def evaluate(
+ self, parameters: NDArrays, config: Dict[str, Scalar]
+ ) -> Tuple[float, int, Dict]:
+ """Implement distributed evaluation for a given client."""
+ self.set_parameters(parameters)
+ loss, accuracy, accuracy_single = test(self.net, self.valloader, self.device)
+ return (
+ float(loss),
+ len(self.valloader),
+ {"accuracy": float(accuracy), "accuracy_single": accuracy_single},
+ )
+
+
+def gen_client_fn( # pylint: disable=too-many-arguments
+ num_epochs: int,
+ trainloaders: List[DataLoader],
+ valloaders: List[DataLoader],
+ learning_rate: float,
+ learning_rate_decay: float,
+ models: List[DictConfig],
+) -> Callable[[str], FlowerClient]:
+ """Generate the client function that creates the Flower Clients.
+
+ Parameters
+ ----------
+ num_epochs : int
+ The number of local epochs each client should run the training for before
+ sending it to the server.
+ trainloaders: List[DataLoader]
+ A list of DataLoaders, each pointing to the dataset training partition
+ belonging to a particular client.
+ valloaders: List[DataLoader]
+ A list of DataLoaders, each pointing to the dataset validation partition
+ belonging to a particular client.
+ learning_rate : float
+ The learning rate for the SGD optimizer of clients.
+ learning_rate_decay : float
+ The learning rate decay ratio per round for the SGD optimizer of clients.
+ models : List[DictConfig]
+ A list of DictConfigs, each pointing to the model config of client's local model
+
+ Returns
+ -------
+ Callable[[str], FlowerClient]
+ client function that creates Flower Clients
+ """
+
+ def client_fn(cid: str) -> FlowerClient:
+ """Create a Flower client representing a single organization."""
+ # Load model
+ device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
+
+ # each client gets a different model config (different width / depth)
+ net = instantiate(models[int(cid)]).to(device)
+
+ # Note: each client gets a different trainloader/valloader, so each client
+ # will train and evaluate on their own unique data
+ trainloader = trainloaders[int(cid)]
+ valloader = valloaders[int(cid)]
+
+ with open(f"prev_grads/client_{int(cid)}", "rb") as prev_grads_file:
+ prev_grads = pickle.load(prev_grads_file)
+
+ return FlowerClient(
+ net,
+ trainloader,
+ valloader,
+ device,
+ num_epochs,
+ learning_rate,
+ learning_rate_decay,
+ prev_grads,
+ int(cid),
+ )
+
+ return client_fn
diff --git a/baselines/depthfl/depthfl/conf/config.yaml b/baselines/depthfl/depthfl/conf/config.yaml
new file mode 100644
index 000000000000..5a126229956e
--- /dev/null
+++ b/baselines/depthfl/depthfl/conf/config.yaml
@@ -0,0 +1,42 @@
+---
+
+num_clients: 100 # total number of clients
+num_epochs: 5 # number of local epochs
+batch_size: 50
+num_rounds: 1000
+fraction: 0.1 # participation ratio
+learning_rate: 0.1
+learning_rate_decay : 0.998 # per round
+static_bn: false # static batch normalization (HeteroFL)
+exclusive_learning: false # exclusive learning baseline in DepthFL paper
+model_size: 1 # model size for exclusive learning
+
+client_resources:
+ num_cpus: 1
+ num_gpus: 0.5
+
+server_device: cuda
+
+dataset_config:
+ iid: true
+ beta: 0.5
+
+fit_config:
+ feddyn: true
+ kd: true
+ alpha: 0.1 # alpha for FedDyn
+ extended: true # if not extended : InclusiveFL
+ drop_client: false # with FedProx, clients shouldn't be dropped even if they are stragglers
+
+model:
+ _target_: depthfl.resnet.multi_resnet18
+ n_blocks: 4 # depth (1 ~ 4)
+ num_classes: 100
+
+strategy:
+ _target_: depthfl.strategy.FedDyn
+ fraction_fit: 0.00001 # because we want the number of clients to sample on each round to be solely defined by min_fit_clients
+ fraction_evaluate: 0.0
+ # min_fit_clients: ${clients_per_round}
+ min_evaluate_clients: 0
+ # min_available_clients: ${clients_per_round}
\ No newline at end of file
diff --git a/baselines/depthfl/depthfl/conf/heterofl.yaml b/baselines/depthfl/depthfl/conf/heterofl.yaml
new file mode 100644
index 000000000000..ad0bb8c8f8b8
--- /dev/null
+++ b/baselines/depthfl/depthfl/conf/heterofl.yaml
@@ -0,0 +1,43 @@
+---
+
+num_clients: 100 # total number of clients
+num_epochs: 5 # number of local epochs
+batch_size: 50
+num_rounds: 1000
+fraction: 0.1 # participation ratio
+learning_rate: 0.1
+learning_rate_decay : 0.998 # per round
+static_bn: true # static batch normalization (HeteroFL)
+exclusive_learning: false # exclusive learning baseline in DepthFL paper
+model_size: 1 # model size for exclusive learning
+
+client_resources:
+ num_cpus: 1
+ num_gpus: 0.5
+
+server_device: cuda
+
+dataset_config:
+ iid: true
+ beta: 0.5
+
+fit_config:
+ feddyn: false
+ kd: false
+ alpha: 0.1 # unused
+ extended: false # unused
+ drop_client: false # with FedProx, clients shouldn't be dropped even if they are stragglers
+
+model:
+ _target_: depthfl.resnet_hetero.resnet18
+ n_blocks: 4 # width (1 ~ 4)
+ num_classes: 100
+ scale: true # scaler module in HeteroFL
+
+strategy:
+ _target_: depthfl.strategy_hetero.HeteroFL
+ fraction_fit: 0.00001 # because we want the number of clients to sample on each round to be solely defined by min_fit_clients
+ fraction_evaluate: 0.0
+ # min_fit_clients: ${clients_per_round}
+ min_evaluate_clients: 0
+ # min_available_clients: ${clients_per_round}
\ No newline at end of file
diff --git a/baselines/depthfl/depthfl/dataset.py b/baselines/depthfl/depthfl/dataset.py
new file mode 100644
index 000000000000..c2024fe068a0
--- /dev/null
+++ b/baselines/depthfl/depthfl/dataset.py
@@ -0,0 +1,60 @@
+"""CIFAR100 dataset utilities for federated learning."""
+
+from typing import Optional, Tuple
+
+import torch
+from omegaconf import DictConfig
+from torch.utils.data import DataLoader, random_split
+
+from depthfl.dataset_preparation import _partition_data
+
+
+def load_datasets( # pylint: disable=too-many-arguments
+ config: DictConfig,
+ num_clients: int,
+ val_ratio: float = 0.0,
+ batch_size: Optional[int] = 32,
+ seed: Optional[int] = 41,
+) -> Tuple[DataLoader, DataLoader, DataLoader]:
+ """Create the dataloaders to be fed into the model.
+
+ Parameters
+ ----------
+ config: DictConfig
+ Parameterises the dataset partitioning process
+ num_clients : int
+ The number of clients that hold a part of the data
+ val_ratio : float, optional
+ The ratio of training data that will be used for validation (between 0 and 1),
+ by default 0.1
+ batch_size : int, optional
+ The size of the batches to be fed into the model, by default 32
+ seed : int, optional
+ Used to set a fix seed to replicate experiments, by default 42
+
+ Returns
+ -------
+ Tuple[DataLoader, DataLoader, DataLoader]
+ The DataLoader for training, validation, and testing.
+ """
+ print(f"Dataset partitioning config: {config}")
+ datasets, testset = _partition_data(
+ num_clients,
+ iid=config.iid,
+ beta=config.beta,
+ seed=seed,
+ )
+ # Split each partition into train/val and create DataLoader
+ trainloaders = []
+ valloaders = []
+ for dataset in datasets:
+ len_val = 0
+ if val_ratio > 0:
+ len_val = int(len(dataset) / (1 / val_ratio))
+ lengths = [len(dataset) - len_val, len_val]
+ ds_train, ds_val = random_split(
+ dataset, lengths, torch.Generator().manual_seed(seed)
+ )
+ trainloaders.append(DataLoader(ds_train, batch_size=batch_size, shuffle=True))
+ valloaders.append(DataLoader(ds_val, batch_size=batch_size))
+ return trainloaders, valloaders, DataLoader(testset, batch_size=batch_size)
diff --git a/baselines/depthfl/depthfl/dataset_preparation.py b/baselines/depthfl/depthfl/dataset_preparation.py
new file mode 100644
index 000000000000..006491c7679e
--- /dev/null
+++ b/baselines/depthfl/depthfl/dataset_preparation.py
@@ -0,0 +1,125 @@
+"""Dataset(CIFAR100) preparation for DepthFL."""
+
+from typing import List, Optional, Tuple
+
+import numpy as np
+import torchvision.transforms as transforms
+from torch.utils.data import Dataset, Subset
+from torchvision.datasets import CIFAR100
+
+
+def _download_data() -> Tuple[Dataset, Dataset]:
+ """Download (if necessary) and returns the CIFAR-100 dataset.
+
+ Returns
+ -------
+ Tuple[CIFAR100, CIFAR100]
+ The dataset for training and the dataset for testing CIFAR100.
+ """
+ transform_train = transforms.Compose(
+ [
+ transforms.ToTensor(),
+ transforms.RandomCrop(32, padding=4),
+ transforms.RandomHorizontalFlip(),
+ transforms.Normalize((0.5071, 0.4867, 0.4408), (0.2675, 0.2565, 0.2761)),
+ ]
+ )
+
+ transform_test = transforms.Compose(
+ [
+ transforms.ToTensor(),
+ transforms.Normalize((0.5071, 0.4867, 0.4408), (0.2675, 0.2565, 0.2761)),
+ ]
+ )
+
+ trainset = CIFAR100(
+ "./dataset", train=True, download=True, transform=transform_train
+ )
+ testset = CIFAR100(
+ "./dataset", train=False, download=True, transform=transform_test
+ )
+ return trainset, testset
+
+
+def _partition_data(
+ num_clients,
+ iid: Optional[bool] = True,
+ beta=0.5,
+ seed=41,
+) -> Tuple[List[Dataset], Dataset]:
+ """Split training set to simulate the federated setting.
+
+ Parameters
+ ----------
+ num_clients : int
+ The number of clients that hold a part of the data
+ iid : bool, optional
+ Whether the data should be independent and identically distributed
+ or if the data should first be sorted by labels and distributed by
+ noniid manner to each client, by default true
+ beta : hyperparameter for dirichlet distribution
+ seed : int, optional
+ Used to set a fix seed to replicate experiments, by default 42
+
+ Returns
+ -------
+ Tuple[List[Dataset], Dataset]
+ A list of dataset for each client and a
+ single dataset to be use for testing the model.
+ """
+ trainset, testset = _download_data()
+
+ datasets: List[Subset] = []
+
+ if iid:
+ distribute_iid(num_clients, seed, trainset, datasets)
+
+ else:
+ distribute_noniid(num_clients, beta, seed, trainset, datasets)
+
+ return datasets, testset
+
+
+def distribute_iid(num_clients, seed, trainset, datasets):
+ """Distribute dataset in iid manner."""
+ np.random.seed(seed)
+ num_sample = int(len(trainset) / (num_clients))
+ index = list(range(len(trainset)))
+ for _ in range(num_clients):
+ sample_idx = np.random.choice(index, num_sample, replace=False)
+ index = list(set(index) - set(sample_idx))
+ datasets.append(Subset(trainset, sample_idx))
+
+
+def distribute_noniid(num_clients, beta, seed, trainset, datasets):
+ """Distribute dataset in non-iid manner."""
+ labels = np.array([label for _, label in trainset])
+ min_size = 0
+ np.random.seed(seed)
+
+ while min_size < 10:
+ idx_batch = [[] for _ in range(num_clients)]
+ # for each class in the dataset
+ for k in range(np.max(labels) + 1):
+ idx_k = np.where(labels == k)[0]
+ np.random.shuffle(idx_k)
+ proportions = np.random.dirichlet(np.repeat(beta, num_clients))
+ # Balance
+ proportions = np.array(
+ [
+ p * (len(idx_j) < labels.shape[0] / num_clients)
+ for p, idx_j in zip(proportions, idx_batch)
+ ]
+ )
+ proportions = proportions / proportions.sum()
+ proportions = (np.cumsum(proportions) * len(idx_k)).astype(int)[:-1]
+ idx_batch = [
+ idx_j + idx.tolist()
+ for idx_j, idx in zip(idx_batch, np.split(idx_k, proportions))
+ ]
+ min_size = min([len(idx_j) for idx_j in idx_batch])
+
+ for j in range(num_clients):
+ np.random.shuffle(idx_batch[j])
+ # net_dataidx_map[j] = np.array(idx_batch[j])
+ datasets.append(Subset(trainset, np.array(idx_batch[j])))
diff --git a/baselines/depthfl/depthfl/main.py b/baselines/depthfl/depthfl/main.py
new file mode 100644
index 000000000000..7bf1d9563eae
--- /dev/null
+++ b/baselines/depthfl/depthfl/main.py
@@ -0,0 +1,135 @@
+"""DepthFL main."""
+
+import copy
+
+import flwr as fl
+import hydra
+from flwr.common import ndarrays_to_parameters
+from flwr.server.client_manager import SimpleClientManager
+from hydra.core.hydra_config import HydraConfig
+from hydra.utils import instantiate
+from omegaconf import DictConfig, OmegaConf
+
+from depthfl import client, server
+from depthfl.dataset import load_datasets
+from depthfl.utils import save_results_as_pickle
+
+
+@hydra.main(config_path="conf", config_name="config", version_base=None)
+def main(cfg: DictConfig) -> None:
+ """Run the baseline.
+
+ Parameters
+ ----------
+ cfg : DictConfig
+ An omegaconf object that stores the hydra config.
+ """
+ print(OmegaConf.to_yaml(cfg))
+
+ # partition dataset and get dataloaders
+ trainloaders, valloaders, testloader = load_datasets(
+ config=cfg.dataset_config,
+ num_clients=cfg.num_clients,
+ batch_size=cfg.batch_size,
+ )
+
+ # exclusive learning baseline in DepthFL paper
+ # (model_size, % of clients) = (a,100), (b,75), (c,50), (d,25)
+ if cfg.exclusive_learning:
+ cfg.num_clients = int(
+ cfg.num_clients - (cfg.model_size - 1) * (cfg.num_clients // 4)
+ )
+
+ models = []
+ for i in range(cfg.num_clients):
+ model = copy.deepcopy(cfg.model)
+
+ # each client gets different model depth / width
+ model.n_blocks = i // (cfg.num_clients // 4) + 1
+
+ # In exclusive learning, every client has same model depth / width
+ if cfg.exclusive_learning:
+ model.n_blocks = cfg.model_size
+
+ models.append(model)
+
+ # prepare function that will be used to spawn each client
+ client_fn = client.gen_client_fn(
+ num_epochs=cfg.num_epochs,
+ trainloaders=trainloaders,
+ valloaders=valloaders,
+ learning_rate=cfg.learning_rate,
+ learning_rate_decay=cfg.learning_rate_decay,
+ models=models,
+ )
+
+ # get function that will executed by the strategy's evaluate() method
+ # Set server's device
+ device = cfg.server_device
+
+ # Static Batch Normalization for HeteroFL
+ if cfg.static_bn:
+ evaluate_fn = server.gen_evaluate_fn_hetero(
+ trainloaders, testloader, device=device, model_cfg=model
+ )
+ else:
+ evaluate_fn = server.gen_evaluate_fn(testloader, device=device, model=model)
+
+ # get a function that will be used to construct the config that the client's
+ # fit() method will received
+ def get_on_fit_config():
+ def fit_config_fn(server_round):
+ # resolve and convert to python dict
+ fit_config = OmegaConf.to_container(cfg.fit_config, resolve=True)
+ fit_config["curr_round"] = server_round # add round info
+ return fit_config
+
+ return fit_config_fn
+
+ net = instantiate(cfg.model)
+ # instantiate strategy according to config. Here we pass other arguments
+ # that are only defined at run time.
+ strategy = instantiate(
+ cfg.strategy,
+ cfg,
+ net,
+ evaluate_fn=evaluate_fn,
+ on_fit_config_fn=get_on_fit_config(),
+ initial_parameters=ndarrays_to_parameters(
+ [val.cpu().numpy() for _, val in net.state_dict().items()]
+ ),
+ min_fit_clients=int(cfg.num_clients * cfg.fraction),
+ min_available_clients=int(cfg.num_clients * cfg.fraction),
+ )
+
+ # Start simulation
+ history = fl.simulation.start_simulation(
+ client_fn=client_fn,
+ num_clients=cfg.num_clients,
+ config=fl.server.ServerConfig(num_rounds=cfg.num_rounds),
+ client_resources={
+ "num_cpus": cfg.client_resources.num_cpus,
+ "num_gpus": cfg.client_resources.num_gpus,
+ },
+ strategy=strategy,
+ server=server.ServerFedDyn(
+ client_manager=SimpleClientManager(), strategy=strategy
+ ),
+ )
+
+ # Experiment completed. Now we save the results and
+ # generate plots using the `history`
+ print("................")
+ print(history)
+
+ # Hydra automatically creates an output directory
+ # Let's retrieve it and save some results there
+ save_path = HydraConfig.get().runtime.output_dir
+
+ # save results as a Python pickle using a file_path
+ # the directory created by Hydra for each run
+ save_results_as_pickle(history, file_path=save_path, extra_results={})
+
+
+if __name__ == "__main__":
+ main()
diff --git a/baselines/depthfl/depthfl/models.py b/baselines/depthfl/depthfl/models.py
new file mode 100644
index 000000000000..df3eebf9f9ce
--- /dev/null
+++ b/baselines/depthfl/depthfl/models.py
@@ -0,0 +1,301 @@
+"""ResNet18 model architecutre, training, and testing functions for CIFAR100."""
+
+
+from typing import List, Tuple
+
+import torch
+import torch.nn as nn
+import torch.nn.functional as F
+from omegaconf import DictConfig
+from torch.utils.data import DataLoader
+
+
+class KLLoss(nn.Module):
+ """KL divergence loss for self distillation."""
+
+ def __init__(self):
+ super().__init__()
+ self.temperature = 1
+
+ def forward(self, pred, label):
+ """KL loss forward."""
+ predict = F.log_softmax(pred / self.temperature, dim=1)
+ target_data = F.softmax(label / self.temperature, dim=1)
+ target_data = target_data + 10 ** (-7)
+ with torch.no_grad():
+ target = target_data.detach().clone()
+
+ loss = (
+ self.temperature
+ * self.temperature
+ * ((target * (target.log() - predict)).sum(1).sum() / target.size()[0])
+ )
+ return loss
+
+
+def train( # pylint: disable=too-many-arguments
+ net: nn.Module,
+ trainloader: DataLoader,
+ device: torch.device,
+ epochs: int,
+ learning_rate: float,
+ config: dict,
+ consistency_weight: float,
+ prev_grads: dict,
+) -> None:
+ """Train the network on the training set.
+
+ Parameters
+ ----------
+ net : nn.Module
+ The neural network to train.
+ trainloader : DataLoader
+ The DataLoader containing the data to train the network on.
+ device : torch.device
+ The device on which the model should be trained, either 'cpu' or 'cuda'.
+ epochs : int
+ The number of epochs the model should be trained for.
+ learning_rate : float
+ The learning rate for the SGD optimizer.
+ config : dict
+ training configuration
+ consistency_weight : float
+ hyperparameter for self distillation
+ prev_grads : dict
+ control variate for feddyn
+ """
+ criterion = torch.nn.CrossEntropyLoss()
+ optimizer = torch.optim.SGD(net.parameters(), lr=learning_rate, weight_decay=1e-3)
+ global_params = {
+ k: val.detach().clone().flatten() for (k, val) in net.named_parameters()
+ }
+
+ for k, _ in net.named_parameters():
+ prev_grads[k] = prev_grads[k].to(device)
+
+ net.train()
+ for _ in range(epochs):
+ _train_one_epoch(
+ net,
+ global_params,
+ trainloader,
+ device,
+ criterion,
+ optimizer,
+ config,
+ consistency_weight,
+ prev_grads,
+ )
+
+ # update prev_grads for FedDyn
+ if config["feddyn"]:
+ update_prev_grads(config, net, prev_grads, global_params)
+
+
+def update_prev_grads(config, net, prev_grads, global_params):
+ """Update prev_grads for FedDyn."""
+ for k, param in net.named_parameters():
+ curr_param = param.detach().clone().flatten()
+ prev_grads[k] = prev_grads[k] - config["alpha"] * (
+ curr_param - global_params[k]
+ )
+ prev_grads[k] = prev_grads[k].to(torch.device(torch.device("cpu")))
+
+
+def _train_one_epoch( # pylint: disable=too-many-locals, too-many-arguments
+ net: nn.Module,
+ global_params: dict,
+ trainloader: DataLoader,
+ device: torch.device,
+ criterion: torch.nn.CrossEntropyLoss,
+ optimizer: torch.optim.SGD,
+ config: dict,
+ consistency_weight: float,
+ prev_grads: dict,
+):
+ """Train for one epoch.
+
+ Parameters
+ ----------
+ net : nn.Module
+ The neural network to train.
+ global_params : List[Parameter]
+ The parameters of the global model (from the server).
+ trainloader : DataLoader
+ The DataLoader containing the data to train the network on.
+ device : torch.device
+ The device on which the model should be trained, either 'cpu' or 'cuda'.
+ criterion : torch.nn.CrossEntropyLoss
+ The loss function to use for training
+ optimizer : torch.optim.Adam
+ The optimizer to use for training
+ config : dict
+ training configuration
+ consistency_weight : float
+ hyperparameter for self distillation
+ prev_grads : dict
+ control variate for feddyn
+ """
+ criterion_kl = KLLoss().cuda()
+
+ for images, labels in trainloader:
+ images, labels = images.to(device), labels.to(device)
+ loss = torch.zeros(1).to(device)
+ optimizer.zero_grad()
+ output_lst = net(images)
+
+ for i, branch_output in enumerate(output_lst):
+ # only trains last classifier in InclusiveFL
+ if not config["extended"] and i != len(output_lst) - 1:
+ continue
+
+ loss += criterion(branch_output, labels)
+
+ # self distillation term
+ if config["kd"] and len(output_lst) > 1:
+ for j, output in enumerate(output_lst):
+ if j == i:
+ continue
+
+ loss += (
+ consistency_weight
+ * criterion_kl(branch_output, output.detach())
+ / (len(output_lst) - 1)
+ )
+
+ # Dynamic regularization in FedDyn
+ if config["feddyn"]:
+ for k, param in net.named_parameters():
+ curr_param = param.flatten()
+
+ lin_penalty = torch.dot(curr_param, prev_grads[k])
+ loss -= lin_penalty
+
+ quad_penalty = (
+ config["alpha"]
+ / 2.0
+ * torch.sum(torch.square(curr_param - global_params[k]))
+ )
+ loss += quad_penalty
+
+ loss.backward()
+ optimizer.step()
+
+
+def test( # pylint: disable=too-many-locals
+ net: nn.Module, testloader: DataLoader, device: torch.device
+) -> Tuple[float, float, List[float]]:
+ """Evaluate the network on the entire test set.
+
+ Parameters
+ ----------
+ net : nn.Module
+ The neural network to test.
+ testloader : DataLoader
+ The DataLoader containing the data to test the network on.
+ device : torch.device
+ The device on which the model should be tested, either 'cpu' or 'cuda'.
+
+ Returns
+ -------
+ Tuple[float, float, List[float]]
+ The loss and the accuracy of the global model
+ and the list of accuracy for each classifier on the given data.
+ """
+ criterion = torch.nn.CrossEntropyLoss()
+ correct, total, loss = 0, 0, 0.0
+ correct_single = [0] * 4 # accuracy of each classifier within model
+ net.eval()
+ with torch.no_grad():
+ for images, labels in testloader:
+ images, labels = images.to(device), labels.to(device)
+ output_lst = net(images)
+
+ # ensemble classfiers' output
+ ensemble_output = torch.stack(output_lst, dim=2)
+ ensemble_output = torch.sum(ensemble_output, dim=2) / len(output_lst)
+
+ loss += criterion(ensemble_output, labels).item()
+ _, predicted = torch.max(ensemble_output, 1)
+ total += labels.size(0)
+ correct += (predicted == labels).sum().item()
+
+ for i, single in enumerate(output_lst):
+ _, predicted = torch.max(single, 1)
+ correct_single[i] += (predicted == labels).sum().item()
+
+ if len(testloader.dataset) == 0:
+ raise ValueError("Testloader can't be 0, exiting...")
+ loss /= len(testloader.dataset)
+ accuracy = correct / total
+ accuracy_single = [correct / total for correct in correct_single]
+ return loss, accuracy, accuracy_single
+
+
+def test_sbn( # pylint: disable=too-many-locals
+ nets: List[nn.Module],
+ trainloaders: List[DictConfig],
+ testloader: DataLoader,
+ device: torch.device,
+) -> Tuple[float, float, List[float]]:
+ """Evaluate the networks on the entire test set.
+
+ Parameters
+ ----------
+ nets : List[nn.Module]
+ The neural networks to test. Each neural network has different width
+ trainloaders : List[DataLoader]
+ The List of dataloaders containing the data to train the network on
+ testloader : DataLoader
+ The DataLoader containing the data to test the network on.
+ device : torch.device
+ The device on which the model should be tested, either 'cpu' or 'cuda'.
+
+ Returns
+ -------
+ Tuple[float, float, List[float]]
+ The loss and the accuracy of the global model
+ and the list of accuracy for each classifier on the given data.
+ """
+ # static batch normalization
+ for trainloader in trainloaders:
+ with torch.no_grad():
+ for model in nets:
+ model.train()
+ for _batch_idx, (images, labels) in enumerate(trainloader):
+ images, labels = images.to(device), labels.to(device)
+ output = model(images)
+
+ model.eval()
+
+ criterion = torch.nn.CrossEntropyLoss()
+ correct, total, loss = 0, 0, 0.0
+ correct_single = [0] * 4
+
+ # test each network of different width
+ with torch.no_grad():
+ for images, labels in testloader:
+ images, labels = images.to(device), labels.to(device)
+
+ output_lst = []
+
+ for model in nets:
+ output_lst.append(model(images)[0])
+
+ output = output_lst[-1]
+
+ loss += criterion(output, labels).item()
+ _, predicted = torch.max(output, 1)
+ total += labels.size(0)
+ correct += (predicted == labels).sum().item()
+
+ for i, single in enumerate(output_lst):
+ _, predicted = torch.max(single, 1)
+ correct_single[i] += (predicted == labels).sum().item()
+
+ if len(testloader.dataset) == 0:
+ raise ValueError("Testloader can't be 0, exiting...")
+ loss /= len(testloader.dataset)
+ accuracy = correct / total
+ accuracy_single = [correct / total for correct in correct_single]
+ return loss, accuracy, accuracy_single
diff --git a/baselines/depthfl/depthfl/resnet.py b/baselines/depthfl/depthfl/resnet.py
new file mode 100644
index 000000000000..04348ae17441
--- /dev/null
+++ b/baselines/depthfl/depthfl/resnet.py
@@ -0,0 +1,386 @@
+"""ResNet18 for DepthFL."""
+
+import torch.nn as nn
+
+
+class MyGroupNorm(nn.Module):
+ """Group Normalization layer."""
+
+ def __init__(self, num_channels):
+ super().__init__()
+ # change num_groups to 32
+ self.norm = nn.GroupNorm(
+ num_groups=16, num_channels=num_channels, eps=1e-5, affine=True
+ )
+
+ def forward(self, x):
+ """GN forward."""
+ x = self.norm(x)
+ return x
+
+
+class MyBatchNorm(nn.Module):
+ """Batch Normalization layer."""
+
+ def __init__(self, num_channels):
+ super().__init__()
+ self.norm = nn.BatchNorm2d(num_channels, track_running_stats=True)
+
+ def forward(self, x):
+ """BN forward."""
+ x = self.norm(x)
+ return x
+
+
+def conv3x3(in_planes, out_planes, stride=1):
+ """Convolution layer 3x3."""
+ return nn.Conv2d(
+ in_planes, out_planes, kernel_size=3, stride=stride, padding=1, bias=False
+ )
+
+
+def conv1x1(in_planes, planes, stride=1):
+ """Convolution layer 1x1."""
+ return nn.Conv2d(in_planes, planes, kernel_size=1, stride=stride, bias=False)
+
+
+class SepConv(nn.Module):
+ """Bottleneck layer module."""
+
+ def __init__( # pylint: disable=too-many-arguments
+ self,
+ channel_in,
+ channel_out,
+ kernel_size=3,
+ stride=2,
+ padding=1,
+ norm_layer=MyGroupNorm,
+ ):
+ super().__init__()
+ self.operations = nn.Sequential(
+ nn.Conv2d(
+ channel_in,
+ channel_in,
+ kernel_size=kernel_size,
+ stride=stride,
+ padding=padding,
+ groups=channel_in,
+ bias=False,
+ ),
+ nn.Conv2d(channel_in, channel_in, kernel_size=1, padding=0, bias=False),
+ norm_layer(channel_in),
+ nn.ReLU(inplace=False),
+ nn.Conv2d(
+ channel_in,
+ channel_in,
+ kernel_size=kernel_size,
+ stride=1,
+ padding=padding,
+ groups=channel_in,
+ bias=False,
+ ),
+ nn.Conv2d(channel_in, channel_out, kernel_size=1, padding=0, bias=False),
+ norm_layer(channel_out),
+ nn.ReLU(inplace=False),
+ )
+
+ def forward(self, x):
+ """SepConv forward."""
+ return self.operations(x)
+
+
+class BasicBlock(nn.Module):
+ """Basic Block for ResNet18."""
+
+ expansion = 1
+
+ def __init__(
+ self, inplanes, planes, stride=1, downsample=None, norm_layer=None
+ ): # pylint: disable=too-many-arguments
+ super().__init__()
+ self.conv1 = conv3x3(inplanes, planes, stride)
+ self.bn1 = norm_layer(planes)
+ self.relu = nn.ReLU(inplace=True)
+ self.conv2 = conv3x3(planes, planes)
+ self.bn2 = norm_layer(planes)
+ self.downsample = downsample
+ self.stride = stride
+
+ def forward(self, x):
+ """BasicBlock forward."""
+ residual = x
+
+ output = self.conv1(x)
+ output = self.bn1(output)
+ output = self.relu(output)
+
+ output = self.conv2(output)
+ output = self.bn2(output)
+
+ if self.downsample is not None:
+ residual = self.downsample(x)
+
+ output += residual
+ output = self.relu(output)
+ return output
+
+
+class MultiResnet(nn.Module): # pylint: disable=too-many-instance-attributes
+ """Resnet model.
+
+ Args:
+ block (class): block type, BasicBlock or BottleneckBlock
+ layers (int list): layer num in each block
+ n_blocks (int) : Depth of network
+ num_classes (int): class num.
+ norm_layer (class): type of normalization layer.
+ """
+
+ def __init__( # pylint: disable=too-many-arguments
+ self,
+ block,
+ layers,
+ n_blocks,
+ num_classes=1000,
+ norm_layer=MyBatchNorm,
+ ):
+ super().__init__()
+ self.n_blocks = n_blocks
+ self.inplanes = 64
+ self.norm_layer = norm_layer
+ self.conv1 = nn.Conv2d(
+ 3, self.inplanes, kernel_size=3, stride=1, padding=1, bias=False
+ )
+ self.bn1 = norm_layer(self.inplanes)
+
+ self.relu = nn.ReLU(inplace=True)
+ # self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
+
+ self.layer1 = self._make_layer(block, 64, layers[0])
+
+ self.middle_fc1 = nn.Linear(512 * block.expansion, num_classes)
+ # self.feature_fc1 = nn.Linear(512 * block.expansion, 512 * block.expansion)
+ self.scala1 = nn.Sequential(
+ SepConv(
+ channel_in=64 * block.expansion,
+ channel_out=128 * block.expansion,
+ norm_layer=norm_layer,
+ ),
+ SepConv(
+ channel_in=128 * block.expansion,
+ channel_out=256 * block.expansion,
+ norm_layer=norm_layer,
+ ),
+ SepConv(
+ channel_in=256 * block.expansion,
+ channel_out=512 * block.expansion,
+ norm_layer=norm_layer,
+ ),
+ nn.AdaptiveAvgPool2d(1),
+ )
+
+ self.attention1 = nn.Sequential(
+ SepConv(
+ channel_in=64 * block.expansion,
+ channel_out=64 * block.expansion,
+ norm_layer=norm_layer,
+ ),
+ norm_layer(64 * block.expansion),
+ nn.ReLU(),
+ nn.Upsample(scale_factor=2, mode="bilinear", align_corners=False),
+ nn.Sigmoid(),
+ )
+
+ if n_blocks > 1:
+ self.layer2 = self._make_layer(block, 128, layers[1], stride=2)
+ self.middle_fc2 = nn.Linear(512 * block.expansion, num_classes)
+ # self.feature_fc2 = nn.Linear(512 * block.expansion, 512 * block.expansion)
+ self.scala2 = nn.Sequential(
+ SepConv(
+ channel_in=128 * block.expansion,
+ channel_out=256 * block.expansion,
+ norm_layer=norm_layer,
+ ),
+ SepConv(
+ channel_in=256 * block.expansion,
+ channel_out=512 * block.expansion,
+ norm_layer=norm_layer,
+ ),
+ nn.AdaptiveAvgPool2d(1),
+ )
+ self.attention2 = nn.Sequential(
+ SepConv(
+ channel_in=128 * block.expansion,
+ channel_out=128 * block.expansion,
+ norm_layer=norm_layer,
+ ),
+ norm_layer(128 * block.expansion),
+ nn.ReLU(),
+ nn.Upsample(scale_factor=2, mode="bilinear", align_corners=False),
+ nn.Sigmoid(),
+ )
+
+ if n_blocks > 2:
+ self.layer3 = self._make_layer(block, 256, layers[2], stride=2)
+ self.middle_fc3 = nn.Linear(512 * block.expansion, num_classes)
+ # self.feature_fc3 = nn.Linear(512 * block.expansion, 512 * block.expansion)
+ self.scala3 = nn.Sequential(
+ SepConv(
+ channel_in=256 * block.expansion,
+ channel_out=512 * block.expansion,
+ norm_layer=norm_layer,
+ ),
+ nn.AdaptiveAvgPool2d(1),
+ )
+ self.attention3 = nn.Sequential(
+ SepConv(
+ channel_in=256 * block.expansion,
+ channel_out=256 * block.expansion,
+ norm_layer=norm_layer,
+ ),
+ norm_layer(256 * block.expansion),
+ nn.ReLU(),
+ nn.Upsample(scale_factor=2, mode="bilinear", align_corners=False),
+ nn.Sigmoid(),
+ )
+
+ if n_blocks > 3:
+ self.layer4 = self._make_layer(block, 512, layers[3], stride=2)
+ self.fc_layer = nn.Linear(512 * block.expansion, num_classes)
+ self.scala4 = nn.AdaptiveAvgPool2d(1)
+
+ for module in self.modules():
+ if isinstance(module, nn.Conv2d):
+ nn.init.kaiming_normal_(
+ module.weight, mode="fan_out", nonlinearity="relu"
+ )
+ elif isinstance(module, (nn.BatchNorm2d, nn.GroupNorm)):
+ nn.init.constant_(module.weight, 1)
+ nn.init.constant_(module.bias, 0)
+
+ def _make_layer(
+ self, block, planes, layers, stride=1, norm_layer=None
+ ): # pylint: disable=too-many-arguments
+ """Create a block with layers.
+
+ Args:
+ block (class): block type
+ planes (int): output channels = planes * expansion
+ layers (int): layer num in the block
+ stride (int): the first layer stride in the block.
+ norm_layer (class): type of normalization layer.
+ """
+ norm_layer = self.norm_layer
+ downsample = None
+ if stride != 1 or self.inplanes != planes * block.expansion:
+ downsample = nn.Sequential(
+ conv1x1(self.inplanes, planes * block.expansion, stride),
+ norm_layer(planes * block.expansion),
+ )
+ layer = []
+ layer.append(
+ block(
+ self.inplanes,
+ planes,
+ stride=stride,
+ downsample=downsample,
+ norm_layer=norm_layer,
+ )
+ )
+ self.inplanes = planes * block.expansion
+ for _i in range(1, layers):
+ layer.append(block(self.inplanes, planes, norm_layer=norm_layer))
+
+ return nn.Sequential(*layer)
+
+ def forward(self, x):
+ """Resnet forward."""
+ x = self.conv1(x)
+ x = self.bn1(x)
+ x = self.relu(x)
+ # x = self.maxpool(x)
+
+ x = self.layer1(x)
+ fea1 = self.attention1(x)
+ fea1 = fea1 * x
+ out1_feature = self.scala1(fea1).view(x.size(0), -1)
+ middle_output1 = self.middle_fc1(out1_feature)
+ # out1_feature = self.feature_fc1(out1_feature)
+
+ if self.n_blocks == 1:
+ return [middle_output1]
+
+ x = self.layer2(x)
+ fea2 = self.attention2(x)
+ fea2 = fea2 * x
+ out2_feature = self.scala2(fea2).view(x.size(0), -1)
+ middle_output2 = self.middle_fc2(out2_feature)
+ # out2_feature = self.feature_fc2(out2_feature)
+ if self.n_blocks == 2:
+ return [middle_output1, middle_output2]
+
+ x = self.layer3(x)
+ fea3 = self.attention3(x)
+ fea3 = fea3 * x
+ out3_feature = self.scala3(fea3).view(x.size(0), -1)
+ middle_output3 = self.middle_fc3(out3_feature)
+ # out3_feature = self.feature_fc3(out3_feature)
+
+ if self.n_blocks == 3:
+ return [middle_output1, middle_output2, middle_output3]
+
+ x = self.layer4(x)
+ out4_feature = self.scala4(x).view(x.size(0), -1)
+ output4 = self.fc_layer(out4_feature)
+
+ return [middle_output1, middle_output2, middle_output3, output4]
+
+
+def multi_resnet18(n_blocks=1, norm="bn", num_classes=100):
+ """Create resnet18 for HeteroFL.
+
+ Parameters
+ ----------
+ n_blocks: int
+ depth of network
+ norm: str
+ normalization layer type
+ num_classes: int
+ # of labels
+
+ Returns
+ -------
+ Callable [ [nn.Module,List[int],int,int,nn.Module], nn.Module]
+ """
+ if norm == "gn":
+ norm_layer = MyGroupNorm
+
+ elif norm == "bn":
+ norm_layer = MyBatchNorm
+
+ return MultiResnet(
+ BasicBlock,
+ [2, 2, 2, 2],
+ n_blocks,
+ num_classes=num_classes,
+ norm_layer=norm_layer,
+ )
+
+
+# if __name__ == "__main__":
+# from ptflops import get_model_complexity_info
+
+# model = MultiResnet18(n_blocks=4, num_classes=100)
+
+# with torch.cuda.device(0):
+# macs, params = get_model_complexity_info(
+# model,
+# (3, 32, 32),
+# as_strings=True,
+# print_per_layer_stat=False,
+# verbose=True,
+# units="MMac",
+# )
+
+# print("{:<30} {:<8}".format("Computational complexity: ", macs))
+# print("{:<30} {:<8}".format("Number of parameters: ", params))
diff --git a/baselines/depthfl/depthfl/resnet_hetero.py b/baselines/depthfl/depthfl/resnet_hetero.py
new file mode 100644
index 000000000000..a84c07b881b2
--- /dev/null
+++ b/baselines/depthfl/depthfl/resnet_hetero.py
@@ -0,0 +1,280 @@
+"""ResNet18 for HeteroFL."""
+
+import numpy as np
+import torch.nn as nn
+
+
+class Scaler(nn.Module):
+ """Scaler module for HeteroFL."""
+
+ def __init__(self, rate, scale):
+ super().__init__()
+ if scale:
+ self.rate = rate
+ else:
+ self.rate = 1
+
+ def forward(self, x):
+ """Scaler forward."""
+ output = x / self.rate if self.training else x
+ return output
+
+
+class MyBatchNorm(nn.Module):
+ """Static Batch Normalization for HeteroFL."""
+
+ def __init__(self, num_channels, track=True):
+ super().__init__()
+ # change num_groups to 32
+ self.norm = nn.BatchNorm2d(num_channels, track_running_stats=track)
+
+ def forward(self, x):
+ """BatchNorm forward."""
+ x = self.norm(x)
+ return x
+
+
+def conv3x3(in_planes, out_planes, stride=1):
+ """Convolution layer 3x3."""
+ return nn.Conv2d(
+ in_planes, out_planes, kernel_size=3, stride=stride, padding=1, bias=False
+ )
+
+
+def conv1x1(in_planes, planes, stride=1):
+ """Convolution layer 1x1."""
+ return nn.Conv2d(in_planes, planes, kernel_size=1, stride=stride, bias=False)
+
+
+class BasicBlock(nn.Module): # pylint: disable=too-many-instance-attributes
+ """Basic Block for ResNet18."""
+
+ expansion = 1
+
+ def __init__( # pylint: disable=too-many-arguments
+ self,
+ inplanes,
+ planes,
+ stride=1,
+ scaler_rate=1,
+ downsample=None,
+ track=True,
+ scale=True,
+ ):
+ super().__init__()
+ self.conv1 = conv3x3(inplanes, planes, stride)
+ self.scaler = Scaler(scaler_rate, scale)
+ self.bn1 = MyBatchNorm(planes, track)
+ self.relu = nn.ReLU(inplace=True)
+ self.conv2 = conv3x3(planes, planes)
+ self.bn2 = MyBatchNorm(planes, track)
+ self.downsample = downsample
+ self.stride = stride
+
+ def forward(self, x):
+ """BasicBlock forward."""
+ residual = x
+
+ output = self.conv1(x)
+ output = self.scaler(output)
+ output = self.bn1(output)
+ output = self.relu(output)
+
+ output = self.conv2(output)
+ output = self.scaler(output)
+ output = self.bn2(output)
+
+ if self.downsample is not None:
+ residual = self.downsample(x)
+
+ output += residual
+ output = self.relu(output)
+ return output
+
+
+class Resnet(nn.Module): # pylint: disable=too-many-instance-attributes
+ """Resnet model."""
+
+ def __init__( # pylint: disable=too-many-arguments
+ self, hidden_size, block, layers, num_classes, scaler_rate, track, scale
+ ):
+ super().__init__()
+
+ self.inplanes = hidden_size[0]
+ self.norm_layer = MyBatchNorm
+ self.conv1 = nn.Conv2d(
+ 3, self.inplanes, kernel_size=3, stride=1, padding=1, bias=False
+ )
+ self.scaler = Scaler(scaler_rate, scale)
+ self.bn1 = self.norm_layer(self.inplanes, track)
+
+ self.relu = nn.ReLU(inplace=True)
+ # self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
+
+ self.layer1 = self._make_layer(
+ block,
+ hidden_size[0],
+ layers[0],
+ scaler_rate=scaler_rate,
+ track=track,
+ scale=scale,
+ )
+ self.layer2 = self._make_layer(
+ block,
+ hidden_size[1],
+ layers[1],
+ stride=2,
+ scaler_rate=scaler_rate,
+ track=track,
+ scale=scale,
+ )
+ self.layer3 = self._make_layer(
+ block,
+ hidden_size[2],
+ layers[2],
+ stride=2,
+ scaler_rate=scaler_rate,
+ track=track,
+ scale=scale,
+ )
+ self.layer4 = self._make_layer(
+ block,
+ hidden_size[3],
+ layers[3],
+ stride=2,
+ scaler_rate=scaler_rate,
+ track=track,
+ scale=scale,
+ )
+ self.fc_layer = nn.Linear(hidden_size[3] * block.expansion, num_classes)
+ self.scala = nn.AdaptiveAvgPool2d(1)
+
+ for module in self.modules():
+ if isinstance(module, nn.Conv2d):
+ nn.init.kaiming_normal_(
+ module.weight, mode="fan_out", nonlinearity="relu"
+ )
+ elif isinstance(module, (nn.BatchNorm2d, nn.GroupNorm)):
+ nn.init.constant_(module.weight, 1)
+ nn.init.constant_(module.bias, 0)
+
+ def _make_layer( # pylint: disable=too-many-arguments
+ self, block, planes, layers, stride=1, scaler_rate=1, track=True, scale=True
+ ):
+ """Create a block with layers.
+
+ Args:
+ block (class): block type
+ planes (int): output channels = planes * expansion
+ layers (int): layer num in the block
+ stride (int): the first layer stride in the block.
+ scaler_rate (float): for scaler module
+ track (bool): static batch normalization
+ scale (bool): for scaler module.
+ """
+ norm_layer = self.norm_layer
+ downsample = None
+ if stride != 1 or self.inplanes != planes * block.expansion:
+ downsample = nn.Sequential(
+ conv1x1(self.inplanes, planes * block.expansion, stride),
+ norm_layer(planes * block.expansion, track),
+ )
+ layer = []
+ layer.append(
+ block(
+ self.inplanes,
+ planes,
+ stride=stride,
+ scaler_rate=scaler_rate,
+ downsample=downsample,
+ track=track,
+ scale=scale,
+ )
+ )
+ self.inplanes = planes * block.expansion
+ for _i in range(1, layers):
+ layer.append(
+ block(
+ self.inplanes,
+ planes,
+ scaler_rate=scaler_rate,
+ track=track,
+ scale=scale,
+ )
+ )
+
+ return nn.Sequential(*layer)
+
+ def forward(self, x):
+ """Resnet forward."""
+ x = self.conv1(x)
+ x = self.scaler(x)
+ x = self.bn1(x)
+ x = self.relu(x)
+ # x = self.maxpool(x)
+
+ x = self.layer1(x)
+ x = self.layer2(x)
+ x = self.layer3(x)
+ x = self.layer4(x)
+ out = self.scala(x).view(x.size(0), -1)
+ out = self.fc_layer(out)
+
+ return [out]
+
+
+def resnet18(n_blocks=4, track=False, scale=True, num_classes=100):
+ """Create resnet18 for HeteroFL.
+
+ Parameters
+ ----------
+ n_blocks: int
+ corresponds to width (divided by 4)
+ track: bool
+ static batch normalization
+ scale: bool
+ scaler module
+ num_classes: int
+ # of labels
+
+ Returns
+ -------
+ Callable [ [List[int],nn.Module,List[int],int,float,bool,bool], nn.Module]
+ """
+ # width pruning ratio : (0.25, 0.50, 0.75, 0.10)
+ model_rate = n_blocks / 4
+ classes_size = num_classes
+
+ hidden_size = [64, 128, 256, 512]
+ hidden_size = [int(np.ceil(model_rate * x)) for x in hidden_size]
+
+ scaler_rate = model_rate
+
+ return Resnet(
+ hidden_size,
+ BasicBlock,
+ [2, 2, 2, 2],
+ num_classes=classes_size,
+ scaler_rate=scaler_rate,
+ track=track,
+ scale=scale,
+ )
+
+
+# if __name__ == "__main__":
+# from ptflops import get_model_complexity_info
+
+# model = resnet18(100, 1.0)
+
+# with torch.cuda.device(0):
+# macs, params = get_model_complexity_info(
+# model,
+# (3, 32, 32),
+# as_strings=True,
+# print_per_layer_stat=False,
+# verbose=True,
+# units="MMac",
+# )
+
+# print("{:<30} {:<8}".format("Computational complexity: ", macs))
+# print("{:<30} {:<8}".format("Number of parameters: ", params))
diff --git a/baselines/depthfl/depthfl/server.py b/baselines/depthfl/depthfl/server.py
new file mode 100644
index 000000000000..dc99ae2fc5de
--- /dev/null
+++ b/baselines/depthfl/depthfl/server.py
@@ -0,0 +1,209 @@
+"""Server for DepthFL baseline."""
+
+import copy
+from collections import OrderedDict
+from logging import DEBUG, INFO
+from typing import Callable, Dict, List, Optional, Tuple, Union
+
+import torch
+from flwr.common import FitRes, Parameters, Scalar, parameters_to_ndarrays
+from flwr.common.logger import log
+from flwr.common.typing import NDArrays
+from flwr.server.client_proxy import ClientProxy
+from flwr.server.server import Server, fit_clients
+from hydra.utils import instantiate
+from omegaconf import DictConfig
+from torch.utils.data import DataLoader
+
+from depthfl.client import prune
+from depthfl.models import test, test_sbn
+from depthfl.strategy import aggregate_fit_depthfl
+from depthfl.strategy_hetero import aggregate_fit_hetero
+
+FitResultsAndFailures = Tuple[
+ List[Tuple[ClientProxy, FitRes]],
+ List[Union[Tuple[ClientProxy, FitRes], BaseException]],
+]
+
+
+def gen_evaluate_fn(
+ testloader: DataLoader,
+ device: torch.device,
+ model: DictConfig,
+) -> Callable[
+ [int, NDArrays, Dict[str, Scalar]],
+ Tuple[float, Dict[str, Union[Scalar, List[float]]]],
+]:
+ """Generate the function for centralized evaluation.
+
+ Parameters
+ ----------
+ testloader : DataLoader
+ The dataloader to test the model with.
+ device : torch.device
+ The device to test the model on.
+ model : DictConfig
+ model configuration for instantiating
+
+ Returns
+ -------
+ Callable[ [int, NDArrays, Dict[str, Scalar]],
+ Optional[Tuple[float, Dict[str, Scalar]]] ]
+ The centralized evaluation function.
+ """
+
+ def evaluate(
+ server_round: int, parameters_ndarrays: NDArrays, config: Dict[str, Scalar]
+ ) -> Tuple[float, Dict[str, Union[Scalar, List[float]]]]:
+ # pylint: disable=unused-argument
+ """Use the entire CIFAR-100 test set for evaluation."""
+ net = instantiate(model)
+ params_dict = zip(net.state_dict().keys(), parameters_ndarrays)
+ state_dict = OrderedDict({k: torch.tensor(v) for k, v in params_dict})
+ net.load_state_dict(state_dict, strict=True)
+ net.to(device)
+
+ loss, accuracy, accuracy_single = test(net, testloader, device=device)
+ # return statistics
+ return loss, {"accuracy": accuracy, "accuracy_single": accuracy_single}
+
+ return evaluate
+
+
+def gen_evaluate_fn_hetero(
+ trainloaders: List[DataLoader],
+ testloader: DataLoader,
+ device: torch.device,
+ model_cfg: DictConfig,
+) -> Callable[
+ [int, NDArrays, Dict[str, Scalar]],
+ Tuple[float, Dict[str, Union[Scalar, List[float]]]],
+]:
+ """Generate the function for centralized evaluation.
+
+ Parameters
+ ----------
+ trainloaders : List[DataLoader]
+ The list of dataloaders to calculate statistics for BN
+ testloader : DataLoader
+ The dataloader to test the model with.
+ device : torch.device
+ The device to test the model on.
+ model_cfg : DictConfig
+ model configuration for instantiating
+
+ Returns
+ -------
+ Callable[ [int, NDArrays, Dict[str, Scalar]],
+ Optional[Tuple[float, Dict[str, Scalar]]] ]
+ The centralized evaluation function.
+ """
+
+ def evaluate( # pylint: disable=too-many-locals
+ server_round: int, parameters_ndarrays: NDArrays, config: Dict[str, Scalar]
+ ) -> Tuple[float, Dict[str, Union[Scalar, List[float]]]]:
+ # pylint: disable=unused-argument
+ """Use the entire CIFAR-100 test set for evaluation."""
+ # test per 50 rounds (sbn takes a long time)
+ if server_round % 50 != 0:
+ return 0.0, {"accuracy": 0.0, "accuracy_single": [0] * 4}
+
+ # models with different width
+ models = []
+ for i in range(4):
+ model_tmp = copy.deepcopy(model_cfg)
+ model_tmp.n_blocks = i + 1
+ models.append(model_tmp)
+
+ # load global parameters
+ param_idx_lst = []
+ nets = []
+ net_tmp = instantiate(models[-1], track=False)
+ for model in models:
+ net = instantiate(model, track=True, scale=False)
+ nets.append(net)
+ param_idx = {}
+ for k in net_tmp.state_dict().keys():
+ param_idx[k] = [
+ torch.arange(size) for size in net.state_dict()[k].shape
+ ]
+ param_idx_lst.append(param_idx)
+
+ params_dict = zip(net_tmp.state_dict().keys(), parameters_ndarrays)
+ state_dict = OrderedDict({k: torch.tensor(v) for k, v in params_dict})
+
+ for net, param_idx in zip(nets, param_idx_lst):
+ net.load_state_dict(prune(state_dict, param_idx), strict=False)
+ net.to(device)
+ net.train()
+
+ loss, accuracy, accuracy_single = test_sbn(
+ nets, trainloaders, testloader, device=device
+ )
+ # return statistics
+ return loss, {"accuracy": accuracy, "accuracy_single": accuracy_single}
+
+ return evaluate
+
+
+class ServerFedDyn(Server):
+ """Sever for FedDyn."""
+
+ def fit_round(
+ self,
+ server_round: int,
+ timeout: Optional[float],
+ ) -> Optional[
+ Tuple[Optional[Parameters], Dict[str, Scalar], FitResultsAndFailures]
+ ]:
+ """Perform a single round."""
+ # Get clients and their respective instructions from strategy
+ client_instructions = self.strategy.configure_fit(
+ server_round=server_round,
+ parameters=self.parameters,
+ client_manager=self._client_manager,
+ )
+
+ if not client_instructions:
+ log(INFO, "fit_round %s: no clients selected, cancel", server_round)
+ return None
+ log(
+ DEBUG,
+ "fit_round %s: strategy sampled %s clients (out of %s)",
+ server_round,
+ len(client_instructions),
+ self._client_manager.num_available(),
+ )
+
+ # Collect `fit` results from all clients participating in this round
+ results, failures = fit_clients(
+ client_instructions=client_instructions,
+ max_workers=self.max_workers,
+ timeout=timeout,
+ )
+ log(
+ DEBUG,
+ "fit_round %s received %s results and %s failures",
+ server_round,
+ len(results),
+ len(failures),
+ )
+
+ if "HeteroFL" in str(type(self.strategy)):
+ aggregate_fit = aggregate_fit_hetero
+ else:
+ aggregate_fit = aggregate_fit_depthfl
+
+ aggregated_result: Tuple[
+ Optional[Parameters],
+ Dict[str, Scalar],
+ ] = aggregate_fit(
+ self.strategy,
+ server_round,
+ results,
+ failures,
+ parameters_to_ndarrays(self.parameters),
+ )
+
+ parameters_aggregated, metrics_aggregated = aggregated_result
+ return parameters_aggregated, metrics_aggregated, (results, failures)
diff --git a/baselines/depthfl/depthfl/strategy.py b/baselines/depthfl/depthfl/strategy.py
new file mode 100644
index 000000000000..3414c28c4518
--- /dev/null
+++ b/baselines/depthfl/depthfl/strategy.py
@@ -0,0 +1,136 @@
+"""Strategy for DepthFL."""
+
+import os
+import pickle
+from logging import WARNING
+from typing import Dict, List, Optional, Tuple, Union
+
+import numpy as np
+import torch
+import torch.nn as nn
+from flwr.common import (
+ NDArrays,
+ Parameters,
+ Scalar,
+ ndarrays_to_parameters,
+ parameters_to_ndarrays,
+)
+from flwr.common.logger import log
+from flwr.common.typing import FitRes
+from flwr.server.client_proxy import ClientProxy
+from flwr.server.strategy import FedAvg
+from omegaconf import DictConfig
+
+
+class FedDyn(FedAvg):
+ """Applying dynamic regularization in FedDyn paper."""
+
+ def __init__(self, cfg: DictConfig, net: nn.Module, *args, **kwargs):
+ self.cfg = cfg
+ self.h_variate = [np.zeros(v.shape) for (k, v) in net.state_dict().items()]
+
+ # tagging real weights / biases
+ self.is_weight = []
+ for k in net.state_dict().keys():
+ if "weight" not in k and "bias" not in k:
+ self.is_weight.append(False)
+ else:
+ self.is_weight.append(True)
+
+ # prev_grads file for each client
+ prev_grads = [
+ {k: torch.zeros(v.numel()) for (k, v) in net.named_parameters()}
+ ] * cfg.num_clients
+
+ if not os.path.exists("prev_grads"):
+ os.makedirs("prev_grads")
+
+ for idx in range(cfg.num_clients):
+ with open(f"prev_grads/client_{idx}", "wb") as prev_grads_file:
+ pickle.dump(prev_grads[idx], prev_grads_file)
+
+ super().__init__(*args, **kwargs)
+
+
+def aggregate_fit_depthfl(
+ strategy,
+ server_round: int,
+ results: List[Tuple[ClientProxy, FitRes]],
+ failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]],
+ origin: NDArrays,
+) -> Tuple[Optional[Parameters], Dict[str, Scalar]]:
+ """Aggregate fit results using weighted average."""
+ if not results:
+ return None, {}
+ # Do not aggregate if there are failures and failures are not accepted
+ if not strategy.accept_failures and failures:
+ return None, {}
+
+ # Convert results
+ weights_results = [
+ (parameters_to_ndarrays(fit_res.parameters), fit_res.num_examples)
+ for _, fit_res in results
+ ]
+ parameters_aggregated = ndarrays_to_parameters(
+ aggregate(
+ weights_results,
+ origin,
+ strategy.h_variate,
+ strategy.is_weight,
+ strategy.cfg,
+ )
+ )
+
+ # Aggregate custom metrics if aggregation fn was provided
+ metrics_aggregated = {}
+ if strategy.fit_metrics_aggregation_fn:
+ fit_metrics = [(res.num_examples, res.metrics) for _, res in results]
+ metrics_aggregated = strategy.fit_metrics_aggregation_fn(fit_metrics)
+ elif server_round == 1: # Only log this warning once
+ log(WARNING, "No fit_metrics_aggregation_fn provided")
+
+ return parameters_aggregated, metrics_aggregated
+
+
+def aggregate(
+ results: List[Tuple[NDArrays, int]],
+ origin: NDArrays,
+ h_list: List,
+ is_weight: List,
+ cfg: DictConfig,
+) -> NDArrays:
+ """Aggregate model parameters with different depths."""
+ param_count = [0] * len(origin)
+ weights_sum = [np.zeros(v.shape) for v in origin]
+
+ # summation & counting of parameters
+ for parameters, _ in results:
+ for i, layer in enumerate(parameters):
+ weights_sum[i] += layer
+ param_count[i] += 1
+
+ # update parameters
+ for i, weight in enumerate(weights_sum):
+ if param_count[i] > 0:
+ weight = weight / param_count[i]
+ # print(np.isscalar(weight))
+
+ # update h variable for FedDyn
+ h_list[i] = (
+ h_list[i]
+ - cfg.fit_config.alpha
+ * param_count[i]
+ * (weight - origin[i])
+ / cfg.num_clients
+ )
+
+ # applying h only for weights / biases
+ if is_weight[i] and cfg.fit_config.feddyn:
+ weights_sum[i] = weight - h_list[i] / cfg.fit_config.alpha
+ else:
+ weights_sum[i] = weight
+
+ else:
+ weights_sum[i] = origin[i]
+
+ return weights_sum
diff --git a/baselines/depthfl/depthfl/strategy_hetero.py b/baselines/depthfl/depthfl/strategy_hetero.py
new file mode 100644
index 000000000000..7544204cde2f
--- /dev/null
+++ b/baselines/depthfl/depthfl/strategy_hetero.py
@@ -0,0 +1,136 @@
+"""Strategy for HeteroFL."""
+
+import os
+import pickle
+from logging import WARNING
+from typing import Dict, List, Optional, Tuple, Union
+
+import numpy as np
+import torch
+import torch.nn as nn
+from flwr.common import (
+ NDArrays,
+ Parameters,
+ Scalar,
+ ndarrays_to_parameters,
+ parameters_to_ndarrays,
+)
+from flwr.common.logger import log
+from flwr.common.typing import FitRes
+from flwr.server.client_proxy import ClientProxy
+from flwr.server.strategy import FedAvg
+from hydra.utils import instantiate
+from omegaconf import DictConfig
+
+
+class HeteroFL(FedAvg):
+ """Custom FedAvg for HeteroFL."""
+
+ def __init__(self, cfg: DictConfig, net: nn.Module, *args, **kwargs):
+ self.cfg = cfg
+ self.parameters = [np.zeros(v.shape) for (k, v) in net.state_dict().items()]
+ self.param_idx_lst = []
+
+ model = cfg.model
+ # store parameter shapes of different width
+ for i in range(4):
+ model.n_blocks = i + 1
+ net_tmp = instantiate(model)
+ param_idx = []
+ for k in net_tmp.state_dict().keys():
+ param_idx.append(
+ [torch.arange(size) for size in net_tmp.state_dict()[k].shape]
+ )
+
+ # print(net_tmp.state_dict()['conv1.weight'].shape[0])
+ self.param_idx_lst.append(param_idx)
+
+ self.is_weight = []
+
+ # tagging real weights / biases
+ for k in net.state_dict().keys():
+ if "num" in k:
+ self.is_weight.append(False)
+ else:
+ self.is_weight.append(True)
+
+ # prev_grads file for each client
+ prev_grads = [
+ {k: torch.zeros(v.numel()) for (k, v) in net.named_parameters()}
+ ] * cfg.num_clients
+
+ if not os.path.exists("prev_grads"):
+ os.makedirs("prev_grads")
+
+ for idx in range(cfg.num_clients):
+ with open(f"prev_grads/client_{idx}", "wb") as prev_grads_file:
+ pickle.dump(prev_grads[idx], prev_grads_file)
+
+ super().__init__(*args, **kwargs)
+
+ def aggregate_hetero(
+ self, results: List[Tuple[NDArrays, Union[bool, bytes, float, int, str]]]
+ ):
+ """Aggregate function for HeteroFL."""
+ for i, params in enumerate(self.parameters):
+ count = np.zeros(params.shape)
+ tmp_v = np.zeros(params.shape)
+ if self.is_weight[i]:
+ for weights, cid in results:
+ if self.cfg.exclusive_learning:
+ cid = self.cfg.model_size * (self.cfg.num_clients // 4) - 1
+
+ tmp_v[
+ torch.meshgrid(
+ self.param_idx_lst[cid // (self.cfg.num_clients // 4)][i]
+ )
+ ] += weights[i]
+ count[
+ torch.meshgrid(
+ self.param_idx_lst[cid // (self.cfg.num_clients // 4)][i]
+ )
+ ] += 1
+ tmp_v[count > 0] = np.divide(tmp_v[count > 0], count[count > 0])
+ params[count > 0] = tmp_v[count > 0]
+
+ else:
+ for weights, _ in results:
+ tmp_v += weights[i]
+ count += 1
+ tmp_v = np.divide(tmp_v, count)
+ params = tmp_v
+
+
+def aggregate_fit_hetero(
+ strategy,
+ server_round: int,
+ results: List[Tuple[ClientProxy, FitRes]],
+ failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]],
+ origin: NDArrays,
+) -> Tuple[Optional[Parameters], Dict[str, Scalar]]:
+ """Aggregate fit results using weighted average."""
+ if not results:
+ return None, {}
+ # Do not aggregate if there are failures and failures are not accepted
+ if not strategy.accept_failures and failures:
+ return None, {}
+
+ # Convert results
+ weights_results = [
+ (parameters_to_ndarrays(fit_res.parameters), fit_res.metrics["cid"])
+ for _, fit_res in results
+ ]
+
+ strategy.parameters = origin
+ strategy.aggregate_hetero(weights_results)
+ parameters_aggregated = ndarrays_to_parameters(strategy.parameters)
+
+ # Aggregate custom metrics if aggregation fn was provided
+ metrics_aggregated = {}
+ if strategy.fit_metrics_aggregation_fn:
+ fit_metrics = [(res.num_examples, res.metrics) for _, res in results]
+ metrics_aggregated = strategy.fit_metrics_aggregation_fn(fit_metrics)
+ elif server_round == 1: # Only log this warning once
+ log(WARNING, "No fit_metrics_aggregation_fn provided")
+
+ return parameters_aggregated, metrics_aggregated
diff --git a/baselines/depthfl/depthfl/utils.py b/baselines/depthfl/depthfl/utils.py
new file mode 100644
index 000000000000..fad2afcad4be
--- /dev/null
+++ b/baselines/depthfl/depthfl/utils.py
@@ -0,0 +1,66 @@
+"""Contains utility functions for CNN FL on MNIST."""
+
+import pickle
+from pathlib import Path
+from secrets import token_hex
+from typing import Dict, Union
+
+from flwr.server.history import History
+
+
+def save_results_as_pickle(
+ history: History,
+ file_path: Union[str, Path],
+ extra_results: Dict,
+ default_filename: str = "results.pkl",
+) -> None:
+ """Save results from simulation to pickle.
+
+ Parameters
+ ----------
+ history: History
+ History returned by start_simulation.
+ file_path: Union[str, Path]
+ Path to file to create and store both history and extra_results.
+ If path is a directory, the default_filename will be used.
+ path doesn't exist, it will be created. If file exists, a
+ randomly generated suffix will be added to the file name. This
+ is done to avoid overwritting results.
+ extra_results : Dict
+ A dictionary containing additional results you would like
+ to be saved to disk. Default: {} (an empty dictionary)
+ default_filename: Optional[str]
+ File used by default if file_path points to a directory instead
+ to a file. Default: "results.pkl"
+ """
+ path = Path(file_path)
+
+ # ensure path exists
+ path.mkdir(exist_ok=True, parents=True)
+
+ def _add_random_suffix(path_: Path):
+ """Add a randomly generated suffix to the file name."""
+ print(f"File `{path_}` exists! ")
+ suffix = token_hex(4)
+ print(f"New results to be saved with suffix: {suffix}")
+ return path_.parent / (path_.stem + "_" + suffix + ".pkl")
+
+ def _complete_path_with_default_name(path_: Path):
+ """Append the default file name to the path."""
+ print("Using default filename")
+ return path_ / default_filename
+
+ if path.is_dir():
+ path = _complete_path_with_default_name(path)
+
+ if path.is_file():
+ # file exists already
+ path = _add_random_suffix(path)
+
+ print(f"Results will be saved into: {path}")
+
+ data = {"history": history, **extra_results}
+
+ # save results to pickle
+ with open(str(path), "wb") as handle:
+ pickle.dump(data, handle, protocol=pickle.HIGHEST_PROTOCOL)
diff --git a/baselines/depthfl/pyproject.toml b/baselines/depthfl/pyproject.toml
new file mode 100644
index 000000000000..2f928c2d3553
--- /dev/null
+++ b/baselines/depthfl/pyproject.toml
@@ -0,0 +1,141 @@
+[build-system]
+requires = ["poetry-core>=1.4.0"]
+build-backend = "poetry.masonry.api"
+
+[tool.poetry]
+name = "depthfl" # <----- Ensure it matches the name of your baseline directory containing all the source code
+version = "1.0.0"
+description = "DepthFL: Depthwise Federated Learning for Heterogeneous Clients"
+license = "Apache-2.0"
+authors = ["Minjae Kim "]
+readme = "README.md"
+homepage = "https://flower.dev"
+repository = "https://github.com/adap/flower"
+documentation = "https://flower.dev"
+classifiers = [
+ "Development Status :: 3 - Alpha",
+ "Intended Audience :: Developers",
+ "Intended Audience :: Science/Research",
+ "License :: OSI Approved :: Apache Software License",
+ "Operating System :: MacOS :: MacOS X",
+ "Operating System :: POSIX :: Linux",
+ "Programming Language :: Python",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3 :: Only",
+ "Programming Language :: Python :: 3.8",
+ "Programming Language :: Python :: 3.9",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: Implementation :: CPython",
+ "Topic :: Scientific/Engineering",
+ "Topic :: Scientific/Engineering :: Artificial Intelligence",
+ "Topic :: Scientific/Engineering :: Mathematics",
+ "Topic :: Software Development",
+ "Topic :: Software Development :: Libraries",
+ "Topic :: Software Development :: Libraries :: Python Modules",
+ "Typing :: Typed",
+]
+
+[tool.poetry.dependencies]
+python = ">=3.10.0, <3.11.0"
+flwr = { extras = ["simulation"], version = "1.5.0" }
+hydra-core = "1.3.2" # don't change this
+matplotlib = "3.7.1"
+torch = { url = "https://download.pytorch.org/whl/cu116/torch-1.13.1%2Bcu116-cp310-cp310-linux_x86_64.whl"}
+torchvision = { url = "https://download.pytorch.org/whl/cu116/torchvision-0.14.1%2Bcu116-cp310-cp310-linux_x86_64.whl"}
+
+
+[tool.poetry.dev-dependencies]
+isort = "==5.11.5"
+black = "==23.1.0"
+docformatter = "==1.5.1"
+mypy = "==1.4.1"
+pylint = "==2.8.2"
+flake8 = "==3.9.2"
+pytest = "==6.2.4"
+pytest-watch = "==4.2.0"
+ruff = "==0.0.272"
+types-requests = "==2.27.7"
+
+[tool.isort]
+line_length = 88
+indent = " "
+multi_line_output = 3
+include_trailing_comma = true
+force_grid_wrap = 0
+use_parentheses = true
+
+[tool.black]
+line-length = 88
+target-version = ["py38", "py39", "py310", "py311"]
+
+[tool.pytest.ini_options]
+minversion = "6.2"
+addopts = "-qq"
+testpaths = [
+ "flwr_baselines",
+]
+
+[tool.mypy]
+ignore_missing_imports = true
+strict = false
+plugins = "numpy.typing.mypy_plugin"
+
+[tool.pylint."MESSAGES CONTROL"]
+disable = "bad-continuation,duplicate-code,too-few-public-methods,useless-import-alias"
+good-names = "i,j,k,_,x,y,X,Y"
+signature-mutators="hydra.main.main"
+
+[tool.pylint.typecheck]
+generated-members="numpy.*, torch.*, tensorflow.*"
+
+[[tool.mypy.overrides]]
+module = [
+ "importlib.metadata.*",
+ "importlib_metadata.*",
+]
+follow_imports = "skip"
+follow_imports_for_stubs = true
+disallow_untyped_calls = false
+
+[[tool.mypy.overrides]]
+module = "torch.*"
+follow_imports = "skip"
+follow_imports_for_stubs = true
+
+[tool.docformatter]
+wrap-summaries = 88
+wrap-descriptions = 88
+
+[tool.ruff]
+target-version = "py38"
+line-length = 88
+select = ["D", "E", "F", "W", "B", "ISC", "C4"]
+fixable = ["D", "E", "F", "W", "B", "ISC", "C4"]
+ignore = ["B024", "B027"]
+exclude = [
+ ".bzr",
+ ".direnv",
+ ".eggs",
+ ".git",
+ ".hg",
+ ".mypy_cache",
+ ".nox",
+ ".pants.d",
+ ".pytype",
+ ".ruff_cache",
+ ".svn",
+ ".tox",
+ ".venv",
+ "__pypackages__",
+ "_build",
+ "buck-out",
+ "build",
+ "dist",
+ "node_modules",
+ "venv",
+ "proto",
+]
+
+[tool.ruff.pydocstyle]
+convention = "numpy"
diff --git a/baselines/fedper/LICENSE b/baselines/fedper/LICENSE
new file mode 100644
index 000000000000..d64569567334
--- /dev/null
+++ b/baselines/fedper/LICENSE
@@ -0,0 +1,202 @@
+
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
+
+ APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "[]"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+ Copyright [yyyy] [name of copyright owner]
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
diff --git a/baselines/fedper/README.md b/baselines/fedper/README.md
new file mode 100644
index 000000000000..157bc22d2da5
--- /dev/null
+++ b/baselines/fedper/README.md
@@ -0,0 +1,152 @@
+---
+title: Federated Learning with Personalization Layers
+url: https://arxiv.org/abs/1912.00818
+labels: [system heterogeneity, image classification, personalization, horizontal data partition]
+dataset: [CIFAR-10, FLICKR-AES]
+---
+
+# Federated Learning with Personalization Layers
+
+> Note: If you use this baseline in your work, please remember to cite the original authors of the paper as well as the Flower paper.
+
+**Paper:** [arxiv.org/abs/1912.00818](https://arxiv.org/abs/1912.00818)
+
+**Authors:** Manoj Ghuhan Arivazhagan, Vinay Aggarwal, Aaditya Kumar Singh, and Sunav Choudhary
+
+**Abstract:** The emerging paradigm of federated learning strives to enable collaborative training of machine learning models on the network edge without centrally aggregating raw data and hence, improving data privacy. This sharply deviates from traditional machine learning and necessitates design of algorithms robust to various sources of heterogeneity. Specifically, statistical heterogeneity of data across user devices can severely degrade performance of standard federated averaging for traditional machine learning applications like personalization with deep learning. This paper proposes `FedPer`, a base + personalization layer approach for federated training of deep feed forward neural networks, which can combat the ill-effects of statistical heterogeneity. We demonstrate effectiveness of `FedPer` for non-identical data partitions of CIFAR datasets and on a personalized image aesthetics dataset from Flickr.
+
+## About this baseline
+
+**What’s implemented:** The code in this directory replicates the experiments in _Federated Learning with Personalization Layers_ (Arivazhagan et al., 2019) for CIFAR10 and FLICKR-AES datasets, which proposed the `FedPer` model. Specifically, it replicates the results found in figures 2, 4, 7, and 8 in their paper. __Note__ that there is typo in the caption of Figure 4 in the article, it should be CIFAR10 and __not__ CIFAR100.
+
+**Datasets:** CIFAR10 from PyTorch's Torchvision and FLICKR-AES. FLICKR-AES was proposed as dataset in _Personalized Image Aesthetics_ (Ren et al., 2017) and can be downloaded using a link provided on thier [GitHub](https://github.com/alanspike/personalizedImageAesthetics). One must first download FLICKR-AES-001.zip (5.76GB), extract all inside and place in baseline/FedPer/datasets. To this location, also download the other 2 related files: (1) FLICKR-AES_image_labeled_by_each_worker.csv, and (2) FLICKR-AES_image_score.txt. Images are also scaled to 224x224 for both datasets. This is not explicitly stated in the paper but seems to be boosting performance. Also, for FLICKR dataset, it is stated in the paper that they use data from clients with more than 60 and less than 290 rated images. This amounts to circa 60 clients and we randomly select 30 out of these (as in paper). Therefore, the results might differ somewhat but only slighly. Since the pre-processing steps in the paper are somewhat obscure, the metric values in the plots below may differ slightly, but not the overall results and findings.
+
+```bash
+# These steps are not needed if you are only interested in CIFAR-10
+
+# Create the `datasets` directory if it doesn't exist already
+mkdir datasets
+
+# move/copy the downloaded FLICKR-AES-001.zip file to `datasets/`
+
+# unzip dataset to a directory named `flickr`
+cd datasets
+unzip FLICKR-AES-001.zip -d flickr
+
+# then move the .csv files inside flickr
+mv FLICKR-AES_image_labeled_by_each_worker.csv flickr
+mv FLICKR-AES_image_score.txt flickr
+```
+
+**Hardware Setup:** Experiments have been carried out on GPU. 2 different computers managed to run experiments:
+
+- GeForce RTX 3080 16GB
+- GeForce RTX 4090 24GB
+
+It's worth mentioning that GPU memory for each client is ~7.5GB. When training on powerful GPUs, one can reduce ratio of GPU needed for each client in the configuration setting to e.g. `num_gpus` to 0.33.
+
+> NOTE: One experiment carried out using 1 GPU (RTX 4090) takes somehwere between 1-3h depending on dataset and model. Running ResNet34 compared to MobileNet-v1 takes approximately 10-15% longer.
+
+**Contributors:** [William Lindskog](https://github.com/WilliamLindskog)
+
+
+## Experimental Setup
+
+**Task:** Image Classification
+
+**Model:** This directory implements 2 models:
+
+- ResNet34 which can be imported directly (after having installed the packages) from PyTorch, using `from torchvision.models import resnet34
+- MobileNet-v1
+
+Please see how models are implemented using a so called model_manager and model_split class since FedPer uses head and base layers in a neural network. These classes are defined in the models.py file and thereafter called when building new models in the directory /implemented_models. Please, extend and add new models as you wish.
+
+**Dataset:** CIFAR10, FLICKR-AES. CIFAR10 will be partitioned based on number of classes for data that each client shall recieve e.g. 4 allocated classes could be [1, 3, 5, 9]. FLICKR-AES is an unbalanced dataset, so there we only apply random sampling.
+
+**Training Hyperparameters:** The hyperparameters can be found in conf/base.yaml file which is the configuration file for the main script.
+
+| Description | Default Value |
+| ----------- | ----- |
+| num_clients | 10 |
+| clients per round | 10 |
+| number of rounds | 50 |
+| client resources | {'num_cpus': 4, 'num_gpus': 1 }|
+| learning_rate | 0.01 |
+| batch_size | 128 |
+| optimizer | SGD |
+| algorithm | fedavg|
+
+**Stateful Clients:**
+In this Baseline (FedPer), we must store the state of the local client head while aggregation of body parameters happen at the server. Flower is currently making this possible but for the time being, we reside to storing client _head_ state in a folder called client_states. We store the values after each fit and evaluate function carried out on each client, and call for the state before executing these funcitons. Moreover, the state of a unique client is accessed using the client ID.
+
+> NOTE: This is a work-around so that the local head parameters are not reset before each fit and evaluate. Nevertheless, it can come to change with future releases.
+
+
+## Environment Setup
+
+To construct the Python environment follow these steps:
+
+```bash
+# Set Python 3.10
+pyenv local 3.10.6
+# Tell poetry to use python 3.10
+poetry env use 3.10.6
+
+# Install the base Poetry environment
+poetry install
+
+# Activate the environment
+poetry shell
+```
+
+## Running the Experiments
+```bash
+python -m fedper.main # this will run using the default settings in the `conf/base.yaml`
+
+# When running models for flickr dataset, it is important to keep batch size at 4 or lower since some clients (for reproducing experiment) will have very few examples of one class
+```
+
+While the config files contain a large number of settings, the ones below are the main ones you'd likely want to modify to .
+```bash
+algorithm: fedavg, fedper # these are currently supported
+server_device: 'cuda:0', 'cpu'
+dataset.name: 'cifar10', 'flickr'
+num_classes: 10, 5 # respectively
+dataset.num_classes: 4, 8, 10 # for non-iid split assigning n num_classes to each client (these numbers for CIFAR10 experiments)
+model_name: mobile, resnet
+```
+
+To run multiple runs, one can also reside to `HYDRA`'s multirun option.
+```bash
+# for CIFAR10
+python -m fedper.main --multirun --config_name cifar10 dataset.num_classes=4,8,10 model_name=resnet,mobile algorithm=fedper,fedavg model.num_head_layers=2,3
+
+# to repeat each run 5 times, one can also add
+python -m fedper.main --multirun --config_name cifar10 dataset.num_classes=4,8,10 model_name=resnet,mobile algorithm=fedper,fedavg model.num_head_layers=2,3 '+repeat_num=range(5)'
+```
+
+
+## Expected Results
+
+To reproduce figures make `fedper/run_figures.sh` executable and run it. By default all experiments will be run:
+
+```bash
+# Make fedper/run_figures.sh executable
+chmod u+x fedper/run_figures.sh
+# Run the script
+bash fedper/run_figures.sh
+```
+
+Having run the `run_figures.sh`, the expected results should look something like this:
+
+**MobileNet-v1 and ResNet-34 on CIFAR10**
+
+
+
+**MobileNet-v1 and ResNet-34 on CIFAR10 using varying size of head**
+
+
+
+**MobileNet-v1 and ResNet-34 on FLICKR-AES**
+
+
\ No newline at end of file
diff --git a/baselines/fedper/_static/mobile_plot_figure_2.png b/baselines/fedper/_static/mobile_plot_figure_2.png
new file mode 100644
index 000000000000..b485b850fb39
Binary files /dev/null and b/baselines/fedper/_static/mobile_plot_figure_2.png differ
diff --git a/baselines/fedper/_static/mobile_plot_figure_flickr.png b/baselines/fedper/_static/mobile_plot_figure_flickr.png
new file mode 100644
index 000000000000..76e99927df36
Binary files /dev/null and b/baselines/fedper/_static/mobile_plot_figure_flickr.png differ
diff --git a/baselines/fedper/_static/mobile_plot_figure_num_head.png b/baselines/fedper/_static/mobile_plot_figure_num_head.png
new file mode 100644
index 000000000000..9dcb9f0a3f33
Binary files /dev/null and b/baselines/fedper/_static/mobile_plot_figure_num_head.png differ
diff --git a/baselines/fedper/_static/resnet_plot_figure_2.png b/baselines/fedper/_static/resnet_plot_figure_2.png
new file mode 100644
index 000000000000..14e3a7145a23
Binary files /dev/null and b/baselines/fedper/_static/resnet_plot_figure_2.png differ
diff --git a/baselines/fedper/_static/resnet_plot_figure_flickr.png b/baselines/fedper/_static/resnet_plot_figure_flickr.png
new file mode 100644
index 000000000000..4e6ba71489b7
Binary files /dev/null and b/baselines/fedper/_static/resnet_plot_figure_flickr.png differ
diff --git a/baselines/fedper/_static/resnet_plot_figure_num_head.png b/baselines/fedper/_static/resnet_plot_figure_num_head.png
new file mode 100644
index 000000000000..03c6ac88b84a
Binary files /dev/null and b/baselines/fedper/_static/resnet_plot_figure_num_head.png differ
diff --git a/baselines/fedper/fedper/__init__.py b/baselines/fedper/fedper/__init__.py
new file mode 100644
index 000000000000..a5e567b59135
--- /dev/null
+++ b/baselines/fedper/fedper/__init__.py
@@ -0,0 +1 @@
+"""Template baseline package."""
diff --git a/baselines/fedper/fedper/client.py b/baselines/fedper/fedper/client.py
new file mode 100644
index 000000000000..83babbd9613f
--- /dev/null
+++ b/baselines/fedper/fedper/client.py
@@ -0,0 +1,353 @@
+"""Client implementation - can call FedPer and FedAvg clients."""
+import pickle
+from collections import OrderedDict, defaultdict
+from pathlib import Path
+from typing import Any, Callable, Dict, List, Tuple, Type, Union
+
+import numpy as np
+import torch
+from flwr.client import NumPyClient
+from flwr.common import NDArrays, Scalar
+from omegaconf import DictConfig
+from torch.utils.data import DataLoader, Subset, random_split
+from torchvision import transforms
+from torchvision.datasets import ImageFolder
+
+from fedper.constants import MEAN, STD
+from fedper.dataset_preparation import call_dataset
+from fedper.implemented_models.mobile_model import MobileNetModelManager
+from fedper.implemented_models.resnet_model import ResNetModelManager
+
+PROJECT_DIR = Path(__file__).parent.parent.absolute()
+
+
+class ClientDataloaders:
+ """Client dataloaders."""
+
+ def __init__(
+ self,
+ trainloader: DataLoader,
+ testloader: DataLoader,
+ ) -> None:
+ """Initialize the client dataloaders."""
+ self.trainloader = trainloader
+ self.testloader = testloader
+
+
+class ClientEssentials:
+ """Client essentials."""
+
+ def __init__(
+ self,
+ client_id: str,
+ client_state_save_path: str = "",
+ ) -> None:
+ """Set client state save path and client ID."""
+ self.client_id = int(client_id)
+ self.client_state_save_path = (
+ (client_state_save_path + f"/client_{self.client_id}")
+ if client_state_save_path != ""
+ else None
+ )
+
+
+class BaseClient(NumPyClient):
+ """Implementation of Federated Averaging (FedAvg) Client."""
+
+ def __init__(
+ self,
+ data_loaders: ClientDataloaders,
+ config: DictConfig,
+ client_essentials: ClientEssentials,
+ model_manager_class: Union[
+ Type[MobileNetModelManager], Type[ResNetModelManager]
+ ],
+ ):
+ """Initialize client attributes.
+
+ Args:
+ config: dictionary containing the client configurations.
+ client_id: id of the client.
+ model_manager_class: class to be used as the model manager.
+ """
+ super().__init__()
+
+ self.train_id = 1
+ self.test_id = 1
+ self.client_id = int(client_essentials.client_id)
+ self.client_state_save_path = client_essentials.client_state_save_path
+ self.hist: Dict[str, Dict[str, Any]] = defaultdict(dict)
+ self.num_epochs: int = config["num_epochs"]
+ self.model_manager = model_manager_class(
+ client_id=self.client_id,
+ config=config,
+ trainloader=data_loaders.trainloader,
+ testloader=data_loaders.testloader,
+ client_save_path=self.client_state_save_path,
+ learning_rate=config["learning_rate"],
+ )
+
+ def get_parameters(self, config: Dict[str, Scalar]) -> NDArrays:
+ """Return the current local model parameters."""
+ return self.model_manager.model.get_parameters()
+
+ def set_parameters(
+ self, parameters: List[np.ndarray], evaluate: bool = False
+ ) -> None:
+ """Set the local model parameters to the received parameters.
+
+ Args:
+ parameters: parameters to set the model to.
+ """
+ _ = evaluate
+ model_keys = [
+ k
+ for k in self.model_manager.model.state_dict().keys()
+ if k.startswith("_body") or k.startswith("_head")
+ ]
+ params_dict = zip(model_keys, parameters)
+
+ state_dict = OrderedDict({k: torch.tensor(v) for k, v in params_dict})
+
+ self.model_manager.model.set_parameters(state_dict)
+
+ def perform_train(
+ self,
+ ) -> Dict[str, Union[List[Dict[str, float]], int, float]]:
+ """Perform local training to the whole model.
+
+ Returns
+ -------
+ Dict with the train metrics.
+ """
+ epochs = self.num_epochs
+
+ self.model_manager.model.enable_body()
+ self.model_manager.model.enable_head()
+
+ return self.model_manager.train(
+ epochs=epochs,
+ )
+
+ def fit(
+ self, parameters: NDArrays, config: Dict[str, Scalar]
+ ) -> Tuple[NDArrays, int, Dict[str, Union[bool, bytes, float, int, str]]]:
+ """Train the provided parameters using the locally held dataset.
+
+ Args:
+ parameters: The current (global) model parameters.
+ config: configuration parameters for training sent by the server.
+
+ Returns
+ -------
+ Tuple containing the locally updated model parameters, \
+ the number of examples used for training and \
+ the training metrics.
+ """
+ self.set_parameters(parameters)
+
+ train_results = self.perform_train()
+
+ # Update train history
+ self.hist[str(self.train_id)] = {
+ **self.hist[str(self.train_id)],
+ "trn": train_results,
+ }
+ print("<------- TRAIN RESULTS -------> :", train_results)
+
+ self.train_id += 1
+
+ return self.get_parameters(config), self.model_manager.train_dataset_size(), {}
+
+ def evaluate(
+ self, parameters: NDArrays, config: Dict[str, Scalar]
+ ) -> Tuple[float, int, Dict[str, Union[bool, bytes, float, int, str]]]:
+ """Evaluate the provided global parameters using the locally held dataset.
+
+ Args:
+ parameters: The current (global) model parameters.
+ config: configuration parameters for training sent by the server.
+
+ Returns
+ -------
+ Tuple containing the test loss, \
+ the number of examples used for evaluation and \
+ the evaluation metrics.
+ """
+ self.set_parameters(parameters, evaluate=True)
+
+ # Test the model
+ tst_results = self.model_manager.test()
+ print("<------- TEST RESULTS -------> :", tst_results)
+
+ # Update test history
+ self.hist[str(self.test_id)] = {
+ **self.hist[str(self.test_id)],
+ "tst": tst_results,
+ }
+ self.test_id += 1
+
+ return (
+ tst_results.get("loss", 0.0),
+ self.model_manager.test_dataset_size(),
+ {k: v for k, v in tst_results.items() if not isinstance(v, (dict, list))},
+ )
+
+
+class FedPerClient(BaseClient):
+ """Implementation of Federated Personalization (FedPer) Client."""
+
+ def get_parameters(self, config: Dict[str, Scalar]) -> NDArrays:
+ """Return the current local body parameters."""
+ return [
+ val.cpu().numpy()
+ for _, val in self.model_manager.model.body.state_dict().items()
+ ]
+
+ def set_parameters(self, parameters: List[np.ndarray], evaluate=False) -> None:
+ """Set the local body parameters to the received parameters.
+
+ Args:
+ parameters: parameters to set the body to.
+ evaluate: whether the client is evaluating or not.
+ """
+ model_keys = [
+ k
+ for k in self.model_manager.model.state_dict().keys()
+ if k.startswith("_body")
+ ]
+
+ if not evaluate:
+ # Only update client's local head if it hasn't trained yet
+ print("Setting head parameters to global head parameters.")
+ model_keys.extend(
+ [
+ k
+ for k in self.model_manager.model.state_dict().keys()
+ if k.startswith("_head")
+ ]
+ )
+
+ params_dict = zip(model_keys, parameters)
+
+ state_dict = OrderedDict({k: torch.tensor(v) for k, v in params_dict})
+
+ self.model_manager.model.set_parameters(state_dict)
+
+
+def get_client_fn_simulation(
+ config: DictConfig,
+ client_state_save_path: str = "",
+) -> Callable[[str], Union[FedPerClient, BaseClient]]:
+ """Generate the client function that creates the Flower Clients.
+
+ Parameters
+ ----------
+ model : DictConfig
+ The model configuration.
+ cleint_state_save_path : str
+ The path to save the client state.
+
+ Returns
+ -------
+ Tuple[Callable[[str], FlowerClient], DataLoader]
+ A tuple containing the client function that creates Flower Clients and
+ the DataLoader that will be used for testing
+ """
+ assert config.model_name.lower() in [
+ "mobile",
+ "resnet",
+ ], f"Model {config.model.name} not implemented"
+
+ # load dataset and clients' data indices
+ if config.dataset.name.lower() == "cifar10":
+ try:
+ partition_path = (
+ PROJECT_DIR / "datasets" / config.dataset.name / "partition.pkl"
+ )
+ print(f"Loading partition from {partition_path}")
+ with open(partition_path, "rb") as pickle_file:
+ partition = pickle.load(pickle_file)
+ data_indices: Dict[int, Dict[str, List[int]]] = partition["data_indices"]
+ except FileNotFoundError as error:
+ print(f"Partition not found at {partition_path}")
+ raise error
+
+ # - you can define your own data transformation strategy here -
+ general_data_transform = transforms.Compose(
+ [
+ transforms.Resize((224, 224)),
+ transforms.RandomCrop(224, padding=4),
+ # transforms.RandomHorizontalFlip(),
+ # transforms.ToTensor(),
+ transforms.Normalize(
+ MEAN[config.dataset.name], STD[config.dataset.name]
+ ),
+ ]
+ )
+ # ------------------------------------------------------------
+
+ def client_fn(cid: str) -> BaseClient:
+ """Create a Flower client representing a single organization."""
+ cid_use = int(cid)
+ if config.dataset.name.lower() == "flickr":
+ transform = transforms.Compose(
+ [
+ transforms.Resize((224, 224)),
+ transforms.ToTensor(),
+ ]
+ )
+ data_path = (
+ PROJECT_DIR / "datasets" / config.dataset.name / "tmp" / f"client_{cid}"
+ )
+ dataset = ImageFolder(root=data_path, transform=transform)
+ trainset, testset = random_split(
+ dataset,
+ [int(len(dataset) * 0.8), len(dataset) - int(len(dataset) * 0.8)],
+ )
+ else:
+ dataset = call_dataset(
+ dataset_name=config.dataset.name,
+ root=PROJECT_DIR / "datasets" / config.dataset.name,
+ general_data_transform=general_data_transform,
+ )
+
+ trainset = Subset(dataset, indices=[])
+ testset = Subset(dataset, indices=[])
+ trainset.indices = data_indices[cid_use]["train"]
+ testset.indices = data_indices[cid_use]["test"]
+
+ # Create the train loader
+ trainloader = DataLoader(trainset, config.batch_size, shuffle=False)
+ # Create the test loader
+ testloader = DataLoader(testset, config.batch_size)
+
+ manager: Union[
+ Type[MobileNetModelManager], Type[ResNetModelManager]
+ ] = MobileNetModelManager
+ if config.model_name.lower() == "resnet":
+ manager = ResNetModelManager
+ elif config.model_name.lower() == "mobile":
+ manager = MobileNetModelManager
+ else:
+ raise NotImplementedError("Model not implemented, check name.")
+ client_data_loaders = ClientDataloaders(trainloader, testloader)
+ client_essentials = ClientEssentials(
+ client_id=cid,
+ client_state_save_path=client_state_save_path,
+ )
+ if client_state_save_path != "":
+ return FedPerClient(
+ data_loaders=client_data_loaders,
+ client_essentials=client_essentials,
+ config=config,
+ model_manager_class=manager,
+ )
+ return BaseClient(
+ data_loaders=client_data_loaders,
+ client_essentials=client_essentials,
+ config=config,
+ model_manager_class=manager,
+ )
+
+ return client_fn
diff --git a/baselines/fedper/fedper/conf/base.yaml b/baselines/fedper/fedper/conf/base.yaml
new file mode 100644
index 000000000000..b0b9778d4682
--- /dev/null
+++ b/baselines/fedper/fedper/conf/base.yaml
@@ -0,0 +1,44 @@
+---
+num_clients: 10 # total number of clients
+num_epochs: 4 # number of local epochs
+batch_size: 128
+num_rounds: 100
+clients_per_round: 10
+learning_rate: 0.01
+algorithm: fedper
+model_name: resnet
+
+client_resources:
+ num_cpus: 4
+ num_gpus: 1
+
+server_device: cuda:0
+
+dataset:
+ name : "cifar10"
+ split: sample
+ num_classes: 10
+ seed: 42
+ num_clients: ${num_clients}
+ fraction: 0.83
+
+model:
+ _target_: null
+ num_head_layers: 2
+ num_classes: 10
+
+fit_config:
+ drop_client: false
+ epochs : ${num_epochs}
+ batch_size: ${batch_size}
+
+strategy:
+ _target_: fedPer.server.DefaultStrategyPipeline
+ fraction_fit: 0.00001 # because we want the number of clients to sample on each roudn to be solely defined by min_fit_clients
+ min_fit_clients: ${clients_per_round}
+ fraction_evaluate: 0.0
+ min_evaluate_clients: ${clients_per_round}
+ min_available_clients: ${num_clients}
+ algorithm: ${algorithm}
+ evaluate_fn: None
+ on_evaluate_config_fn: None
\ No newline at end of file
diff --git a/baselines/fedper/fedper/conf/cifar10.yaml b/baselines/fedper/fedper/conf/cifar10.yaml
new file mode 100644
index 000000000000..66a06d481507
--- /dev/null
+++ b/baselines/fedper/fedper/conf/cifar10.yaml
@@ -0,0 +1,44 @@
+---
+num_clients: 10 # total number of clients
+num_epochs: 4 # number of local epochs
+batch_size: 128
+num_rounds: 50
+clients_per_round: 10
+learning_rate: 0.01
+algorithm: fedavg
+model_name: resnet
+
+client_resources:
+ num_cpus: 4
+ num_gpus: 1
+
+server_device: cuda:0
+
+dataset:
+ name : "cifar10"
+ split: sample
+ num_classes: 10
+ seed: 42
+ num_clients: ${num_clients}
+ fraction: 0.83
+
+model:
+ _target_: null
+ num_head_layers: 2
+ num_classes: 10
+
+fit_config:
+ drop_client: false
+ epochs : ${num_epochs}
+ batch_size: ${batch_size}
+
+strategy:
+ _target_: fedPer.server.DefaultStrategyPipeline
+ fraction_fit: 0.00001 # because we want the number of clients to sample on each roudn to be solely defined by min_fit_clients
+ min_fit_clients: ${clients_per_round}
+ fraction_evaluate: 0.0
+ min_evaluate_clients: ${clients_per_round}
+ min_available_clients: ${num_clients}
+ algorithm: ${algorithm}
+ evaluate_fn: None
+ on_evaluate_config_fn: None
\ No newline at end of file
diff --git a/baselines/fedper/fedper/conf/flickr.yaml b/baselines/fedper/fedper/conf/flickr.yaml
new file mode 100644
index 000000000000..341b1c0ac6c2
--- /dev/null
+++ b/baselines/fedper/fedper/conf/flickr.yaml
@@ -0,0 +1,44 @@
+---
+num_clients: 30 # total number of clients
+num_epochs: 4 # number of local epochs
+batch_size: 4
+num_rounds: 35
+clients_per_round: 30
+learning_rate: 0.01
+algorithm: fedper
+model_name: resnet
+
+client_resources:
+ num_cpus: 4
+ num_gpus: 1
+
+server_device: cuda:0
+
+dataset:
+ name : "flickr"
+ split: sample
+ num_classes: 5
+ seed: 42
+ num_clients: ${num_clients}
+ fraction: 0.80
+
+model:
+ _target_: null
+ num_head_layers: 2
+ num_classes: 5
+
+fit_config:
+ drop_client: false
+ epochs : ${num_epochs}
+ batch_size: ${batch_size}
+
+strategy:
+ _target_: fedPer.server.DefaultStrategyPipeline
+ fraction_fit: 0.00001 # because we want the number of clients to sample on each roudn to be solely defined by min_fit_clients
+ min_fit_clients: ${clients_per_round}
+ fraction_evaluate: 0.0
+ min_evaluate_clients: ${clients_per_round}
+ min_available_clients: ${num_clients}
+ algorithm: ${algorithm}
+ evaluate_fn: None
+ on_evaluate_config_fn: None
\ No newline at end of file
diff --git a/baselines/fedper/fedper/constants.py b/baselines/fedper/fedper/constants.py
new file mode 100644
index 000000000000..3eda77c5134e
--- /dev/null
+++ b/baselines/fedper/fedper/constants.py
@@ -0,0 +1,23 @@
+"""Constants used in machine learning pipeline."""
+from enum import Enum
+
+
+# FL Algorithms
+class Algorithms(Enum):
+ """Enum for FL algorithms."""
+
+ FEDAVG = "FedAvg"
+ FEDPER = "FedPer"
+
+
+# FL Default Train and Fine-Tuning Epochs
+DEFAULT_TRAIN_EP = 5
+DEFAULT_FT_EP = 5
+
+MEAN = {
+ "cifar10": [0.4915, 0.4823, 0.4468],
+}
+
+STD = {
+ "cifar10": [0.2470, 0.2435, 0.2616],
+}
diff --git a/baselines/fedper/fedper/dataset.py b/baselines/fedper/fedper/dataset.py
new file mode 100644
index 000000000000..81a95286b1b8
--- /dev/null
+++ b/baselines/fedper/fedper/dataset.py
@@ -0,0 +1,85 @@
+"""Handle basic dataset creation.
+
+In case of PyTorch it should return dataloaders for your dataset (for both the clients
+and the server). If you are using a custom dataset class, this module is the place to
+define it. If your dataset requires to be downloaded (and this is not done
+automatically -- e.g. as it is the case for many dataset in TorchVision) and
+partitioned, please include all those functions and logic in the
+`dataset_preparation.py` module. You can use all those functions from functions/methods
+defined here of course.
+"""
+import os
+import pickle
+import sys
+from pathlib import Path
+
+import numpy as np
+
+from fedper.dataset_preparation import (
+ call_dataset,
+ flickr_preprocess,
+ randomly_assign_classes,
+)
+
+# working dir is two up
+WORKING_DIR = Path(__file__).resolve().parent.parent
+FL_BENCH_ROOT = WORKING_DIR.parent
+
+sys.path.append(FL_BENCH_ROOT.as_posix())
+
+
+def dataset_main(config: dict) -> None:
+ """Prepare the dataset."""
+ dataset_name = config["name"].lower()
+ dataset_folder = Path(WORKING_DIR, "datasets")
+ dataset_root = Path(dataset_folder, dataset_name)
+
+ if not os.path.isdir(dataset_root):
+ os.makedirs(dataset_root)
+
+ if dataset_name == "cifar10":
+ dataset = call_dataset(dataset_name=dataset_name, root=dataset_root)
+
+ # randomly assign classes
+ assert config["num_classes"] > 0, "Number of classes must be positive"
+ config["num_classes"] = max(1, min(config["num_classes"], len(dataset.classes)))
+ # partition, stats = randomly_assign_classes(
+ partition = randomly_assign_classes(
+ dataset=dataset,
+ client_num=config["num_clients"],
+ class_num=config["num_classes"],
+ )
+
+ clients_4_train = list(range(config["num_clients"]))
+ clients_4_test = list(range(config["num_clients"]))
+
+ partition["separation"] = {
+ "train": clients_4_train,
+ "test": clients_4_test,
+ "total": config["num_clients"],
+ }
+ for client_id, idx in enumerate(partition["data_indices"]):
+ if config["split"] == "sample":
+ num_train_samples = int(len(idx) * config["fraction"])
+
+ np.random.shuffle(idx)
+ idx_train, idx_test = idx[:num_train_samples], idx[num_train_samples:]
+ partition["data_indices"][client_id] = {
+ "train": idx_train,
+ "test": idx_test,
+ }
+ else:
+ if client_id in clients_4_train:
+ partition["data_indices"][client_id] = {"train": idx, "test": []}
+ else:
+ partition["data_indices"][client_id] = {"train": [], "test": idx}
+ with open(dataset_root / "partition.pkl", "wb") as pickle_file:
+ pickle.dump(partition, pickle_file)
+
+ # with open(dataset_root / "all_stats.json", "w") as f:
+ # json.dump(stats, f)
+
+ elif dataset_name.lower() == "flickr":
+ flickr_preprocess(dataset_root, config)
+ else:
+ raise RuntimeError("Please implement the dataset preparation for your dataset.")
diff --git a/baselines/fedper/fedper/dataset_preparation.py b/baselines/fedper/fedper/dataset_preparation.py
new file mode 100644
index 000000000000..0b8b53782aac
--- /dev/null
+++ b/baselines/fedper/fedper/dataset_preparation.py
@@ -0,0 +1,209 @@
+"""Dataset preparation."""
+import os
+import random
+from collections import Counter
+from pathlib import Path
+from typing import Any, Dict, List, Union
+
+import numpy as np
+import pandas as pd
+import torch
+import torchvision
+from torch.utils.data import Dataset
+from torchvision import transforms
+
+
+class BaseDataset(Dataset):
+ """Base class for all datasets."""
+
+ def __init__(
+ self,
+ root: Path = Path("datasets/cifar10"),
+ general_data_transform: transforms.transforms.Compose = None,
+ ) -> None:
+ """Initialize the dataset."""
+ self.root = root
+ self.classes = None
+ self.data: torch.tensor = None
+ self.targets: torch.tensor = None
+ self.general_data_transform = general_data_transform
+
+ def __getitem__(self, index):
+ """Get the item at the given index."""
+ data, targets = self.data[index], self.targets[index]
+ if self.general_data_transform is not None:
+ data = self.general_data_transform(data)
+ return data, targets
+
+ def __len__(self):
+ """Return the length of the dataset."""
+ return len(self.targets)
+
+
+class CIFAR10(BaseDataset):
+ """CIFAR10 dataset."""
+
+ def __init__(
+ self,
+ root: Path = Path("datasets/cifar10"),
+ general_data_transform=None,
+ ):
+ super().__init__()
+ train_part = torchvision.datasets.CIFAR10(root, True, download=True)
+ test_part = torchvision.datasets.CIFAR10(root, False, download=True)
+ train_data = torch.tensor(train_part.data).permute([0, -1, 1, 2]).float()
+ test_data = torch.tensor(test_part.data).permute([0, -1, 1, 2]).float()
+ train_targets = torch.tensor(train_part.targets).long().squeeze()
+ test_targets = torch.tensor(test_part.targets).long().squeeze()
+ self.data = torch.cat([train_data, test_data])
+ self.targets = torch.cat([train_targets, test_targets])
+ self.classes = train_part.classes
+ self.general_data_transform = general_data_transform
+
+
+def flickr_preprocess(root, config):
+ """Preprocess the FLICKR dataset."""
+ print("Preprocessing FLICKR dataset...")
+ # create a tmp folder to store the preprocessed data
+ tmp_folder = Path(root, "tmp")
+ if not os.path.isdir(tmp_folder):
+ os.makedirs(tmp_folder)
+
+ # remove any folder or file in tmp folder, even if it is not empty
+ os.system(f"rm -rf {tmp_folder.as_posix()}/*")
+
+ # get number of clients
+ num_clients = config["num_clients"]
+ # get flickr image labels per clients
+ df_labelled_igms = pd.read_csv(
+ Path(root, "FLICKR-AES_image_labeled_by_each_worker.csv")
+ )
+ # take num_clients random workers from df
+ # #where workers have minimum 60 images and maximum 290
+ df_labelled_igms = df_labelled_igms.groupby("worker").filter(
+ lambda x: len(x) >= 60 and len(x) <= 290
+ )
+ # only take workers that have at least 1 image for each score (1-5)
+ df_labelled_igms = df_labelled_igms.groupby("worker").filter(
+ lambda x: len(x[" score"].unique()) == 5
+ )
+ df_labelled_igms = df_labelled_igms.groupby("worker").filter(
+ lambda x: x[" score"].value_counts().min() >= 4
+ )
+ # only take workers that have at least 4 images for each score (1-5)
+
+ # get num_clients random workers
+ clients = np.random.choice(
+ df_labelled_igms["worker"].unique(), num_clients, replace=False
+ )
+ for i, client in enumerate(clients):
+ print(f"Processing client {i}...")
+ df_client = df_labelled_igms[df_labelled_igms["worker"] == client]
+ client_path = Path(tmp_folder, f"client_{i}")
+ if not os.path.isdir(client_path):
+ os.makedirs(client_path)
+ # create score folder in client folder, scores go from 1-5
+ for score in range(1, 6):
+ score_path = Path(client_path, str(score))
+ if not os.path.isdir(score_path):
+ os.makedirs(score_path)
+ # copy images to score folder
+ for _, row in df_client.iterrows():
+ img_path = Path(root, "40K", row[" imagePair"])
+ score_path = Path(client_path, str(row[" score"]))
+ if os.path.isfile(img_path):
+ os.system(f"cp {img_path} {score_path}")
+
+
+def call_dataset(dataset_name, root, **kwargs):
+ """Call the dataset."""
+ if dataset_name == "cifar10":
+ return CIFAR10(root, **kwargs)
+ raise ValueError(f"Dataset {dataset_name} not supported.")
+
+
+def randomly_assign_classes(
+ dataset: Dataset, client_num: int, class_num: int
+) -> Dict[str, Union[Dict[Any, Any], List[Any]]]:
+ # ) -> Dict[str, Any]:
+ """Randomly assign number classes to clients."""
+ partition: Dict[str, Union[Dict, List]] = {"separation": {}, "data_indices": []}
+ data_indices: List[List[int]] = [[] for _ in range(client_num)]
+ targets_numpy = np.array(dataset.targets, dtype=np.int32)
+ label_list = list(range(len(dataset.classes)))
+
+ data_idx_for_each_label = [
+ np.where(targets_numpy == i)[0].tolist() for i in label_list
+ ]
+
+ assigned_labels = []
+ selected_times = [0 for _ in label_list]
+ for _ in range(client_num):
+ sampled_labels = random.sample(label_list, class_num)
+ assigned_labels.append(sampled_labels)
+ for j in sampled_labels:
+ selected_times[j] += 1
+
+ batch_sizes = _get_batch_sizes(
+ targets_numpy=targets_numpy,
+ label_list=label_list,
+ selected_times=selected_times,
+ )
+
+ data_indices = _get_data_indices(
+ batch_sizes=batch_sizes,
+ data_indices=data_indices,
+ data_idx_for_each_label=data_idx_for_each_label,
+ assigned_labels=assigned_labels,
+ client_num=client_num,
+ )
+
+ partition["data_indices"] = data_indices
+
+ return partition # , stats
+
+
+def _get_batch_sizes(
+ targets_numpy: np.ndarray,
+ label_list: List[int],
+ selected_times: List[int],
+) -> np.ndarray:
+ """Get batch sizes for each label."""
+ labels_count = Counter(targets_numpy)
+ batch_sizes = np.zeros_like(label_list)
+ for i in label_list:
+ print(f"label: {i}, count: {labels_count[i]}")
+ print(f"selected times: {selected_times[i]}")
+ batch_sizes[i] = int(labels_count[i] / selected_times[i])
+
+ return batch_sizes
+
+
+def _get_data_indices(
+ batch_sizes: np.ndarray,
+ data_indices: List[List[int]],
+ data_idx_for_each_label: List[List[int]],
+ assigned_labels: List[List[int]],
+ client_num: int,
+) -> List[List[int]]:
+ for i in range(client_num):
+ for cls in assigned_labels[i]:
+ if len(data_idx_for_each_label[cls]) < 2 * batch_sizes[cls]:
+ batch_size = len(data_idx_for_each_label[cls])
+ else:
+ batch_size = batch_sizes[cls]
+ selected_idx = random.sample(data_idx_for_each_label[cls], batch_size)
+ data_indices_use: np.ndarray = np.concatenate(
+ [data_indices[i], selected_idx], axis=0
+ ).astype(np.int64)
+ data_indices[i] = data_indices_use.tolist()
+ # data_indices[i]: np.ndarray = np.concatenate(
+ # [data_indices[i], selected_idx], axis=0
+ # ).astype(np.int64)
+ data_idx_for_each_label[cls] = list(
+ set(data_idx_for_each_label[cls]) - set(selected_idx)
+ )
+
+ data_indices[i] = data_indices[i]
+
+ return data_indices
diff --git a/baselines/fedper/fedper/implemented_models/mobile_model.py b/baselines/fedper/fedper/implemented_models/mobile_model.py
new file mode 100644
index 000000000000..57d3210c9511
--- /dev/null
+++ b/baselines/fedper/fedper/implemented_models/mobile_model.py
@@ -0,0 +1,258 @@
+"""MobileNet-v1 model, model manager and model split."""
+from typing import Dict, List, Optional, Tuple, Union
+
+import torch
+import torch.nn as nn
+from omegaconf import DictConfig
+from torch.utils.data import DataLoader
+
+from fedper.models import ModelManager, ModelSplit
+
+# Set model architecture
+ARCHITECTURE = {
+ "layer_1": {"conv_dw": [32, 64, 1]},
+ "layer_2": {"conv_dw": [64, 128, 2]},
+ "layer_3": {"conv_dw": [128, 128, 1]},
+ "layer_4": {"conv_dw": [128, 256, 2]},
+ "layer_5": {"conv_dw": [256, 256, 1]},
+ "layer_6": {"conv_dw": [256, 512, 2]},
+ "layer_7": {"conv_dw": [512, 512, 1]},
+ "layer_8": {"conv_dw": [512, 512, 1]},
+ "layer_9": {"conv_dw": [512, 512, 1]},
+ "layer_10": {"conv_dw": [512, 512, 1]},
+ "layer_11": {"conv_dw": [512, 512, 1]},
+ "layer_12": {"conv_dw": [512, 1024, 2]},
+ "layer_13": {"conv_dw": [1024, 1024, 1]},
+}
+
+
+class MobileNet(nn.Module):
+ """Model from MobileNet-v1 (https://github.com/wjc852456/pytorch-mobilenet-v1)."""
+
+ def __init__(
+ self,
+ num_head_layers: int = 1,
+ num_classes: int = 10,
+ ) -> None:
+ super(MobileNet, self).__init__()
+
+ self.architecture = ARCHITECTURE
+
+ def conv_bn(inp, oup, stride):
+ return nn.Sequential(
+ nn.Conv2d(inp, oup, 3, stride, 1, bias=False),
+ nn.BatchNorm2d(oup),
+ nn.ReLU(inplace=True),
+ )
+
+ def conv_dw(inp, oup, stride):
+ return nn.Sequential(
+ nn.Conv2d(inp, inp, 3, stride, 1, groups=inp, bias=False),
+ nn.BatchNorm2d(inp),
+ nn.ReLU(inplace=True),
+ nn.Conv2d(inp, oup, 1, 1, 0, bias=False),
+ nn.BatchNorm2d(oup),
+ nn.ReLU(inplace=True),
+ )
+
+ self.body = nn.Sequential()
+ self.body.add_module("initial_batch_norm", conv_bn(3, 32, 2))
+ for i in range(1, 13):
+ for _, value in self.architecture[f"layer_{i}"].items():
+ self.body.add_module(f"conv_dw_{i}", conv_dw(*value))
+
+ self.body.add_module("avg_pool", nn.AvgPool2d([7]))
+ self.body.add_module("fc", nn.Linear(1024, num_classes))
+
+ if num_head_layers == 1:
+ self.head = nn.Sequential(
+ nn.AvgPool2d([7]), nn.Flatten(), nn.Linear(1024, num_classes)
+ )
+ self.body.avg_pool = nn.Identity()
+ self.body.fc = nn.Identity()
+ elif num_head_layers == 2:
+ self.head = nn.Sequential(
+ conv_dw(1024, 1024, 1),
+ nn.AvgPool2d([7]),
+ nn.Flatten(),
+ nn.Linear(1024, num_classes),
+ )
+ self.body.conv_dw_13 = nn.Identity()
+ self.body.avg_pool = nn.Identity()
+ self.body.fc = nn.Identity()
+ elif num_head_layers == 3:
+ self.head = nn.Sequential(
+ conv_dw(512, 1024, 2),
+ conv_dw(1024, 1024, 1),
+ nn.AvgPool2d([7]),
+ nn.Flatten(),
+ nn.Linear(1024, num_classes),
+ )
+ self.body.conv_dw_12 = nn.Identity()
+ self.body.conv_dw_13 = nn.Identity()
+ self.body.avg_pool = nn.Identity()
+ self.body.fc = nn.Identity()
+ elif num_head_layers == 4:
+ self.head = nn.Sequential(
+ conv_dw(512, 512, 1),
+ conv_dw(512, 1024, 2),
+ conv_dw(1024, 1024, 1),
+ nn.AvgPool2d([7]),
+ nn.Flatten(),
+ nn.Linear(1024, num_classes),
+ )
+ self.body.conv_dw_11 = nn.Identity()
+ self.body.conv_dw_12 = nn.Identity()
+ self.body.conv_dw_13 = nn.Identity()
+ self.body.avg_pool = nn.Identity()
+ self.body.fc = nn.Identity()
+ else:
+ raise NotImplementedError("Number of head layers not implemented.")
+
+ def forward(self, x: torch.Tensor) -> torch.Tensor:
+ """Forward pass of the model."""
+ x = self.body(x)
+ return self.head(x)
+
+
+class MobileNetModelSplit(ModelSplit):
+ """Split MobileNet model into body and head."""
+
+ def _get_model_parts(self, model: MobileNet) -> Tuple[nn.Module, nn.Module]:
+ return model.body, model.head
+
+
+class MobileNetModelManager(ModelManager):
+ """Manager for models with Body/Head split."""
+
+ def __init__(
+ self,
+ client_id: int,
+ config: DictConfig,
+ trainloader: DataLoader,
+ testloader: DataLoader,
+ client_save_path: Optional[str] = "",
+ learning_rate: float = 0.01,
+ ):
+ """Initialize the attributes of the model manager.
+
+ Args:
+ client_id: The id of the client.
+ config: Dict containing the configurations to be used by the manager.
+ """
+ super().__init__(
+ model_split_class=MobileNetModelSplit,
+ client_id=client_id,
+ config=config,
+ )
+ self.trainloader, self.testloader = trainloader, testloader
+ self.device = self.config["server_device"]
+ self.client_save_path = client_save_path if client_save_path != "" else None
+ self.learning_rate = learning_rate
+
+ def _create_model(self) -> nn.Module:
+ """Return MobileNet-v1 model to be splitted into head and body."""
+ try:
+ return MobileNet(
+ num_head_layers=self.config["model"]["num_head_layers"],
+ num_classes=self.config["model"]["num_classes"],
+ ).to(self.device)
+ except AttributeError:
+ self.device = self.config["server_device"]
+ return MobileNet(
+ num_head_layers=self.config["model"]["num_head_layers"],
+ num_classes=self.config["model"]["num_classes"],
+ ).to(self.device)
+
+ def train(
+ self,
+ epochs: int = 1,
+ ) -> Dict[str, Union[List[Dict[str, float]], int, float]]:
+ """Train the model maintained in self.model.
+
+ Method adapted from simple MobileNet-v1 (PyTorch) \
+ https://github.com/wjc852456/pytorch-mobilenet-v1.
+
+ Args:
+ epochs: number of training epochs.
+
+ Returns
+ -------
+ Dict containing the train metrics.
+ """
+ # Load client state (head) if client_save_path is not None and it is not empty
+ if self.client_save_path is not None:
+ try:
+ self.model.head.load_state_dict(torch.load(self.client_save_path))
+ except FileNotFoundError:
+ print("No client state found, training from scratch.")
+ pass
+
+ criterion = torch.nn.CrossEntropyLoss()
+ optimizer = torch.optim.SGD(
+ self.model.parameters(), lr=self.learning_rate, momentum=0.9
+ )
+ correct, total = 0, 0
+ loss: torch.Tensor = 0.0
+ # self.model.train()
+ for _ in range(epochs):
+ for images, labels in self.trainloader:
+ optimizer.zero_grad()
+ outputs = self.model(images.to(self.device))
+ labels = labels.to(self.device)
+ loss = criterion(outputs, labels)
+ loss.backward()
+ optimizer.step()
+ total += labels.size(0)
+ correct += (torch.max(outputs.data, 1)[1] == labels).sum().item()
+
+ # Save client state (head)
+ if self.client_save_path is not None:
+ torch.save(self.model.head.state_dict(), self.client_save_path)
+
+ return {"loss": loss.item(), "accuracy": correct / total}
+
+ def test(
+ self,
+ ) -> Dict[str, float]:
+ """Test the model maintained in self.model.
+
+ Returns
+ -------
+ Dict containing the test metrics.
+ """
+ # Load client state (head)
+ if self.client_save_path is not None:
+ self.model.head.load_state_dict(torch.load(self.client_save_path))
+
+ criterion = torch.nn.CrossEntropyLoss()
+ correct, total, loss = 0, 0, 0.0
+ # self.model.eval()
+ with torch.no_grad():
+ for images, labels in self.testloader:
+ outputs = self.model(images.to(self.device))
+ labels = labels.to(self.device)
+ loss += criterion(outputs, labels).item()
+ total += labels.size(0)
+ correct += (torch.max(outputs.data, 1)[1] == labels).sum().item()
+ print("Test Accuracy: {:.4f}".format(correct / total))
+
+ if self.client_save_path is not None:
+ torch.save(self.model.head.state_dict(), self.client_save_path)
+
+ return {
+ "loss": loss / len(self.testloader.dataset),
+ "accuracy": correct / total,
+ }
+
+ def train_dataset_size(self) -> int:
+ """Return train data set size."""
+ return len(self.trainloader)
+
+ def test_dataset_size(self) -> int:
+ """Return test data set size."""
+ return len(self.testloader)
+
+ def total_dataset_size(self) -> int:
+ """Return total data set size."""
+ return len(self.trainloader) + len(self.testloader)
diff --git a/baselines/fedper/fedper/implemented_models/resnet_model.py b/baselines/fedper/fedper/implemented_models/resnet_model.py
new file mode 100644
index 000000000000..0d9837b118a3
--- /dev/null
+++ b/baselines/fedper/fedper/implemented_models/resnet_model.py
@@ -0,0 +1,272 @@
+"""ResNet model, model manager and split."""
+from typing import Dict, List, Optional, Tuple, Union
+
+import torch
+import torch.nn as nn
+from omegaconf import DictConfig
+from torch.utils.data import DataLoader
+from torchvision.models.resnet import resnet34
+
+from fedper.models import ModelManager, ModelSplit
+
+
+def conv3x3(
+ in_planes: int, out_planes: int, stride: int = 1, groups: int = 1, dilation: int = 1
+) -> nn.Conv2d:
+ """3x3 convolution with padding."""
+ return nn.Conv2d(
+ in_planes,
+ out_planes,
+ kernel_size=3,
+ stride=stride,
+ padding=dilation,
+ groups=groups,
+ bias=False,
+ dilation=dilation,
+ )
+
+
+def conv1x1(in_planes: int, out_planes: int, stride: int = 1) -> nn.Conv2d:
+ """1x1 convolution."""
+ return nn.Conv2d(in_planes, out_planes, kernel_size=1, stride=stride, bias=False)
+
+
+class BasicBlock(nn.Module):
+ """Basic block for ResNet."""
+
+ expansion: int = 1
+
+ def __init__(
+ self,
+ inplanes: int,
+ planes: int,
+ stride: int = 1,
+ downsample: Optional[nn.Module] = None,
+ ) -> None:
+ super().__init__()
+ norm_layer = nn.BatchNorm2d
+ # Both self.conv1 and self.downsample layers downsample input when stride != 1
+ self.conv1 = conv3x3(inplanes, planes, stride)
+ self.bn1 = norm_layer(planes)
+ self.relu = nn.ReLU(inplace=True)
+ self.conv2 = conv3x3(planes, planes)
+ self.bn2 = norm_layer(planes)
+ self.downsample = downsample
+ self.stride = stride
+
+ def forward(self, x: torch.Tensor) -> torch.Tensor:
+ """Forward inputs through the block."""
+ identity = x
+
+ out = self.conv1(x)
+ out = self.bn1(out)
+ out = self.relu(out)
+
+ out = self.conv2(out)
+ out = self.bn2(out)
+
+ if self.downsample is not None:
+ identity = self.downsample(x)
+
+ out += identity
+ out = self.relu(out)
+
+ return out
+
+
+class ResNet(nn.Module):
+ """ResNet model."""
+
+ def __init__(
+ self,
+ num_head_layers: int = 1,
+ num_classes: int = 10,
+ ) -> None:
+ super(ResNet, self).__init__()
+ assert (
+ num_head_layers > 0 and num_head_layers <= 17
+ ), "num_head_layers must be greater than 0 and less than 16"
+
+ self.num_head_layers = num_head_layers
+ self.body = resnet34()
+
+ # if only one head layer
+ if self.num_head_layers == 1:
+ self.head = self.body.fc
+ self.body.fc = nn.Identity()
+ elif self.num_head_layers == 2:
+ self.head = nn.Sequential(
+ BasicBlock(512, 512),
+ nn.AdaptiveAvgPool2d((1, 1)),
+ nn.Flatten(),
+ nn.Linear(512, num_classes),
+ )
+ # remove head layers from body
+ self.body = nn.Sequential(*list(self.body.children())[:-2])
+ body_layer4 = list(self.body.children())[-1]
+ self.body = nn.Sequential(*list(self.body.children())[:-1])
+ self.body.layer4 = nn.Sequential(*list(body_layer4.children())[:-1])
+ elif self.num_head_layers == 3:
+ self.head = nn.Sequential(
+ BasicBlock(512, 512),
+ BasicBlock(512, 512),
+ nn.AdaptiveAvgPool2d((1, 1)),
+ nn.Flatten(),
+ nn.Linear(512, num_classes),
+ )
+ # remove head layers from body
+ self.body = nn.Sequential(*list(self.body.children())[:-2])
+ body_layer4 = list(self.body.children())[-1]
+ self.body = nn.Sequential(*list(self.body.children())[:-1])
+ self.body.layer4 = nn.Sequential(*list(body_layer4.children())[:-2])
+ else:
+ raise NotImplementedError("Only 1 or 2 head layers supported")
+
+ def forward(self, x: torch.Tensor) -> torch.Tensor:
+ """Forward inputs through the model."""
+ print("Forwarding through ResNet model")
+ x = self.body(x)
+ return self.head(x)
+
+
+class ResNetModelSplit(ModelSplit):
+ """Split ResNet model into body and head."""
+
+ def _get_model_parts(self, model: ResNet) -> Tuple[nn.Module, nn.Module]:
+ return model.body, model.head
+
+
+class ResNetModelManager(ModelManager):
+ """Manager for models with Body/Head split."""
+
+ def __init__(
+ self,
+ client_save_path: Optional[str],
+ client_id: int,
+ config: DictConfig,
+ trainloader: DataLoader,
+ testloader: DataLoader,
+ learning_rate: float = 0.01,
+ ):
+ """Initialize the attributes of the model manager.
+
+ Args:
+ client_save_path: Path to save the client state.
+ client_id: The id of the client.
+ config: Dict containing the configurations to be used by the manager.
+ trainloader: DataLoader containing the train data.
+ testloader: DataLoader containing the test data.
+ learning_rate: Learning rate for the optimizer.
+ """
+ super().__init__(
+ model_split_class=ResNetModelSplit,
+ client_id=client_id,
+ config=config,
+ )
+ self.client_save_path = client_save_path
+ self.trainloader, self.testloader = trainloader, testloader
+ self.device = self.config["server_device"]
+ self.learning_rate = learning_rate
+
+ def _create_model(self) -> nn.Module:
+ """Return MobileNet-v1 model to be splitted into head and body."""
+ try:
+ return ResNet(
+ num_head_layers=self.config["model"]["num_head_layers"],
+ num_classes=self.config["model"]["num_classes"],
+ ).to(self.device)
+ except AttributeError:
+ self.device = self.config["server_device"]
+ return ResNet(
+ num_head_layers=self.config["model"]["num_head_layers"],
+ num_classes=self.config["model"]["num_classes"],
+ ).to(self.device)
+
+ def train(
+ self,
+ epochs: int = 1,
+ ) -> Dict[str, Union[List[Dict[str, float]], int, float]]:
+ """Train the model maintained in self.model.
+
+ Method adapted from simple MobileNet-v1 (PyTorch) \
+ https://github.com/wjc852456/pytorch-mobilenet-v1.
+
+ Args:
+ epochs: number of training epochs.
+
+ Returns
+ -------
+ Dict containing the train metrics.
+ """
+ # Load client state (head) if client_save_path is not None and it is not empty
+ if self.client_save_path is not None:
+ try:
+ self.model.head.load_state_dict(torch.load(self.client_save_path))
+ except FileNotFoundError:
+ print("No client state found, training from scratch.")
+ pass
+
+ criterion = torch.nn.CrossEntropyLoss()
+ optimizer = torch.optim.SGD(
+ self.model.parameters(), lr=self.learning_rate, momentum=0.9
+ )
+ correct, total = 0, 0
+ loss: torch.Tensor = 0.0
+ # self.model.train()
+ for _ in range(epochs):
+ for images, labels in self.trainloader:
+ optimizer.zero_grad()
+ outputs = self.model(images.to(self.device))
+ labels = labels.to(self.device)
+ loss = criterion(outputs, labels)
+ loss.backward()
+
+ optimizer.step()
+ total += labels.size(0)
+ correct += (torch.max(outputs.data, 1)[1] == labels).sum().item()
+
+ # Save client state (head)
+ if self.client_save_path is not None:
+ torch.save(self.model.head.state_dict(), self.client_save_path)
+
+ return {"loss": loss.item(), "accuracy": correct / total}
+
+ def test(
+ self,
+ ) -> Dict[str, float]:
+ """Test the model maintained in self.model."""
+ # Load client state (head)
+ if self.client_save_path is not None:
+ self.model.head.load_state_dict(torch.load(self.client_save_path))
+
+ criterion = torch.nn.CrossEntropyLoss()
+ correct, total, loss = 0, 0, 0.0
+ # self.model.eval()
+ with torch.no_grad():
+ for images, labels in self.testloader:
+ outputs = self.model(images.to(self.device))
+ labels = labels.to(self.device)
+ loss += criterion(outputs, labels).item()
+ total += labels.size(0)
+ correct += (torch.max(outputs.data, 1)[1] == labels).sum().item()
+ print("Test Accuracy: {:.4f}".format(correct / total))
+
+ if self.client_save_path is not None:
+ torch.save(self.model.head.state_dict(), self.client_save_path)
+
+ return {
+ "loss": loss / len(self.testloader.dataset),
+ "accuracy": correct / total,
+ }
+
+ def train_dataset_size(self) -> int:
+ """Return train data set size."""
+ return len(self.trainloader)
+
+ def test_dataset_size(self) -> int:
+ """Return test data set size."""
+ return len(self.testloader)
+
+ def total_dataset_size(self) -> int:
+ """Return total data set size."""
+ return len(self.trainloader) + len(self.testloader)
diff --git a/baselines/fedper/fedper/main.py b/baselines/fedper/fedper/main.py
new file mode 100644
index 000000000000..b421b2e0442c
--- /dev/null
+++ b/baselines/fedper/fedper/main.py
@@ -0,0 +1,126 @@
+"""Create and connect the building blocks for your experiments; start the simulation.
+
+It includes processioning the dataset, instantiate strategy, specify how the global
+model is going to be evaluated, etc. At the end, this script saves the results.
+"""
+
+from pathlib import Path
+
+import flwr as fl
+import hydra
+from hydra.core.hydra_config import HydraConfig
+from hydra.utils import instantiate
+from omegaconf import DictConfig, OmegaConf
+
+from fedper.dataset import dataset_main
+from fedper.utils import (
+ get_client_fn,
+ get_create_model_fn,
+ plot_metric_from_history,
+ save_results_as_pickle,
+ set_client_state_save_path,
+ set_model_class,
+ set_num_classes,
+ set_server_target,
+)
+
+
+@hydra.main(config_path="conf", config_name="base", version_base=None)
+def main(cfg: DictConfig) -> None:
+ """Run the baseline.
+
+ Parameters
+ ----------
+ cfg : DictConfig
+ An omegaconf object that stores the hydra config.
+ """
+ # 1. Print parsed config
+ # Set the model class, server target, and number of classes
+ cfg = set_model_class(cfg)
+ cfg = set_server_target(cfg)
+ cfg = set_num_classes(cfg)
+
+ print(OmegaConf.to_yaml(cfg))
+
+ # Create directory to store client states if it does not exist
+ # Client state has subdirectories with the name of current time
+ client_state_save_path = set_client_state_save_path()
+
+ # 2. Prepare your dataset
+ dataset_main(cfg.dataset)
+
+ # 3. Define your clients
+ # Get client function
+ client_fn = get_client_fn(
+ config=cfg,
+ client_state_save_path=client_state_save_path,
+ )
+
+ # get a function that will be used to construct the config that the client's
+ # fit() method will received
+ def get_on_fit_config():
+ def fit_config_fn(server_round: int):
+ # resolve and convert to python dict
+ fit_config = OmegaConf.to_container(cfg.fit_config, resolve=True)
+ _ = server_round
+ return fit_config
+
+ return fit_config_fn
+
+ # get a function that will be used to construct the model
+ create_model, split = get_create_model_fn(cfg)
+
+ # 4. Define your strategy
+ strategy = instantiate(
+ cfg.strategy,
+ create_model=create_model,
+ on_fit_config_fn=get_on_fit_config(),
+ model_split_class=split,
+ )
+
+ # 5. Start Simulation
+ history = fl.simulation.start_simulation(
+ client_fn=client_fn,
+ num_clients=cfg.num_clients,
+ config=fl.server.ServerConfig(num_rounds=cfg.num_rounds),
+ client_resources={
+ "num_cpus": cfg.client_resources.num_cpus,
+ "num_gpus": cfg.client_resources.num_gpus,
+ },
+ strategy=strategy,
+ )
+
+ # Experiment completed. Now we save the results and
+ # generate plots using the `history`
+ print("................")
+ print(history)
+
+ # 6. Save your results
+ save_path = Path(HydraConfig.get().runtime.output_dir)
+
+ # save results as a Python pickle using a file_path
+ # the directory created by Hydra for each run
+ save_results_as_pickle(
+ history,
+ file_path=save_path,
+ )
+ # plot results and include them in the readme
+ strategy_name = strategy.__class__.__name__
+ file_suffix: str = (
+ f"_{strategy_name}"
+ f"_C={cfg.num_clients}"
+ f"_B={cfg.batch_size}"
+ f"_E={cfg.num_epochs}"
+ f"_R={cfg.num_rounds}"
+ f"_lr={cfg.learning_rate}"
+ )
+
+ plot_metric_from_history(
+ history,
+ save_path,
+ (file_suffix),
+ )
+
+
+if __name__ == "__main__":
+ main()
diff --git a/baselines/fedper/fedper/models.py b/baselines/fedper/fedper/models.py
new file mode 100644
index 000000000000..2a2ebde158f8
--- /dev/null
+++ b/baselines/fedper/fedper/models.py
@@ -0,0 +1,189 @@
+"""Abstract class for splitting a model into body and head."""
+from abc import ABC, abstractmethod
+from collections import OrderedDict
+from typing import Any, Dict, List, Tuple, Type, Union
+
+import numpy as np
+from omegaconf import DictConfig
+from torch import Tensor
+from torch import nn as nn
+
+
+class ModelSplit(ABC, nn.Module):
+ """Abstract class for splitting a model into body and head."""
+
+ def __init__(
+ self,
+ model: nn.Module,
+ ):
+ """Initialize the attributes of the model split.
+
+ Args:
+ model: dict containing the vocab sizes of the input attributes.
+ """
+ super().__init__()
+
+ self._body, self._head = self._get_model_parts(model)
+
+ @abstractmethod
+ def _get_model_parts(self, model: nn.Module) -> Tuple[nn.Module, nn.Module]:
+ """Return the body and head of the model.
+
+ Args:
+ model: model to be split into head and body
+
+ Returns
+ -------
+ Tuple where the first element is the body of the model
+ and the second is the head.
+ """
+
+ @property
+ def body(self) -> nn.Module:
+ """Return model body."""
+ return self._body
+
+ @body.setter
+ def body(self, state_dict: "OrderedDict[str, Tensor]") -> None:
+ """Set model body.
+
+ Args:
+ state_dict: dictionary of the state to set the model body to.
+ """
+ self.body.load_state_dict(state_dict, strict=True)
+
+ @property
+ def head(self) -> nn.Module:
+ """Return model head."""
+ return self._head
+
+ @head.setter
+ def head(self, state_dict: "OrderedDict[str, Tensor]") -> None:
+ """Set model head.
+
+ Args:
+ state_dict: dictionary of the state to set the model head to.
+ """
+ self.head.load_state_dict(state_dict, strict=True)
+
+ def get_parameters(self) -> List[np.ndarray]:
+ """Get model parameters (without fixed head).
+
+ Returns
+ -------
+ Body and head parameters
+ """
+ return [
+ val.cpu().numpy()
+ for val in [
+ *self.body.state_dict().values(),
+ *self.head.state_dict().values(),
+ ]
+ ]
+
+ def set_parameters(self, state_dict: Dict[str, Tensor]) -> None:
+ """Set model parameters.
+
+ Args:
+ state_dict: dictionary of the state to set the model to.
+ """
+ ordered_state_dict = OrderedDict(self.state_dict().copy())
+ # Update with the values of the state_dict
+ ordered_state_dict.update(dict(state_dict.items()))
+ self.load_state_dict(ordered_state_dict, strict=False)
+
+ def enable_head(self) -> None:
+ """Enable gradient tracking for the head parameters."""
+ for param in self.head.parameters():
+ param.requires_grad = True
+
+ def enable_body(self) -> None:
+ """Enable gradient tracking for the body parameters."""
+ for param in self.body.parameters():
+ param.requires_grad = True
+
+ def disable_head(self) -> None:
+ """Disable gradient tracking for the head parameters."""
+ for param in self.head.parameters():
+ param.requires_grad = False
+
+ def disable_body(self) -> None:
+ """Disable gradient tracking for the body parameters."""
+ for param in self.body.parameters():
+ param.requires_grad = False
+
+ def forward(self, inputs: Any) -> Any:
+ """Forward inputs through the body and the head."""
+ x = self.body(inputs)
+ return self.head(x)
+
+
+class ModelManager(ABC):
+ """Manager for models with Body/Head split."""
+
+ def __init__(
+ self,
+ client_id: int,
+ config: DictConfig,
+ model_split_class: Type[Any], # ModelSplit
+ ):
+ """Initialize the attributes of the model manager.
+
+ Args:
+ client_id: The id of the client.
+ config: Dict containing the configurations to be used by the manager.
+ model_split_class: Class to be used to split the model into body and head\
+ (concrete implementation of ModelSplit).
+ """
+ super().__init__()
+
+ self.client_id = client_id
+ self.config = config
+ self._model = model_split_class(self._create_model())
+
+ @abstractmethod
+ def _create_model(self) -> nn.Module:
+ """Return model to be splitted into head and body."""
+
+ @abstractmethod
+ def train(
+ self,
+ epochs: int = 1,
+ ) -> Dict[str, Union[List[Dict[str, float]], int, float]]:
+ """Train the model maintained in self.model.
+
+ Args:
+ epochs: number of training epochs.
+
+ Returns
+ -------
+ Dict containing the train metrics.
+ """
+
+ @abstractmethod
+ def test(
+ self,
+ ) -> Dict[str, float]:
+ """Test the model maintained in self.model.
+
+ Returns
+ -------
+ Dict containing the test metrics.
+ """
+
+ @abstractmethod
+ def train_dataset_size(self) -> int:
+ """Return train data set size."""
+
+ @abstractmethod
+ def test_dataset_size(self) -> int:
+ """Return test data set size."""
+
+ @abstractmethod
+ def total_dataset_size(self) -> int:
+ """Return total data set size."""
+
+ @property
+ def model(self) -> nn.Module:
+ """Return model."""
+ return self._model
diff --git a/baselines/fedper/fedper/run_figures.sh b/baselines/fedper/fedper/run_figures.sh
new file mode 100755
index 000000000000..9f7382412465
--- /dev/null
+++ b/baselines/fedper/fedper/run_figures.sh
@@ -0,0 +1,36 @@
+#!/bin/bash
+
+# CIFAR10 Mobile and Resnet (non-iid n classes (FIGURE 2a&b))
+for model in mobile resnet
+do
+ for num_classes in 4 8 10
+ do
+ for algorithm in fedper fedavg
+ do
+ python -m fedper.main --config-path conf --config-name cifar10 dataset.num_classes=${num_classes} model_name=${model} algorithm=${algorithm}
+ done
+ done
+done
+
+
+# CIFAR10 Mobile (n head layers (FIGURE 4a))
+for num_head_layers in 2 3 4
+do
+ python -m fedper.main --config-path conf --config-name cifar10 dataset.num_classes=4 model.num_head_layers=${num_head_layers} num_rounds=25 model_name=mobile algorithm=fedper
+done
+python -m fedper.main --config-path conf --config-name cifar10 num_rounds=25 model_name=mobile dataset.num_classes=4
+
+# CIFAR10 Resnet (n head layers (FIGURE 4b))
+for num_head_layers in 1 2 3
+do
+ python -m fedper.main --config-path conf --config-name cifar10 dataset.num_classes=4 model.num_head_layers=${num_head_layers} num_rounds=25 model_name=resnet algorithm=fedper
+done
+python -m fedper.main --config-path conf --config-name cifar10 num_rounds=25 model_name=resnet dataset.num_classes=4
+
+# FLICKR
+for model in mobile resnet
+do
+ python -m fedper.main --config-path conf --config-name flickr model.num_head_layers=2 model_name=${model} algorithm=fedper num_rounds=35
+ python -m fedper.main --config-path conf --config-name flickr model_name=${model} algorithm=fedavg num_rounds=35
+done
+
diff --git a/baselines/fedper/fedper/server.py b/baselines/fedper/fedper/server.py
new file mode 100644
index 000000000000..93616f50f45a
--- /dev/null
+++ b/baselines/fedper/fedper/server.py
@@ -0,0 +1,24 @@
+"""Server strategies pipelines for FedPer."""
+from flwr.server.strategy.fedavg import FedAvg
+
+from fedper.strategy import (
+ AggregateBodyStrategy,
+ AggregateFullStrategy,
+ ServerInitializationStrategy,
+)
+
+
+class InitializationStrategyPipeline(ServerInitializationStrategy):
+ """Initialization strategy pipeline."""
+
+
+class AggregateBodyStrategyPipeline(
+ InitializationStrategyPipeline, AggregateBodyStrategy, FedAvg
+):
+ """Aggregate body strategy pipeline."""
+
+
+class DefaultStrategyPipeline(
+ InitializationStrategyPipeline, AggregateFullStrategy, FedAvg
+):
+ """Default strategy pipeline."""
diff --git a/baselines/fedper/fedper/strategy.py b/baselines/fedper/fedper/strategy.py
new file mode 100644
index 000000000000..5ae55086db2f
--- /dev/null
+++ b/baselines/fedper/fedper/strategy.py
@@ -0,0 +1,437 @@
+"""FL server strategies."""
+from collections import OrderedDict
+from pathlib import Path
+from typing import Any, Callable, Dict, List, Optional, Tuple, Type, Union
+
+import torch
+from flwr.common import (
+ EvaluateIns,
+ EvaluateRes,
+ FitIns,
+ FitRes,
+ NDArrays,
+ Parameters,
+ Scalar,
+ ndarrays_to_parameters,
+ parameters_to_ndarrays,
+)
+from flwr.server.client_manager import ClientManager
+from flwr.server.client_proxy import ClientProxy
+from flwr.server.strategy.fedavg import FedAvg
+from torch import nn as nn
+
+from fedper.constants import Algorithms
+from fedper.implemented_models.mobile_model import MobileNetModelSplit
+from fedper.implemented_models.resnet_model import ResNetModelSplit
+from fedper.models import ModelSplit
+
+
+class ServerInitializationStrategy(FedAvg):
+ """Server FL Parameter Initialization strategy implementation."""
+
+ def __init__(
+ self,
+ *args: Any,
+ model_split_class: Union[
+ Type[MobileNetModelSplit], Type[ModelSplit], Type[ResNetModelSplit]
+ ],
+ create_model: Callable[[], nn.Module],
+ initial_parameters: Optional[Parameters] = None,
+ on_fit_config_fn: Optional[Callable[[int], Dict[str, Any]]] = None,
+ evaluate_fn: Optional[
+ Callable[
+ [int, NDArrays, Dict[str, Scalar]],
+ Optional[Tuple[float, Dict[str, Scalar]]],
+ ]
+ ] = None,
+ min_available_clients: int = 1,
+ min_evaluate_clients: int = 1,
+ min_fit_clients: int = 1,
+ algorithm: str = Algorithms.FEDPER.value,
+ **kwargs: Any,
+ ) -> None:
+ super().__init__(*args, **kwargs)
+ _ = evaluate_fn
+ self.on_fit_config_fn = on_fit_config_fn
+ self.initial_parameters = initial_parameters
+ self.min_available_clients = min_available_clients
+ self.min_evaluate_clients = min_evaluate_clients
+ self.min_fit_clients = min_fit_clients
+ self.algorithm = algorithm
+ self.model = model_split_class(model=create_model())
+
+ def initialize_parameters(
+ self, client_manager: ClientManager
+ ) -> Optional[Parameters]:
+ """Initialize the (global) model parameters.
+
+ Args:
+ client_manager: ClientManager. The client manager which holds all currently
+ connected clients.
+
+ Returns
+ -------
+ If parameters are returned, then the server will treat these as the
+ initial global model parameters.
+ """
+ initial_parameters: Optional[Parameters] = self.initial_parameters
+ self.initial_parameters = None # Don't keep initial parameters in memory
+ if initial_parameters is None and self.model is not None:
+ if self.algorithm == Algorithms.FEDPER.value:
+ initial_parameters_use = [
+ val.cpu().numpy() for _, val in self.model.body.state_dict().items()
+ ]
+ else: # FedAvg
+ initial_parameters_use = [
+ val.cpu().numpy() for _, val in self.model.state_dict().items()
+ ]
+
+ if isinstance(initial_parameters_use, list):
+ initial_parameters = ndarrays_to_parameters(initial_parameters_use)
+ return initial_parameters
+
+
+class AggregateFullStrategy(ServerInitializationStrategy):
+ """Full model aggregation strategy implementation."""
+
+ def __init__(self, *args, save_path: Path = Path(""), **kwargs) -> None:
+ super().__init__(*args, **kwargs)
+ self.save_path = save_path if save_path != "" else None
+ if save_path is not None:
+ self.save_path = save_path / "models"
+ self.save_path.mkdir(parents=True, exist_ok=True)
+
+ def configure_evaluate(
+ self, server_round: int, parameters: Parameters, client_manager: ClientManager
+ ) -> List[Tuple[ClientProxy, EvaluateIns]]:
+ """Configure the next round of evaluation.
+
+ Args:
+ server_round: The current round of federated learning.
+ parameters: The current (global) model parameters.
+ client_manager: The client manager which holds all currently
+ connected clients.
+
+ Returns
+ -------
+ A list of tuples. Each tuple in the list identifies a `ClientProxy` and the
+ `EvaluateIns` for this particular `ClientProxy`. If a particular
+ `ClientProxy` is not included in this list, it means that this
+ `ClientProxy` will not participate in the next round of federated
+ evaluation.
+ """
+ # Same as superclass method but adds the head
+
+ # Parameters and config
+ config: Dict[Any, Any] = {}
+
+ weights = parameters_to_ndarrays(parameters)
+
+ parameters = ndarrays_to_parameters(weights)
+
+ evaluate_ins = EvaluateIns(parameters, config)
+
+ # Sample clients
+ if server_round >= 0:
+ # Sample clients
+ sample_size, min_num_clients = self.num_evaluation_clients(
+ client_manager.num_available()
+ )
+ clients = client_manager.sample(
+ num_clients=sample_size,
+ min_num_clients=min_num_clients,
+ )
+ else:
+ clients = list(client_manager.all().values())
+
+ # Return client/config pairs
+ return [(client, evaluate_ins) for client in clients]
+
+ def aggregate_fit(
+ self,
+ server_round: int,
+ results: List[Tuple[ClientProxy, FitRes]],
+ failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]],
+ ) -> Tuple[Optional[Parameters], Dict[str, Scalar]]:
+ """Aggregate received local parameters, set global model parameters and save.
+
+ Args:
+ server_round: The current round of federated learning.
+ results: Successful updates from the previously selected and configured
+ clients. Each pair of `(ClientProxy, FitRes)` constitutes a
+ successful update from one of the previously selected clients. Not
+ that not all previously selected clients are necessarily included in
+ this list: a client might drop out and not submit a result. For each
+ client that did not submit an update, there should be an `Exception`
+ in `failures`.
+ failures: Exceptions that occurred while the server was waiting for client
+ updates.
+
+ Returns
+ -------
+ If parameters are returned, then the server will treat these as the
+ new global model parameters (i.e., it will replace the previous
+ parameters with the ones returned from this method). If `None` is
+ returned (e.g., because there were only failures and no viable
+ results) then the server will no update the previous model
+ parameters, the updates received in this round are discarded, and
+ the global model parameters remain the same.
+ """
+ agg_params, agg_metrics = super().aggregate_fit(
+ server_round=server_round, results=results, failures=failures
+ )
+ if agg_params is not None:
+ # Update Server Model
+ parameters = parameters_to_ndarrays(agg_params)
+ model_keys = [
+ k
+ for k in self.model.state_dict().keys()
+ if k.startswith("_body") or k.startswith("_head")
+ ]
+ params_dict = zip(model_keys, parameters)
+ state_dict = OrderedDict({k: torch.tensor(v) for k, v in params_dict})
+ self.model.set_parameters(state_dict)
+
+ if self.save_path is not None:
+ # Save Model
+ torch.save(self.model, self.save_path / f"model-ep_{server_round}.pt")
+
+ return agg_params, agg_metrics
+
+ def aggregate_evaluate(
+ self,
+ server_round: int,
+ results: List[Tuple[ClientProxy, EvaluateRes]],
+ failures: List[Union[Tuple[ClientProxy, EvaluateRes], BaseException]],
+ ) -> Tuple[Optional[float], Dict[str, Scalar]]:
+ """Aggregate the received local parameters and store the test aggregated.
+
+ Args:
+ server_round: The current round of federated learning.
+ results: Successful updates from the
+ previously selected and configured clients. Each pair of
+ `(ClientProxy, FitRes` constitutes a successful update from one of the
+ previously selected clients. Not that not all previously selected
+ clients are necessarily included in this list: a client might drop out
+ and not submit a result. For each client that did not submit an update,
+ there should be an `Exception` in `failures`.
+ failures: Exceptions that occurred while the server
+ was waiting for client updates.
+
+ Returns
+ -------
+ Optional `float` representing the aggregated evaluation result. Aggregation
+ typically uses some variant of a weighted average.
+ """
+ aggregated_loss, aggregated_metrics = super().aggregate_evaluate(
+ server_round=server_round, results=results, failures=failures
+ )
+ _ = aggregated_metrics # Avoid unused variable warning
+
+ # Weigh accuracy of each client by number of examples used
+ accuracies: List[float] = []
+ for _, res in results:
+ accuracy: float = float(res.metrics["accuracy"])
+ accuracies.append(accuracy)
+ print(f"Round {server_round} accuracies: {accuracies}")
+
+ # Aggregate and print custom metric
+ averaged_accuracy = sum(accuracies) / len(accuracies)
+ print(f"Round {server_round} accuracy averaged: {averaged_accuracy}")
+ return aggregated_loss, {"accuracy": averaged_accuracy}
+
+
+class AggregateBodyStrategy(ServerInitializationStrategy):
+ """Body Aggregation strategy implementation."""
+
+ def __init__(self, *args, save_path: Path = Path(""), **kwargs) -> None:
+ super().__init__(*args, **kwargs)
+ self.save_path = save_path if save_path != "" else None
+ if save_path is not None:
+ self.save_path = save_path / "models"
+ self.save_path.mkdir(parents=True, exist_ok=True)
+
+ def configure_fit(
+ self, server_round: int, parameters: Parameters, client_manager: ClientManager
+ ) -> List[Tuple[ClientProxy, FitIns]]:
+ """Configure the next round of training.
+
+ Args:
+ server_round: The current round of federated learning.
+ parameters: The current (global) model parameters.
+ client_manager: The client manager which holds all
+ currently connected clients.
+
+ Returns
+ -------
+ A list of tuples. Each tuple in the list identifies a `ClientProxy` and the
+ `FitIns` for this particular `ClientProxy`. If a particular `ClientProxy`
+ is not included in this list, it means that this `ClientProxy`
+ will not participate in the next round of federated learning.
+ """
+ # Same as superclass method but adds the head
+
+ config = {}
+ if self.on_fit_config_fn is not None:
+ # Custom fit config function provided
+ config = self.on_fit_config_fn(server_round)
+
+ weights = parameters_to_ndarrays(parameters)
+
+ # Add head parameters to received body parameters
+ weights.extend(
+ [val.cpu().numpy() for _, val in self.model.head.state_dict().items()]
+ )
+
+ parameters = ndarrays_to_parameters(weights)
+
+ fit_ins = FitIns(parameters, config)
+
+ # Sample clients
+ clients = client_manager.sample(
+ num_clients=self.min_available_clients, min_num_clients=self.min_fit_clients
+ )
+
+ # Return client/config pairs
+ return [(client, fit_ins) for client in clients]
+
+ def configure_evaluate(
+ self, server_round: int, parameters: Parameters, client_manager: ClientManager
+ ) -> List[Tuple[ClientProxy, EvaluateIns]]:
+ """Configure the next round of evaluation.
+
+ Args:
+ server_round: The current round of federated learning.
+ parameters: The current (global) model parameters.
+ client_manager: The client manager which holds all currently
+ connected clients.
+
+ Returns
+ -------
+ A list of tuples. Each tuple in the list identifies a `ClientProxy` and the
+ `EvaluateIns` for this particular `ClientProxy`. If a particular
+ `ClientProxy` is not included in this list, it means that this
+ `ClientProxy` will not participate in the next round of federated
+ evaluation.
+ """
+ # Same as superclass method but adds the head
+
+ # Parameters and config
+ config: Dict[Any, Any] = {}
+
+ weights = parameters_to_ndarrays(parameters)
+
+ # Add head parameters to received body parameters
+ weights.extend(
+ [val.cpu().numpy() for _, val in self.model.head.state_dict().items()]
+ )
+
+ parameters = ndarrays_to_parameters(weights)
+
+ evaluate_ins = EvaluateIns(parameters, config)
+
+ # Sample clients
+ if server_round >= 0:
+ # Sample clients
+ sample_size, min_num_clients = self.num_evaluation_clients(
+ client_manager.num_available()
+ )
+ clients = client_manager.sample(
+ num_clients=sample_size,
+ min_num_clients=min_num_clients,
+ )
+ else:
+ clients = list(client_manager.all().values())
+
+ # Return client/config pairs
+ return [(client, evaluate_ins) for client in clients]
+
+ def aggregate_fit(
+ self,
+ server_round: int,
+ results: List[Tuple[ClientProxy, FitRes]],
+ failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]],
+ ) -> Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]]:
+ """Aggregate received local parameters, set global model parameters and save.
+
+ Args:
+ server_round: The current round of federated learning.
+ results: Successful updates from the previously selected and configured
+ clients. Each pair of `(ClientProxy, FitRes)` constitutes a
+ successful update from one of the previously selected clients. Not
+ that not all previously selected clients are necessarily included in
+ this list: a client might drop out and not submit a result. For each
+ client that did not submit an update, there should be an `Exception`
+ in `failures`.
+ failures: Exceptions that occurred while the server was waiting for client
+ updates.
+
+ Returns
+ -------
+ If parameters are returned, then the server will treat these as the
+ new global model parameters (i.e., it will replace the previous
+ parameters with the ones returned from this method). If `None` is
+ returned (e.g., because there were only failures and no viable
+ results) then the server will no update the previous model
+ parameters, the updates received in this round are discarded, and
+ the global model parameters remain the same.
+ """
+ agg_params, agg_metrics = super().aggregate_fit(
+ server_round=server_round, results=results, failures=failures
+ )
+ if agg_params is not None:
+ parameters = parameters_to_ndarrays(agg_params)
+ model_keys = [
+ k for k in self.model.state_dict().keys() if k.startswith("_body")
+ ]
+ params_dict = zip(model_keys, parameters)
+ state_dict = OrderedDict({k: torch.tensor(v) for k, v in params_dict})
+ self.model.set_parameters(state_dict)
+
+ if self.save_path is not None:
+ # Save Model
+ torch.save(self.model, self.save_path / f"model-ep_{server_round}.pt")
+
+ return agg_params, agg_metrics
+
+ def aggregate_evaluate(
+ self,
+ server_round: int,
+ results: List[Tuple[ClientProxy, EvaluateRes]],
+ failures: List[Union[Tuple[ClientProxy, EvaluateRes], BaseException]],
+ ) -> Tuple[Optional[float], Dict[str, Scalar]]:
+ """Aggregate the received local parameters and store the test aggregated.
+
+ Args:
+ server_round: The current round of federated learning.
+ results: Successful updates from the
+ previously selected and configured clients. Each pair of
+ `(ClientProxy, FitRes` constitutes a successful update from one of the
+ previously selected clients. Not that not all previously selected
+ clients are necessarily included in this list: a client might drop out
+ and not submit a result. For each client that did not submit an update,
+ there should be an `Exception` in `failures`.
+ failures: Exceptions that occurred while the server
+ was waiting for client updates.
+
+ Returns
+ -------
+ Optional `float` representing the aggregated evaluation result. Aggregation
+ typically uses some variant of a weighted average.
+ """
+ aggregated_loss, aggregated_metrics = super().aggregate_evaluate(
+ server_round=server_round, results=results, failures=failures
+ )
+ _ = aggregated_metrics # Avoid unused variable warning
+
+ # Weigh accuracy of each client by number of examples used
+ accuracies: List[float] = []
+ for _, res in results:
+ accuracy: float = float(res.metrics["accuracy"])
+ accuracies.append(accuracy)
+ print(f"Round {server_round} accuracies: {accuracies}")
+
+ # Aggregate and print custom metric
+ averaged_accuracy = sum(accuracies) / len(accuracies)
+ print(f"Round {server_round} accuracy averaged: {averaged_accuracy}")
+ return aggregated_loss, {"accuracy": averaged_accuracy}
diff --git a/baselines/fedper/fedper/utils.py b/baselines/fedper/fedper/utils.py
new file mode 100644
index 000000000000..00b4c5318729
--- /dev/null
+++ b/baselines/fedper/fedper/utils.py
@@ -0,0 +1,225 @@
+"""Utility functions for FedPer."""
+import os
+import pickle
+import time
+from pathlib import Path
+from secrets import token_hex
+from typing import Callable, Optional, Type, Union
+
+import matplotlib.pyplot as plt
+import numpy as np
+from flwr.server.history import History
+from omegaconf import DictConfig
+
+from fedper.client import BaseClient, FedPerClient, get_client_fn_simulation
+from fedper.implemented_models.mobile_model import MobileNet, MobileNetModelSplit
+from fedper.implemented_models.resnet_model import ResNet, ResNetModelSplit
+
+
+def set_model_class(config: DictConfig) -> DictConfig:
+ """Set model class based on the model name in the config file."""
+ # Set the model class
+ if config.model_name.lower() == "resnet":
+ config.model["_target_"] = "fedper.implemented_models.resnet_model.ResNet"
+ elif config.model_name.lower() == "mobile":
+ config.model["_target_"] = "fedper.implemented_models.mobile_model.MobileNet"
+ else:
+ raise NotImplementedError(f"Model {config.model.name} not implemented")
+ return config
+
+
+def set_num_classes(config: DictConfig) -> DictConfig:
+ """Set the number of classes based on the dataset name in the config file."""
+ # Set the number of classes
+ if config.dataset.name.lower() == "cifar10":
+ config.model.num_classes = 10
+ elif config.dataset.name.lower() == "flickr":
+ config.model.num_classes = 5
+ # additionally for flickr
+ config.batch_size = 4
+ config.num_clients = 30
+ config.clients_per_round = 30
+ else:
+ raise NotImplementedError(f"Dataset {config.dataset.name} not implemented")
+ return config
+
+
+def set_server_target(config: DictConfig) -> DictConfig:
+ """Set the server target based on the algorithm in the config file."""
+ # Set the server target
+ if config.algorithm.lower() == "fedper":
+ config.strategy["_target_"] = "fedper.server.AggregateBodyStrategyPipeline"
+ elif config.algorithm.lower() == "fedavg":
+ config.strategy["_target_"] = "fedper.server.DefaultStrategyPipeline"
+ else:
+ raise NotImplementedError(f"Algorithm {config.algorithm} not implemented")
+ return config
+
+
+def set_client_state_save_path() -> str:
+ """Set the client state save path."""
+ client_state_save_path = time.strftime("%Y-%m-%d")
+ client_state_sub_path = time.strftime("%H-%M-%S")
+ client_state_save_path = (
+ f"./client_states/{client_state_save_path}/{client_state_sub_path}"
+ )
+ if not os.path.exists(client_state_save_path):
+ os.makedirs(client_state_save_path)
+ return client_state_save_path
+
+
+def get_client_fn(
+ config: DictConfig, client_state_save_path: str = ""
+) -> Callable[[str], Union[FedPerClient, BaseClient]]:
+ """Get client function."""
+ # Get algorithm
+ algorithm = config.algorithm.lower()
+ # Get client fn
+ if algorithm == "fedper":
+ client_fn = get_client_fn_simulation(
+ config=config,
+ client_state_save_path=client_state_save_path,
+ )
+ elif algorithm == "fedavg":
+ client_fn = get_client_fn_simulation(
+ config=config,
+ )
+ else:
+ raise NotImplementedError
+ return client_fn
+
+
+def get_create_model_fn(
+ config: DictConfig,
+) -> tuple[
+ Callable[[], Union[type[MobileNet], type[ResNet]]],
+ Union[type[MobileNetModelSplit], type[ResNetModelSplit]],
+]:
+ """Get create model function."""
+ device = config.server_device
+ split: Union[
+ Type[MobileNetModelSplit], Type[ResNetModelSplit]
+ ] = MobileNetModelSplit
+ if config.model_name.lower() == "mobile":
+
+ def create_model() -> Union[Type[MobileNet], Type[ResNet]]:
+ """Create initial MobileNet-v1 model."""
+ return MobileNet(
+ num_head_layers=config.model.num_head_layers,
+ num_classes=config.model.num_classes,
+ ).to(device)
+
+ elif config.model_name.lower() == "resnet":
+ split = ResNetModelSplit
+
+ def create_model() -> Union[Type[MobileNet], Type[ResNet]]:
+ """Create initial ResNet model."""
+ return ResNet(
+ num_head_layers=config.model.num_head_layers,
+ num_classes=config.model.num_classes,
+ ).to(device)
+
+ else:
+ raise NotImplementedError("Model not implemented, check name. ")
+ return create_model, split
+
+
+def plot_metric_from_history(
+ hist: History,
+ save_plot_path: Path,
+ suffix: Optional[str] = "",
+) -> None:
+ """Plot from Flower server History.
+
+ Parameters
+ ----------
+ hist : History
+ Object containing evaluation for all rounds.
+ save_plot_path : Path
+ Folder to save the plot to.
+ suffix: Optional[str]
+ Optional string to add at the end of the filename for the plot.
+ """
+ metric_type = "distributed"
+ metric_dict = (
+ hist.metrics_centralized
+ if metric_type == "centralized"
+ else hist.metrics_distributed
+ )
+ _, values = zip(*metric_dict["accuracy"])
+
+ # let's extract decentralized loss (main metric reported in FedProx paper)
+ rounds_loss, values_loss = zip(*hist.losses_distributed)
+
+ _, axs = plt.subplots(nrows=2, ncols=1, sharex="row")
+ axs[0].plot(np.asarray(rounds_loss), np.asarray(values_loss))
+ axs[1].plot(np.asarray(rounds_loss), np.asarray(values))
+
+ axs[0].set_ylabel("Loss")
+ axs[1].set_ylabel("Accuracy")
+
+ axs[0].grid()
+ axs[1].grid()
+ # plt.title(f"{metric_type.capitalize()} Validation - MNIST")
+ plt.xlabel("Rounds")
+ # plt.legend(loc="lower right")
+
+ plt.savefig(Path(save_plot_path) / Path(f"{metric_type}_metrics{suffix}.png"))
+ plt.close()
+
+
+def save_results_as_pickle(
+ history: History,
+ file_path: Union[str, Path],
+ default_filename: Optional[str] = "results.pkl",
+) -> None:
+ """Save results from simulation to pickle.
+
+ Parameters
+ ----------
+ history: History
+ History returned by start_simulation.
+ file_path: Union[str, Path]
+ Path to file to create and store both history and extra_results.
+ If path is a directory, the default_filename will be used.
+ path doesn't exist, it will be created. If file exists, a
+ randomly generated suffix will be added to the file name. This
+ is done to avoid overwritting results.
+ extra_results : Optional[Dict]
+ A dictionary containing additional results you would like
+ to be saved to disk. Default: {} (an empty dictionary)
+ default_filename: Optional[str]
+ File used by default if file_path points to a directory instead
+ to a file. Default: "results.pkl"
+ """
+ path = Path(file_path)
+
+ # ensure path exists
+ path.mkdir(exist_ok=True, parents=True)
+
+ def _add_random_suffix(path_: Path):
+ """Add a random suffix to the file name."""
+ print(f"File `{path_}` exists! ")
+ suffix = token_hex(4)
+ print(f"New results to be saved with suffix: {suffix}")
+ return path_.parent / (path_.stem + "_" + suffix + ".pkl")
+
+ def _complete_path_with_default_name(path_: Path):
+ """Append the default file name to the path."""
+ print("Using default filename")
+ if default_filename is None:
+ return path_
+ return path_ / default_filename
+
+ if path.is_dir():
+ path = _complete_path_with_default_name(path)
+
+ if path.is_file():
+ path = _add_random_suffix(path)
+
+ print(f"Results will be saved into: {path}")
+ # data = {"history": history, **extra_results}
+ data = {"history": history}
+ # save results to pickle
+ with open(str(path), "wb") as handle:
+ pickle.dump(data, handle, protocol=pickle.HIGHEST_PROTOCOL)
diff --git a/baselines/fedper/pyproject.toml b/baselines/fedper/pyproject.toml
new file mode 100644
index 000000000000..efcdf25eface
--- /dev/null
+++ b/baselines/fedper/pyproject.toml
@@ -0,0 +1,143 @@
+[build-system]
+requires = ["poetry-core>=1.4.0"]
+build-backend = "poetry.masonry.api"
+
+[tool.poetry]
+name = "fedper" # <----- Ensure it matches the name of your baseline directory containing all the source code
+version = "1.0.0"
+description = "Federated Learning with Personalization Layers"
+license = "Apache-2.0"
+authors = ["The Flower Authors ", "William Lindskog "]
+readme = "README.md"
+homepage = "https://flower.dev"
+repository = "https://github.com/adap/flower"
+documentation = "https://flower.dev"
+classifiers = [
+ "Development Status :: 3 - Alpha",
+ "Intended Audience :: Developers",
+ "Intended Audience :: Science/Research",
+ "License :: OSI Approved :: Apache Software License",
+ "Operating System :: MacOS :: MacOS X",
+ "Operating System :: POSIX :: Linux",
+ "Programming Language :: Python",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3 :: Only",
+ "Programming Language :: Python :: 3.8",
+ "Programming Language :: Python :: 3.9",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: Implementation :: CPython",
+ "Topic :: Scientific/Engineering",
+ "Topic :: Scientific/Engineering :: Artificial Intelligence",
+ "Topic :: Scientific/Engineering :: Mathematics",
+ "Topic :: Software Development",
+ "Topic :: Software Development :: Libraries",
+ "Topic :: Software Development :: Libraries :: Python Modules",
+ "Typing :: Typed",
+]
+
+[tool.poetry.dependencies]
+python = ">=3.10.0, <3.11.0" # don't change this
+flwr = {extras = ["simulation"], version = "1.5.0" }
+hydra-core = "1.3.2" # don't change this
+pandas = "^2.0.3"
+matplotlib = "^3.7.2"
+tqdm = "^4.66.1"
+torch = { url = "https://download.pytorch.org/whl/cu117/torch-2.0.1%2Bcu117-cp310-cp310-linux_x86_64.whl"}
+torchvision = { url = "https://download.pytorch.org/whl/cu117/torchvision-0.15.2%2Bcu117-cp310-cp310-linux_x86_64.whl"}
+
+
+[tool.poetry.dev-dependencies]
+isort = "==5.11.5"
+black = "==23.1.0"
+docformatter = "==1.5.1"
+mypy = "==1.4.1"
+pylint = "==2.8.2"
+flake8 = "==3.9.2"
+pytest = "==6.2.4"
+pytest-watch = "==4.2.0"
+ruff = "==0.0.272"
+types-requests = "==2.27.7"
+
+[tool.isort]
+line_length = 88
+indent = " "
+multi_line_output = 3
+include_trailing_comma = true
+force_grid_wrap = 0
+use_parentheses = true
+
+[tool.black]
+line-length = 88
+target-version = ["py38", "py39", "py310", "py311"]
+
+[tool.pytest.ini_options]
+minversion = "6.2"
+addopts = "-qq"
+testpaths = [
+ "flwr_baselines",
+]
+
+[tool.mypy]
+ignore_missing_imports = true
+strict = false
+plugins = "numpy.typing.mypy_plugin"
+
+[tool.pylint."MESSAGES CONTROL"]
+disable = "bad-continuation,duplicate-code,too-few-public-methods,useless-import-alias"
+good-names = "i,j,k,_,x,y,X,Y"
+signature-mutators="hydra.main.main"
+
+[tool.pylint."TYPECHECK"]
+generated-members="numpy.*, torch.*, tensorflow.*"
+
+[[tool.mypy.overrides]]
+module = [
+ "importlib.metadata.*",
+ "importlib_metadata.*",
+]
+follow_imports = "skip"
+follow_imports_for_stubs = true
+disallow_untyped_calls = false
+
+[[tool.mypy.overrides]]
+module = "torch.*"
+follow_imports = "skip"
+follow_imports_for_stubs = true
+
+[tool.docformatter]
+wrap-summaries = 88
+wrap-descriptions = 88
+
+[tool.ruff]
+target-version = "py38"
+line-length = 88
+select = ["D", "E", "F", "W", "B", "ISC", "C4"]
+fixable = ["D", "E", "F", "W", "B", "ISC", "C4"]
+ignore = ["B024", "B027"]
+exclude = [
+ ".bzr",
+ ".direnv",
+ ".eggs",
+ ".git",
+ ".hg",
+ ".mypy_cache",
+ ".nox",
+ ".pants.d",
+ ".pytype",
+ ".ruff_cache",
+ ".svn",
+ ".tox",
+ ".venv",
+ "__pypackages__",
+ "_build",
+ "buck-out",
+ "build",
+ "dist",
+ "node_modules",
+ "venv",
+ "proto",
+]
+
+[tool.ruff.pydocstyle]
+convention = "numpy"
\ No newline at end of file
diff --git a/baselines/fedwav2vec2/.gitignore b/baselines/fedwav2vec2/.gitignore
new file mode 100644
index 000000000000..df43bf9803df
--- /dev/null
+++ b/baselines/fedwav2vec2/.gitignore
@@ -0,0 +1,2 @@
+outputs/
+data/
\ No newline at end of file
diff --git a/baselines/fedwav2vec2/LICENSE b/baselines/fedwav2vec2/LICENSE
new file mode 100644
index 000000000000..d64569567334
--- /dev/null
+++ b/baselines/fedwav2vec2/LICENSE
@@ -0,0 +1,202 @@
+
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
+
+ APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "[]"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+ Copyright [yyyy] [name of copyright owner]
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
diff --git a/baselines/fedwav2vec2/README.md b/baselines/fedwav2vec2/README.md
new file mode 100644
index 000000000000..0b41c6172976
--- /dev/null
+++ b/baselines/fedwav2vec2/README.md
@@ -0,0 +1,131 @@
+---
+title: Federated Learning for ASR based on Wav2vec2.0
+url: https://ieeexplore.ieee.org/document/10096426
+labels: [speech, asr, cross-device]
+dataset: [TED-LIUM 3]
+---
+
+# Federated Learning for ASR Based on wav2vec 2.0
+
+> Note: If you use this baseline in your work, please remember to cite the original authors of the paper as well as the Flower paper.
+
+**Paper:** [ieeexplore.ieee.org/document/10096426](https://ieeexplore.ieee.org/document/10096426)
+
+**Authors:** Tuan Nguyen, Salima Mdhaffar, Natalia Tomashenko, Jean-François Bonastre, Yannick Estève
+
+**Abstract:** This paper presents a study on the use of federated learning to train an ASR model based on a wav2vec 2.0 model pre-trained by self supervision. Carried out on the well-known TED-LIUM 3 dataset, our experiments show that such a model can obtain, with no use of a language model, a word error rate of 10.92% on the official TEDLIUM 3 test set, without sharing any data from the different users. We also analyse the ASR performance for speakers depending to their participation to the federated learning. Since federated learning was first introduced for privacy purposes, we also measure its ability to protect speaker identity. To do that, we exploit an approach to analyze information contained in exchanged models based on a neural network footprint on an indicator dataset. This analysis is made layer-wise and shows which layers in an exchanged wav2vec 2.0 based model bring the speaker identity information.
+
+
+## About this baseline
+
+**What’s implemented:** Figure 1 in the paper. However, this baseline only provide the SSL from figure 1. However, this baseline exclusively offers the self-supervised learning (SSL) approach as depicted in Figure 1 due to it superior performance. If you wish to implement non-SSL methods yourself, you can use the provided recipe and pre-trained model by Speechbrain, available at this link: [Speechbrain Recipe for Non-SSL](https://github.com/speechbrain/speechbrain/tree/develop/recipes/CommonVoice/ASR/seq2seq).
+
+**Datasets:** TED-LIUM 3 dataset. It requires a 54GB download. Once extracted it is ~60 GB. You can read more about this dataset in the [TED-LIUM 3](https://arxiv.org/abs/1805.04699) paper. A more concise description of this dataset can be found in the [OpenSLR](https://www.openslr.org/51/) site.
+
+**Hardware Setup:** Training `wav2vec2.0` is a bit memory intensive so you'd need at least a 24GB GPU. With the current settings, each client requires ~15GB of VRAM. This suggest you could run the experiment fine on a 16GB GPU but not if you also need to pack the global model evaluation stage on the same GPU. On a single RTX 3090Ti (24GB VRAM) each round takes between 20 and 40 minutes (depending on which clients are sampled, some clients have more data than others).
+
+**Contributors:** [Tuan Nguyen](https://www.linkedin.com/in/manh-tuan-nguyen-595898203)
+
+## Experimental Setup
+
+**Task:** Automatic Speech Recognition (ASR)
+
+**Model:** Wav2vec2.0-large [from Huggingface](https://huggingface.co/facebook/wav2vec2-large-lv60) totalling 317M parameters. Read more in the [wav2vec2.0 paper](https://arxiv.org/abs/2006.11477).
+
+
+**Dataset:** In this paper, we divided the training dataset of TED-LIUM 3 into 1943 clients, where each of them is represented by a speaker from TED-LIUM 3. The clients are ordered by CID, with `client_0` having the largest amount of speech hours and `client_1943` having the smallest. Each client's data will be divided into training, development, and test sets with an 80-10-10 ratio. For client who has more than 10 minutes, we extract 5 minutes from their training set for analysis purposes. This portion will not be used during training or in any part of this baseline. For clients with duration less than 10 minutes, all the speaker data will represent the local dataset for the client. The full structure breakdown is below:
+```bash
+├── data
+│ ├── client_{cid}
+│ │ ├── ted_train.csv
+│ │ ├── ted_dev.csv
+│ │ ├── ted_test.csv
+│ │ ├── ted_train_full5.csv {Analysis dataset contains only 5m from ted_train.csv}
+│ │ ├── ted_train_wo5.csv {the training file for client who has more than 10m}
+│ ├── server
+│ │ ├── ted_train.csv {all TED-LIUM 3 train set}
+│ │ ├── ted_dev.csv {all TED-LIUM 3 valid set}
+│ │ ├── ted_test.csv {all TED-LIUM 3 test set}
+
+```
+For more details, please refer to the relevant section in the paper.
+
+**Training Hyperparameters:**
+| Hyperparameter | Default Value | Description |
+| ------- | ----- | ------- |
+| `pre_train_model_path` | `null` | Path to pre-trained model or checkpoint. The best checkpoint could be found [here](https://github.com/tuanct1997/Federated-Learning-ASR-based-on-wav2vec-2.0/tree/main/material/pre-trained) |
+| `save_checkpoint` | `null` | Path to folder where server model will be saved at each round |
+| `label_path` | `docs/pretrained_wav2vec2` | Label each character for every client to ensure consistency during training phase|
+| `sb_config` | `fedwav2vec2/conf/sb_config/w2v2.yaml` | Speechbrain config file for architecture model. Please refer to [SpeechBrain](https://github.com/speechbrain/speechbrain) for more information |
+| `rounds` | `100` | Indicate the number of Federated Learning (FL) rounds|
+| `local_epochs` | `20` | Specify the number of training epochs at the client side |
+| `total_clients` | `1943` | Size of client pool, with a maxium set at 1943 clients|
+| `server_cid` | `19999` | ID of the server to distinguish from the client's ID |
+| `server_device` | `cuda` | You can choose between `cpu` or `cuda` for centralised evaluation, but it is recommended to use `cuda`|
+| `parallel_backend` | `false` | Multi-gpus training. Only active if you have more than 1 gpu per client |
+| `strategy.min_fit_client` | `20` | Number of clients involve per round. Default is 20 as indicated in the paper |
+| `strategy.fraction_fit` | `0.01` | Ratio of client pool to involve during training |
+| `strategy.weight_strategy` | `num`| Different way to average clients weight. Could be chose between `num`,`loss`,`wer` |
+| `client_resources.num_cpus` | `8`| Number of cpus per client. Recommended to have more than 8 |
+| `client_resources.num_gpus` | `1`| Number of gpus per client. Recommended to have at least 1 with VRAM > 24GB |
+
+
+By default, long audio sequences (>10s) are excluded from training. This is done so to keep the VRAM usage low enough to train a client on a 16GB GPU. This hyperparameter is defined in the `sb_config` under the `avoid_if_longer_than` tag.
+
+## Environment Setup
+
+Once you have installed `pyenv` and `poetry`, run the commands below to setup your python environment:
+
+```bash
+# Set a recent version of Python for your environment
+pyenv local 3.10.6
+poetry env use 3.10.6
+
+# Install your environment
+poetry install
+
+# Activate your environment
+poetry shell
+```
+
+When you run this baseline for the first time, you need first to download the data-to-client mapping files as well as the `TED-LIUM-3`` dataset.
+
+```bash
+# Then create a directory using the same name as you'll use for `dada_dir` in your config (see conf/base.yaml)
+mkdir data
+
+# Clone client mapping (note content will be moved to your data dir)
+git clone https://github.com/tuanct1997/Federated-Learning-ASR-based-on-wav2vec-2.0.git _temp && mv _temp/data/* data/ && rm -rf _temp
+
+# Download dataset, extract and prepare dataset partitions
+# This might take a while depending on your internet connection
+python -m fedwav2vec2.dataset_preparation
+```
+
+
+## Running the Experiments
+
+```bash
+# Run with default arguments (one client per GPU)
+python -m fedwav2vec2.main
+
+# if you have a large GPU (32GB+) you migth want to fit two per GPU
+python -m fedwav2vec2.main client_resources.num_gpus=0.5
+
+# the global model can be saved at the end of each round if you specify a checkpoint path
+python -m fedwav2vec2.main save_checkpoint= # if directory doesn't exist, it will be created
+
+# then you can use it as the starting point for your global model like so:
+python -m fedwav2vec2.main pre_train_model_path=/last_checkpoint.pt
+```
+
+When running the experiment, a structure of directories `/` will be created by Hydra. Inside you'll find a directory for each client (where their log is recorded). Another directory at the same level `/server` is created where the server log is recorded. For this baseline the metric of interes it the Word Error Rate (`WER`) which is logged in `train_log.txt` at the end of each round.
+
+
+## Expected Results
+
+Running the command above will generate the `SSL` results as shown on the plot below. The results should closely follow those in Figure 1 in the paper.
+
+
+
+
diff --git a/baselines/fedwav2vec2/_static/fedwav2vec.png b/baselines/fedwav2vec2/_static/fedwav2vec.png
new file mode 100644
index 000000000000..27a2a7c5d7c1
Binary files /dev/null and b/baselines/fedwav2vec2/_static/fedwav2vec.png differ
diff --git a/baselines/fedwav2vec2/docs/label_encoder.txt b/baselines/fedwav2vec2/docs/label_encoder.txt
new file mode 100644
index 000000000000..654e01e1065d
--- /dev/null
+++ b/baselines/fedwav2vec2/docs/label_encoder.txt
@@ -0,0 +1,54 @@
+'t' => 50
+'h' => 1
+'e' => 2
+'a' => 3
+'_' => 4
+'o' => 5
+'n' => 6
+'l' => 7
+'i' => 8
+'r' => 9
+'s' => 10
+'p' => 11
+'d' => 12
+'w' => 13
+'u' => 14
+'k' => 15
+'c' => 16
+'m' => 17
+'y' => 18
+'v' => 19
+'z' => 20
+'f' => 21
+'b' => 22
+'g' => 23
+'j' => 24
+"'" => 25
+'x' => 26
+'q' => 27
+'4' => 28
+'2' => 29
+'7' => 30
+'[' => 31
+']' => 32
+'1' => 33
+'9' => 34
+'0' => 35
+'5' => 36
+'3' => 37
+'6' => 38
+'=' => 39
+'%' => 40
+'$' => 41
+'8' => 42
+'#' => 43
+'ā' => 44
+'&' => 45
+'+' => 46
+'@' => 47
+'^' => 48
+'\\' => 49
+'' => 0
+================
+'starting_index' => 0
+'blank_label' => ''
diff --git a/baselines/fedwav2vec2/fedwav2vec2/__init__.py b/baselines/fedwav2vec2/fedwav2vec2/__init__.py
new file mode 100644
index 000000000000..a5e567b59135
--- /dev/null
+++ b/baselines/fedwav2vec2/fedwav2vec2/__init__.py
@@ -0,0 +1 @@
+"""Template baseline package."""
diff --git a/baselines/fedwav2vec2/fedwav2vec2/client.py b/baselines/fedwav2vec2/fedwav2vec2/client.py
new file mode 100644
index 000000000000..319580a83845
--- /dev/null
+++ b/baselines/fedwav2vec2/fedwav2vec2/client.py
@@ -0,0 +1,172 @@
+"""Define your client class and a function to construct such clients.
+
+Please overwrite `flwr.client.NumPyClient` or `flwr.client.Client` and create a function
+to instantiate your client.
+"""
+
+
+import gc
+import logging
+from math import exp
+
+import flwr as fl
+import speechbrain as sb
+import torch
+from flwr.common import (
+ Code,
+ EvaluateIns,
+ EvaluateRes,
+ FitIns,
+ FitRes,
+ GetParametersIns,
+ GetParametersRes,
+ NDArrays,
+ Status,
+ ndarrays_to_parameters,
+ parameters_to_ndarrays,
+)
+from omegaconf import DictConfig
+
+from fedwav2vec2.models import int_model
+from fedwav2vec2.sb_recipe import get_weights, set_weights
+
+
+class SpeechBrainClient(fl.client.Client):
+ """Flower client for SpeechBrain."""
+
+ def __init__(self, cid: str, asr_brain, dataset):
+ self.cid = cid
+ self.params = asr_brain.hparams
+ self.modules = asr_brain.modules
+ self.asr_brain = asr_brain
+ self.dataset = dataset
+
+ fl.common.logger.log(logging.DEBUG, "Starting client %s", cid)
+
+ def get_parameters(self, _: GetParametersIns) -> GetParametersRes:
+ """Return the parameters of the current net."""
+ weights: NDArrays = get_weights(self.modules)
+ parameters = ndarrays_to_parameters(weights)
+ gc.collect()
+ status = Status(code=Code.OK, message="Success")
+ return GetParametersRes(status=status, parameters=parameters)
+
+ def fit(self, ins: FitIns) -> FitRes:
+ """Implement distributed fit function for a given client."""
+ weights: NDArrays = fl.common.parameters_to_ndarrays(ins.parameters)
+ config = ins.config
+
+ # Read training configuration
+ epochs = int(config["epochs"])
+
+ (_, num_examples, avg_loss, avg_wer) = self._train_speech_recogniser(
+ weights, epochs
+ )
+ metrics = {"train_loss": avg_loss, "wer": avg_wer}
+
+ parameters = self.get_parameters(GetParametersIns(config={})).parameters
+ del self.asr_brain.modules
+ if torch.cuda.is_available():
+ torch.cuda.empty_cache()
+ gc.collect()
+
+ status = Status(code=Code.OK, message="Success")
+
+ return FitRes(
+ status=status,
+ parameters=parameters,
+ num_examples=num_examples,
+ metrics=metrics,
+ )
+
+ def evaluate(self, ins: EvaluateIns) -> EvaluateRes:
+ """Implement distributed evaluation for a given client."""
+ weights = parameters_to_ndarrays(ins.parameters)
+
+ num_examples, loss, wer = self.evaluate_train_speech_recogniser(
+ server_params=weights,
+ epochs=1,
+ )
+ if torch.cuda.is_available():
+ torch.cuda.empty_cache()
+ gc.collect()
+
+ status = Status(code=Code.OK, message="Success")
+ # Return the number of evaluation examples and the evaluation result (loss)
+ return EvaluateRes(
+ status=status,
+ num_examples=num_examples,
+ loss=float(loss),
+ metrics={"Error rate": float(wer)},
+ )
+
+ def evaluate_train_speech_recogniser(self, server_params, epochs):
+ """Evaluate aggerate/server model."""
+ _, _, test_data = self._setup_task(server_params, epochs)
+ self.params.wer_file = self.params.output_folder + "/wer_test.txt"
+
+ batch_count, loss, wer = self.asr_brain.evaluate(
+ test_data,
+ test_loader_kwargs=self.params.test_dataloader_options,
+ )
+
+ return batch_count, float(loss), float(wer)
+
+ def _setup_task(
+ self,
+ server_params,
+ epochs,
+ ):
+ self.params.epoch_counter.limit = epochs
+ self.params.epoch_counter.current = 0
+
+ train_data, valid_data, test_data = self.dataset
+ # Set the parameters to the ones given by the server
+ if server_params is not None:
+ set_weights(server_params, self.modules, self.params.device)
+ return train_data, valid_data, test_data
+
+ def _train_speech_recogniser(self, server_params, epochs):
+ train_data, valid_data, _ = self._setup_task(server_params, epochs)
+
+ # Training
+ count_sample, avg_loss, avg_wer = self.asr_brain.fit(
+ self.params.epoch_counter,
+ train_data,
+ valid_data,
+ train_loader_kwargs=self.params.dataloader_options,
+ valid_loader_kwargs=self.params.test_dataloader_options,
+ )
+ # exp operation to avg_loss and avg_wer
+ avg_wer = 100 if avg_wer > 100 else avg_wer
+ avg_loss = exp(-avg_loss)
+ avg_wer = exp(100 - avg_wer)
+
+ # retrieve the parameters to return
+ params_list = get_weights(self.modules)
+
+ # Manage when last batch isn't full w.r.t batch size
+ train_set = sb.dataio.dataloader.make_dataloader(
+ train_data, **self.params.dataloader_options
+ )
+ if count_sample > len(train_set) * self.params.batch_size * epochs:
+ count_sample = len(train_set) * self.params.batch_size * epochs
+
+ del train_data, valid_data
+ if torch.cuda.is_available():
+ torch.cuda.empty_cache()
+ gc.collect()
+ return (params_list, count_sample, avg_loss, avg_wer)
+
+
+def get_client_fn(config: DictConfig, save_path: str):
+ """Return a function that creates a Flower client."""
+
+ def client_fn(cid: str) -> fl.client.Client:
+ """Generate the simulated clients."""
+ device = "cuda" if torch.cuda.is_available() else "cpu"
+
+ asr_brain, dataset = int_model(cid, config, device=device, save_path=save_path)
+ return SpeechBrainClient(cid, asr_brain, dataset)
+
+ return client_fn
diff --git a/baselines/fedwav2vec2/fedwav2vec2/conf/base.yaml b/baselines/fedwav2vec2/fedwav2vec2/conf/base.yaml
new file mode 100644
index 000000000000..0df942aa1c12
--- /dev/null
+++ b/baselines/fedwav2vec2/fedwav2vec2/conf/base.yaml
@@ -0,0 +1,43 @@
+---
+# this is the config that will be loaded as default by main.py
+# Please follow the provided structure (this will ensuring all baseline follow
+# a similar configuration structure and hence be easy to customise)
+
+pre_train_model_path: null # Path to checkpoint exp: docs/checkpoint/last_checkpoint.pt
+save_checkpoint: null # Path to folder for checkpoint
+
+# Path for label encoder file if want to ensure the same encode for every client
+label_path: docs/label_encoder.txt
+
+huggingface_model_save_path: docs/pretrained_wav2vec2
+sb_config: fedwav2vec2/conf/sb_config/w2v2.yaml # config with SpeechBrain recipe for Wav2Vec 2.0
+data_path: data # if you change this, ensure you `git cloned` the author's own repo to a directory with the new name
+rounds: 100 # global FL rounds
+local_epochs: 20 # local epochs for each client
+total_clients: 1943
+server_cid: 19999
+
+# Device setup
+server_device: cuda
+parallel_backend: false # If using multi-gpus per client (disable it if using server_device=cpu)
+
+client_resources:
+ num_cpus: 8
+ num_gpus: 1
+
+dataset:
+ download_filename: TEDLIUM_release-3.tgz
+
+ extract_subdirectory: audio
+
+
+strategy:
+ _target_: fedwav2vec2.strategy.CustomFedAvg
+ min_fit_clients: 20
+ fraction_fit: 0.01
+ fraction_evaluate: 0.00
+ min_available_clients: ${total_clients}
+ weight_strategy: num # strategy of weighting clients in: [num, loss, wer]
+ on_fit_config_fn:
+ _target_: fedwav2vec2.server.get_on_fit_config_fn
+ local_epochs: ${local_epochs}
diff --git a/baselines/fedwav2vec2/fedwav2vec2/conf/sb_config/w2v2.yaml b/baselines/fedwav2vec2/fedwav2vec2/conf/sb_config/w2v2.yaml
new file mode 100644
index 000000000000..87bf21080b0e
--- /dev/null
+++ b/baselines/fedwav2vec2/fedwav2vec2/conf/sb_config/w2v2.yaml
@@ -0,0 +1,178 @@
+# ################################
+# Model: seq2seq ASR on TIMIT with wav2vec2 + CTC/Attention
+# Authors:
+# * Titouan Parcollet 2021
+# ################################
+
+# Seed needs to be set at top of yaml, before objects with parameters are made
+seed: 1234
+__set_seed: !!python/object/apply:torch.manual_seed [!ref ]
+output_folder: !ref docs/results/fl_wav2vec2/
+wer_file: !ref /wer.txt
+save_folder: !ref /save
+train_log: !ref /train_log.txt
+label_encode: docs/label_encoder.txt
+
+wav2vec_output: docs/pretrained_wav2vec2
+
+# URL for the biggest huggingface english wav2vec2 model.
+
+wav2vec2_hub: "facebook/wav2vec2-large-lv60"
+# Data files
+data_folder: data/audio/TEDLIUM_release-3/data
+
+accented_letters: False
+language: en # use 'it' for Italian, 'rw' for Kinyarwanda, 'en' for english
+train_csv: data/ted_train.csv
+valid_csv: data/ted_dev.csv
+test_csv: data/ted_test.csv
+skip_prep: False # Skip data preparation
+device: cpu # will be overwritten by `server_device` in `conf/base.yaml`
+# Training parameters
+
+avoid_if_longer_than: 10.0
+avoid_if_smaller_than: 0.1
+
+
+# Decoding parameters
+blank_index: 0
+bos_index: 1
+eos_index: 2
+
+# Training parameters
+number_of_epochs: 30
+lr: 0.001
+lr_wav2vec: 0.0001
+
+ctc_weight: 0.2
+sorting: descending
+
+# With data_parallel batch_size is split into N jobs
+# With DDP batch_size is multiplied by N jobs
+# Must be 6 per GPU to fit 16GB of VRAM
+batch_size: 4
+test_batch_size: 1
+
+dataloader_options:
+ batch_size: !ref
+ num_workers: 4
+test_dataloader_options:
+ batch_size: !ref
+ num_workers: 4
+
+# BPE parameters
+token_type: unigram # ["unigram", "bpe", "char"]
+character_coverage: 1.0
+
+# Feature parameters (FBANKS etc)
+sample_rate: 16000
+
+
+# Model parameters
+activation: !name:torch.nn.LeakyReLU
+dropout: 0.5
+dnn_neurons: 500
+dec_neurons: 500
+
+#encoder with w2v
+enc_dnn_layers: 1
+enc_dnn_neurons: 1024
+
+# Outputs
+output_neurons: 52
+
+# Decoding parameters
+# Be sure that the bos and eos index match with the BPEs ones
+# blank_index: 0
+beam_size: 20
+temperature: 1.50
+
+#
+# Functions and classes
+#
+epoch_counter: !new:speechbrain.utils.epoch_loop.EpochCounter
+ limit: !ref
+
+
+
+wav2vec2: !new:speechbrain.lobes.models.huggingface_wav2vec.HuggingFaceWav2Vec2
+ source: !ref
+ output_norm: True
+ freeze: False
+
+ save_path: !ref /wav2vec2_checkpoint
+
+ # A simple DNN that receive as inputs the output of the wav2vec2 model
+ # Here the output dimensionality of the LARGE wav2vec2 is 1024.
+enc: !new:speechbrain.lobes.models.VanillaNN.VanillaNN
+ input_shape: [null, null, 1024]
+ activation: !ref
+ dnn_blocks: !ref
+ dnn_neurons: !ref
+
+
+ctc_lin: !new:speechbrain.nnet.linear.Linear
+ input_size: !ref
+ n_neurons: !ref
+ bias: True
+
+log_softmax: !new:speechbrain.nnet.activations.Softmax
+ apply_log: True
+
+
+modules:
+ wav2vec2: !ref
+ enc: !ref
+ ctc_lin: !ref
+
+model: !new:torch.nn.ModuleList
+ - [!ref , !ref ]
+
+adam_opt_class: !name:torch.optim.Adam
+ lr: !ref
+
+wav2vec_opt_class: !name:torch.optim.Adam
+ lr: !ref
+
+ctc_cost: !name:speechbrain.nnet.losses.ctc_loss
+ blank_index: !ref
+
+lr_annealing_adam: !new:speechbrain.nnet.schedulers.NewBobScheduler
+ initial_value: !ref
+ improvement_threshold: 0.0025
+ annealing_factor: 0.8
+ patient: 0
+
+lr_annealing_wav2vec: !new:speechbrain.nnet.schedulers.NewBobScheduler
+ initial_value: !ref
+ improvement_threshold: 0.0025
+ annealing_factor: 0.9
+
+checkpointer: !new:speechbrain.utils.checkpoints.Checkpointer
+ checkpoints_dir: !ref
+ recoverables:
+ model: !ref
+ wav2vec2: !ref
+ lr_annealing_adam: !ref
+ lr_annealing_wav2vec: !ref
+ counter: !ref
+
+train_logger: !new:speechbrain.utils.train_logger.FileTrainLogger
+ save_file: !ref
+
+
+ctc_computer: !name:speechbrain.utils.metric_stats.MetricStats
+ metric: !name:speechbrain.nnet.losses.ctc_loss
+ blank_index: !ref
+ reduction: batch
+
+
+error_rate_computer: !name:speechbrain.utils.metric_stats.ErrorRateStats
+
+cer_computer: !name:speechbrain.utils.metric_stats.ErrorRateStats
+ merge_tokens: True
+
+coer_computer: !name:speechbrain.utils.metric_stats.ErrorRateStats
+
+cver_computer: !name:speechbrain.utils.metric_stats.ErrorRateStats
+
diff --git a/baselines/fedwav2vec2/fedwav2vec2/dataset.py b/baselines/fedwav2vec2/fedwav2vec2/dataset.py
new file mode 100644
index 000000000000..65fee90faf38
--- /dev/null
+++ b/baselines/fedwav2vec2/fedwav2vec2/dataset.py
@@ -0,0 +1,125 @@
+"""Handle basic dataset creation.
+
+In case of PyTorch it should return dataloaders for your dataset (for both the clients
+and the server). If you are using a custom dataset class, this module is the place to
+define it. If your dataset requires to be downloaded (and this is not done
+automatically -- e.g. as it is the case for many dataset in TorchVision) and
+partitioned, please include all those functions and logic in the
+`dataset_preparation.py` module. You can use all those functions from functions/methods
+defined here of course.
+"""
+
+
+import speechbrain as sb
+import torchaudio
+
+
+# Define custom data procedure
+def dataio_prepare(hparams):
+ """Create Dataset objects from the CSV files."""
+ # 1. Define datasets
+ data_folder = hparams["data_folder"]
+
+ train_data = sb.dataio.dataset.DynamicItemDataset.from_csv(
+ csv_path=hparams["train_csv"],
+ replacements={"data_root": data_folder},
+ )
+
+ if hparams["sorting"] == "ascending":
+ # we sort training data to speed up training and get better results.
+ train_data = train_data.filtered_sorted(
+ sort_key="duration",
+ key_max_value={"duration": hparams["avoid_if_longer_than"]},
+ key_min_value={"duration": hparams["avoid_if_smaller_than"]},
+ )
+ # when sorting do not shuffle in dataloader ! otherwise is pointless
+ hparams["dataloader_options"]["shuffle"] = False
+
+ elif hparams["sorting"] == "descending":
+ train_data = train_data.filtered_sorted(
+ sort_key="duration",
+ reverse=True,
+ key_max_value={"duration": hparams["avoid_if_longer_than"]},
+ key_min_value={"duration": hparams["avoid_if_smaller_than"]},
+ )
+ # when sorting do not shuffle in dataloader ! otherwise is pointless
+ hparams["dataloader_options"]["shuffle"] = False
+
+ elif hparams["sorting"] == "random":
+ pass
+
+ else:
+ raise NotImplementedError("sorting must be random, ascending or descending")
+
+ valid_data = sb.dataio.dataset.DynamicItemDataset.from_csv(
+ csv_path=hparams["valid_csv"],
+ replacements={"data_root": data_folder},
+ )
+ # We also sort the validation data so it is faster to validate
+ valid_data = valid_data.filtered_sorted(
+ sort_key="duration",
+ reverse=True,
+ key_max_value={"duration": hparams["avoid_if_longer_than"]},
+ key_min_value={"duration": hparams["avoid_if_smaller_than"]},
+ )
+
+ test_data = sb.dataio.dataset.DynamicItemDataset.from_csv(
+ csv_path=hparams["test_csv"],
+ replacements={"data_root": data_folder},
+ )
+ # We also sort the test data so it is faster to validate
+ test_data = test_data.filtered_sorted(
+ sort_key="duration",
+ reverse=True,
+ key_max_value={"duration": hparams["avoid_if_longer_than"]},
+ key_min_value={"duration": hparams["avoid_if_smaller_than"]},
+ )
+
+ datasets = [train_data, valid_data, test_data]
+
+ label_encoder = sb.dataio.encoder.CTCTextEncoder()
+
+ # 2. Define audio pipeline:
+ @sb.utils.data_pipeline.takes("wav", "start_seg", "end_seg")
+ @sb.utils.data_pipeline.provides("sig")
+ def audio_pipeline(wav, start_seg, end_seg):
+ info = torchaudio.info(wav)
+ start = int(float(start_seg) * hparams["sample_rate"])
+ stop = int(float(end_seg) * hparams["sample_rate"])
+ speech_segment = {"file": wav, "start": start, "stop": stop}
+ sig = sb.dataio.dataio.read_audio(speech_segment)
+ # resample to correct 16Hz if different or else remain the same
+ resampled = torchaudio.transforms.Resample(
+ info.sample_rate,
+ hparams["sample_rate"],
+ )(sig)
+ return resampled
+
+ sb.dataio.dataset.add_dynamic_item(datasets, audio_pipeline)
+
+ # 3. Define text pipeline:
+ @sb.utils.data_pipeline.takes("char")
+ @sb.utils.data_pipeline.provides("char_list", "char_encoded")
+ def text_pipeline(char):
+ char_list = char.strip().split()
+ yield char_list
+ char_encoded = label_encoder.encode_sequence_torch(char_list)
+ yield char_encoded
+
+ sb.dataio.dataset.add_dynamic_item(datasets, text_pipeline)
+
+ lab_enc_file = hparams["label_encoder"]
+ label_encoder.load_or_create(
+ path=lab_enc_file,
+ from_didatasets=[train_data],
+ output_key="char_list",
+ special_labels={"blank_label": hparams["blank_index"]},
+ sequence_input=True,
+ )
+
+ # 4. Set output:
+ sb.dataio.dataset.set_output_keys(
+ datasets,
+ ["id", "sig", "char_encoded"],
+ )
+ return train_data, valid_data, test_data, label_encoder
diff --git a/baselines/fedwav2vec2/fedwav2vec2/dataset_preparation.py b/baselines/fedwav2vec2/fedwav2vec2/dataset_preparation.py
new file mode 100644
index 000000000000..255664d3d2c3
--- /dev/null
+++ b/baselines/fedwav2vec2/fedwav2vec2/dataset_preparation.py
@@ -0,0 +1,130 @@
+"""Handle the dataset partitioning and (optionally) complex downloads.
+
+Please add here all the necessary logic to either download, uncompress, pre/post-process
+your dataset (or all of the above). If the desired way of running your baseline is to
+first download the dataset and partition it and then run the experiments, please
+uncomment the lines below and tell us in the README.md (see the "Running the Experiment"
+block) that this file should be executed first.
+"""
+
+
+import os
+import ssl
+import tarfile
+import urllib.request
+from shutil import rmtree
+
+import hydra
+import pandas as pd
+from hydra.core.hydra_config import HydraConfig
+from omegaconf import DictConfig, OmegaConf
+
+
+def _download_file(url, filename):
+ """Download the file and show a progress bar."""
+ print(f"Downloading {url}...")
+ retries = 3
+ while retries > 0:
+ try:
+ with urllib.request.urlopen(
+ url,
+ # pylint: disable=protected-access
+ context=ssl._create_unverified_context(),
+ ) as response, open(filename, "wb") as out_file:
+ total_size = int(response.getheader("Content-Length"))
+ block_size = 1024 * 8
+ count = 0
+ while True:
+ data = response.read(block_size)
+ if not data:
+ break
+ count += 1
+ out_file.write(data)
+ percent = int(count * block_size * 100 / total_size)
+ print(
+ f"\rDownload: {percent}% [{count * block_size}/{total_size}]",
+ end="",
+ )
+ print("\nDownload complete.")
+ break
+ except Exception as error: # pylint: disable=broad-except
+ print(f"\nError occurred during download: {error}")
+ retries -= 1
+ if retries > 0:
+ print(f"Retrying ({retries} retries left)...")
+ else:
+ print("Download failed.")
+ raise error
+
+
+def _extract_file(filename, extract_path):
+ """Extract the contents and show a progress bar."""
+ print(f"Extracting {filename}...")
+ with tarfile.open(filename, "r:gz") as tar:
+ members = tar.getmembers()
+ total_files = len(members)
+ current_file = 0
+ for member in members:
+ current_file += 1
+ tar.extract(member, path=extract_path)
+ percent = int(current_file * 100 / total_files)
+ print(f"\rExtracting: {percent}% [{current_file}/{total_files}]", end="")
+ print("\nExtraction complete.")
+
+
+def _delete_file(filename):
+ """Delete the downloaded file."""
+ os.remove(filename)
+ print(f"Deleted {filename}.")
+
+
+def _csv_path_audio(extract_path: str):
+ """Change the path corespond to your actual path."""
+ for subdir, _dirs, files in os.walk("./data"):
+ for file in files:
+ if file.endswith(".csv"):
+ if "client" in subdir:
+ path = path = os.path.join(extract_path, "legacy/train/sph")
+ else:
+ if "train" in file:
+ path = os.path.join(extract_path, "legacy/train/sph")
+ elif "dev" in file:
+ path = os.path.join(extract_path, "legacy/dev/sph")
+ else:
+ path = os.path.join(extract_path, "legacy/test/sph")
+ d_f = pd.read_csv(os.path.join(subdir, file))
+ d_f["wav"] = d_f["wav"].str.replace("path", path)
+ d_f.to_csv(os.path.join(subdir, file), index=False)
+
+
+@hydra.main(config_path="./conf", config_name="base", version_base=None)
+def download_and_extract(cfg: DictConfig) -> None:
+ """Download and extract TEDIUM-3 dataset."""
+ print(OmegaConf.to_yaml(cfg))
+ url = (
+ "https://projets-lium.univ-lemans.fr"
+ "/wp-content/uploads/corpus/TED-LIUM/TEDLIUM_release-3.tgz"
+ )
+ # URL = "https://www.openslr.org/resources/51/TEDLIUM_release-3.tgz"
+ filename = f"{cfg.data_path}/{cfg.dataset.download_filename}"
+ extract_path = f"{cfg.data_path}/{cfg.dataset.extract_subdirectory}"
+
+ print(f"{extract_path = }")
+ print(f"{filename = }")
+
+ if not os.path.exists(extract_path):
+ try:
+ _download_file(url, filename)
+ _extract_file(filename, extract_path)
+ finally:
+ _delete_file(filename)
+
+ _csv_path_audio(f"{extract_path}/TEDLIUM_release-3")
+
+ # remove output dir. No need to keep it around
+ save_path = HydraConfig.get().runtime.output_dir
+ rmtree(save_path)
+
+
+if __name__ == "__main__":
+ download_and_extract()
diff --git a/baselines/fedwav2vec2/fedwav2vec2/main.py b/baselines/fedwav2vec2/fedwav2vec2/main.py
new file mode 100644
index 000000000000..5011b2ac15e2
--- /dev/null
+++ b/baselines/fedwav2vec2/fedwav2vec2/main.py
@@ -0,0 +1,60 @@
+"""Create and connect the building blocks for your experiments; start the simulation.
+
+It includes processioning the dataset, instantiate strategy, specify how the global
+model is going to be evaluated, etc. At the end, this script saves the results.
+"""
+
+import flwr as fl
+import hydra
+from hydra.core.hydra_config import HydraConfig
+from hydra.utils import instantiate
+from omegaconf import DictConfig, OmegaConf
+
+from fedwav2vec2.client import get_client_fn
+from fedwav2vec2.models import pre_trained_point
+from fedwav2vec2.server import get_evaluate_fn
+
+
+@hydra.main(config_path="conf", config_name="base", version_base=None)
+def main(cfg: DictConfig) -> None:
+ """Run the baseline.
+
+ Parameters
+ ----------
+ cfg : DictConfig
+ An omegaconf object that stores the hydra config.
+ """
+ # 1. Print parsed config
+ print(OmegaConf.to_yaml(cfg))
+
+ # Hydra automatically creates an output directory
+ # Let's retrieve it and save some results there
+ save_path = HydraConfig.get().runtime.output_dir
+
+ if cfg.pre_train_model_path is not None:
+ print("PRETRAINED INITIALIZE")
+
+ pretrained = pre_trained_point(save_path, cfg, cfg.server_device)
+ else:
+ pretrained = None
+
+ strategy = instantiate(
+ cfg.strategy,
+ initial_parameters=pretrained,
+ evaluate_fn=get_evaluate_fn(
+ cfg, server_device=cfg.server_device, save_path=save_path
+ ),
+ )
+
+ fl.simulation.start_simulation(
+ client_fn=get_client_fn(cfg, save_path),
+ num_clients=cfg.total_clients,
+ client_resources=cfg.client_resources,
+ config=fl.server.ServerConfig(num_rounds=cfg.rounds),
+ strategy=strategy,
+ ray_init_args={"include_dashboard": False},
+ )
+
+
+if __name__ == "__main__":
+ main()
diff --git a/baselines/fedwav2vec2/fedwav2vec2/models.py b/baselines/fedwav2vec2/fedwav2vec2/models.py
new file mode 100644
index 000000000000..38916aaac555
--- /dev/null
+++ b/baselines/fedwav2vec2/fedwav2vec2/models.py
@@ -0,0 +1,148 @@
+"""Define our models, and training and eval functions.
+
+If your model is 100% off-the-shelf (e.g. directly from torchvision without requiring
+modifications) you might be better off instantiating your model directly from the Hydra
+config. In this way, swapping your model for another one can be done without changing
+the python code at all
+"""
+
+
+import gc
+import os
+
+import speechbrain as sb
+import torch
+from flwr.common import ndarrays_to_parameters
+from hyperpyyaml import load_hyperpyyaml
+from omegaconf import DictConfig
+
+from fedwav2vec2.dataset import dataio_prepare
+from fedwav2vec2.sb_recipe import ASR, get_weights
+
+
+def int_model( # pylint: disable=too-many-arguments,too-many-locals
+ cid,
+ config: DictConfig,
+ device: str,
+ save_path,
+ evaluate=False,
+):
+ """Set up the experiment.
+
+ Loading the hyperparameters from config files and command-line overrides, setting
+ the correct path for the corresponding clients, and creating the model.
+ """
+ # Load hyperparameters file with command-line overrides
+
+ if cid == 19999:
+ save_path = save_path + "server"
+ else:
+ save_path = save_path + "/client_" + str(cid)
+
+ # Override with FLOWER PARAMS
+ if evaluate:
+ overrides = {
+ "output_folder": save_path,
+ "number_of_epochs": 1,
+ "test_batch_size": 4,
+ "device": device,
+ "wav2vec_output": config.huggingface_model_save_path,
+ }
+
+ else:
+ overrides = {
+ "output_folder": save_path,
+ "wav2vec_output": config.huggingface_model_save_path,
+ }
+
+ label_path_ = config.label_path
+ if label_path_ is None:
+ label_path_ = os.path.join(save_path, "label_encoder.txt")
+
+ _, run_opts, _ = sb.parse_arguments(config.sb_config)
+ run_opts["device"] = device
+ run_opts["data_parallel_backend"] = config.parallel_backend
+ run_opts["noprogressbar"] = True # disable tqdm progress bar
+
+ with open(config.sb_config) as fin:
+ params = load_hyperpyyaml(fin, overrides)
+
+ # This logic follow the data_path is a path to csv folder file
+ # All train/dev/test csv files are in the same name format for server and client
+ # Example:
+ # server: /users/server/train.csv
+ # client: /users/client_1/train.csv
+ # Modify (if needed) the if else logic to fit with path format
+
+ if int(cid) != config.server_cid:
+ params["data_folder"] = os.path.join(config.data_path, "client_" + str(cid))
+ else:
+ params["data_folder"] = os.path.join(config.data_path, "server")
+
+ print(f'{params["data_folder"] = }')
+ params["train_csv"] = params["data_folder"] + "/ted_train.csv"
+ params["valid_csv"] = params["data_folder"] + "/ted_dev.csv"
+ params["test_csv"] = params["data_folder"] + "/ted_test.csv"
+
+ if int(cid) < 1341:
+ params["train_csv"] = params["data_folder"] + "/ted_train_wo5.csv"
+ params["label_encoder"] = label_path_
+
+ # Create experiment directory
+ sb.create_experiment_directory(
+ experiment_directory=params["output_folder"],
+ hyperparams_to_save=config.sb_config,
+ overrides=overrides,
+ )
+
+ # Create the datasets objects as well as tokenization and encoding :-D
+ train_data, valid_data, test_data, label_encoder = dataio_prepare(params)
+ # Trainer initialization
+
+ asr_brain = ASR(
+ modules=params["modules"],
+ hparams=params,
+ run_opts=run_opts,
+ checkpointer=params["checkpointer"],
+ )
+ asr_brain.label_encoder = label_encoder
+ asr_brain.label_encoder.add_unk()
+
+ # Adding objects to trainer.
+ gc.collect()
+ return asr_brain, [train_data, valid_data, test_data]
+
+
+def pre_trained_point(save, config: DictConfig, server_device: str):
+ """Return a pre-trained model from a path and hyperparameters."""
+ state_dict = torch.load(config.pre_train_model_path)
+
+ overrides = {"output_folder": save}
+
+ hparams = config.sb_config
+ _, run_opts, _ = sb.parse_arguments(hparams)
+ with open(hparams) as fin:
+ params = load_hyperpyyaml(fin, overrides)
+
+ run_opts["device"] = server_device
+ run_opts["data_parallel_backend"] = config.parallel_backend
+ run_opts["noprogressbar"] = True # disable tqdm progress bar
+
+ asr_brain = ASR(
+ modules=params["modules"],
+ hparams=params,
+ run_opts=run_opts,
+ checkpointer=params["checkpointer"],
+ )
+
+ asr_brain.modules.load_state_dict(state_dict)
+ weights = get_weights(asr_brain.modules)
+ pre_trained = ndarrays_to_parameters(weights)
+
+ # Free up space after initialized
+ del asr_brain, weights
+ gc.collect()
+
+ if torch.cuda.is_available():
+ torch.cuda.empty_cache()
+ return pre_trained
diff --git a/baselines/fedwav2vec2/fedwav2vec2/sb_recipe.py b/baselines/fedwav2vec2/fedwav2vec2/sb_recipe.py
new file mode 100644
index 000000000000..390edfe246d3
--- /dev/null
+++ b/baselines/fedwav2vec2/fedwav2vec2/sb_recipe.py
@@ -0,0 +1,473 @@
+"""Main SpeechBrain training and testing logic."""
+
+
+import gc
+import time
+from collections import OrderedDict
+from enum import Enum, auto
+from typing import Dict, Optional
+
+import flwr as fl
+import numpy as np
+import speechbrain as sb
+import torch
+from speechbrain.dataio.dataloader import LoopedLoader
+from torch.utils.data import DataLoader
+from tqdm.contrib import tqdm
+
+# Recipe for training a sequence-to-sequence ASR system with CommonVoice.
+# The system employs a wav2vec2 encoder and a CTC decoder.
+# Decoding is performed with greedy decoding (will be extended to beam search).
+
+# To run this recipe, do the following:
+# > python train_with_wav2vec2.py hparams/train_with_wav2vec2.yaml
+
+# With the default hyperparameters, the system employs a pretrained wav2vec2 encoder.
+# The wav2vec2 model is pretrained following the model given in the hprams file.
+# It may be dependent on the language.
+
+# The neural network is trained with CTC on sub-word units estimated with
+# Byte Pairwise Encoding (BPE).
+
+# The experiment file is flexible enough to support a large variety of
+# different systems. By properly changing the parameter files, you can try
+# different encoders, decoders, tokens (e.g, characters instead of BPE),
+# training languages (all CommonVoice languages), and many
+# other possible variations.
+
+# Authors
+# * Titouan Parcollet 2021
+
+
+class Stage(Enum):
+ """Simple enum to track stage of experiments."""
+
+ TRAIN = auto()
+ VALID = auto()
+ TEST = auto()
+
+
+def set_weights(weights: fl.common.NDArrays, modules, device) -> None:
+ """Set model weights from a list of NumPy ndarrays."""
+ state_dict = OrderedDict()
+ valid_keys = modules.state_dict().keys()
+ for key, value in zip(valid_keys, weights):
+ weight = torch.Tensor(np.array(value))
+ weight = weight.to(device)
+ state_dict[key] = weight
+
+ modules.load_state_dict(state_dict, strict=True)
+
+
+def get_weights(modules) -> fl.common.NDArrays:
+ """Get model weights as a list of NumPy ndarrays."""
+ weights = []
+ for _, value in modules.state_dict().items():
+ weights.append(value.cpu().numpy())
+ return weights
+
+
+# pylint: disable=E1101,W0201,R0902
+class ASR(sb.core.Brain):
+ """Override of SpeechBrain default Brain class."""
+
+ def compute_forward(self, batch, _):
+ """Forward computations from the waveform batches to the output.
+
+ probabilities.
+ """
+ batch = batch.to(self.device)
+ wavs, wav_lens = batch.sig
+ # Forward pass
+ self.feats = self.modules.wav2vec2(wavs)
+
+ encoded_features = self.modules.enc(self.feats)
+ logits = self.modules.ctc_lin(encoded_features)
+ p_ctc = self.hparams.log_softmax(logits)
+
+ return p_ctc, wav_lens
+
+ def compute_objectives(self, predictions, batch, stage):
+ """Compute the CTC loss given predictions and targets."""
+ ids = batch.id
+ p_ctc, wav_lens = predictions
+ chars, char_lens = batch.char_encoded
+
+ loss = self.hparams.ctc_cost(p_ctc, chars, wav_lens, char_lens)
+ sequence = sb.decoders.ctc_greedy_decode(
+ p_ctc, wav_lens, self.hparams.blank_index
+ )
+ # ==============================Add by Salima=======================
+ # ==================================================================
+
+ if stage != sb.Stage.TRAIN:
+ self.cer_metric.append(
+ ids=ids,
+ predict=sequence,
+ target=chars,
+ target_len=char_lens,
+ ind2lab=self.label_encoder.decode_ndim,
+ )
+ self.coer_metric.append(
+ ids=ids,
+ predict=sequence,
+ target=chars,
+ target_len=char_lens,
+ ind2lab=self.label_encoder.decode_ndim,
+ )
+ self.cver_metric.append(
+ ids=ids,
+ predict=sequence,
+ target=chars,
+ target_len=char_lens,
+ ind2lab=self.label_encoder.decode_ndim,
+ )
+ self.ctc_metric.append(ids, p_ctc, chars, wav_lens, char_lens)
+
+ return loss
+
+ def init_optimizers(self):
+ """Initialize the wav2vec2 optimizer and model optimizer."""
+ self.wav2vec_optimizer = self.hparams.wav2vec_opt_class(
+ self.modules.wav2vec2.parameters()
+ )
+ self.adam_optimizer = self.hparams.adam_opt_class(
+ self.hparams.model.parameters()
+ )
+
+ def fit_batch(self, batch):
+ """Train the parameters given a single batch in input."""
+ batch = batch.to(self.device)
+ wavs, wav_lens = batch.sig
+
+ wavs, wav_lens = wavs.to(self.device), wav_lens.to(self.device)
+
+ stage = sb.Stage.TRAIN
+
+ predictions = self.compute_forward(batch, stage)
+ loss = self.compute_objectives(predictions, batch, stage)
+ loss.backward()
+ if self.check_gradients(loss):
+ self.wav2vec_optimizer.step()
+ self.adam_optimizer.step()
+
+ self.wav2vec_optimizer.zero_grad()
+ self.adam_optimizer.zero_grad()
+
+ return loss.detach().cpu()
+
+ def evaluate_batch(self, batch, stage):
+ """Compute validation/test batches."""
+ # Get data.
+ batch = batch.to(self.device)
+
+ predictions = self.compute_forward(batch, stage)
+ with torch.no_grad():
+ loss = self.compute_objectives(predictions, batch, stage=stage)
+ return loss.detach()
+
+ def on_stage_start(self, stage, epoch=None):
+ """Call when a stage (either training, validation, test) starts."""
+ _ = epoch
+ # self.ctc_metrics = self.hparams.ctc_stats()
+ if stage != sb.Stage.TRAIN:
+ self.cer_metric = self.hparams.cer_computer()
+ self.ctc_metric = self.hparams.ctc_computer()
+ self.coer_metric = self.hparams.coer_computer()
+ self.cver_metric = self.hparams.cver_computer()
+ # self.wer_metric = self.hparams.error_rate_computer()
+
+ def on_stage_end(self, stage, stage_loss, epoch=None):
+ """Call at the end of a stage."""
+ # Compute/store important stats
+ stage_stats = {"loss": stage_loss}
+
+ # if stage == sb.Stage.TRAIN:
+ # self.train_loss = stage_loss
+ if stage == sb.Stage.TRAIN:
+ self.train_loss = stage_loss
+ else:
+ # cer = self.cer_metrics.summarize("error_rate")
+ stage_stats["WER"] = self.cer_metric.summarize("error_rate")
+ stage_stats["COER"] = self.coer_metric.summarize("error_rate")
+ stage_stats["CVER"] = self.cver_metric.summarize("error_rate")
+
+ # Perform end-of-iteration things, like annealing, logging, etc.
+ if stage == sb.Stage.VALID:
+ old_lr_adam, new_lr_adam = self.hparams.lr_annealing_adam(
+ stage_stats["loss"]
+ )
+ old_lr_wav2vec, new_lr_wav2vec = self.hparams.lr_annealing_wav2vec(
+ stage_stats["loss"]
+ )
+ sb.nnet.schedulers.update_learning_rate(self.adam_optimizer, new_lr_adam)
+ sb.nnet.schedulers.update_learning_rate(
+ self.wav2vec_optimizer, new_lr_wav2vec
+ )
+
+ self.hparams.train_logger.log_stats(
+ stats_meta={
+ "epoch": epoch,
+ "lr_adam": old_lr_adam,
+ "lr_wav2vec": old_lr_wav2vec,
+ },
+ train_stats={"loss": self.train_loss},
+ valid_stats=stage_stats,
+ )
+
+ self.stage_wer = stage_stats["WER"]
+
+ elif stage == sb.Stage.TEST:
+ self.hparams.train_logger.log_stats(
+ stats_meta={"Epoch loaded": self.hparams.epoch_counter.current},
+ test_stats=stage_stats,
+ )
+ with open(self.hparams.wer_file, "w") as wer_file:
+ wer_file.write("CTC loss stats:\n")
+ self.ctc_metric.write_stats(wer_file)
+ wer_file.write("\nCER stats:\n")
+ self.cer_metric.write_stats(wer_file)
+ print("CTC and WER stats written to ", self.hparams.wer_file)
+
+ self.stage_wer = stage_stats["WER"]
+
+ def fit( # pylint: disable=W0102,R0912,R0913,R0914,R0915
+ self,
+ epoch_counter,
+ train_set,
+ valid_set=None,
+ progressbar=None,
+ train_loader_kwargs=Optional[Dict],
+ valid_loader_kwargs=Optional[Dict],
+ ):
+ """Iterate epochs and datasets to improve objective.
+
+ Relies on the existence of multiple functions that can (or should) be
+ overridden. The following methods are used and expected to have a
+ certain behavior:
+
+ * ``fit_batch()``
+ * ``evaluate_batch()``
+ * ``update_average()``
+
+ If the initialization was done with distributed_count > 0 and the
+ distributed_backend is ddp, this will generally handle multiprocess
+ logic, like splitting the training data into subsets for each device and
+ only saving a checkpoint on the main process.
+
+ Arguments
+ ---------
+ epoch_counter : iterable
+ Each call should return an integer indicating the epoch count.
+ train_set : Dataset, DataLoader
+ A set of data to use for training. If a Dataset is given, a
+ DataLoader is automatically created. If a DataLoader is given, it is
+ used directly.
+ valid_set : Dataset, DataLoader
+ A set of data to use for validation. If a Dataset is given, a
+ DataLoader is automatically created. If a DataLoader is given, it is
+ used directly.
+ train_loader_kwargs : Optional[Dict]
+ Kwargs passed to `make_dataloader()` for making the train_loader
+ (if train_set is a Dataset, not DataLoader).
+ E.G. batch_size, num_workers.
+ DataLoader kwargs are all valid.
+ valid_loader_kwargs : Optional[Dict]
+ Kwargs passed to `make_dataloader()` for making the valid_loader
+ (if valid_set is a Dataset, not DataLoader).
+ E.g., batch_size, num_workers.
+ DataLoader kwargs are all valid.
+ progressbar : bool
+ Whether to display the progress of each epoch in a progressbar.
+ """
+ if not isinstance(train_set, (DataLoader, LoopedLoader)):
+ train_set = self.make_dataloader(
+ train_set, stage=sb.Stage.TRAIN, **train_loader_kwargs
+ )
+ if valid_set is not None and not isinstance(
+ valid_set, (DataLoader, LoopedLoader)
+ ):
+ valid_set = self.make_dataloader(
+ valid_set,
+ stage=sb.Stage.VALID,
+ ckpt_prefix=None,
+ **valid_loader_kwargs,
+ )
+
+ self.on_fit_start()
+
+ if progressbar is None:
+ progressbar = not self.noprogressbar
+ self.modules = self.modules.to(self.device)
+ if torch.cuda.is_available():
+ torch.cuda.empty_cache()
+ gc.collect()
+ # Iterate epochs
+ batch_count = 0
+ for epoch in epoch_counter:
+ # Training stage
+ self.on_stage_start(sb.Stage.TRAIN, epoch)
+ self.modules.train()
+
+ # Reset nonfinite count to 0 each epoch
+ self.nonfinite_count = 0
+
+ if self.train_sampler is not None and hasattr(
+ self.train_sampler, "set_epoch"
+ ):
+ self.train_sampler.set_epoch(epoch)
+
+ # Time since last intra-epoch checkpoint
+ last_ckpt_time = time.time()
+
+ # Only show progressbar if requested and main_process
+ enable = progressbar and sb.utils.distributed.if_main_process()
+ with tqdm(
+ train_set,
+ initial=self.step,
+ dynamic_ncols=True,
+ disable=not enable,
+ ) as progress_bar:
+ for batch in progress_bar:
+ self.step += 1
+ loss = self.fit_batch(batch)
+ _, wav_lens = batch.sig
+ batch_count += wav_lens.shape[0]
+ self.avg_train_loss = self.update_average(loss, self.avg_train_loss)
+ progress_bar.set_postfix(train_loss=self.avg_train_loss)
+
+ # Debug mode only runs a few batches
+ if self.debug and self.step == self.debug_batches:
+ break
+
+ if (
+ self.checkpointer is not None
+ and self.ckpt_interval_minutes > 0
+ and time.time() - last_ckpt_time
+ >= self.ckpt_interval_minutes * 60.0
+ ):
+ # This should not use run_on_main, because that
+ # includes a DDP barrier. That eventually leads to a
+ # crash when the processes'
+ # time.time() - last_ckpt_time differ and some
+ # processes enter this block while others don't,
+ # missing the barrier.
+ if sb.utils.distributed.if_main_process():
+ self._save_intra_epoch_ckpt()
+ last_ckpt_time = time.time()
+
+ if epoch == epoch_counter.limit:
+ avg_loss = self.avg_train_loss
+ # Run train "on_stage_end" on all processes
+ self.on_stage_end(sb.Stage.TRAIN, self.avg_train_loss, epoch)
+ self.avg_train_loss = 0.0
+ self.step = 0
+
+ # Validation stage
+ if valid_set is not None:
+ self.on_stage_start(sb.Stage.VALID, epoch)
+ self.modules.eval()
+ avg_valid_loss = 0.0
+ with torch.no_grad():
+ for batch in tqdm(
+ valid_set, dynamic_ncols=True, disable=not enable
+ ):
+ self.step += 1
+ loss = self.evaluate_batch(batch, stage=sb.Stage.VALID)
+ avg_valid_loss = self.update_average(loss, avg_valid_loss)
+
+ # Debug mode only runs a few batches
+ if self.debug and self.step == self.debug_batches:
+ break
+
+ # Only run validation "on_stage_end" on main process
+ self.step = 0
+ self.on_stage_end(sb.Stage.VALID, avg_valid_loss, epoch)
+ valid_wer = self.stage_wer
+ if epoch == epoch_counter.limit:
+ valid_wer_last = valid_wer
+
+ # Debug mode only runs a few epochs
+ if self.debug and epoch == self.debug_epochs:
+ break
+ if self.device == "cpu":
+ self.modules = self.modules.to("cpu")
+ gc.collect()
+ if torch.cuda.is_available():
+ torch.cuda.empty_cache()
+ return batch_count, avg_loss, valid_wer_last
+
+ def evaluate( # pylint: disable=W0102,R0913
+ self,
+ test_set,
+ max_key=None,
+ min_key=None,
+ progressbar=None,
+ test_loader_kwargs=Optional[Dict],
+ ):
+ """Iterate test_set and evaluate brain performance. By default, loads the best-.
+
+ performing checkpoint (as recorded using the checkpointer).
+
+ Arguments
+ ---------
+ test_set : Dataset, DataLoader
+ If a DataLoader is given, it is iterated directly. Otherwise passed
+ to ``self.make_dataloader()``.
+ max_key : str
+ Key to use for finding best checkpoint, passed to
+ ``on_evaluate_start()``.
+ min_key : str
+ Key to use for finding best checkpoint, passed to
+ ``on_evaluate_start()``.
+ progressbar : bool
+ Whether to display the progress in a progressbar.
+ test_loader_kwargs : Optional[Dict]
+ Kwargs passed to ``make_dataloader()`` if ``test_set`` is not a
+ DataLoader. NOTE: ``loader_kwargs["ckpt_prefix"]`` gets
+ automatically overwritten to ``None`` (so that the test DataLoader
+ is not added to the checkpointer).
+
+ Returns
+ -------
+ average test loss
+ """
+ if progressbar is None:
+ progressbar = not self.noprogressbar
+
+ if not isinstance(test_set, (DataLoader, LoopedLoader)):
+ test_loader_kwargs["ckpt_prefix"] = None
+ test_set = self.make_dataloader(
+ test_set, sb.Stage.TEST, **test_loader_kwargs
+ )
+ self.modules = self.modules.to(self.device)
+ if torch.cuda.is_available():
+ torch.cuda.empty_cache()
+ gc.collect()
+ self.on_evaluate_start(max_key=max_key, min_key=min_key)
+ self.on_stage_start(sb.Stage.TEST, None)
+ self.modules.eval()
+ avg_test_loss = 0.0
+ batch_count = 0
+ with torch.no_grad():
+ for batch in tqdm(test_set, dynamic_ncols=True, disable=not progressbar):
+ self.step += 1
+ _, wav_lens = batch.sig
+ batch_count += wav_lens.shape[0]
+ loss = self.evaluate_batch(batch, stage=sb.Stage.TEST)
+ avg_test_loss = self.update_average(loss, avg_test_loss)
+
+ # Debug mode only runs a few batches
+ if self.debug and self.step == self.debug_batches:
+ break
+
+ self.on_stage_end(sb.Stage.TEST, avg_test_loss, None)
+ cer = self.stage_wer
+ self.step = 0
+ if self.device == "cpu":
+ self.modules = self.modules.to("cpu")
+ gc.collect()
+ if torch.cuda.is_available():
+ torch.cuda.empty_cache()
+
+ return batch_count, avg_test_loss, cer
diff --git a/baselines/fedwav2vec2/fedwav2vec2/server.py b/baselines/fedwav2vec2/fedwav2vec2/server.py
new file mode 100644
index 000000000000..db42c0f069a5
--- /dev/null
+++ b/baselines/fedwav2vec2/fedwav2vec2/server.py
@@ -0,0 +1,69 @@
+"""Create global evaluation function.
+
+Optionally, also define a new Server class (please note this is not needed in most
+settings).
+"""
+
+import gc
+import os
+from typing import Callable, Dict
+
+import flwr as fl
+import torch
+from flwr.common import Scalar
+from omegaconf import DictConfig
+
+from fedwav2vec2.client import SpeechBrainClient
+from fedwav2vec2.models import int_model
+
+
+def get_on_fit_config_fn(local_epochs: int) -> Callable[[int], Dict[str, str]]:
+ """Return a function which returns training configurations."""
+
+ def fit_config(rnd: int) -> Dict[str, str]:
+ """Return a configuration with static batch size and (local) epochs."""
+ config = {"epoch_global": str(rnd), "epochs": str(local_epochs)}
+ return config
+
+ return fit_config
+
+
+def get_evaluate_fn(config: DictConfig, server_device: str, save_path: str):
+ """Return function to execute during global evaluation."""
+ config_ = config
+
+ def evaluate_fn(
+ server_round: int, weights: fl.common.NDArrays, config: Dict[str, Scalar]
+ ):
+ """Run centralized evaluation."""
+ _ = (server_round, config)
+ # int model
+ asr_brain, dataset = int_model(
+ config_.server_cid,
+ config_,
+ server_device,
+ save_path,
+ evaluate=True,
+ )
+
+ client = SpeechBrainClient(config_.server_cid, asr_brain, dataset)
+
+ _, lss, err = client.evaluate_train_speech_recogniser(
+ server_params=weights,
+ epochs=1,
+ )
+ # Save model if indicated
+ if config_.save_checkpoint is not None:
+ if not os.path.exists(config_.save_checkpoint):
+ os.mkdir(config_.save_checkpoint)
+ checkpoint = os.path.join(config_.save_checkpoint, "last_checkpoint.pt")
+ torch.save(asr_brain.modules.state_dict(), checkpoint)
+ print(f"Checkpoint saved for round {server_round}")
+
+ del client, asr_brain, dataset
+ if torch.cuda.is_available():
+ torch.cuda.empty_cache()
+ gc.collect()
+ return lss, {"Error rate": err}
+
+ return evaluate_fn
diff --git a/baselines/fedwav2vec2/fedwav2vec2/strategy.py b/baselines/fedwav2vec2/fedwav2vec2/strategy.py
new file mode 100644
index 000000000000..6aa6007e4ccd
--- /dev/null
+++ b/baselines/fedwav2vec2/fedwav2vec2/strategy.py
@@ -0,0 +1,69 @@
+"""Optionally define a custom strategy.
+
+Needed only when the strategy is not yet implemented in Flower or because you want to
+extend or modify the functionality of an existing strategy.
+"""
+
+
+import gc
+from typing import Dict, List, Optional, Tuple, Union
+
+import flwr as fl
+import torch
+from flwr.common import (
+ FitRes,
+ Parameters,
+ ndarrays_to_parameters,
+ parameters_to_ndarrays,
+)
+from flwr.server.client_proxy import ClientProxy
+from flwr.server.strategy.aggregate import aggregate
+
+
+class CustomFedAvg(fl.server.strategy.FedAvg):
+ """Custom strategy to aggregate using metrics instead of number of samples."""
+
+ def __init__(self, *args, weight_strategy, **kwargs) -> None:
+ super().__init__(*args, **kwargs)
+ self.weight_strategy = weight_strategy
+
+ def aggregate_fit(
+ self,
+ _: int,
+ results: List[Tuple[ClientProxy, FitRes]],
+ failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]],
+ ) -> Tuple[Optional[Parameters], Dict[str, Union[bool, bytes, float, int, str]]]:
+ """Aggregate results using different weighing metrics (train_loss or WER)."""
+ if not results:
+ return None, {}
+ # Do not aggregate if there are failures and failures are not accepted
+ if not self.accept_failures and failures:
+ return None, {}
+
+ # Convert results
+ key_name = "train_loss" if self.weight_strategy == "loss" else "wer"
+ weights = None
+
+ # Define ratio merge
+ if self.weight_strategy == "num":
+ weights_results = [
+ (parameters_to_ndarrays(fit_res.parameters), fit_res.num_examples)
+ for _, fit_res in results
+ ]
+ weights = aggregate(weights_results)
+ else:
+ weights_results = [
+ (
+ parameters_to_ndarrays(fit_res.parameters),
+ int(fit_res.metrics[key_name]),
+ )
+ for _, fit_res in results
+ ]
+ weights = aggregate(weights_results)
+
+ # Free memory for next round
+ del results, weights_results
+ if torch.cuda.is_available():
+ torch.cuda.empty_cache()
+ gc.collect()
+ return ndarrays_to_parameters(weights), {}
diff --git a/baselines/fedwav2vec2/fedwav2vec2/utils.py b/baselines/fedwav2vec2/fedwav2vec2/utils.py
new file mode 100644
index 000000000000..9a831719d623
--- /dev/null
+++ b/baselines/fedwav2vec2/fedwav2vec2/utils.py
@@ -0,0 +1,6 @@
+"""Define any utility function.
+
+They are not directly relevant to the other (more FL specific) python modules. For
+example, you may define here things like: loading a model from a checkpoint, saving
+results, plotting.
+"""
diff --git a/baselines/fedwav2vec2/pyproject.toml b/baselines/fedwav2vec2/pyproject.toml
new file mode 100644
index 000000000000..1e7dbf55154b
--- /dev/null
+++ b/baselines/fedwav2vec2/pyproject.toml
@@ -0,0 +1,142 @@
+[build-system]
+requires = ["poetry-core>=1.4.0"]
+build-backend = "poetry.masonry.api"
+
+[tool.poetry]
+name = "fedwav2vec2" # <----- Ensure it matches the name of your baseline directory containing all the source code
+version = "1.0.0"
+description = "Federated Learning for ASR Based on wav2vec 2.0"
+license = "Apache-2.0"
+authors = ["The Flower Authors ", "Tuan Nguyen "]
+readme = "README.md"
+homepage = "https://flower.dev"
+repository = "https://github.com/adap/flower"
+documentation = "https://flower.dev"
+classifiers = [
+ "Development Status :: 3 - Alpha",
+ "Intended Audience :: Developers",
+ "Intended Audience :: Science/Research",
+ "License :: OSI Approved :: Apache Software License",
+ "Operating System :: MacOS :: MacOS X",
+ "Operating System :: POSIX :: Linux",
+ "Programming Language :: Python",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3 :: Only",
+ "Programming Language :: Python :: 3.8",
+ "Programming Language :: Python :: 3.9",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: Implementation :: CPython",
+ "Topic :: Scientific/Engineering",
+ "Topic :: Scientific/Engineering :: Artificial Intelligence",
+ "Topic :: Scientific/Engineering :: Mathematics",
+ "Topic :: Software Development",
+ "Topic :: Software Development :: Libraries",
+ "Topic :: Software Development :: Libraries :: Python Modules",
+ "Typing :: Typed",
+]
+
+[tool.poetry.dependencies]
+python = ">=3.10.0, <3.12.0" # don't change this
+flwr = { extras = ["simulation"], version = "1.5.0" }
+hydra-core = "1.3.2" # don't change this
+speechbrain = "0.5.15"
+pandas = "2.1.1"
+torch = { url = "https://download.pytorch.org/whl/cu116/torch-1.13.1%2Bcu116-cp310-cp310-linux_x86_64.whl"}
+torchaudio = { url = "https://download.pytorch.org/whl/cu116/torchaudio-0.13.1%2Bcu116-cp310-cp310-linux_x86_64.whl"}
+transformers = "4.33.2"
+
+[tool.poetry.dev-dependencies]
+isort = "==5.11.5"
+black = "==23.1.0"
+docformatter = "==1.5.1"
+mypy = "==1.4.1"
+pylint = "==2.8.2"
+flake8 = "==3.9.2"
+pytest = "==6.2.4"
+pytest-watch = "==4.2.0"
+ruff = "==0.0.272"
+types-requests = "==2.27.7"
+
+[tool.isort]
+line_length = 88
+indent = " "
+multi_line_output = 3
+include_trailing_comma = true
+force_grid_wrap = 0
+use_parentheses = true
+
+[tool.black]
+line-length = 88
+target-version = ["py38", "py39", "py310", "py311"]
+
+[tool.pytest.ini_options]
+minversion = "6.2"
+addopts = "-qq"
+testpaths = [
+ "flwr_baselines",
+]
+
+[tool.mypy]
+ignore_missing_imports = true
+strict = false
+plugins = "numpy.typing.mypy_plugin"
+
+[tool.pylint."MESSAGES CONTROL"]
+disable = "bad-continuation,duplicate-code,too-few-public-methods,useless-import-alias"
+good-names = "i,j,k,_,x,y,X,Y"
+signature-mutators="hydra.main.main"
+
+[tool.pylint.typecheck]
+generated-members="numpy.*, torch.*, tensorflow.*"
+
+[[tool.mypy.overrides]]
+module = [
+ "importlib.metadata.*",
+ "importlib_metadata.*",
+]
+follow_imports = "skip"
+follow_imports_for_stubs = true
+disallow_untyped_calls = false
+
+[[tool.mypy.overrides]]
+module = "torch.*"
+follow_imports = "skip"
+follow_imports_for_stubs = true
+
+[tool.docformatter]
+wrap-summaries = 88
+wrap-descriptions = 88
+
+[tool.ruff]
+target-version = "py38"
+line-length = 88
+select = ["D", "E", "F", "W", "B", "ISC", "C4"]
+fixable = ["D", "E", "F", "W", "B", "ISC", "C4"]
+ignore = ["B024", "B027"]
+exclude = [
+ ".bzr",
+ ".direnv",
+ ".eggs",
+ ".git",
+ ".hg",
+ ".mypy_cache",
+ ".nox",
+ ".pants.d",
+ ".pytype",
+ ".ruff_cache",
+ ".svn",
+ ".tox",
+ ".venv",
+ "__pypackages__",
+ "_build",
+ "buck-out",
+ "build",
+ "dist",
+ "node_modules",
+ "venv",
+ "proto",
+]
+
+[tool.ruff.pydocstyle]
+convention = "numpy"
diff --git a/baselines/fjord/.gitignore b/baselines/fjord/.gitignore
new file mode 100644
index 000000000000..8199f9d1a17f
--- /dev/null
+++ b/baselines/fjord/.gitignore
@@ -0,0 +1,3 @@
+data/
+runs/
+exp_logs/
diff --git a/baselines/fjord/LICENSE b/baselines/fjord/LICENSE
new file mode 100644
index 000000000000..d64569567334
--- /dev/null
+++ b/baselines/fjord/LICENSE
@@ -0,0 +1,202 @@
+
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
+
+ APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "[]"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+ Copyright [yyyy] [name of copyright owner]
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
diff --git a/baselines/fjord/README.md b/baselines/fjord/README.md
new file mode 100644
index 000000000000..563f583082a8
--- /dev/null
+++ b/baselines/fjord/README.md
@@ -0,0 +1,118 @@
+---
+title: "FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout"
+url: "https://openreview.net/forum?id=4fLr7H5D_eT"
+labels: ["Federated Learning", "Heterogeneity", "Efficient DNNs", "Distributed Systems"]
+dataset: ["CIFAR-10"]
+---
+
+# FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout
+
+**Paper:** [openreview.net/forum?id=4fLr7H5D_eT](https://openreview.net/forum?id=4fLr7H5D_eT)
+
+**Authors:** Samuel Horváth\*, Stefanos Laskaridis\*, Mario Almeida\*, Ilias Leontiadis, Stylianos Venieris, Nicholas Donald Lane
+
+
+**Abstract:** Federated Learning (FL) has been gaining significant traction across different ML tasks, ranging from vision to keyboard predictions. In large-scale deployments, client heterogeneity is a fact and constitutes a primary problem for fairness, training performance and accuracy. Although significant efforts have been made into tackling statistical data heterogeneity, the diversity in the processing capabilities and network bandwidth of clients, termed system heterogeneity, has remained largely unexplored. Current solutions either disregard a large portion of available devices or set a uniform limit on the model's capacity, restricted by the least capable participants.
+
+In this work, we introduce Ordered Dropout, a mechanism that achieves an ordered, nested representation of knowledge in Neural Networks and enables the extraction of lower footprint submodels without the need for retraining. We further show that for linear maps our Ordered Dropout is equivalent to SVD. We employ this technique, along with a self-distillation methodology, in the realm of FL in a framework called FjORD. FjORD alleviates the problem of client system heterogeneity by tailoring the model width to the client's capabilities.
+Extensive evaluation on both CNNs and RNNs across diverse modalities shows that FjORD consistently leads to significant performance gains over state-of-the-art baselines while maintaining its nested structure.
+
+
+## About this baseline
+
+**What’s implemented:** The code in this directory implements the two variants of FjORD, with and without knowledge distillation.
+
+**Datasets:** CIFAR-10
+
+**Hardware Setup:** We trained the baseline on an Nvidia RTX 4090.
+
+**Contributors:** @stevelaskaridis ([Brave Software](https://brave.com/)), @SamuelHorvath ([MBZUAI](https://mbzuai.ac.ae/))
+
+
+## Experimental Setup
+
+**Task:** Image Classification
+
+**Model:** ResNet-18
+
+**Dataset:**
+
+| **Feature** | **Value** |
+| -------------------------- | ---------------------------- |
+| **Dataset** | CIFAR-10 |
+| **Partition** | Randomised Sequential Split |
+| **Number of Partitions** | 100 clients |
+| **Data points per client** | 500 samples |
+
+**Training Hyperparameters:**
+
+| **Hyperparameter** | **Value** |
+| ----------------------- | ------------------------- |
+| batch size | 32 |
+| learning rate | 0.1 |
+| learning rate scheduler | static |
+| optimiser | sgd |
+| momentum | 0 |
+| nesterov | False |
+| weight decay | 1e-4 |
+| sample per round | 10 |
+| local epochs | 1 |
+| p-values | [0.2, 0.4, 0.6, 0.8, 1.0] |
+| client tier allocation | uniform |
+
+
+## Environment Setup
+
+### Through regular pip
+
+```bash
+pip install -r requirements.txt
+python setup.py install
+```
+
+### Through poetry
+
+```bash
+# Set python version
+pyenv install 3.10.6
+pyenv local 3.10.6
+
+# Tell poetry to use python 3.10
+poetry env use 3.10.6
+
+# install the base Poetry environment
+poetry install
+
+# activate the environment
+poetry shell
+```
+
+## Running the Experiments
+
+### Through your environment
+
+
+```bash
+python -m fjord.main # without knowledge distillation
+# or
+python -m fjord.main +train_mode=fjord_kd # with knowledge distillation
+```
+
+### Through poetry
+
+```bash
+poetry run python -m fjord.main # without knowledge distillation
+# or
+poetry run python -m fjord.main +train_mode=fjord_kd # with knowledge distillation
+```
+
+## Expected Results
+
+```bash
+cd scripts/
+./run.sh
+```
+
+Plots and the associated code reside in `fjord/notebooks/visualise.ipynb`.
+
+![resnet18_cifar10_500_global_rounds_acc_pvalues](./_static/resnet18_cifar10_500_global_rounds_acc_pvalues.png)
\ No newline at end of file
diff --git a/baselines/fjord/_static/resnet18_cifar10_500_global_rounds_acc_pvalues.png b/baselines/fjord/_static/resnet18_cifar10_500_global_rounds_acc_pvalues.png
new file mode 100644
index 000000000000..de3ad61a5d55
Binary files /dev/null and b/baselines/fjord/_static/resnet18_cifar10_500_global_rounds_acc_pvalues.png differ
diff --git a/baselines/fjord/_static/resnet18_cifar10_fjord_convergence.png b/baselines/fjord/_static/resnet18_cifar10_fjord_convergence.png
new file mode 100644
index 000000000000..12b137e3d196
Binary files /dev/null and b/baselines/fjord/_static/resnet18_cifar10_fjord_convergence.png differ
diff --git a/baselines/fjord/_static/resnet18_cifar10_fjord_kd_convergence.png b/baselines/fjord/_static/resnet18_cifar10_fjord_kd_convergence.png
new file mode 100644
index 000000000000..358a5d19a281
Binary files /dev/null and b/baselines/fjord/_static/resnet18_cifar10_fjord_kd_convergence.png differ
diff --git a/baselines/fjord/fjord/__init__.py b/baselines/fjord/fjord/__init__.py
new file mode 100644
index 000000000000..7aa11d2a7b9f
--- /dev/null
+++ b/baselines/fjord/fjord/__init__.py
@@ -0,0 +1 @@
+"""FjORD package."""
diff --git a/baselines/fjord/fjord/client.py b/baselines/fjord/fjord/client.py
new file mode 100644
index 000000000000..2b18d9547086
--- /dev/null
+++ b/baselines/fjord/fjord/client.py
@@ -0,0 +1,240 @@
+"""Flower client implementing FjORD."""
+from collections import OrderedDict
+from copy import deepcopy
+from types import SimpleNamespace
+from typing import Any, Dict, List, Tuple, Union
+
+import flwr as fl
+import numpy as np
+import torch
+from torch import Tensor
+from torch.nn import Module
+from torch.utils.data import DataLoader
+
+from .dataset import load_data
+from .models import get_net, test, train
+from .od.layers import ODBatchNorm2d, ODConv2d, ODLinear
+from .od.samplers import ODSampler
+from .utils.logger import Logger
+from .utils.utils import save_model
+
+FJORD_CONFIG_TYPE = Dict[
+ Union[str, float],
+ List[Any],
+]
+
+
+def get_layer_from_state_dict(model: Module, state_dict_key: str) -> Module:
+ """Get the layer corresponding to the given state dict key.
+
+ :param model: The model.
+ :param state_dict_key: The state dict key.
+ :return: The module corresponding to the given state dict key.
+ """
+ keys = state_dict_key.split(".")
+ module = model
+ # The last keyc orresponds to the parameter name
+ # (e.g., weight or bias)
+ for key in keys[:-1]:
+ module = getattr(module, key)
+ return module
+
+
+def net_to_state_dict_layers(net: Module) -> List[Module]:
+ """Get the state_dict of the model.
+
+ :param net: The model.
+ :return: The state_dict of the model.
+ """
+ layers = []
+ for key, _ in net.state_dict().items():
+ layer = get_layer_from_state_dict(net, key)
+ layers.append(layer)
+ return layers
+
+
+def get_agg_config(
+ net: Module, trainloader: DataLoader, p_s: List[float]
+) -> FJORD_CONFIG_TYPE:
+ """Get the aggregation configuration of the model.
+
+ :param net: The model.
+ :param trainloader: The training set.
+ :param p_s: The p values used
+ :return: The aggregation configuration of the model.
+ """
+ Logger.get().info("Constructing OD model configuration for aggregation.")
+ device = next(net.parameters()).device
+ images, _ = next(iter(trainloader))
+ images = images.to(device)
+ layers = net_to_state_dict_layers(net)
+ # init min dims in networks
+ config: FJORD_CONFIG_TYPE = {p: [{} for _ in layers] for p in p_s}
+ config["layer"] = []
+ config["layer_p"] = []
+ with torch.no_grad():
+ for p in p_s:
+ max_sampler = ODSampler(
+ p_s=[p],
+ max_p=p,
+ model=net,
+ )
+ net(images, sampler=max_sampler)
+ for i, layer in enumerate(layers):
+ if isinstance(layer, (ODConv2d, ODLinear)):
+ config[p][i]["in_dim"] = layer.last_input_dim
+ config[p][i]["out_dim"] = layer.last_output_dim
+ elif isinstance(layer, ODBatchNorm2d):
+ config[p][i]["in_dim"] = None
+ config[p][i]["out_dim"] = layer.p_to_num_features[p]
+ elif isinstance(layer, torch.nn.BatchNorm2d):
+ pass
+ else:
+ raise ValueError(f"Unsupported layer {layer.__class__.__name__}")
+ for layer in layers:
+ config["layer"].append(layer.__class__.__name__)
+ if hasattr(layer, "p"):
+ config["layer_p"].append(layer.p)
+ else:
+ config["layer_p"].append(None)
+ return config
+
+
+# Define Flower client
+class FjORDClient(
+ fl.client.NumPyClient
+): # pylint: disable=too-many-instance-attributes
+ """Flower client training on CIFAR-10."""
+
+ def __init__( # pylint: disable=too-many-arguments
+ self,
+ cid: int,
+ model_name: str,
+ model_path: str,
+ data_path: str,
+ know_distill: bool,
+ max_p: float,
+ p_s: List[float],
+ train_config: SimpleNamespace,
+ fjord_config: FJORD_CONFIG_TYPE,
+ log_config: Dict[str, str],
+ device: torch.device,
+ seed: int,
+ ) -> None:
+ """Initialise the client.
+
+ :param cid: The client ID.
+ :param model_name: The model name.
+ :param model_path: The path to save the model.
+ :param data_path: The path to the dataset.
+ :param know_distill: Whether the model uses knowledge distillation.
+ :param max_p: The maximum p value.
+ :param p_s: The p values to use for training.
+ :param train_config: The training configuration.
+ :param fjord_config: The configuration for Fjord.
+ :param log_config: The logging configuration.
+ :param device: The device to use.
+ :param seed: The seed to use for the random number generator.
+ """
+ Logger.setup_logging(**log_config)
+ self.cid = cid
+ self.p_s = p_s
+ self.net = get_net(model_name, p_s, device)
+ self.trainloader, self.valloader = load_data(
+ data_path, int(cid), train_config.batch_size, seed
+ )
+
+ self.know_distill = know_distill
+ self.max_p = max_p
+ self.fjord_config = fjord_config
+ self.train_config = train_config
+ self.model_path = model_path
+
+ def get_parameters(self, config: Dict[str, fl.common.Scalar]) -> List[np.ndarray]:
+ """Get the parameters of the model to return to the server.
+
+ :param config: The configuration.
+ :return: The parameters of the model.
+ """
+ Logger.get().info(f"Getting parameters from client {self.cid}")
+ return [val.cpu().numpy() for _, val in self.net.state_dict().items()]
+
+ def net_to_state_dict_layers(self) -> List[Module]:
+ """Model to state dict layers."""
+ return net_to_state_dict_layers(self.net)
+
+ def set_parameters(self, parameters: List[np.ndarray]) -> None:
+ """Set the parameters of the model.
+
+ :param parameters: The parameters of the model.
+ """
+ params_dict = zip(self.net.state_dict().keys(), parameters)
+ state_dict = OrderedDict({k: torch.tensor(v) for k, v in params_dict})
+ self.net.load_state_dict(state_dict, strict=True)
+
+ def fit(
+ self, parameters: List[Tensor], config: Dict[str, fl.common.Scalar]
+ ) -> Tuple[List[np.ndarray], int, Dict[str, Any]]:
+ """Train the model on the training set.
+
+ :param parameters: The parameters of the model.
+ :param config: The train configuration.
+ :return: The parameters of the model, the number of samples used for training,
+ and the training metrics
+ """
+ Logger.get().info(
+ f"Training on client {self.cid} for round "
+ f"{config['current_round']!r}/{config['total_rounds']!r}"
+ )
+
+ original_parameters = deepcopy(parameters)
+
+ self.set_parameters(parameters)
+ self.train_config.lr = config["lr"]
+
+ loss = train(
+ self.net,
+ self.trainloader,
+ self.know_distill,
+ self.max_p,
+ p_s=self.p_s,
+ epochs=self.train_config.local_epochs,
+ current_round=int(config["current_round"]),
+ total_rounds=int(config["total_rounds"]),
+ train_config=self.train_config,
+ )
+
+ final_parameters = self.get_parameters(config={})
+
+ return (
+ final_parameters,
+ len(self.trainloader.dataset),
+ {
+ "max_p": self.max_p,
+ "p_s": self.p_s,
+ "fjord_config": self.fjord_config,
+ "original_parameters": original_parameters,
+ "loss": loss,
+ },
+ )
+
+ def evaluate(
+ self, parameters: List[np.ndarray], config: Dict[str, fl.common.Scalar]
+ ) -> Tuple[float, int, Dict[str, Union[bool, bytes, float, int, str]]]:
+ """Validate the model on the test set.
+
+ :param parameters: The parameters of the model.
+ :param config: The eval configuration.
+ :return: The loss on the test set, the number of samples used for evaluation,
+ and the evaluation metrics.
+ """
+ Logger.get().info(
+ f"Evaluating on client {self.cid} for round "
+ f"{config['current_round']!r}/{config['total_rounds']!r}"
+ )
+
+ self.set_parameters(parameters)
+ loss, accuracy = test(self.net, self.valloader, [self.max_p])
+ save_model(self.net, self.model_path, cid=self.cid)
+
+ return loss[0], len(self.valloader.dataset), {"accuracy": accuracy[0]}
diff --git a/baselines/fjord/fjord/conf/__init__.py b/baselines/fjord/fjord/conf/__init__.py
new file mode 100644
index 000000000000..39fdacc8e90b
--- /dev/null
+++ b/baselines/fjord/fjord/conf/__init__.py
@@ -0,0 +1 @@
+"""Fjord configuration."""
diff --git a/baselines/fjord/fjord/conf/common.yaml b/baselines/fjord/fjord/conf/common.yaml
new file mode 100644
index 000000000000..d0f392faf4ff
--- /dev/null
+++ b/baselines/fjord/fjord/conf/common.yaml
@@ -0,0 +1,39 @@
+# @package _global_
+---
+loglevel: info
+logfile: run.log
+
+manual_seed: 123
+model: resnet18
+dataset: cifar10
+num_clients: 100
+data_path: "./data"
+num_workers: 4
+evaluate_every: 10
+
+cuda: true
+batch_size: 32
+lr: 0.1
+lr_scheduler: static
+optimiser: sgd
+momentum: 0
+nesterov: false
+weight_decay: 1e-4
+
+sampled_clients: 10
+min_fit_clients: 2
+client_selection: random # or balanced
+num_rounds: 500
+local_epochs: 1
+strategy: fjord_fedavg
+client_resources:
+ num_cpus: 1
+ num_gpus: 0.2
+knowledge_distillation: ???
+p_s:
+ - 0.2
+ - 0.4
+ - 0.6
+ - 0.8
+ - 1.0
+client_tier_allocation: uniform
diff --git a/baselines/fjord/fjord/conf/config.yaml b/baselines/fjord/fjord/conf/config.yaml
new file mode 100644
index 000000000000..a1cdc87c63ce
--- /dev/null
+++ b/baselines/fjord/fjord/conf/config.yaml
@@ -0,0 +1,8 @@
+---
+hydra:
+ run:
+ dir: ./runs/${now:%Y-%m-%d}:${now:%H-%M-%S}
+
+defaults:
+ - train_mode/fjord
+ - override hydra/job_logging: disabled
diff --git a/baselines/fjord/fjord/conf/train_mode/fjord.yaml b/baselines/fjord/fjord/conf/train_mode/fjord.yaml
new file mode 100644
index 000000000000..33b0e17957e0
--- /dev/null
+++ b/baselines/fjord/fjord/conf/train_mode/fjord.yaml
@@ -0,0 +1,6 @@
+# @package _global_
+---
+defaults:
+ - ../common@
+
+knowledge_distillation: false
\ No newline at end of file
diff --git a/baselines/fjord/fjord/conf/train_mode/fjord_kd.yaml b/baselines/fjord/fjord/conf/train_mode/fjord_kd.yaml
new file mode 100644
index 000000000000..d344d95314e9
--- /dev/null
+++ b/baselines/fjord/fjord/conf/train_mode/fjord_kd.yaml
@@ -0,0 +1,6 @@
+# @package _global_
+---
+defaults:
+ - ../common@
+
+knowledge_distillation: true
\ No newline at end of file
diff --git a/baselines/fjord/fjord/dataset.py b/baselines/fjord/fjord/dataset.py
new file mode 100644
index 000000000000..478826c2cf64
--- /dev/null
+++ b/baselines/fjord/fjord/dataset.py
@@ -0,0 +1,174 @@
+"""Dataset for CIFAR10."""
+import random
+from typing import Optional, Tuple
+
+import numpy as np
+import torch
+from PIL import Image
+from torch.nn import Module
+from torch.utils.data import DataLoader, Dataset
+from torchvision import transforms
+from torchvision.datasets import CIFAR10
+
+CIFAR_NORMALIZATION = ((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010))
+
+
+class FLCifar10Client(Dataset):
+ """Class implementing the partitioned CIFAR10 dataset."""
+
+ def __init__(self, fl_dataset: Dataset, client_id: Optional[int] = None) -> None:
+ """Ctor.
+
+ Args:
+ :param fl_dataset: The CIFAR10 dataset.
+ :param client_id: The client id to be used.
+ """
+ self.fl_dataset = fl_dataset
+ self.set_client(client_id)
+
+ def set_client(self, index: Optional[int] = None) -> None:
+ """Set the client to the given index. If index is None, use the whole dataset.
+
+ Args:
+ :param index: Index of the client to be used.
+ """
+ fl = self.fl_dataset
+ if index is None:
+ self.client_id = None
+ self.length = len(fl.data)
+ self.data = fl.data
+ else:
+ if index < 0 or index >= fl.num_clients:
+ raise ValueError("Number of clients is out of bounds.")
+ self.client_id = index
+ indices = fl.partition[self.client_id]
+ self.length = len(indices)
+ self.data = fl.data[indices]
+ self.targets = [fl.targets[i] for i in indices]
+
+ def __getitem__(self, index: int):
+ """Return the item at the given index.
+
+ :param index: Index of the item to be returned.
+ :return: The item at the given index.
+ """
+ fl = self.fl_dataset
+ img, target = self.data[index], self.targets[index]
+
+ # doing this so that it is consistent with all other fl_datasets
+ # to return a PIL Image
+ img = Image.fromarray(img)
+
+ if fl.transform is not None:
+ img = fl.transform(img)
+
+ if fl.target_transform is not None:
+ target = fl.target_transform(target)
+
+ return img, target
+
+ def __len__(self):
+ """Return the length of the dataset."""
+ return self.length
+
+
+class FLCifar10(CIFAR10):
+ """CIFAR10 Federated Dataset."""
+
+ def __init__( # pylint: disable=too-many-arguments
+ self,
+ root: str,
+ train: Optional[bool] = True,
+ transform: Optional[Module] = None,
+ target_transform: Optional[Module] = None,
+ download: Optional[bool] = False,
+ ) -> None:
+ """Ctor.
+
+ :param root: Root directory of dataset
+ :param train: If True, creates dataset from training set
+ :param transform: A function/transform that takes in an PIL image and returns a
+ transformed version.
+ :param target_transform: A function/transform that takes in the target and
+ transforms it.
+ :param download: If true, downloads the dataset from the internet.
+ """
+ super().__init__(
+ root,
+ train=train,
+ transform=transform,
+ target_transform=target_transform,
+ download=download,
+ )
+
+ # Uniform shuffle
+ shuffle = np.arange(len(self.data))
+ rng = np.random.default_rng(12345)
+ rng.shuffle(shuffle)
+ self.partition = shuffle.reshape([100, -1])
+ self.num_clients = len(self.partition)
+
+
+def get_transforms() -> Tuple[transforms.Compose, transforms.Compose]:
+ """Get the transforms for the CIFAR10 dataset.
+
+ :return: The transforms for the CIFAR10 dataset.
+ """
+ transform_train = transforms.Compose(
+ [
+ transforms.RandomCrop(32, padding=4),
+ transforms.RandomHorizontalFlip(),
+ transforms.ToTensor(),
+ transforms.Normalize(*CIFAR_NORMALIZATION),
+ ]
+ )
+
+ transform_test = transforms.Compose(
+ [
+ transforms.ToTensor(),
+ transforms.Normalize(*CIFAR_NORMALIZATION),
+ ]
+ )
+
+ return transform_train, transform_test
+
+
+def load_data(
+ path: str, cid: int, train_bs: int, seed: int, eval_bs: int = 1024
+) -> Tuple[DataLoader, DataLoader]:
+ """Load the CIFAR10 dataset.
+
+ :param path: The path to the dataset.
+ :param cid: The client ID.
+ :param train_bs: The batch size for training.
+ :param seed: The seed to use for the random number generator.
+ :param eval_bs: The batch size for evaluation.
+ :return: The training and test sets.
+ """
+
+ def seed_worker(worker_id): # pylint: disable=unused-argument
+ worker_seed = torch.initial_seed() % 2**32
+ np.random.seed(worker_seed)
+ random.seed(worker_seed)
+
+ g = torch.Generator()
+ g.manual_seed(seed)
+ transform_train, transform_test = get_transforms()
+
+ fl_dataset = FLCifar10(
+ root=path, train=True, download=True, transform=transform_train
+ )
+
+ trainset = FLCifar10Client(fl_dataset, client_id=cid)
+ testset = CIFAR10(root=path, train=False, download=True, transform=transform_test)
+
+ train_loader = DataLoader(
+ trainset,
+ batch_size=train_bs,
+ shuffle=True,
+ worker_init_fn=seed_worker,
+ generator=g,
+ )
+ test_loader = DataLoader(testset, batch_size=eval_bs)
+
+ return train_loader, test_loader
diff --git a/baselines/fjord/fjord/dataset_preparation.py b/baselines/fjord/fjord/dataset_preparation.py
new file mode 100644
index 000000000000..fe70679d3351
--- /dev/null
+++ b/baselines/fjord/fjord/dataset_preparation.py
@@ -0,0 +1 @@
+"""All dataset-related logic happens in dataset.py."""
diff --git a/baselines/fjord/fjord/main.py b/baselines/fjord/fjord/main.py
new file mode 100644
index 000000000000..f85fb9ccf158
--- /dev/null
+++ b/baselines/fjord/fjord/main.py
@@ -0,0 +1,278 @@
+"""Main script for FjORD."""
+import math
+import os
+import random
+from types import SimpleNamespace
+from typing import Any, Callable, Dict, List, Optional, Union
+
+import flwr as fl
+import hydra
+import numpy as np
+import torch
+from flwr.client import Client, NumPyClient
+from omegaconf import OmegaConf, open_dict
+
+from .client import FJORD_CONFIG_TYPE, FjORDClient, get_agg_config
+from .dataset import load_data
+from .models import get_net
+from .server import get_eval_fn
+from .strategy import FjORDFedAVG
+from .utils.logger import Logger
+from .utils.utils import get_parameters
+
+
+def get_fit_config_fn(
+ total_rounds: int, lr: float
+) -> Callable[[int], Dict[str, fl.common.Scalar]]:
+ """Get fit config function.
+
+ :param total_rounds: Total number of rounds
+ :param lr: Learning rate
+ :return: Fit config function
+ """
+
+ def fit_config(rnd: int) -> Dict[str, fl.common.Scalar]:
+ config: Dict[str, fl.common.Scalar] = {
+ "current_round": rnd,
+ "total_rounds": total_rounds,
+ "lr": lr,
+ }
+ return config
+
+ return fit_config
+
+
+def get_client_fn( # pylint: disable=too-many-arguments
+ args: Any,
+ model_path: str,
+ cid_to_max_p: Dict[int, float],
+ config: FJORD_CONFIG_TYPE,
+ train_config: SimpleNamespace,
+ device: torch.device,
+) -> Callable[[str], Union[Client, NumPyClient]]:
+ """Get client function that creates Flower client.
+
+ :param args: CLI/Config Arguments
+ :param model_path: Path to save the model
+ :param cid_to_max_p: Dictionary mapping client id to max p-value
+ :param config: Aggregation config
+ :param train_config: Training config
+ :param device: Device to be used
+ :return: Client function that returns Flower client
+ """
+
+ def client_fn(cid) -> FjORDClient:
+ max_p = cid_to_max_p[int(cid)]
+ log_config = {
+ "loglevel": args.loglevel,
+ "logfile": args.logfile,
+ }
+ return FjORDClient(
+ cid=cid,
+ model_name=args.model,
+ data_path=args.data_path,
+ model_path=model_path,
+ know_distill=args.knowledge_distillation,
+ max_p=max_p,
+ p_s=args.p_s,
+ fjord_config=config,
+ train_config=train_config,
+ log_config=log_config,
+ seed=args.manual_seed,
+ device=device,
+ )
+
+ return client_fn
+
+
+class FjORDBalancedClientManager(fl.server.SimpleClientManager):
+ """Balanced client manager for FjORD.
+
+ This class samples equal number of clients per p-value and the rest in RR.
+ """
+
+ def __init__(self, cid_to_max_p: Dict[int, float]) -> None:
+ """Ctor.
+
+ Args:
+ :param cid_to_max_p: Dictionary mapping client id to max p-value
+ """
+ super().__init__()
+ self.cid_to_max_p = cid_to_max_p
+ self.p_s = sorted(set(self.cid_to_max_p.values()))
+
+ def sample(
+ self,
+ num_clients: int,
+ min_num_clients: Optional[int] = None,
+ criterion: Optional[fl.server.criterion.Criterion] = None,
+ ) -> List[fl.server.client_proxy.ClientProxy]:
+ """Sample clients in a balanced way (equal per tier, remainder in Round-Robin).
+
+ Args:
+ :param num_clients: Number of clients to sample
+ :param min_num_clients: Minimum number of clients to sample
+ :param criterion: Client selection criterion
+ :return: List of sampled clients
+ """
+ if min_num_clients is None:
+ min_num_clients = num_clients
+ self.wait_for(min_num_clients)
+ available_cids = list(self.clients)
+ if criterion is not None:
+ available_cids = [
+ cid for cid in available_cids if criterion.select(self.clients[cid])
+ ]
+ if num_clients > len(available_cids):
+ Logger.get().info(
+ "Sampling failed: number of available clients"
+ " (%s) is less than number of requested clients (%s).",
+ len(available_cids),
+ num_clients,
+ )
+ return []
+
+ # construct p to available cids
+ max_p_to_cids: Dict[float, List[int]] = {p: [] for p in self.p_s}
+ random.shuffle(available_cids)
+ for cid_s in available_cids:
+ client_id = int(cid_s)
+ client_p = self.cid_to_max_p[client_id]
+ max_p_to_cids[client_p].append(client_id)
+
+ cl_per_tier = math.floor(num_clients / len(self.p_s))
+ remainder = num_clients - cl_per_tier * len(self.p_s)
+
+ selected_cids = set()
+ for p in self.p_s:
+ for cid in random.sample(max_p_to_cids[p], cl_per_tier):
+ selected_cids.add(cid)
+
+ for p in self.p_s:
+ if remainder == 0:
+ break
+ cid = random.choice(max_p_to_cids[p])
+ while cid not in selected_cids:
+ cid = random.choice(max_p_to_cids[p])
+ selected_cids.add(cid)
+ remainder -= 1
+
+ Logger.get().debug(f"Sampled {selected_cids}")
+ return [self.clients[str(cid)] for cid in selected_cids]
+
+
+def main(args: Any) -> None:
+ """Enter main functionality.
+
+ Args:
+ :param args: CLI/Config Arguments
+ """
+ torch.manual_seed(args.manual_seed)
+ torch.use_deterministic_algorithms(True)
+ np.random.seed(args.manual_seed)
+ random.seed(args.manual_seed)
+
+ path = args.data_path
+ device = torch.device("cuda") if args.cuda else torch.device("cpu")
+ model_path = hydra.core.hydra_config.HydraConfig.get().runtime.output_dir
+
+ Logger.get().info(
+ f"Training on {device} using PyTorch "
+ f"{torch.__version__} and Flower {fl.__version__}"
+ )
+
+ trainloader, testloader = load_data(
+ path, cid=0, seed=args.manual_seed, train_bs=args.batch_size
+ )
+ NUM_CLIENTS = args.num_clients
+ if args.client_tier_allocation == "uniform":
+ cid_to_max_p = {cid: (cid // 20) * 0.2 + 0.2 for cid in range(100)}
+ else:
+ raise ValueError(
+ f"Client to tier allocation strategy "
+ f"{args.client_tier_allocation} not currently"
+ "supported"
+ )
+
+ model = get_net(args.model, args.p_s, device=device)
+ config = get_agg_config(model, trainloader, args.p_s)
+ train_config = SimpleNamespace(
+ **{
+ "batch_size": args.batch_size,
+ "lr": args.lr,
+ "optimiser": args.optimiser,
+ "momentum": args.momentum,
+ "nesterov": args.nesterov,
+ "lr_scheduler": args.lr_scheduler,
+ "weight_decay": args.weight_decay,
+ "local_epochs": args.local_epochs,
+ }
+ )
+
+ if args.strategy == "fjord_fedavg":
+ strategy = FjORDFedAVG(
+ fraction_fit=args.sampled_clients / args.num_clients,
+ fraction_evaluate=0.0,
+ min_fit_clients=args.min_fit_clients,
+ min_evaluate_clients=1,
+ min_available_clients=NUM_CLIENTS,
+ evaluate_fn=get_eval_fn(args, model_path, testloader, device),
+ on_fit_config_fn=get_fit_config_fn(args.num_rounds, args.lr),
+ initial_parameters=fl.common.ndarrays_to_parameters(
+ get_parameters(get_net(args.model, args.p_s, device=device))
+ ),
+ )
+ else:
+ raise ValueError(f"Strategy {args.strategy} is not currently supported")
+
+ client_resources = args.client_resources
+ if device.type != "cuda":
+ client_resources = {
+ "num_cpus": args.client_resources["num_cpus"],
+ "num_gpus": 0,
+ }
+
+ if args.client_selection == "balanced":
+ cl_manager = FjORDBalancedClientManager(cid_to_max_p)
+ elif args.client_selection == "random":
+ cl_manager = None
+ else:
+ raise ValueError(
+ f"Client selection {args.client_selection} is not currently supported"
+ )
+
+ Logger.get().info("Starting simulated run.")
+ # Start simulation
+ fl.simulation.start_simulation(
+ client_fn=get_client_fn(
+ args, model_path, cid_to_max_p, config, train_config, device
+ ),
+ num_clients=NUM_CLIENTS,
+ config=fl.server.ServerConfig(num_rounds=args.num_rounds),
+ strategy=strategy,
+ client_resources=client_resources,
+ client_manager=cl_manager,
+ ray_init_args={"include_dashboard": False},
+ )
+
+
+@hydra.main(version_base=None, config_path="conf", config_name="config")
+def run_app(cfg):
+ """Run the application.
+
+ Args:
+ :param cfg: Hydra configuration
+ """
+ OmegaConf.resolve(cfg)
+ logfile = os.path.join(
+ hydra.core.hydra_config.HydraConfig.get()["runtime"]["output_dir"], cfg.logfile
+ )
+ with open_dict(cfg):
+ cfg.logfile = logfile
+ Logger.setup_logging(loglevel=cfg.loglevel, logfile=logfile)
+ Logger.get().info(f"Hydra configuration: {OmegaConf.to_yaml(cfg)}")
+ main(cfg)
+
+
+if __name__ == "__main__":
+ run_app()
diff --git a/baselines/fjord/fjord/models.py b/baselines/fjord/fjord/models.py
new file mode 100644
index 000000000000..0f3fc276decf
--- /dev/null
+++ b/baselines/fjord/fjord/models.py
@@ -0,0 +1,319 @@
+"""ResNet model for Fjord."""
+from types import SimpleNamespace
+from typing import List, Optional, Tuple
+
+import torch
+import torch.nn.functional as F
+from torch import nn
+from torch.nn import Module
+from torch.optim import Optimizer
+from torch.optim.lr_scheduler import MultiStepLR
+from torch.utils.data import DataLoader
+from tqdm import tqdm
+
+from .od.models.utils import (
+ SequentialWithSampler,
+ create_bn_layer,
+ create_conv_layer,
+ create_linear_layer,
+)
+from .od.samplers import BaseSampler, ODSampler
+
+
+class BasicBlock(nn.Module):
+ """Basic Block for resnet."""
+
+ expansion = 1
+
+ def __init__(
+ self, od, p_s, in_planes, planes, stride=1
+ ): # pylint: disable=too-many-arguments
+ super().__init__()
+ self.od = od
+ self.conv1 = create_conv_layer(
+ od,
+ True,
+ in_planes,
+ planes,
+ kernel_size=3,
+ stride=stride,
+ padding=1,
+ bias=False,
+ )
+ self.bn1 = create_bn_layer(od=od, p_s=p_s, num_features=planes)
+ self.conv2 = create_conv_layer(
+ od, True, planes, planes, kernel_size=3, stride=1, padding=1, bias=False
+ )
+ self.bn2 = create_bn_layer(od=od, p_s=p_s, num_features=planes)
+
+ self.shortcut = SequentialWithSampler()
+ if stride != 1 or in_planes != self.expansion * planes:
+ self.shortcut = SequentialWithSampler(
+ create_conv_layer(
+ od,
+ True,
+ in_planes,
+ self.expansion * planes,
+ kernel_size=1,
+ stride=stride,
+ bias=False,
+ ),
+ create_bn_layer(od=od, p_s=p_s, num_features=self.expansion * planes),
+ )
+
+ def forward(self, x, sampler):
+ """Forward method for basic block.
+
+ Args:
+ :param x: input
+ :param sampler: sampler
+ :return: Output of forward pass
+ """
+ if sampler is None:
+ out = F.relu(self.bn1(self.conv1(x)))
+ out = self.bn2(self.conv2(out))
+ out += self.shortcut(x)
+ out = F.relu(out)
+ else:
+ out = F.relu(self.bn1(self.conv1(x, p=sampler())))
+ out = self.bn2(self.conv2(out, p=sampler()))
+ shortcut = self.shortcut(x, sampler=sampler)
+ assert (
+ shortcut.shape == out.shape
+ ), f"Shortcut shape: {shortcut.shape} out.shape: {out.shape}"
+ out += shortcut
+ # out += self.shortcut(x, sampler=sampler)
+ out = F.relu(out)
+ return out
+
+
+# Adapted from:
+# https://github.com/kuangliu/pytorch-cifar/blob/master/models/resnet.py
+class ResNet(nn.Module): # pylint: disable=too-many-instance-attributes
+ """ResNet in PyTorch.
+
+ Reference:
+ [1] Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun
+ Deep Residual Learning for Image Recognition. arXiv:1512.03385
+ """
+
+ def __init__(
+ self, od, p_s, block, num_blocks, num_classes=10
+ ): # pylint: disable=too-many-arguments
+ super().__init__()
+ self.od = od
+ self.in_planes = 64
+
+ self.conv1 = create_conv_layer(
+ od, True, 3, 64, kernel_size=3, stride=1, padding=1, bias=False
+ )
+ self.bn1 = create_bn_layer(od=od, p_s=p_s, num_features=64)
+ self.layer1 = self._make_layer(od, p_s, block, 64, num_blocks[0], stride=1)
+ self.layer2 = self._make_layer(od, p_s, block, 128, num_blocks[1], stride=2)
+ self.layer3 = self._make_layer(od, p_s, block, 256, num_blocks[2], stride=2)
+ self.layer4 = self._make_layer(od, p_s, block, 512, num_blocks[3], stride=2)
+ self.linear = create_linear_layer(od, False, 512 * block.expansion, num_classes)
+
+ def _make_layer(
+ self, od, p_s, block, planes, num_blocks, stride
+ ): # pylint: disable=too-many-arguments
+ strides = [stride] + [1] * (num_blocks - 1)
+ layers = []
+ for strd in strides:
+ layers.append(block(od, p_s, self.in_planes, planes, strd))
+ self.in_planes = planes * block.expansion
+ return SequentialWithSampler(*layers)
+
+ def forward(self, x, sampler=None):
+ """Forward method for ResNet.
+
+ Args:
+ :param x: input
+ :param sampler: sampler
+ :return: Output of forward pass
+ """
+ if self.od:
+ if sampler is None:
+ sampler = BaseSampler(self)
+ out = F.relu(self.bn1(self.conv1(x, p=sampler())))
+ out = self.layer1(out, sampler=sampler)
+ out = self.layer2(out, sampler=sampler)
+ out = self.layer3(out, sampler=sampler)
+ out = self.layer4(out, sampler=sampler)
+ out = F.avg_pool2d(out, 4) # pylint: disable=not-callable
+ out = out.view(out.size(0), -1)
+ out = self.linear(out)
+ else:
+ out = F.relu(self.bn1(self.conv1(x)))
+ out = self.layer1(out)
+ out = self.layer2(out)
+ out = self.layer3(out)
+ out = self.layer4(out)
+ out = F.avg_pool2d(out, 4) # pylint: disable=not-callable
+ out = out.view(out.size(0), -1)
+ out = self.linear(out)
+ return out
+
+
+def ResNet18(od=False, p_s=(1.0,)):
+ """Construct a ResNet-18 model.
+
+ Args:
+ :param od: whether to create OD (Ordered Dropout) layer
+ :param p_s: list of p-values
+ """
+ return ResNet(od, p_s, BasicBlock, [2, 2, 2, 2])
+
+
+def get_net(
+ model_name: str,
+ p_s: List[float],
+ device: torch.device,
+) -> torch.nn.Module:
+ """Initialise model.
+
+ :param model_name: name of the model
+ :param p_s: list of p-values
+ :param device: device to be used
+ :return: initialised model
+ """
+ if model_name == "resnet18":
+ net = ResNet18(od=True, p_s=p_s).to(device)
+ else:
+ raise ValueError(f"Model {model_name} is not supported")
+
+ return net
+
+
+def train( # pylint: disable=too-many-locals, too-many-arguments
+ net: Module,
+ trainloader: DataLoader,
+ know_distill: bool,
+ max_p: float,
+ current_round: int,
+ total_rounds: int,
+ p_s: List[float],
+ epochs: int,
+ train_config: SimpleNamespace,
+) -> float:
+ """Train the model on the training set.
+
+ :param net: The model to train.
+ :param trainloader: The training set.
+ :param know_distill: Whether the model being trained uses knowledge distillation.
+ :param max_p: The maximum p value.
+ :param current_round: The current round of training.
+ :param total_rounds: The total number of rounds of training.
+ :param p_s: The p values to use for training.
+ :param epochs: The number of epochs to train for.
+ :param train_config: The training configuration.
+ :return: The loss on the training set.
+ """
+ device = next(net.parameters()).device
+ criterion = torch.nn.CrossEntropyLoss()
+ net.train()
+ if train_config.optimiser == "sgd":
+ optimizer = torch.optim.SGD(
+ net.parameters(),
+ lr=train_config.lr,
+ momentum=train_config.momentum,
+ nesterov=train_config.nesterov,
+ weight_decay=train_config.weight_decay,
+ )
+ else:
+ raise ValueError(f"Optimiser {train_config.optimiser} not supported")
+ lr_scheduler = get_lr_scheduler(
+ optimizer, total_rounds, method=train_config.lr_scheduler
+ )
+ for _ in range(current_round):
+ lr_scheduler.step()
+
+ sampler = ODSampler(
+ p_s=p_s,
+ max_p=max_p,
+ model=net,
+ )
+ max_sampler = ODSampler(
+ p_s=[max_p],
+ max_p=max_p,
+ model=net,
+ )
+
+ loss = 0.0
+ samples = 0
+ for _ in range(epochs):
+ for images, labels in trainloader:
+ optimizer.zero_grad()
+ target = labels.to(device)
+ images = images.to(device)
+ batch_size = images.shape[0]
+ if know_distill:
+ full_output = net(images.to(device), sampler=max_sampler)
+ full_loss = criterion(full_output, target)
+ full_loss.backward()
+ target = full_output.detach().softmax(dim=1)
+ partial_loss = criterion(net(images, sampler=sampler), target)
+ partial_loss.backward()
+ optimizer.step()
+ loss += partial_loss.item() * batch_size
+ samples += batch_size
+
+ return loss / samples
+
+
+def test(
+ net: Module, testloader: DataLoader, p_s: List[float]
+) -> Tuple[List[float], List[float]]:
+ """Validate the model on the test set.
+
+ :param net: The model to validate.
+ :param testloader: The test set.
+ :param p_s: The p values to use for validation.
+ :return: The loss and accuracy on the test set.
+ """
+ device = next(net.parameters()).device
+ criterion = torch.nn.CrossEntropyLoss()
+ losses = []
+ accuracies = []
+ net.eval()
+
+ for p in p_s:
+ correct, loss = 0, 0.0
+ p_sampler = ODSampler(
+ p_s=[p],
+ max_p=p,
+ model=net,
+ )
+
+ with torch.no_grad():
+ for images, labels in tqdm(testloader):
+ outputs = net(images.to(device), sampler=p_sampler)
+ labels = labels.to(device)
+ loss += criterion(outputs, labels).item() * images.shape[0]
+ correct += (torch.max(outputs.data, 1)[1] == labels).sum().item()
+ accuracy = correct / len(testloader.dataset)
+ losses.append(loss / len(testloader.dataset))
+ accuracies.append(accuracy)
+
+ return losses, accuracies
+
+
+def get_lr_scheduler(
+ optimiser: Optimizer,
+ total_epochs: int,
+ method: Optional[str] = "static",
+) -> torch.optim.lr_scheduler.LRScheduler:
+ """Get the learning rate scheduler.
+
+ :param optimiser: The optimiser for which to get the scheduler.
+ :param total_epochs: The total number of epochs.
+ :param method: The method to use for the scheduler. Supports static and cifar10.
+ :return: The learning rate scheduler.
+ """
+ if method == "static":
+ return MultiStepLR(optimiser, [total_epochs + 1])
+ if method == "cifar10":
+ return MultiStepLR(
+ optimiser, [int(0.5 * total_epochs), int(0.75 * total_epochs)], gamma=0.1
+ )
+ raise ValueError(f"{method} scheduler not currently supported.")
diff --git a/baselines/fjord/fjord/od/__init__.py b/baselines/fjord/fjord/od/__init__.py
new file mode 100644
index 000000000000..f2b055c479f2
--- /dev/null
+++ b/baselines/fjord/fjord/od/__init__.py
@@ -0,0 +1 @@
+"""Ordered dropout package."""
diff --git a/baselines/fjord/fjord/od/layers/__init__.py b/baselines/fjord/fjord/od/layers/__init__.py
new file mode 100644
index 000000000000..a87c70401d4c
--- /dev/null
+++ b/baselines/fjord/fjord/od/layers/__init__.py
@@ -0,0 +1,6 @@
+"""Ordered Dropout layers."""
+from .batch_norm import ODBatchNorm2d
+from .conv import ODConv2d
+from .linear import ODLinear
+
+__all__ = ["ODBatchNorm2d", "ODConv2d", "ODLinear"]
diff --git a/baselines/fjord/fjord/od/layers/batch_norm.py b/baselines/fjord/fjord/od/layers/batch_norm.py
new file mode 100644
index 000000000000..5fce4dff0910
--- /dev/null
+++ b/baselines/fjord/fjord/od/layers/batch_norm.py
@@ -0,0 +1,75 @@
+"""BatchNorm using Ordered Dropout."""
+from typing import List, Optional
+
+import numpy as np
+import torch
+from torch import Tensor, nn
+
+__all__ = ["ODBatchNorm2d"]
+
+
+class ODBatchNorm2d(nn.Module): # pylint: disable=too-many-instance-attributes
+ """Ordered Dropout BatchNorm2d."""
+
+ def __init__(
+ self,
+ *args,
+ p_s: List[float],
+ num_features: int,
+ affine: Optional[bool] = True,
+ **kwargs,
+ ) -> None:
+ super().__init__()
+ self.p_s = p_s
+ self.is_od = False # no sampling is happening here
+ self.num_features = num_features
+ self.num_features_s = [int(np.ceil(num_features * p)) for p in p_s]
+ self.p_to_num_features = dict(zip(p_s, self.num_features_s))
+ self.width = np.max(self.num_features_s)
+ self.last_input_dim = None
+
+ self.bn = nn.ModuleDict(
+ {
+ str(num_features): nn.BatchNorm2d(
+ num_features, *args, **kwargs, affine=False
+ )
+ for num_features in self.num_features_s
+ }
+ )
+
+ # single track_running_stats
+ if affine:
+ self.affine = True
+ self.weight = nn.Parameter(torch.Tensor(self.width, 1, 1))
+ self.bias = nn.Parameter(torch.Tensor(self.width, 1, 1))
+
+ self.reset_parameters()
+
+ # get p into the layer
+ for m, p in zip(self.bn, self.p_s):
+ self.bn[m].p = p
+ self.bn[m].num_batches_tracked = torch.tensor(1, dtype=torch.long)
+
+ def reset_parameters(self):
+ """Reset parameters."""
+ if self.affine:
+ nn.init.ones_(self.weight)
+ nn.init.zeros_(self.bias)
+ for m in self.bn:
+ self.bn[m].reset_parameters()
+
+ def forward(self, x: Tensor) -> Tensor:
+ """Forward pass.
+
+ Args:
+ :param x: Input tensor.
+ :return: Output of forward pass.
+ """
+ in_dim = x.size(1) # second dimension is input dimension
+ assert (
+ in_dim in self.num_features_s
+ ), "input dimension not in selected num_features_s"
+ out = self.bn[str(in_dim)](x)
+ if self.affine:
+ out = out * self.weight[:in_dim] + self.bias[:in_dim]
+ return out
diff --git a/baselines/fjord/fjord/od/layers/conv.py b/baselines/fjord/fjord/od/layers/conv.py
new file mode 100644
index 000000000000..544f3a578418
--- /dev/null
+++ b/baselines/fjord/fjord/od/layers/conv.py
@@ -0,0 +1,140 @@
+"""Convolutional layer using Ordered Dropout."""
+from typing import Optional, Tuple, Union
+
+import numpy as np
+from torch import Tensor, nn
+from torch.nn import Module
+
+from .utils import check_layer
+
+__all__ = ["ODConv1d", "ODConv2d", "ODConv3d"]
+
+
+def od_conv_forward(
+ layer: Module, x: Tensor, p: Optional[Union[Tuple[Module, float], float]] = None
+) -> Tensor:
+ """Ordered dropout forward pass for convolution networks.
+
+ Args:
+ :param layer: The layer being forwarded.
+ :param x: Input tensor.
+ :param p: Tuple of layer and p or p.
+ :return: Output of forward pass.
+ """
+ p = check_layer(layer, p)
+ if not layer.is_od and p is not None:
+ raise ValueError("p must be None if is_od is False")
+ in_dim = x.size(1) # second dimension is input dimension
+ layer.last_input_dim = in_dim
+ if not p: # i.e., don't apply OD
+ out_dim = layer.width
+ else:
+ out_dim = int(np.ceil(layer.width * p))
+ layer.last_output_dim = out_dim
+ # subsampled weights and bias
+ weights_red = layer.weight[:out_dim, :in_dim]
+ bias_red = layer.bias[:out_dim] if layer.bias is not None else None
+ return layer._conv_forward( # pylint: disable=protected-access
+ x, weights_red, bias_red
+ )
+
+
+def get_slice(layer: Module, in_dim: int, out_dim: int) -> Tuple[Tensor, Tensor]:
+ """Get slice of weights and bias.
+
+ Args:
+ :param layer: The layer.
+ :param in_dim: The input dimension.
+ :param out_dim: The output dimension.
+ :return: The slice of weights and bias.
+ """
+ weight_slice = layer.weight[:in_dim, :out_dim]
+ bias_slice = layer.bias[:out_dim] if layer.bias is not None else None
+ return weight_slice, bias_slice
+
+
+class ODConv1d(nn.Conv1d):
+ """Ordered Dropout Conv1d."""
+
+ def __init__(self, *args, is_od: bool = True, **kwargs) -> None:
+ self.is_od = is_od
+ super().__init__(*args, **kwargs)
+ self.width = self.out_channels
+ self.last_input_dim = None
+ self.last_output_dim = None
+
+ def forward( # pylint: disable=arguments-differ
+ self,
+ input: Tensor, # pylint: disable=redefined-builtin
+ p: Optional[Union[Tuple[Module, float], float]] = None,
+ ) -> Tensor:
+ """Forward pass.
+
+ Args:
+ :param input: Input tensor.
+ :param p: Tuple of layer and p or p.
+ :return: Output of forward pass.
+ """
+ return od_conv_forward(self, input, p)
+
+ def get_slice(self, *args, **kwargs) -> Tuple[Tensor, Tensor]:
+ """Get slice of weights and bias."""
+ return get_slice(self, *args, **kwargs)
+
+
+class ODConv2d(nn.Conv2d):
+ """Ordered Dropout Conv2d."""
+
+ def __init__(self, *args, is_od: bool = True, **kwargs) -> None:
+ self.is_od = is_od
+ super().__init__(*args, **kwargs)
+ self.width = self.out_channels
+ self.last_input_dim = None
+ self.last_output_dim = None
+
+ def forward( # pylint: disable=arguments-differ
+ self,
+ input: Tensor, # pylint: disable=redefined-builtin
+ p: Optional[Union[Tuple[Module, float], float]] = None,
+ ) -> Tensor:
+ """Forward pass.
+
+ Args:
+ :param input: Input tensor.
+ :param p: Tuple of layer and p or p.
+ :return: Output of forward pass.
+ """
+ return od_conv_forward(self, input, p)
+
+ def get_slice(self, *args, **kwargs) -> Tuple[Tensor, Tensor]:
+ """Get slice of weights and bias."""
+ return get_slice(self, *args, **kwargs)
+
+
+class ODConv3d(nn.Conv3d):
+ """Ordered Dropout Conv3d."""
+
+ def __init__(self, *args, is_od: bool = True, **kwargs) -> None:
+ self.is_od = is_od
+ super().__init__(*args, **kwargs)
+ self.width = self.out_channels
+ self.last_input_dim = None
+ self.last_output_dim = None
+
+ def forward( # pylint: disable=arguments-differ
+ self,
+ input: Tensor, # pylint: disable=redefined-builtin
+ p: Optional[Union[Tuple[Module, float], float]] = None,
+ ) -> Tensor:
+ """Forward pass.
+
+ Args:
+ :param input: Input tensor.
+ :param p: Tuple of layer and p or p.
+ :return: Output of forward pass.
+ """
+ return od_conv_forward(self, input, p)
+
+ def get_slice(self, *args, **kwargs) -> Tuple[Tensor, Tensor]:
+ """Get slice of weights and bias."""
+ return get_slice(self, *args, **kwargs)
diff --git a/baselines/fjord/fjord/od/layers/linear.py b/baselines/fjord/fjord/od/layers/linear.py
new file mode 100644
index 000000000000..927ae4c8d516
--- /dev/null
+++ b/baselines/fjord/fjord/od/layers/linear.py
@@ -0,0 +1,62 @@
+"""Liner layer using Ordered Dropout."""
+from typing import Optional, Tuple, Union
+
+import numpy as np
+import torch.nn.functional as F
+from torch import Tensor, nn
+from torch.nn import Module
+
+from .utils import check_layer
+
+__all__ = ["ODLinear"]
+
+
+class ODLinear(nn.Linear):
+ """Ordered Dropout Linear."""
+
+ def __init__(self, *args, is_od: bool = True, **kwargs) -> None:
+ super().__init__(*args, **kwargs)
+ self.is_od = is_od
+ self.width = self.out_features
+ self.last_input_dim = None
+ self.last_output_dim = None
+
+ def forward( # pylint: disable=arguments-differ
+ self,
+ input: Tensor, # pylint: disable=redefined-builtin
+ p: Optional[Union[Tuple[Module, float], float]] = None,
+ ) -> Tensor:
+ """Forward pass.
+
+ Args:
+ :param input: Input tensor.
+ :param p: Tuple of layer and p or p.
+ :return: Output of forward pass.
+ """
+ if not self.is_od and p is not None:
+ raise ValueError("p must be None if is_od is False")
+ p = check_layer(self, p)
+ in_dim = input.size(1) # second dimension is input dimension
+ self.last_input_dim = in_dim
+ if not p: # i.e., don't apply OD
+ out_dim = self.width
+ else:
+ out_dim = int(np.ceil(self.width * p))
+ self.last_output_dim = out_dim
+ # subsampled weights and bias
+ weights_red = self.weight[:out_dim, :in_dim]
+ bias_red = self.bias[:out_dim] if self.bias is not None else None
+ return F.linear(input, weights_red, bias_red) # pylint: disable=not-callable
+
+ def get_slice(self, in_dim: int, out_dim: int) -> Tuple[Tensor, Tensor]:
+ """Get slice of weights and bias.
+
+ Args:
+ :param layer: The layer.
+ :param in_dim: The input dimension.
+ :param out_dim: The output dimension.
+ :return: The slice of weights and bias.
+ """
+ weight_slice = self.weight[:in_dim, :out_dim]
+ bias_slice = self.bias[:out_dim] if self.bias is not None else None
+ return weight_slice, bias_slice
diff --git a/baselines/fjord/fjord/od/layers/utils.py b/baselines/fjord/fjord/od/layers/utils.py
new file mode 100644
index 000000000000..46649a51de96
--- /dev/null
+++ b/baselines/fjord/fjord/od/layers/utils.py
@@ -0,0 +1,23 @@
+"""Utils function for Ordered Dropout layers."""
+from typing import Optional, Tuple, Union
+
+from torch.nn import Module
+
+
+def check_layer(
+ layer: Module, p: Union[Tuple[Module, Optional[float]], Optional[float]]
+) -> Optional[float]:
+ """Check if layer is valid and return p.
+
+ Args:
+ layer: PyTorch layer
+ p: Ordered dropout p
+ """
+ # if p is tuple, check layer validity
+ if isinstance(p, tuple):
+ p_, sampled_layer = p
+ assert layer == sampled_layer, "Layer mismatch"
+ else:
+ p_ = p
+
+ return p_
diff --git a/baselines/fjord/fjord/od/models/__init__.py b/baselines/fjord/fjord/od/models/__init__.py
new file mode 100644
index 000000000000..b0e5ede4f93b
--- /dev/null
+++ b/baselines/fjord/fjord/od/models/__init__.py
@@ -0,0 +1 @@
+"""Functions for creatingin OD models."""
diff --git a/baselines/fjord/fjord/od/models/utils.py b/baselines/fjord/fjord/od/models/utils.py
new file mode 100644
index 000000000000..4a1707587ef4
--- /dev/null
+++ b/baselines/fjord/fjord/od/models/utils.py
@@ -0,0 +1,77 @@
+"""Utility functions for models."""
+from torch import nn
+
+from ..layers import ODBatchNorm2d, ODConv2d, ODLinear
+
+
+def create_linear_layer(od, is_od, *args, **kwargs):
+ """Create linear layer.
+
+ :param od: whether to create OD layer
+ :param is_od: whether to create OD layer
+ :param args: arguments for nn.Linear
+ :param kwargs: keyword arguments for nn.Linear
+ :return: nn.Linear or ODLinear
+ """
+ if od:
+ return ODLinear(*args, is_od=is_od, **kwargs)
+
+ return nn.Linear(*args, **kwargs)
+
+
+def create_conv_layer(od, is_od, *args, **kwargs):
+ """Create conv layer.
+
+ :param od: whether to create OD layer
+ :param is_od: whether to create OD layer
+ :param args: arguments for nn.Conv2d
+ :param kwargs: keyword arguments for nn.Conv2d
+ :return: nn.Conv2d or ODConv2d
+ """
+ if od:
+ return ODConv2d(*args, is_od=is_od, **kwargs)
+
+ return nn.Conv2d(*args, **kwargs)
+
+
+def create_bn_layer(od, p_s, *args, **kwargs):
+ """Create batch norm layer.
+
+ :param od: whether to create OD layer
+ :param p_s: list of p-values
+ :param args: arguments for nn.BatchNorm2d
+ :param kwargs: keyword arguments for nn.BatchNorm2d
+ :return: nn.BatchNorm2d or ODBatchNorm2d
+ """
+ if od:
+ num_features = kwargs["num_features"]
+ del kwargs["num_features"]
+ return ODBatchNorm2d(*args, p_s=p_s, num_features=num_features, **kwargs)
+
+ return nn.BatchNorm2d(*args, **kwargs)
+
+
+class SequentialWithSampler(nn.Sequential):
+ """Implements sequential model with sampler."""
+
+ def forward(
+ self, input, sampler=None
+ ): # pylint: disable=redefined-builtin, arguments-differ
+ """Forward method for custom Sequential.
+
+ :param input: input
+ :param sampler: the sampler to use.
+ :return: Output of sequential
+ """
+ if sampler is None:
+ for module in self:
+ input = module(input)
+ else:
+ for module in self:
+ if hasattr(module, "od") and module.od:
+ input = module(input, sampler=sampler)
+ elif hasattr(module, "is_od") and module.is_od:
+ input = module(input, p=sampler())
+ else:
+ input = module(input)
+ return input
diff --git a/baselines/fjord/fjord/od/samplers/__init__.py b/baselines/fjord/fjord/od/samplers/__init__.py
new file mode 100644
index 000000000000..dad08b4236c4
--- /dev/null
+++ b/baselines/fjord/fjord/od/samplers/__init__.py
@@ -0,0 +1,5 @@
+"""OD samplers."""
+from .base_sampler import BaseSampler
+from .fixed_od import ODSampler
+
+__all__ = ["BaseSampler", "ODSampler"]
diff --git a/baselines/fjord/fjord/od/samplers/base_sampler.py b/baselines/fjord/fjord/od/samplers/base_sampler.py
new file mode 100644
index 000000000000..28eac929df81
--- /dev/null
+++ b/baselines/fjord/fjord/od/samplers/base_sampler.py
@@ -0,0 +1,49 @@
+"""Base sampler class."""
+from collections.abc import Generator
+
+from torch.nn import Module
+
+
+class BaseSampler:
+ """Base class implementing p-value sampling per layer."""
+
+ def __init__(self, model: Module, with_layer: bool = False) -> None:
+ """Initialise sampler.
+
+ :param model: OD model
+ :param with_layer: whether to return layer upon call.
+ """
+ self.model = model
+ self.with_layer = with_layer
+ self.prepare_sampler()
+ self.width_samples = self.width_sampler()
+ self.layer_samples = self.layer_sampler()
+
+ def prepare_sampler(self) -> None:
+ """Prepare sampler."""
+ self.num_od_layers = 0
+ self.widths = []
+ self.od_layers = []
+ for m in self.model.modules():
+ if hasattr(m, "is_od") and m.is_od:
+ self.num_od_layers += 1
+ self.widths.append(m.width)
+ self.od_layers.append(m)
+
+ def width_sampler(self) -> Generator: # pylint: disable=no-self-use
+ """Sample width."""
+ while True:
+ yield None
+
+ def layer_sampler(self) -> Module:
+ """Sample layer."""
+ while True:
+ for m in self.od_layers:
+ yield m
+
+ def __call__(self):
+ """Call sampler."""
+ if self.with_layer:
+ return next(self.width_samples), next(self.layer_samples)
+
+ return next(self.width_samples)
diff --git a/baselines/fjord/fjord/od/samplers/fixed_od.py b/baselines/fjord/fjord/od/samplers/fixed_od.py
new file mode 100644
index 000000000000..b90912a7b5c2
--- /dev/null
+++ b/baselines/fjord/fjord/od/samplers/fixed_od.py
@@ -0,0 +1,27 @@
+"""Ordered Dropout stochastic sampler."""
+from collections.abc import Generator
+from typing import List
+
+import numpy as np
+
+from .base_sampler import BaseSampler
+
+
+class ODSampler(BaseSampler):
+ """Implements OD sampling per layer up to p-max value.
+
+ :param p_s: list of p-values
+ :param max_p: maximum p-value
+ """
+
+ def __init__(self, p_s: List[float], max_p: float, *args, **kwargs) -> None:
+ super().__init__(*args, **kwargs)
+ self.p_s = np.array([p for p in p_s if p <= max_p])
+ self.max_p = max_p
+
+ def width_sampler(self) -> Generator:
+ """Sample width."""
+ while True:
+ p = np.random.choice(self.p_s)
+ for _ in range(self.num_od_layers):
+ yield p
diff --git a/baselines/fjord/fjord/server.py b/baselines/fjord/fjord/server.py
new file mode 100644
index 000000000000..d25e8f17156a
--- /dev/null
+++ b/baselines/fjord/fjord/server.py
@@ -0,0 +1,50 @@
+"""Global evaluation function."""
+from typing import Any, Dict, Optional, Tuple
+
+import flwr as fl
+import torch
+from torch.utils.data import DataLoader
+
+from .models import get_net, test
+from .utils.logger import Logger
+from .utils.utils import save_model, set_parameters
+
+
+def get_eval_fn(
+ args: Any, model_path: str, testloader: DataLoader, device: torch.device
+):
+ """Get evaluation function.
+
+ :param args: Arguments
+ :param model_path: Path to save the model
+ :param testloader: Test data loader
+ :param device: Device to be used
+ :return: Evaluation function
+ """
+
+ def evaluate(
+ server_round: int,
+ parameters: fl.common.NDArrays,
+ config: Dict[str, fl.common.Scalar], # pylint: disable=unused-argument
+ ) -> Optional[Tuple[float, Dict[str, fl.common.Scalar]]]:
+ if server_round and (server_round % args.evaluate_every == 0):
+ net = get_net(args.model, args.p_s, device)
+ set_parameters(net, parameters)
+ # Update model with the latest parameters
+ losses, accuracies = test(net, testloader, args.p_s)
+ avg_loss = sum(losses) / len(losses)
+ for p, loss, accuracy in zip(args.p_s, losses, accuracies):
+ Logger.get().info(
+ f"Server-side evaluation (global round={server_round})"
+ f" {p=}: {loss=} / {accuracy=}"
+ )
+ save_model(net, model_path)
+
+ return avg_loss, {
+ f"Accuracy[{p}]": acc for p, acc in zip(args.p_s, accuracies)
+ }
+
+ Logger.get().debug(f"Evaluation skipped for global round={server_round}.")
+ return float("inf"), {"accuracy": "None"}
+
+ return evaluate
diff --git a/baselines/fjord/fjord/strategy.py b/baselines/fjord/fjord/strategy.py
new file mode 100644
index 000000000000..d3ec99a419bd
--- /dev/null
+++ b/baselines/fjord/fjord/strategy.py
@@ -0,0 +1,235 @@
+"""FjORD strategy."""
+from copy import deepcopy
+from functools import reduce
+from typing import Dict, List, Optional, Tuple, Union
+
+import numpy as np
+from flwr.common import (
+ FitRes,
+ Metrics,
+ NDArrays,
+ Parameters,
+ Scalar,
+ ndarrays_to_parameters,
+ parameters_to_ndarrays,
+)
+from flwr.server.client_proxy import ClientProxy
+from flwr.server.strategy import FedAvg
+
+from .client import FJORD_CONFIG_TYPE
+from .utils.logger import Logger
+
+
+# Define metric aggregation function
+def weighted_average(metrics: List[Tuple[int, Metrics]]) -> Metrics:
+ """Aggregate using weighted average based on number of samples.
+
+ :param metrics: List of tuples (num_examples, metrics)
+ :return: Aggregated metrics
+ """
+ # Multiply accuracy of each client by number of examples used
+ accuracies = np.array([num_examples * m["accuracy"] for num_examples, m in metrics])
+ examples = np.array([num_examples for num_examples, _ in metrics])
+
+ # Aggregate and return custom metric (weighted average)
+ return {"accuracy": accuracies.sum() / examples.sum()}
+
+
+def get_p_layer_updates(
+ p: float,
+ layer_updates: List[np.ndarray],
+ num_examples: List[int],
+ p_max_s: List[float],
+) -> Tuple[List[np.ndarray], int]:
+ """Get layer updates for given p width.
+
+ :param p: p-value
+ :param layer_updates: list of layer updates from clients
+ :param num_examples: list of number of examples from clients
+ :param p_max_s: list of p_max values from clients
+ """
+ # get layers that were updated for given p
+ # i.e., for the clients with p_max >= p
+ layer_updates_p = [
+ layer_update
+ for p_max, layer_update in zip(p_max_s, layer_updates)
+ if p_max >= p
+ ]
+ num_examples_p = sum(n for p_max, n in zip(p_max_s, num_examples) if p_max >= p)
+ return layer_updates_p, num_examples_p
+
+
+def fjord_average( # pylint: disable=too-many-arguments
+ i: int,
+ layer_updates: List[np.ndarray],
+ num_examples: List[int],
+ p_max_s: List[float],
+ p_s: List[float],
+ fjord_config: FJORD_CONFIG_TYPE,
+ original_parameters: List[np.ndarray],
+) -> np.ndarray:
+ """Compute average per layer for given updates.
+
+ :param i: index of the layer
+ :param layer_updates: list of layer updates from clients
+ :param num_examples: list of number of examples from clients
+ :param p_max_s: list of p_max values from clients
+ :param p_s: list of p values
+ :param fjord_config: fjord config
+ :param original_parameters: original model parameters
+ :return: average of layer
+ """
+ # if no client updated the given part of the model,
+ # reuse previous parameters
+ update = deepcopy(original_parameters[i])
+
+ # BatchNorm2d layers, only average over the p_max_s
+ # that are greater than corresponding p of the layer
+ # i.e., only update the layers that were updated
+ if fjord_config["layer_p"][i] is not None:
+ p = fjord_config["layer_p"][i]
+ layer_updates_p, num_examples_p = get_p_layer_updates(
+ p, layer_updates, num_examples, p_max_s
+ )
+ if len(layer_updates_p) == 0:
+ return update
+
+ assert num_examples_p > 0
+ return reduce(np.add, layer_updates_p) / num_examples_p
+ if fjord_config["layer"][i] in ["ODLinear", "ODConv2d", "ODBatchNorm2d"]:
+ # perform nested updates
+ for p in p_s[::-1]:
+ layer_updates_p, num_examples_p = get_p_layer_updates(
+ p, layer_updates, num_examples, p_max_s
+ )
+ if len(layer_updates_p) == 0:
+ continue
+ in_dim = (
+ int(fjord_config[p][i]["in_dim"])
+ if fjord_config[p][i]["in_dim"]
+ else None
+ )
+ out_dim = (
+ int(fjord_config[p][i]["out_dim"])
+ if fjord_config[p][i]["out_dim"]
+ else None
+ )
+ assert num_examples_p > 0
+ # check whether the parameter to update is bias or weight
+ if len(update.shape) == 1:
+ # bias or ODBatchNorm2d
+ layer_updates_p = [
+ layer_update[:out_dim] for layer_update in layer_updates_p
+ ]
+ update[:out_dim] = reduce(np.add, layer_updates_p) / num_examples_p
+ else:
+ # weight
+ layer_updates_p = [
+ layer_update[:out_dim, :in_dim] for layer_update in layer_updates_p
+ ]
+ update[:out_dim, :in_dim] = (
+ reduce(np.add, layer_updates_p) / num_examples_p
+ )
+ return update
+
+ raise ValueError(f"Unsupported layer {fjord_config['layer'][i]}")
+
+
+def aggregate(
+ results: List[Tuple[NDArrays, int, float, List[float], FJORD_CONFIG_TYPE]],
+ original_parameters,
+) -> NDArrays:
+ """Compute weighted average.
+
+ :param results: list of tuples (layer_updates, num_examples, p_max, p_s)
+ :param original_parameters: original model parameters
+ :return: weighted average of layer updates
+ """
+ # Create a list of weights, each multiplied
+ # by the related number of examples
+ weights = [
+ [param * num_examples for param in params]
+ for params, num_examples, _, _, _ in results
+ ]
+ p_max_s = [p_max for _, _, p_max, _, _ in results]
+
+ # Calculate the total number of examples used during training
+ num_examples = [num_examples for _, num_examples, _, _, _ in results]
+ p_s = results[0][3]
+ fjord_config = results[0][4]
+
+ weights_prime: NDArrays = [
+ fjord_average(
+ i,
+ layer_updates,
+ num_examples,
+ p_max_s,
+ p_s,
+ fjord_config,
+ original_parameters,
+ )
+ for i, layer_updates in enumerate(zip(*weights))
+ ]
+ return weights_prime
+
+
+class FjORDFedAVG(FedAvg):
+ """FedAvg strategy with FjORD aggregation."""
+
+ def __init__(self, *args, **kwargs):
+ super().__init__(*args, **kwargs)
+
+ def aggregate_fit(
+ self,
+ server_round: int,
+ results: List[Tuple[ClientProxy, FitRes]],
+ failures: List[Union[Tuple[ClientProxy, FitRes], BaseException]],
+ ) -> Tuple[Optional[Parameters], Dict[str, Scalar]]:
+ """Aggregate fit results using weighted average."""
+ if not results:
+ return None, {}
+ # Do not aggregate if there are failures and failures are not accepted
+ if not self.accept_failures and failures:
+ return None, {}
+
+ Logger.get().info(f"Aggregating for global round {server_round}")
+ # Convert results
+ weights_results: List[
+ Tuple[NDArrays, int, float, List[float], FJORD_CONFIG_TYPE]
+ ] = [
+ ( # type: ignore
+ parameters_to_ndarrays(fit_res.parameters),
+ fit_res.num_examples,
+ fit_res.metrics["max_p"],
+ fit_res.metrics["p_s"],
+ fit_res.metrics["fjord_config"],
+ )
+ for _, fit_res in results
+ ]
+
+ p_max_values_str = ", ".join([str(val[2]) for val in weights_results])
+ Logger.get().info(f"\t - p_max values: {p_max_values_str}")
+
+ # all clients start with the same model
+ for _, fit_res in results:
+ original_parameters = fit_res.metrics["original_parameters"]
+ break
+
+ training_losses_str = ", ".join(
+ [str(fit_res.metrics["loss"]) for _, fit_res in results]
+ )
+ Logger.get().info(f"\t - train losses: {training_losses_str}")
+
+ agg = aggregate(weights_results, original_parameters)
+
+ parameters_aggregated = ndarrays_to_parameters(agg)
+
+ # Aggregate custom metrics if aggregation fn was provided
+ metrics_aggregated = {}
+ if self.fit_metrics_aggregation_fn:
+ fit_metrics = [(res.num_examples, res.metrics) for _, res in results]
+ metrics_aggregated = self.fit_metrics_aggregation_fn(fit_metrics)
+ elif server_round == 1: # Only log this warning once
+ Logger.get().warn("No fit_metrics_aggregation_fn provided")
+
+ return parameters_aggregated, metrics_aggregated
diff --git a/baselines/fjord/fjord/utils.py b/baselines/fjord/fjord/utils.py
new file mode 100644
index 000000000000..77b28f3d68ad
--- /dev/null
+++ b/baselines/fjord/fjord/utils.py
@@ -0,0 +1 @@
+"""Find the utils in the utils/ directory."""
diff --git a/baselines/fjord/fjord/utils/__init__.py b/baselines/fjord/fjord/utils/__init__.py
new file mode 100644
index 000000000000..46856dadddd5
--- /dev/null
+++ b/baselines/fjord/fjord/utils/__init__.py
@@ -0,0 +1 @@
+"""Utility functions for Fjord."""
diff --git a/baselines/fjord/fjord/utils/logger.py b/baselines/fjord/fjord/utils/logger.py
new file mode 100644
index 000000000000..b0eb2194bfef
--- /dev/null
+++ b/baselines/fjord/fjord/utils/logger.py
@@ -0,0 +1,129 @@
+"""Logger functionality."""
+import logging
+
+import coloredlogs
+
+
+class Logger:
+ """Logger class to be used by all modules in the project."""
+
+ log_format = (
+ "[%(asctime)s] (%(process)s) {%(filename)s:%(lineno)d}"
+ " %(levelname)s - %(message)s"
+ )
+ log_level = None
+
+ @classmethod
+ def setup_logging(cls, loglevel="INFO", logfile=""):
+ """Stateful setup of the logging infrastructure.
+
+ :param loglevel: log level to be used
+ :param logfile: file to log to
+ """
+ cls.registered_loggers = {}
+ cls.log_level = loglevel
+ numeric_level = getattr(logging, loglevel.upper(), None)
+
+ if not isinstance(numeric_level, int):
+ raise ValueError(f"Invalid log level: {loglevel}")
+ if logfile:
+ logging.basicConfig(
+ handlers=[logging.FileHandler(logfile), logging.StreamHandler()],
+ level=numeric_level,
+ format=cls.log_format,
+ datefmt="%Y-%m-%d %H:%M:%S",
+ )
+ else:
+ logging.basicConfig(
+ level=numeric_level,
+ format=cls.log_format,
+ datefmt="%Y-%m-%d %H:%M:%S",
+ )
+
+ @classmethod
+ def get(cls, logger_name="default"):
+ """Get logger instance.
+
+ :param logger_name: name of the logger
+ :return: logger instance
+ """
+ if logger_name in cls.registered_loggers:
+ return cls.registered_loggers[logger_name]
+
+ return cls(logger_name)
+
+ def __init__(self, logger_name="default"):
+ """Initialise logger not previously registered.
+
+ :param logger_name: name of the logger
+ """
+ if logger_name in self.registered_loggers:
+ raise ValueError(
+ f"Logger {logger_name} already exists. "
+ f'Call with Logger.get("{logger_name}")'
+ )
+
+ self.name = logger_name
+ self.logger = logging.getLogger(self.name)
+ self.registered_loggers[self.name] = self.logger
+ coloredlogs.install(
+ level=self.log_level,
+ logger=self.logger,
+ fmt=self.log_format,
+ datefmt="%Y-%m-%d %H:%M:%S",
+ )
+
+ self.warn = self.warning
+
+ def log(self, loglevel, msg):
+ """Log message.
+
+ :param loglevel: log level to be used
+ :param msg: message to be logged
+ """
+ loglevel = loglevel.upper()
+ if loglevel == "DEBUG":
+ self.logger.debug(msg)
+ elif loglevel == "INFO":
+ self.logger.info(msg)
+ elif loglevel == "WARNING":
+ self.logger.warning(msg)
+ elif loglevel == "ERROR":
+ self.logger.error(msg)
+ elif loglevel == "CRITICAL":
+ self.logger.critical(msg)
+
+ def debug(self, msg):
+ """Log debug message.
+
+ :param msg: message to be logged
+ """
+ self.log("debug", msg)
+
+ def info(self, msg):
+ """Log info message.
+
+ :param msg: message to be logged
+ """
+ self.log("info", msg)
+
+ def warning(self, msg):
+ """Log warning message.
+
+ :param msg: message to be logged
+ """
+ self.log("warning", msg)
+
+ def error(self, msg):
+ """Log error message.
+
+ :param msg: message to be logged
+ """
+ self.log("error", msg)
+
+ def critical(self, msg):
+ """Log critical message.
+
+ :param msg: message to be logged
+ """
+ self.log("critical", msg)
diff --git a/baselines/fjord/fjord/utils/utils.py b/baselines/fjord/fjord/utils/utils.py
new file mode 100644
index 000000000000..3a1a327dd555
--- /dev/null
+++ b/baselines/fjord/fjord/utils/utils.py
@@ -0,0 +1,52 @@
+"""Utility functions for fjord."""
+import os
+from typing import List, Optional, OrderedDict
+
+import numpy as np
+import torch
+from torch.nn import Module
+
+from .logger import Logger
+
+
+def get_parameters(net: Module) -> List[np.ndarray]:
+ """Get statedict parameters as a list of numpy arrays.
+
+ :param net: PyTorch model
+ :return: List of numpy arrays
+ """
+ return [val.cpu().numpy() for _, val in net.state_dict().items()]
+
+
+def set_parameters(net: Module, parameters: List[np.ndarray]) -> None:
+ """Load parameters into PyTorch model.
+
+ :param net: PyTorch model
+ :param parameters: List of numpy arrays
+ """
+ params_dict = zip(net.state_dict().keys(), parameters)
+ state_dict = OrderedDict({k: torch.Tensor(v) for k, v in params_dict})
+ net.load_state_dict(state_dict, strict=True)
+
+
+def save_model(
+ model: torch.nn.Module,
+ model_path: str,
+ is_best: bool = False,
+ cid: Optional[int] = None,
+) -> None:
+ """Checkpoint model.
+
+ :param model: model to be saved
+ :param model_path: path to save the model
+ :param is_best: whether this is the best model
+ :param cid: client id
+ """
+ suffix = "best" if is_best else "last"
+ if cid:
+ suffix += f"_{cid}"
+ filename = os.path.join(model_path, f"model_{suffix}.checkpoint")
+ Logger.get().info(f"Persisting model in {filename}")
+ if not os.path.isdir(model_path):
+ os.makedirs(model_path)
+ torch.save(model.state_dict(), filename)
diff --git a/baselines/fjord/notebooks/visualise.ipynb b/baselines/fjord/notebooks/visualise.ipynb
new file mode 100644
index 000000000000..04f9a7f768ec
--- /dev/null
+++ b/baselines/fjord/notebooks/visualise.ipynb
@@ -0,0 +1,277 @@
+{
+ "cells": [
+ {
+ "cell_type": "code",
+ "execution_count": 1,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import re\n",
+ "import os\n",
+ "import glob\n",
+ "\n",
+ "import pandas as pd\n",
+ "import numpy as np\n",
+ "import matplotlib.pyplot as plt\n",
+ "\n",
+ "%matplotlib inline"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 2,
+ "metadata": {},
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "['2023-09-23:12-53-16', '2023-09-23:11-25-43', '2023-09-23:13-32-56', '2023-09-23:14-20-22', '2023-09-23:12-06-14', '2023-09-23:15-00-26']\n"
+ ]
+ }
+ ],
+ "source": [
+ "log_root = \"../runs/best_config\"\n",
+ "\n",
+ "filenames = [os.path.basename(f) for f in glob.glob(os.path.join(log_root, \"*\"))]\n",
+ "print(filenames)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 3,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "setups = {}\n",
+ "for f in filenames:\n",
+ " fq = os.path.join(log_root, f, \"run.log\")\n",
+ " with open(fq, \"r\") as fr:\n",
+ " s = fr.readlines()\n",
+ " # get CLI params\n",
+ " args_str = \"\\n\".join(s[:100])\n",
+ " manual_seed = re.search(r\"manual_seed: (\\d+)\", args_str).group(1)\n",
+ " knowledge_distillation = re.search(\n",
+ " r\"knowledge_distillation: (\\w+)\", args_str\n",
+ " ).group(1)\n",
+ " knowledge_distillation = \"kd\" if knowledge_distillation == \"true\" else \"nokd\"\n",
+ " client_selection = re.search(r\"client_selection: (\\w+)\", args_str).group(1)\n",
+ "\n",
+ " # get evaluation results\n",
+ " eval_timeline = []\n",
+ " eval_regex = r\".*Server-side evaluation \\(global round=(\\d+)\\) p=(\\d\\.\\d+): loss=(\\d+\\.\\d+) / accuracy=(\\d+\\.\\d+)\"\n",
+ " for line in s:\n",
+ " if re.match(eval_regex, line):\n",
+ " global_round, p, loss, accuracy = re.match(eval_regex, line).groups()\n",
+ " global_round, p, loss, accuracy = (\n",
+ " int(global_round),\n",
+ " float(p),\n",
+ " float(loss),\n",
+ " float(accuracy),\n",
+ " )\n",
+ " eval_timeline.append(\n",
+ " {\n",
+ " \"global_round\": global_round,\n",
+ " \"p\": p,\n",
+ " \"loss\": loss,\n",
+ " \"accuracy\": accuracy,\n",
+ " }\n",
+ " )\n",
+ "\n",
+ " setups[\n",
+ " f\"{client_selection}_{knowledge_distillation}_{manual_seed}\"\n",
+ " ] = eval_timeline"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 4,
+ "metadata": {},
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ " global_round p loss accuracy kd seed client_selection\n",
+ "0 10 0.2 1.879755 0.2835 False 124 random\n",
+ "1 10 0.4 1.863002 0.3122 False 124 random\n",
+ "2 10 0.6 1.828429 0.3165 False 124 random\n",
+ "3 10 0.8 1.885398 0.2739 False 124 random\n",
+ "4 10 1.0 1.943324 0.2384 False 124 random\n"
+ ]
+ }
+ ],
+ "source": [
+ "dfs = []\n",
+ "for k, v in setups.items():\n",
+ " df = pd.DataFrame(v)\n",
+ " client_selection, kd, seed = k.split(\"_\")\n",
+ " df[\"kd\"] = False if kd == \"nokd\" else True\n",
+ " df[\"seed\"] = seed\n",
+ " df[\"client_selection\"] = client_selection\n",
+ " dfs.append(df)\n",
+ "df = pd.concat(dfs)\n",
+ "print(df.head())"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 5,
+ "metadata": {},
+ "outputs": [
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAA1cAAAGJCAYAAABmacmGAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8qNh9FAAAACXBIWXMAAA9hAAAPYQGoP6dpAADUxElEQVR4nOzdd3wUdfrA8c9s3/SEJCShhN4iIooFVIqK2BXPxnlH0VPv1BMsnIoFOOX42Tju9MDzzoIniu1s51lAAfVQESnSpHcS0nu2zczvj9ndZLMhJCHJJuF5v16jZHZ25rvJZrPPfp/n+Sq6rusIIYQQQgghhDgupkgPQAghhBBCCCE6AgmuhBBCCCGEEKIZSHAlhBBCCCGEEM1AgishhBBCCCGEaAYSXAkhhBBCCCFEM5DgSgghhBBCCCGagQRXQgghhBBCCNEMJLgSQgghhBBCiGYgwZUQQgghhBBCNAMJroQQQghxVJMnT6ZHjx5Nuq+iKNx5553NNpYVK1agKAorVqxotnO2BT169GDy5MmRHoYQohlIcCWEaFNeeeUVFEUJbhaLhS5dujB58mQOHTrUYtedNWsWiqLQuXNnKisrw27v0aMHl112WZPOvWDBAl555ZU6b5szZw5XXHEFnTt3RlEUZs2addTzLFu2jDFjxpCcnExCQgJnnHEG//rXv5o0poZ47733uPjii0lOTsZms5GRkcF1113Hl19+GTwm8Gb3nXfeCe6r/TOsuT3wwAMh11iwYAGKonDmmWcedRy1zxEXF8eoUaP4+OOPw44tLy9n5syZXHTRRSQlJaEoylG/9wBbt27loosuIiYmhqSkJH7961+Tl5fXiO9Sw4wePbrO78dFF10Udqzb7eb+++8nIyMDp9PJmWeeydKlS+s876pVqzjnnHOIiooiLS2Nu+66i/Ly8mYfvxBCiIaxRHoAQghRlz/+8Y/07NkTl8vFd999xyuvvMI333zDpk2bcDgcLXbd3NxcFi5cyL333tts51ywYAHJycl1fjL98MMPk5aWxtChQ/nss8+Oeo4PP/yQq666iuHDhwcDwbfeeouJEyeSn5/P3Xff3Wzj1XWdm266iVdeeYWhQ4dyzz33kJaWRnZ2Nu+99x7nn38+//vf/xgxYkS95wn8DGs66aSTQr5evHgxPXr0YPXq1ezcuZM+ffrUea6xY8cyceJEdF1n3759LFy4kMsvv5xPPvmEcePGBY/Lz8/nj3/8I927d2fIkCH1znAcPHiQkSNHEh8fz5/+9CfKy8t5+umn2bhxI6tXr8Zmsx3jO9U4Xbt2Ze7cuSH7MjIywo6bPHky77zzDtOmTaNv37688sorXHLJJSxfvpxzzjkneNz69es5//zzGThwIPPmzePgwYM8/fTT7Nixg08++aRZxy6EEKKBdCGEaENefvllHdB/+OGHkP3333+/Duhvvvlmi1x35syZOqCfcsopeufOnfXKysqQ2zMzM/VLL720SefOysrSR40aVedte/bs0XVd1/Py8nRAnzlzZp3HjR07Vs/IyNBdLldwn9fr1Xv37q2ffPLJTRrX0Tz11FM6oE+bNk3XNC3s9ldffVX//vvvdV3X9eXLl+uA/vbbbwdvP9rPsLbdu3frgP7vf/9bT0lJ0WfNmlXncYB+xx13hOzbsmWLDugXX3xxyH6Xy6VnZ2fruq7rP/zwgw7oL7/8cp3n/d3vfqc7nU593759wX1Lly7VAf3vf/97vWNvrFGjRulZWVnHPO7777/XAf2pp54K7quqqtJ79+6tDx8+POTYiy++WE9PT9dLSkqC+/7xj3/ogP7ZZ58129gnTZqkZ2ZmNum+df3sjkfg+bZ8+fJG37e8vLzZxtHcMjMz9UmTJkV6GEKIZiBpgUKIduHcc88FYNeuXSH7f/75Z6655hqSkpJwOBwMGzaMDz/8MOQYr9fL7Nmz6du3Lw6Hg06dOnHOOefUmWr16KOPcuTIERYuXHjMMWmaxvz588nKysLhcNC5c2duu+02ioqKgsf06NGDzZs3s3LlymAq2OjRo0Nub4jS0lISExOx2+3BfRaLheTkZJxOZ4PO0RBVVVXMnTuXAQMG8PTTT6MoStgxv/71rznjjDOO+1qLFy8mMTGRSy+9lGuuuYbFixc3+L4DBw4kOTk57Plgt9tJS0tr0DneffddLrvsMrp37x7cd8EFF9CvXz/eeuutBo+lMXw+X71pe++88w5ms5lbb701uM/hcHDzzTfz7bffcuDAAcB4PixdupRf/epXxMXFBY+dOHEiMTExDRr/vn37uOKKK4iOjiY1NZW7776bzz77rEE1TRUVFdx7771069YNu91O//79efrpp9F1vc7jFy9eTP/+/XE4HJx22ml89dVXYWO5/fbb6d+/P06nk06dOnHttdeyd+/eYz6OugRmd7ds2cIvf/lLEhMTg7N+Pp+Pxx57jN69e2O32+nRowczZszA7XaHnONoabq166MCabD/+9//uOeee0hJSSE6Oprx48eHpZjqus7jjz9O165diYqKYsyYMWzevDnsGo15zRJCtC2SFiiEaBcCb7ISExOD+zZv3szZZ59Nly5deOCBB4iOjuatt97iqquu4t1332X8+PGA8UZr7ty5/OY3v+GMM86gtLSUNWvWsHbtWsaOHRtynXPPPZfzzjuPJ598kt/97nf1Bi633XYbr7zyClOmTOGuu+5iz549PPfcc6xbt47//e9/WK1W5s+fz+9//3tiYmJ46KGHAOjcuXOjH//o0aN54okneOSRR5g0aRKKovD666+zZs2aZg0EvvnmGwoLC5k2bRpms/m4zlVSUkJ+fn7IvuTk5OC/Fy9ezNVXX43NZmPChAksXLiQH374gdNPP71B5y4qKqJ3795NGtuhQ4fIzc1l2LBhYbedccYZ/Pe//23Seeuzfft2oqOj8Xg8dO7cmVtuuYVHH30Uq9UaPGbdunX069cvJGAKjAmMVMBu3bqxceNGfD5f2PhtNhunnHIK69atq3csFRUVnHfeeWRnZzN16lTS0tJ4/fXXWb58+TEfh67rXHHFFSxfvpybb76ZU045hc8++4zp06dz6NAh/vznP4ccv3LlSt58803uuusu7HY7CxYs4KKLLmL16tXBNNEffviBVatWccMNN9C1a1f27t3LwoULGT16NFu2bCEqKuqY46rLtddeS9++ffnTn/4UDPx+85vfsGjRIq655hruvfdevv/+e+bOncvWrVt57733mnQdgN///vckJiYyc+ZM9u7dy/z587nzzjt58803g8c8+uijPP7441xyySVccsklrF27lgsvvBCPxxNyrsa8Zgkh2pjITpwJIUSoQErZsmXL9Ly8PP3AgQP6O++8o6ekpOh2u10/cOBA8Njzzz9fHzx4cEiqnKZp+ogRI/S+ffsG9w0ZMuSYKX2BtMC8vDx95cqVOqDPmzcveHvttMCvv/5aB/TFixeHnOfTTz8N219fWmDAsdICy8vL9euuu05XFEUHdECPiorS33///XrP21h/+ctfdEB/7733GnR8fWmBdW0Ba9as0QF96dKluq4bP7euXbvqU6dODbsGoN988816Xl6enpubq69Zs0a/6KKLwtLnaqsvLTBw26uvvhp22/Tp03Ug5Hl1vG666SZ91qxZ+rvvvqu/+uqr+hVXXKED+nXXXRdyXFZWln7eeeeF3X/z5s06oD///PO6ruv622+/rQP6V199FXbstddeq6elpdU7nmeeeUYHQp4/VVVV+oABA8LS7mqnBb7//vs6oD/++OMh57zmmmt0RVH0nTt3BvcFfu5r1qwJ7tu3b5/ucDj08ePHB/fVTsPVdV3/9ttvw35GDU0LDPw+T5gwIWT/+vXrdUD/zW9+E7L/vvvu0wH9yy+/DBl7Xb+PtVP4As/3Cy64ICSN9u6779bNZrNeXFys67qu5+bm6jabTb/00ktDjpsxY4YOhJyzIa9ZQoi2SdIChRBt0gUXXEBKSgrdunXjmmuuITo6mg8//JCuXbsCUFhYyJdffsl1111HWVkZ+fn55OfnU1BQwLhx49ixY0ewu2BCQgKbN29mx44dDbr2yJEjGTNmDE8++SRVVVV1HvP2228THx/P2LFjg9fOz8/ntNNOIyYmpkEzAI1ht9vp168f11xzDW+88QavvfYaw4YN41e/+hXfffdds12ntLQUgNjY2OM+19/+9jeWLl0asgUsXryYzp07M2bMGMBIwbr++utZsmQJqqqGnevFF18kJSWF1NRUhg0bxhdffMEf/vAH7rnnniaNLfBzrZlmGRBomHK0n31TvPjii8ycOZOrr76aX//613zwwQfccsstvPXWWyE/v6qqqgaN6VjjP9bYP/30U7p06cIVV1wRcr9bbrnlmI/lv//9L2azmbvuuitk/7333ouu62HNNIYPH85pp50W/Lp79+5ceeWVfPbZZ8Gfdc0ZYq/XS0FBAX369CEhIYG1a9cec0xH89vf/jZs7EDY8ybQwKauDpQNdeutt4ak0Z577rmoqsq+ffsAo9unx+Ph97//fchx06ZNCztXY1+zhBBthwRXQog2KfDG/J133uGSSy4hPz8/5I3kzp070XWdRx55hJSUlJBt5syZgNH5D4yudcXFxfTr14/Bgwczffp0fvrpp3qvP2vWLHJycnj++efrvH3Hjh2UlJSQmpoadv3y8vLgtZvLnXfeyUcffcSSJUu44YYbuPHGG1m2bBnp6elMnTq13vsWFhaSk5MT3EpKSo56bCAdrays7LjHfMYZZ3DBBReEbACqqrJkyRLGjBnDnj172LlzJzt37uTMM8/kyJEjfPHFF2HnuvLKK1m6dCkff/xxsJ6msrISk6lpf8YCb+Zr19kAuFyukGPqkpeXF/I9bUr788Ab+mXLloWMqyFjOtb4j1WHt2/fPnr37h1WU3e0bo2175uRkREWgA8cODB4e019+/YNO0e/fv2orKwM1iRVVVXx6KOPBmu4kpOTSUlJobi4uN7n67HU7la5b98+TCZT2ONMS0sjISEhbOyNUbN2D6pTmAM1mIFz1/5+pKSkhKQ7Q9Nes4QQbYPUXAkh2qQzzjgjWE9y1VVXcc455/DLX/6Sbdu2ERMTg6ZpANx3330hrbhrCryBGjlyJLt27eKDDz7g888/55///Cd//vOfef755/nNb35T531HjhzJ6NGjefLJJ8M+/QajmUVqaupRmzCkpKQ0+jEfjcfj4cUXX+QPf/hDSDBhtVq5+OKLee655/B4PEdtHX711VezcuXK4NeTJk066tpPAwYMAGDjxo1cddVVzfYYavryyy/Jzs5myZIlLFmyJOz2xYsXc+GFF4bs69q1azA4u+SSS0hOTubOO+9kzJgxXH311Y0eQ3p6OgDZ2dlht2VnZ5OUlFTnrFDA6aefHvJGfObMmfWuUVaXbt26AUbwW3Ncda3nFhhnoHX7scZfV4v3tuz3v/89L7/8MtOmTWP48OHEx8ejKAo33HBD8He9KY4WZNbVqKWh6ppZBY5ao6gfpclHfZrymiWEaBskuBJCtHlms5m5c+cyZswYnnvuOR544AF69eoFGAFG4E13fZKSkpgyZQpTpkyhvLyckSNHMmvWrHrfqMyaNYvRo0fz97//Pey23r17s2zZMs4+++xjzhIczxs5gIKCAnw+X51v6rxeL5qmHfUNH8AzzzwT0sGwvjfe55xzDomJibzxxhvMmDHjuJta1GXx4sWkpqbyt7/9Ley2f//737z33ns8//zzx2wm8uc//5mHH36Y8ePHN/p73KVLF1JSUlizZk3YbatXr+aUU0455mOomXoXeD42xu7du4HQQPyUU05h+fLllJaWhjS1+P7774O3g7FemMViYc2aNVx33XXB4zweD+vXrw/ZV5fMzEy2bNmCrush37udO3cec9yZmZksW7aMsrKykNmrn3/+OXh7TXWltm3fvp2oqKjgY3/nnXeYNGkSzzzzTPAYl8tFcXHxMcfTGJmZmWiaxo4dO4IzbQBHjhyhuLg4ZOyJiYlh1/d4PHUGtA29Nhjfj5rPl7y8vJDfz4CmvGYJISJP0gKFEO3C6NGjOeOMM5g/fz4ul4vU1NRg4FPXm52aLZALCgpCbouJiaFPnz51plTVNGrUqGCXvkBaVsB1112Hqqo89thjYffz+Xwhb8qio6OP601iamoqCQkJvPfeeyFdxcrLy/noo48YMGBAvYHIaaedFpKaN2jQoKMeGxUVxf3338/WrVu5//776/zU/bXXXmP16tVNeixVVVX8+9//5rLLLuOaa64J2+68807KysrC2unXZrFYuPfee9m6dSsffPBBk8byi1/8gv/85z/B9uYAX3zxBdu3b+faa6+t975nn312yPe0vuCqtLQ07Lmm+1tyAyEzr9dccw2qqvLCCy8E97ndbl5++WXOPPPM4GxXfHw8F1xwAa+99lpICue//vUvysvLjzn+cePGcejQoZDvs8vl4h//+Ee99wNj5lBVVZ577rmQ/X/+859RFIWLL744ZP+3334bUjd14MABPvjgAy688MJg8G42m8Oea88++2y9Hxo0xSWXXALA/PnzQ/bPmzcPgEsvvTS4r3fv3mEt41944YUmj+mCCy7AarXy7LPPhjzW2mOBpr9mCSEiT2auhBDtxvTp07n22mt55ZVX+O1vf8vf/vY3zjnnHAYPHswtt9xCr169OHLkCN9++y0HDx5kw4YNAAwaNIjRo0dz2mmnkZSUxJo1a3jnnXe48847j3nNmTNnBpsu1DRq1Chuu+025s6dy/r167nwwguxWq3s2LGDt99+m7/85S9cc801gBHcLFy4kMcff5w+ffqQmprKeeedBxhvhvft20dlZSUAX331VfBN969//WsyMzMxm83cd999PPzww5x11llMnDgRVVV58cUXOXjwIK+99lqzfH8Dpk+fzubNm3nmmWdYvnw511xzDWlpaeTk5PD++++zevVqVq1a1aRzf/jhh5SVlYU0UqjprLPOIiUlhcWLF3P99dfXe67Jkyfz6KOP8sQTT4SkMD733HMUFxdz+PBhAD766CMOHjwIGOln8fHxAMyYMYO3336bMWPGMHXqVMrLy3nqqacYPHgwU6ZMadLjq8vatWuZMGECEyZMoE+fPlRVVfHee+/xv//9j1tvvZVTTz01eOyZZ57Jtddey4MPPkhubi59+vRh0aJF7N27lxdffDHkvHPmzGHEiBGMGjWKW2+9lYMHD/LMM89w4YUXctFFF9U7pttuu43nnnuOCRMmMHXqVNLT01m8eHGwcUZ9M4GXX345Y8aM4aGHHmLv3r0MGTKEzz//nA8++IBp06aFtcc/6aSTGDduXEgrdoDZs2cHj7nsssv417/+RXx8PIMGDeLbb79l2bJldOrUqWHf5AYaMmQIkyZN4oUXXqC4uJhRo0axevVqFi1axFVXXRXyu/6b3/yG3/72t/ziF79g7NixbNiwgc8++yxkOYHGSElJ4b777mPu3LlcdtllXHLJJaxbt45PPvkk7JzH85olhIiwyDUqFEKIcIG2xj/88EPYbaqq6r1799Z79+6t+3w+Xdd1fdeuXfrEiRP1tLQ03Wq16l26dNEvu+wy/Z133gne7/HHH9fPOOMMPSEhQXc6nfqAAQP0OXPm6B6PJ3hMzVbstY0aNUoH6myN/MILL+innXaa7nQ69djYWH3w4MH6H/7wB/3w4cPBY3JycvRLL71Uj42N1YGQtuyBc9e11W43vXjx4pDHceaZZ4Y8zub2zjvv6BdeeKGelJSkWywWPT09Xb/++uv1FStWBI+prxV7XT/Dyy+/XHc4HHpFRcVRrzt58mTdarXq+fn5uq4bLbHvuOOOOo+dNWtW2PcqMzPzqN/TPXv2hNx/06ZN+oUXXqhHRUXpCQkJ+o033qjn5OQ05NvTYLt379avvfZavUePHrrD4dCjoqL00047TX/++edDWnIHVFVV6ffdd5+elpam2+12/fTTT9c//fTTOs/99ddf6yNGjNAdDoeekpKi33HHHXppaWmDx3XppZfqTqdTT0lJ0e+991793Xff1QH9u+++Cx5XuxW7rut6WVmZfvfdd+sZGRm61WrV+/btqz/11FNhjyfws3vttdf0vn376na7XR86dGjYc7uoqEifMmWKnpycrMfExOjjxo3Tf/7557C2541txV7X77PX69Vnz56t9+zZU7darXq3bt30Bx98MKz1vqqq+v33368nJyfrUVFR+rhx4/SdO3cetRV77ed7XWNVVVWfPXu2np6erjudTn306NH6pk2bws7ZkNcsIUTbpOh6EyothRBCCNHhzJ8/n7vvvpuDBw/SpUuXSA9HCCHaHQmuhBBCiBNQVVVVSK2ey+Vi6NChqKrK9u3bIzgyIYRov6TmSgghhDgBXX311XTv3p1TTjmFkpISXnvtNX7++eejLi8ghBDi2CS4EkIIIU5A48aN45///CeLFy9GVVUGDRrEkiVLjtlIRAghxNFJWqAQQgghhBBCNANZ50oIIYQQQgghmoEEV0IIIYQQQgjRDKTmqg6apnH48GFiY2PrXUhRCCGEEEII0bHpuk5ZWRkZGRmYTPXPTUlwVYfDhw/TrVu3SA9DCCGEEEII0UYcOHCArl271nuMBFd1iI2NBYxvYFxcXETH4vV6+fzzz7nwwguxWq0RHYsQTSHPYdGeyfNXtGfy/BXtWVt6/paWltKtW7dgjFAfCa7qEEgFjIuLaxPBVVRUFHFxcRF/YgnRFPIcFu2ZPH9FeybPX9GetcXnb0PKhaShhRBCCCGEEEI0AwmuhBBCCCGEEKIZSHAlhBBCCCGEEM1Aaq6EEEIIIUSHp+s6Pp8PVVUjPRTRAF6vF4vFgsvlavGfmdlsxmKxNMsSTBENrlRVZdasWbz22mvk5OSQkZHB5MmTefjhh4MP7mgP8sknn2T69Ol13jZr1ixmz54dsq9///78/PPPzfsAhBBCCCFEm+fxeMjOzqaysjLSQxENpOs6aWlpHDhwoFXWnY2KiiI9PR2bzXZc54locPXEE0+wcOFCFi1aRFZWFmvWrGHKlCnEx8dz1113AZCdnR1yn08++YSbb76ZX/ziF/WeOysri2XLlgW/tlhkkk4IIYQQ4kSjaRp79uzBbDaTkZGBzWZrlTfr4vhomkZ5eTkxMTHHXLj3eOi6jsfjIS8vjz179tC3b9/jul5EI45Vq1Zx5ZVXcumllwLQo0cP3njjDVavXh08Ji0tLeQ+H3zwAWPGjKFXr171nttisYTdVwghhBBCnFg8Hg+aptGtWzeioqIiPRzRQJqm4fF4cDgcLRpcATidTqxWK/v27Qtes6kiGlyNGDGCF154ge3bt9OvXz82bNjAN998w7x58+o8/siRI3z88ccsWrTomOfesWMHGRkZOBwOhg8fzty5c+nevXudx7rdbtxud/Dr0tJSwMj19Hq9TXhkzSdw/UiPQ4imkuewaM/k+SvaM3n+GrxeL7quA8YbdtE+BH5muq632s9N13W8Xi9mszlkf2N+hxQ9MPII0DSNGTNm8OSTT2I2m1FVlTlz5vDggw/WefyTTz7J//3f/3H48OF6I8pPPvmE8vJy+vfvT3Z2NrNnz+bQoUNs2rSpzpWV66rRAnj99dflEw4hhBBCiHYskM3UrVu3466nER2Xx+PhwIED5OTk4PP5Qm6rrKzkl7/8JSUlJcTFxdV7nogGV0uWLGH69Ok89dRTZGVlsX79eqZNm8a8efOYNGlS2PEDBgxg7NixPPvss426TnFxMZmZmcybN4+bb7457Pa6Zq66detGfn7+Mb+BLc3r9bJ06VLGjh3bZlanFqIx5Dks2jN5/or2TJ6/BpfLxYEDB+jRo8dxpXuJ1qXrOmVlZcTGxrZKjZzL5WLv3r1069Yt7HlSWlpKcnJyg4KriKYFTp8+nQceeIAbbrgBgMGDB7Nv3z7mzp0bFlx9/fXXbNu2jTfffLPR10lISKBfv37s3Lmzztvtdjt2uz1sv9VqbTMvRm1pLEI0hTyHRXsmz1/RrmgaaF7QPYA8f1VVRVEUTCZTi9futKbJkydTXFzM+++/H+mhtIhAKmDgZ9fSTCYTiqLU+fvSmN+fiD7DKisrw75ZZrO5zrzKF198kdNOO40hQ4Y0+jrl5eXs2rWL9PT0Jo9VCCGEECLiVB94q8BVAhUFUJoNxfuhYBfk/gzZP0HOBsjdAoW7jPsU7jGOVX31n1sc1Z+XbuevX+yo87a/frGDPy/d3iLXnTx5MoqihG07d+7kL3/5C6+88krI8QcOHOCmm24KdkXMzMxk6tSpFBQUhBw3evTo4LkcDgf9+vVj7ty51Exo27t3b8g1Y2NjycrK4o477mDHjrq/FyLCM1eXX345c+bMoXv37mRlZbFu3TrmzZvHTTfdFHJcaWkpb7/9Ns8880yd5zn//PMZP348d955JwD33Xcfl19+OZmZmRw+fJiZM2diNpuZMGFCiz8mIYQQQohG03VQvaB6jFkn1b/V/LfqAZpQzeEpg5IKKFHAHguOBHDEg1mWqWkos0lhnj+Auuv8vsH9f/1iB/OWbueesf1a7NoXXXQRL7/8csi+lJSUsKYLu3fvZvjw4fTr14833niDnj17snnzZqZPn84nn3zCd999R1JSUvD4W265hT/+8Y+43W6+/PJLbr31VhISEvjd734Xct5ly5aRlZVFZWUlGzdu5C9/+QtDhgzho48+4vzzz2+xx91eRfS36tlnn+WRRx7h9ttvJzc3l4yMDG677TYeffTRkOOWLFmCrutHDY527dpFfn5+8OuDBw8yYcIECgoKSElJ4ZxzzuG7774jJSWlRR+PEEIIIUQYTTUCI9ULmq/636qn+mutNWaVdHCXGtsJHmjpuk6VV23w8b85tydeVWPe0u14VY3fje7NwhW7ePbLnfz+vD785tyeVHoa9jN0Ws2NqiGy2+11Li9UOy3wjjvuwGaz8fnnn+N0OgHo3r07Q4cOpXfv3jz00EMsXLgweP+oqKjgeadMmcJzzz3H0qVLw4KrTp06BY/r1asXl19+Oeeffz4333wzu3btCgvyTnQR/U2KjY1l/vz5zJ8/v97jbr31Vm699daj3r53796Qr5csWdIMoxNCCCGEqIeu1wqWAjNNHiMFL/BvvS22/64VaNliwJlgBFsnQKBV5VUZ9OhnTbrvs1/u5Nkvdx7162PZ8sdxRNma93tcWFjIZ599xpw5c4KBVUBaWho33ngjb775JgsWLAgL7HRd55tvvuHnn3+mb9++HIvJZGLq1KmMHz+eH3/8kTPOOKNZH0t71/F/e4QQQgghGkvTjpKi5w+cgrNNEWu63Ix0I3XQUwYlB8AW6w+04sF84jbCaCv+85//EBMTE/z64osv5u233w45ZseOHei6zsCBA+s8x8CBAykqKiIvL4/U1FQAFixYwD//+U88Hg9erxeHw8Fdd93VoDENGDAAMCY4JLgKJcGVEEIIIU4sweCoVj2T5qv+t97wlLEOJyTQijFms5wJHSrQclrNbPnjuEbfL5AKaDUreFWd35/Xh9+N7t3oazfGmDFjQtL5oqOjj3psY1ZYuvHGG3nooYcoKipi5syZjBgxghEjRjTovoHrtEaL9PZGgishhBBCdAy6Xkc9U63aJtVLx5htaiWecmMrPVgdaDniwdK+F+NVFKXRqXl//WIHz365k3vG9uOu8/sGm1lYzaaQJhfNLTo6mj59+tR7TJ8+fVAUha1btzJ+/Piw27du3UpiYmJI/4H4+Pjged966y369OnDWWedxQUXXHDMMW3duhWAnj17NuahnBAkuBJCCCFE21ezKURIil6Nr1ulKcQJrGagZY2urtFq54FWQ9TsChgIpAL/r6uLYGvr1KkTY8eOZcGCBdx9990hdVc5OTksXryYiRMnHnWmKSYmhqlTp3Lfffexbt26emekNE3jr3/9Kz179mTo0KHN/ljaOwmuhBBCCBE5gRbkdTWCqJm+1yabQpzAvBXGVnrohAi0VE0PCawCAl+rWuRnQ5977jlGjBjBuHHjePzxx0NasXfp0oU5c+bUe//bbruNxx57jHfffZdrrrkmuL+goICcnBwqKyvZtGkT8+fPZ/Xq1Xz88cfSKbAOElwJIYQQomXU1RSiriYRon2rHWg54o1gy2KP9Miazd31rGMVyRmrmvr27cuaNWuYOXMm1113HYWFhaSlpXHVVVcxc+bMkDWu6pKUlMTEiROZNWsWV199dXB/IE0wKiqKzMxMxowZwwsvvHDMVMUTlQRXQgghhGi8sEYQdazjdCI3hThRBQKtssNgjapuhtGBAq3W9Morrxz1NrfbHdJFECAzM7Pe+wSsWLGizv3PP/988N89evRoVIMMYZDgSgghhBDVNK3GzJLnKOs4SVMI0QDeSmOTQKtZ+Xw+tm/fzrfffsttt90W6eGIWiS4EkIIIU4UR6tnkqYQoqXVDLQszuoaLasj0iNrdzZt2sSIESMYM2YMv/3tbyM9HFGLBFdCCCFEexdoChEMlo4SOElTCNEW+KqgrArKsiXQaoJTTjmFysrKSA9DHIUEV0IIIURbpqlHSdGrsY6TNIUQ7VVIoOWoTh20Oo91TyHaJAmuhBBCiEgJBEm1G0HUbE0us03iROFzQXmOsUmgJdopCa6EEEKI5hZsClHXorc+aQohxLHUDLTM9urUQVtUpEcmRL0kuBJCCCEa42j1TDVbk0sLciGaj+qG8iPGJoGWaOMkuBJCCCFq03XwlENFifF1wS5QVJltEiLSagdagQWLbdGRHpkQgARXQgghhMHnAXepfys3Zp9Uf72TtwLMpsiOTwgRSnVDRa6xmW3VNVoSaIkIkuBKCCHEiUnTwFMGbv/mc0V6REKIplI9tQKteCPYssdEemQtavLkyRQXF/P+++9HeijCTz6GE0IIceLwVkF5rpHml/MTFO6GijwJrIToSFSP8XtdsAOObIaynOPvurl8Lqx8su7bVj5p3N4CJk+ejKIoYdvOnTsB+Mtf/sIrr7wScp8DBw5w0003kZGRgc1mIzMzk6lTp1JQUBBy3OjRo4Pnczgc9OvXj7lz56Lr1anPe/fuDblubGwsWVlZ3HHHHezYsaNFHvOUKVN4+OGH67xt8uTJXHXVVSH73nnnHRwOB88880zwmMB4rVYrnTt3ZuzYsbz00ktoWst3X5XgSgghRMelqVBVBMX7jTdZeT9D6SEj9U9qp4To+FQPVBUaSx14XUb6r9aEhjMmMyyfEx5grXzS2G8yN89463DRRReRnZ0dsvXs2ROA+Ph4EhISgsfu3r2bYcOGsWPHDt544w127tzJ888/zxdffMHw4cMpLCwMOfctt9xCdnY227Zt48EHH+TRRx/l+eefDxvDsmXLyM7OZsOGDfzpT39i69atDBkyhC+++KJZH6uqqvznP//hiiuuaNDx//znP7nxxhtZuHAh9957b3B/4Hu2d+9ePvnkE8aMGcPUqVO57LLL8Pl8zTrm2iQtUAghRMfiqTDS/Fyl4K1EgighBGA0qtF8xgbgcxtBkWICRan/vsPvMAK15XOM/59zN3zzZ/jqKRg53bjdU9GwcVijjn29Gux2O2lpaXXeVjst8I477sBms/H555/jdBrrg3Xv3p2hQ4fSu3dvHnroIRYuXBi8f1RUVPDcU6ZM4bnnnmPp0qX87ne/C7lOp06dgsf16tWLyy+/nPPPP5+bb76ZXbt2YTaHB5fXXHMNaWlpPPfccwBMmzaNv/zlL2zdupUBAwbg8XhITEzkgw8+4IILLgBg1apVWK1WTj/99JAZtLo8+eSTzJw5kyVLljB+/Pijfs+6dOnCqaeeyllnncX555/PK6+8wm9+85t6z308JLgSQgjRvqlef91UqfF/rWU/lRRCdACeSni6d9Pu+9VTxna0r49lxuEWabpRWFjIZ599xpw5c4KBVUBaWho33ngjb775JgsWLECpFdzpus4333zDzz//TN++fY95LZPJxNSpUxk/fjw//vgjZ5xxRtgxo0aN4u9//3vw65UrV5KcnMyKFSsYMGAAP/zwA16vlxEjRgSP+fDDD7n88stRFKXe4Or+++9nwYIF/Oc//+H8888/5ngBzjvvPIYMGcK///3vFg2uJC1QCCFE+6LrRhBVehhyf4Yjm6B4n5H+J4GVEKKD+c9//kNMTExwu/baa+s8bseOHei6zsCBA+u8feDAgRQVFZGXlxfct2DBAmJiYrDb7YwcORJN07jrrrsaNK4BAwYARl1WXUaPHs2WLVvIy8ujqKiILVu2MHXqVFasWAHAihUrOP3004mKql6v7IMPPjhmSuAnn3zCk08+yQcffNDgwKrmmI823uYiM1dCCCHaPp/bn+pXYqw/dbzF6UKIE5vVCfftqv8YBSNl0GSpTh0MpAKabUZ64MjpRopgo67duMWPx4wZE5LKFx1d/6zXsdLparrxxht56KGHKCoqYubMmYwYMSJkJqkh16k9CxZw0kknkZSUxMqVK7HZbAwdOpTLLruMv/3tb4AxkzV69Ojg8Vu3buXw4cPHDJhOPvlk8vPzmTlzJmeccQYxMQ3vCKnr+lHH21wkuBJCCNH2BNqku/ypfqo70iMSQnQkigK2RgQ5CvDNfCOwGj0DRt9f3czCbINRf2ipkRIdHU2fPn2OeVyfPn1QFIWtW7eG1SCBEbwkJiaSkpIS3BcfHx8891tvvUWfPn0466yzgjVQ9dm6dStAsLlGbYqiMHLkSFasWIHdbmf06NGcfPLJuN1uNm3axKpVq7jvvvuCx3/44YeMHTsWh8NR73W7dOnCO++8w5gxY7jooov45JNPiI2NPeZ4A2M+2nibS0TTAlVV5ZFHHqFnz544nU569+7NY489FhJx19V+UlEUnnqq/tzWv/3tb/To0QOHw8GZZ57J6tWrW/rhCCGEOB6BNun5O6vbpFfmS2AlhIi8r+fByieMmaoRvzdm08+9xwi06uoiGAGdOnVi7NixLFiwgKqqqpDbcnJyWLx4Mddff/1RZ25iYmKYOnUq99133zFnvzRN469//Ss9e/Zk6NChRz1u1KhRrFixghUrVjB69GhMJhMjR47kqaeewu12c/bZZweP/eCDD7jyyisb9FgzMzNZuXIlOTk5XHTRRZSVlR3zPl9++SUbN27kF7/4RYOu0VQRDa6eeOIJFi5cyHPPPcfWrVt54oknePLJJ3n22WeDx9RuPfnSSy+hKEq935g333yTe+65h5kzZ7J27VqGDBnCuHHjyM3NbY2HJYQQoiFUn1EnVbQPcjZVt0n3lCEd/oQQbYqm+VMA7/F/rRpt3Uf8HkbdbzTWaUQ6Xkt57rnncLvdjBs3jq+++ooDBw7w6aefMnbsWLp06cKcOXPqvf9tt93G9u3beffdd0P2FxQUkJOTw+7du/nwww+54IILWL16NS+++GKdnQIDAnVXmzdv5pxzzgnuW7x4McOGDQumOObm5rJmzRouu+yyBj/Wbt26sWLFCnJzcxk3bhylpaXB29xuNzk5ORw6dIi1a9fypz/9iSuvvJLLLruMiRMnNvgaTRHRtMBVq1Zx5ZVXcumllwLQo0cP3njjjZBZptqtJz/44APGjBlDr169jnreefPmccsttzBlyhQAnn/+eT7++GNeeuklHnjggRZ4JEIIIRrEU1Gd6udtYNtiIYSItJH3Hf22s6cZ//dWGa3dTWZQzI1qt95c+vbty5o1a5g5cybXXXcdhYWFpKWlcdVVVzFz5kySkpLqvX9SUhITJ05k1qxZXH311cH9gTTBqKgoMjMzGTNmDC+88MIx0xUHDx5MQkIC/fr1C9ZGjR49GlVVQ+qtPvroI8444wySk5Mb9Xi7du3KihUrGDNmDOPGjeOzzz4D4NNPPyU9PR2LxUJiYiJDhgzhr3/9K5MmTcJkatm5pYgGVyNGjOCFF15g+/bt9OvXjw0bNvDNN98wb968Oo8/cuQIH3/8MYsWLTrqOT0eDz/++CMPPvhgcJ/JZOKCCy7g22+/rfM+brcbt7s67SQQ+Xq9Xrxeb1MeWrMJXD/S4xCiqeQ5fIILtkkvMwIrvX118/OqWsj/hWhP5Plr8Kr+Ja7Q0Vp6Vjy4jpbib4bhD7Sa6KWXXjJOq9X9M3S5XERHR4fc3q1bt+D9woZX47gvv/yyznMvWLAg+O/u3bujqkdfdPlo46opPz8/5NiTTz45eM7Avvfff5/LL7885HyB1ERd14P76/p+pKen8/PPPwe/fumll476+Osbs6Zp6LqO1+sNm41rzHuYiAZXDzzwAKWlpQwYMACz2YyqqsyZM4cbb7yxzuMXLVpEbGxsSCRdW35+Pqqq0rlz55D9nTt3DvnG1zR37lxmz54dtv/zzz8PaQ8ZSUuXLo30EIQ4LvIcFu3Z0g2HIz0EIZrsRH/+WiwW0tLSKHf58GgdYxUin8/Hzp07WbVqFZMnTw5JiWuPhg0bxqWXXlrn42hIPVVz8Hg8VFVV8dVXX+HzhX4QWFlZ2eDzRDS4euutt1i8eDGvv/46WVlZrF+/nmnTppGRkcGkSZPCjn/ppZe48cYbj9lFpLEefPBB7rnnnuDXpaWldOvWjQsvvJC4uLhmvVZjeb1eli5dytixY7FarREdixBNIc/hE4DPU72Ar7eiQ7VJ96oaSzccZuyQDKzmjvGmTJw45PlrcPngQLmFGIcFhyPSf4eaZ0Zr/fr1nHfeeYwePZqpU6dG/P3q8XrkkUfC9um6TllZGbGxsS3ePh2MWUCn08nIkSPDYo3GBK8RDa6mT5/OAw88wA033AAYeZn79u1j7ty5YcHV119/zbZt23jzzTfrPWdycjJms5kjR46E7D9y5EhY/VaA3W7HbreH7bdarW3mzWBbGosQTSHP4Q5E06qDqdpt0k3B/3QoVrPphH5zKtq3E/35q+o6igImFEy0fh1UGF0DVQO81UGWqXE1WqeeemqjZlPao0D6nqIoLV4nBUYZkaIodb5facz7l4j+plVWVoZ9s8xmc525kC+++CKnnXYaQ4YMqfecNpuN0047jS+++CK4T9M0vvjiC4YPH948AxdCiBONpxLKjlS3SS/aI23ShRDieGmqsRixt8po76762kTXQdF0EZ25uvzyy5kzZw7du3cnKyuLdevWMW/ePG666aaQ40pLS3n77bd55pln6jzP+eefz/jx47nzzjsBuOeee5g0aRLDhg3jjDPOYP78+VRUVAS7BwohhDgG1Rc6O6VJQxIhRDumg97Wl3jQVEAFFTCZQLE0ekZLNN2x1vZqqIgGV88++yyPPPIIt99+O7m5uWRkZHDbbbfx6KOPhhy3ZMkSdF1nwoQJdZ5n165dwU4kANdffz15eXk8+uij5OTkcMopp/Dpp5+GNbkQQgjhp+tGN79AMCVt0oUQHYTVBKBTWeXC2cx1+y1G0wCPBFqtKJBmebwlDBENrmJjY5k/fz7z58+v97hbb72VW2+99ai37927N2zfnXfeGZzJEkIIUQefxx9M+Weo9KO32xVCiPbKbFJIsGnk5hkfxEc5HShtofaqKUymJtVotUeapuHxeHC5XC1ac6XrOpWVleTm5pKQkFDvosgNEdHgSgghRCvS9eqZKXcp+FyRHpEQQrSKtBig3Edubi5Gx75Ij6g5+DsPKqYOGWjpuk5VVRVOp7NVugUmJCQctfldY0hwJYQQHZnXVR1Meco7VJt0IYRoKEVRSI+FVE3D2+FeBhWwRoM91tjMHePtvdfr5auvvmLkyJEt3m3YarUe94xVQMf47gshhDBoaujslOqJ9IiEEKLNMJsUOmRXer0CXBXgOgK2GHAmgCMezO13CRSz2YzP58PhcLSrpVwkuBJCiPbOU1ljdqoC2npHLCGEEC1EB0+ZsZUcAFtshwi02hMJroQQor0Jtkn3N6LQfJEekRBCiLYoJNCKAUeCEWxJoNViJLgSQoi2LqRNeil4KyM9IiGEEO2Np9zYSg9WB1qOeLDYIj2yDkWCKyGEaIuCbdJLwF0ubdKFEEI0n5qBljXanzqYIIFWM5DgSggh2gJNM/7QBVL9pE26EEKI1uCtMLbSQxJoNQMJroQQIlK8rupgStqkCyGEiLSQQCuqukbLYo/0yNoNCa6EEKK1aGp1MOUukzbpQggh2i5vpbGVHZZAqxEkuBJCiJbkqawxOyVt0oUQQrRDYYFWvBFsWR2RHlmbI8GVEEI0J9Vb3dVP2qQLIYToaIKBVjZYnNU1WhJoARJcCSHE8dF1fyMKf6qftEkXQghxovBVQVmVBFo1SHAlhBCN5fPUWMRX2qQLIYQQoYGWo7pGy+qM9MhalQRXQghxLJpmrHAfmJ2SNulCCCHE0flcUJ5jbCdYoCXBlRBC1MVbVR1MucuQRhRCCCFEE9QMtMz26tRBW1SkR9YiJLgSQgiQNumi2pqXwWSGUyeG37b2VeO5MmxK649LCCHaO9UN5UeMrYMGWhJcCSFOXJ4KI5BylfobUcjslMAIrNa8ZPx7yK+q96991dg/7KbIjEsIITqS2oGWI94ItmzRkR7ZcZHgSghx4pA26aIhAjNWa17CpOnAGEzr/gVrXzYCq7pmtIQQQjSd6oaKXGMz24zZLEtMpEfVJBJcCSE6rppt0l2lRicjIY6lqhg69YG0kzGvfZkreAUFHZJ6Q0U+fLfAaDlscRjthi2OWv92hu8320FRIv3IhBCi7VM9RpCl5Ud6JE0iwZUQomPxuf3BVIkRWOlapEck2jLNB4W74cgWOLIZcrdA6aGQQ5RAumjhLmNrEgUsdqNTlqWugMxR47Zax1lrBmz28MDO6gST/DkXQoi2QF6NhRDtW6BNusuf6qe6Iz0i0ZZVFhoBVCCQyttWd2v9hEwjNaVgBxomTGjQ9XRIHWQE8D6XMRPqdfn/7TI6TAZu81YZ/w82RtGrj2sJirlWEHaMGbWwY+qYiav5tWJqmXELIUQHI8GVEKL9CbRJd5Uas1PSiELURfNBwU5jVioQUJVlhx9nizaCptRB0DkLUgfC5vdgzUuop07hP/oYLlOWY177MqQNhrN+24gxqP6Aq47Aq2ZgFvy66ij76wrmXNULWOuq0aDFU9E837vazPZjpEAeZUbNYj9KYFfjNrNNUiaFEB2GBFdCiLZP9YXOTmneSI9ItEWVBUYAdWQL5G42ZqXCWuorkNgDOg+C1CwjmEroFjozU6MroDbkV7D2INrQX2M2KdVdBBva1MJkNloMt1SbYdUbGrDVDsgCAZvPZQR3YYFdIGBzVx9XM4ALXsftnxUuaf7HoJiOkibpCA/Y6kyVrCPoq5liKSmTQohWFNFXHFVVmTVrFq+99ho5OTlkZGQwefJkHn74YZQan2Jt3bqV+++/n5UrV+Lz+Rg0aBDvvvsu3bt3r/O8r7zyClOmhK5BYrfbcblaKB1DCNH8PBXVwZS3hT6NF+2X6oWCHaG1UuVHwo+zx9aYkRoEqQPAdowOVJpa3RVQrVGzFwioNLX5HsfxMluNzR7b/OfWNSM4rWtGra4UyJBgzh0a2NU1Gxf4kETXjKUQvJXQEj1nTNaj17bVN6MWlirpv61mYNcWG5XIOm1CRFREg6snnniChQsXsmjRIrKyslizZg1TpkwhPj6eu+66C4Bdu3ZxzjnncPPNNzN79mzi4uLYvHkzDoej3nPHxcWxbdu24NdKW3vxE0KEUr3+YMqf6idt0kVN5bn+1D7/rFT+duM5U5NigsSe/lkpf0AV363xb37re+N5IrVhrzmjRELzn1/zhQde9aVAhgR2/n3qUWbpvC6C6cKaF9z+ZRianVIjGKudKlnfjFrt/TVm6Wo2NDFbGz8kWadNiIiKaHC1atUqrrzySi699FIAevTowRtvvMHq1auDxzz00ENccsklPPnkk8F9vXv3Pua5FUUhLS2t+QcthGgegTbpgdkpaZMuAnxuf63U5upZqYq88OPscUYAFZiVShnQcul3ovmZLMYs4rFmEptC141Ztzrr2epIgaw5E9fURiUtkRwT1qikjtq22jNqtmjIPNtYp61oH9G2CzGt+xLWviLrtAnRCiIaXI0YMYIXXniB7du3069fPzZs2MA333zDvHnzANA0jY8//pg//OEPjBs3jnXr1tGzZ08efPBBrrrqqnrPXV5eTmZmJpqmceqpp/KnP/2JrKysOo91u9243dUdxkpLSwHwer14vZGt7QhcP9LjEKKpQp7DPk/1Ar7eCmmTLow3wRW5KLlbUHI3o+RuRSnYiVKrrk5XTJDUCy1lEHrqIPTOWRCbET4rpTbvc8rrP5+3mc8rWoFiBasVrLHgbOZzBxqV1Jg5U2rPrPncYfsUX43grvb9agSCSjM1KjHv+oIL+MIYckImuiUK7cg2SOplzHAJ0YZ5/S+7beE9cGPGoOi6HrE2W5qmMWPGDJ588knMZjOqqjJnzhwefPBBAHJyckhPTycqKorHH3+cMWPG8OmnnzJjxgyWL1/OqFGj6jzvt99+y44dOzj55JMpKSnh6aef5quvvmLz5s107do17PhZs2Yxe/bssP2vv/46UVHyKagQQjQXk+YhoXIPSRU7SazYRVLFThy+4rDj3JZYCqP7UBTVh8LovhRH9UQ121t/wEJEgKL5sGhuzJo7+H9zyNce4/+qG7PuxqLWvt04Jrl8K3UlxXpNTgpj+lIQ3Z+CmP4UR/VEMzUhBVGIE0RlZSW//OUvKSkpIS4urt5jIxpcLVmyhOnTp/PUU0+RlZXF+vXrmTZtGvPmzWPSpEkcPnyYLl26MGHCBF5//fXg/a644gqio6N54403GnQdr9fLwIEDmTBhAo899ljY7XXNXHXr1o38/PxjfgNbmtfrZenSpYwdOxarVV74RDvjLsWbv4elGw4xdkgGVrOslXNC0XUoz/HPSvm3gp3Vn8oHDlPM6J16GzNSqVnoKQMhNr1NNArwqhpLNxyW569od0zr/oV57cuoigWz7kPLOBVMFpQjm1C8lSHH6mYbespA9LTB6J0Ho6dmSYqtiDivZmLp+v1t4j1waWkpycnJDQquIpoWOH36dB544AFuuOEGAAYPHsy+ffuYO3cukyZNIjk5GYvFwqBBg0LuN3DgQL755psGX8dqtTJ06FB27txZ5+12ux27PfwTUavVGvEfZkBbGosQDeIqgdIDYDbeIFvNJnlz2tF5q4xGEzVrpaqKwo9zJoXUSikp/VAs9TcpijR5/op2Ze2rsPbl8HXaht0EF/0JCndD9gbI3gg5P6G4ilFyNkDOBuP+igmS+0LayZB+srG+myMhog9JnID8S2S0hffAjbl+RIOryspKTKbQP1ZmsxlNM5IsbTYbp59+ekjXP4Dt27eTmZnZ4OuoqsrGjRu55JJLjn/QQohjc5VC0V5kcd8OTNeh9FCNDn5bjCYUtevoTBbo1De0g19M5zYxKyVEh9TQddqS+8Hga43f5ZL9wUCL7A3GsgZ524xt49vGfRIy/YGWP+CK6Ry5xyhEGxbR4Oryyy9nzpw5dO/enaysLNatW8e8efO46abqNqHTp0/n+uuvZ+TIkcGaq48++ogVK1YEj5k4cSJdunRh7ty5APzxj3/krLPOok+fPhQXF/PUU0+xb98+fvOb37T2QxTixOMug6I90qyio/FWGm+0gov0bgFXcfhxUclGINU5y1ikN7mv0clMCNE6GrtOm6IYgVNCJgy8zNhXnlsdaOVsND4sK95nbFs/Mo6J6ewPtIZA+mCI7y4fmghBhIOrZ599lkceeYTbb7+d3NxcMjIyuO2223j00UeDx4wfP57nn3+euXPnctddd9G/f3/effddzjnnnOAx+/fvD5kBKyoq4pZbbiEnJ4fExEROO+00Vq1aFZZeKIRoZu5yI91EAqv2Tdeh5KCxnlRgXanCOgJmk9UInmq2Q49JjcyYhRCG5linLSYV+lxgbGB8kJKzsXp2K3+7Mbu1c6mxgZE2mDa4enarU29j5lqIE0xEG1q0VaWlpcTHxzeoaK2leb1e/vvf/3LJJZdEPN9UiHq5y6FwV9gbcK+q8d+1B7nk1K5Ss9JWeSogd2toip+7NPy46FR/IOVP8UvuC2Zb64+3FcnzV7RnLfb89VYarxWB2a3crTXW/vKzRhmvF4HZrZT+MostGsWrmfjvj3vbxHvgxsQG8pGCEOL4eSpkxqq90DUoPuAPpPxNJwr3EFYfZ7ZCcv/qGanOgyA6JSJDFkK0MdYo6DrM2MAIrPK2+YOtjcYsl7cCDv5gbGDMdKcOgDR/GmHnk4wFj4XoYCS4EkIcH08lFOwyFrsUbY+7DPJ+rg6kjmwBT3n4cbFpRo1U50HG/zv1NgIsIYQ4FrPNSAlMGwynYNR1Fe72pxJuMIKuqiLj65yNsB6jE1ynPv77+ZtkOBMj+ziEaAYSXAkhms5T6e8QJ4FVm6BrULSvRq3UFuPrsFkpu/EJcmBGKnUQRHWKyJCFEB2QyWykDSf3hZOuru4uGgi0sjdC2WGjdit/O2x617hffDd/zZZ/dismTZpkiHZHgishRNN4q/w1VhJYRYy7rDqIOrLZqHvwVoQfF5dR3QY9dZAUmgshWpeiQHxXYxtwqbGvIg+yf6qe3SraAyUHjO3nj41jolOMeq3A7FZiZnDtIyHaKvnrKoRoPK/LmLHSfJEeyYlDU412yDVrpYr3hx9ncUDKgNBaKUm1EUK0NdEp0Od8YwNjfcQjm6pnt/K2GwHYzmXGBmCP83ckHGIEW8l95IMi0ebIM1II0TgSWLUOV7ExExUIpHJ/Njp01RbftUat1CBI6ilvNoQQ7Y8jDjJHGBsY2RG5W/yzWz8Zs/TuUtj3P2MD48Okzif5UwkHG6+B0pFQRJj8BRZCNJzP7Q+svJEeScei+YyOfTVnpUoOhh9ndULqwBopfgONtWWEEKKjsTqhy2nGBqB6jfqsQLCVs9FoznNojbGB8cFSyoAas1sngS0mco9BnJAkuBJCNIwEVs2nqqh6cd4jW4xufj5X+HEJ3UNrpRJ7GIXiotnpOng1DZ+q41V1vKqGT9NweY2awuJKL9F2C3aLGVnuSogIMFurFyxngtHAp3CPv0GGP+CqLDBSC49sgg1vAAp06uVvkOGf3ZLmPaKFSXAlhDg2n8cIrGovEimOTfMZreoDs1JHthhdsmqzRhszUYFAKnWgkSYjmo1X0/H5NHyaETwZm45X01BVvXZPRQBUzdhbUOGhuMr4YMFiVnBYzNgtJmxWE3azGatZOpoJ0aoUk9Gcp1NvyBpvfEJSdtgItALBVukh4/W3YBds/rdxv/iu1a3f006G2HTpSCialQRXQoj6SWDVOJUF1etJHdlsLKypusOPS+xRoxV6lnTBagaqRjBoCgRQPn/w5PVpdQZPTeFTdcpVH+U1fqxmRcFuNWOzGIGXzWLCbpGfpxCtRlEgroux9b/Y2FdZUB1oZf9krL1VctDYtv3XOCYquTrQSj/ZeG2W12JxHCS4EkIcner1B1Z1BAfC//3Z5U/v89dKleWEH2eLqW440TnLqAmwx7b+eNs5TTNS97yahs+nB9P4fKqGR9XQmit6agJV16n0+Kj0ABgzXApgt5ixW03YLUbgZTebMcn7NiFaR1Qn6D3G2MC/fMWm6oArbxtU5sOuL40NjNfmzoONQCv9ZEjuJ02CRKPIs0UIUTcJrMJV5PuDKH96X/72Omb0FKNjXyCQ6jzIWBhTPgk9Jl2nRsqejk/zp+75jH/7Wil6enWjC5Oi8KuTwruOvbbJjabrTBzsOOZ5dMDlU3H5VGoGXDaLyT+zZaQWSh2XEK3EHgvdhxsbGLWuuVtrdCTcbARg+1cZGxgdCVMHVc9udR5k7BPiKCS4EkKEU31GYFVXk4UTheqB/B2hi/RW5IYfZ4+rTu8LzErZolt/vO2EVzNmmnxqaN2Tz78/gpNPQSZFYdEm40OFCYNswf2vbXKzaJObSXUEXQ2lA26fhtunUUb1cgZSxyVEBFgckDHU2MCokc3fUaNJxkaj/fvhtcYGoJghpX+Nuq3BkokgQkhwJYQIdaIGVuW51al9RzYbf2Brd0ZUTP5ZqazqxhPxXaUYuoZg3VONlD1vC9Q9taTAjNWiTW4qvRrDE2DxZjf/2uxh0kn2Ome0jpfUcQnRBpgs/uUuBsLJ1xsdCYv21Qi2NhgZDLn+D91+WkIwW6Fmk4zo5Eg/EhFBElwJIappKhTuAl9VpEfSsnxuI3iqWStVkR9+nCO+OpDqPMj4tNIa1frjbUN0HTxq26x7Ol6qprO3RGNTvsq+UpUoC7y9zcvbmAEP/ZNMpEQp7ClW6R5nwmxq2aC6IXVcdn/AJfG9EC0g8IFaUk8YdKW/I2GOP9jaYMxslRwwGmUU7oYt7xv3i8swgqxAwBXXRT6EO4FIcCWEMGiqMWPlrYz0SJqXrkN5To30vi1QsMNI/6hJMUGnPqEpfrEZJ9wfxJp1T8FOe4HUPbX16p5ag0fV2VagsjFPZVO+ypZ8HxV1LuNmPAe2FWpsW23M6DrM0DvRTL8kM/2STPRLMtM11oSphZ8vUsclRAQpCsSlG1u/cca+ykIjyArMbhXshNLDxrb9U+MYZ1KtjoQ9Zc3CDkyCKyGEP7Da1TECK58L8raHpvhVFYYf50ys0XQiy+gIZXW2/ngjoK425W2t7qkllLp1tuT72JRvBFQ7ClW8WugxTgsMSjYzOMVCdrnGZ3u8mBUdVVcY1MmMxQQ7ilSqfLA5X2Vzvhpy3z7BgMsIujJiWifgkjouISIkKgl6jTI2AE855Gyunt3K22b8Ddq9wtjAqMtNG1w9u5XS31gkWXQIElwJcaLTNCOdwVsR6ZE0nq5DWXZoIFWwC3Q19DjFDMl9QmulYtM67KyUqoFPM9L0atY9+TQNTzupe2oOuRWaf1bKx6Y8lb0lWtgxSQ6Fk1LMnJRiBFQ94410v9c2uflsj5dfZ9kYFlfJmtKoYM3VU+dFcbBMY3uhyvZC4/+7/AHXxjwjcAuItkLfkIDLTFq0gtIKz7366rgC6YRSxyVEM7PFQPczjQ2MNPS8n6vTCI9sAk8F7P/O2ADMdqPOq2ZHwhM8Bb09k+BKiBOZphk1Vp7ySI+kYbxVxh+pQHpf7haoKgo/LqqTP5Dyp/gl9wdL8zchiJRA3ZNPC7QpD8xEGUGUqp8o4VM1TdfZV6KxKU9lY54xO5VXGf596BZrCgZTJyVbSI8JD3RqdgWcMMjGtgOV3JhlD+ki+KuT7HSPM3NBD+M+qqZzIBhwGUHXrmKVCi+sz1VZn1sdcMXawgOu1KjWCbhC67gMUsclRAuy2CF9iLGBkZJesBOy/amEOT+BqwSy1xsbGGnqyf0hPTC7NdioARbtggRXQpyoAjNWbTWw0nUoPeQPovzrShXuMro31WSyQHLfGsFUFkSntvtZKa96YtQ9NZVH1dleqLLJXy+1Oc9Hea16KbMCfRNNZKVYGJxiJivZTILj2LM0mq4HuwKqNb7XgS6BWh3Bq9mk0CPeTI94Mxf2NPb5NCPg216osqPICLp2F2uUeWDtEZW1R6oDrni7YtRu1Qi6OjlbJ+CSOi4hWpHJYizZkTIATr7W+FtXvD+0SUb5Ecjbamw/vWXcL7GHv2ZriBFsxaRG9GGIo5PgSogTka5D0R7wlLXM+de8bBTrnjox/La1rxo1XsOmhO73VBqzUoEUv9wtxqd5tUWnVKf2dc4ymlC0w1mpuuqefP425h257qmpyj06m/3pfZvyVLbVUS/lsMCgTmZOSrFwUoqZAZ3MOC2ND07qWyC4MW3YLSaF3olmeieaudi/z6saHQmDM1xFKnuKNUrcOj9kq/yQXR1wJTkU+iaZ6ZdoCgZcSc7WiW6OVsdlNZuqUwr9M13WFu6aKESHpiiQmGlsAy839gU7Em402r8X74eivca29UPjmNg0SBtSPbsV363df6jYUUhwJcSJRteNGSt3actdw2SGNS8Z/x7yq+r9a1819p82xfhjEayV2mIEe7VnpcxWIzUi0MEvdVC7+bRO08ArdU9NllthtETflFddL1X7e5boUDgp2Z/il2Khd0LLt0c/XlazETD1TTJzqX+fR9XZUxwacO0t0Sh06Xx/2Mf3h6vvn+xU6Jdk9qcVGkFXQ2bjmktg4Wep4xKiBcWmGVvfC42vq4qNGa3sDUbQVbDTCMDKcmDHZ8YxzsTqJhnpQyCpl3QkjJBGBVeaprFy5Uq+/vpr9u3bR2VlJSkpKQwdOpQLLriAbt26tdQ4hRDNITBj1ZKBFVTPWK15CZOmY1HPxLTy/2Dn5xDXFTa9Az++HH6/mM41OvgNMmalzLaWHWsT6TrV6XohdU/G/0/Euqem0nSd/aVG84nNeUZAdaSOeqmusSZOSjaT5W8+kVFHvVRzUoBoqzn475ZiMyv072Smf6fqN0Iun87u4uqGGTuKVPaXauRX6eQf8rHqUPVsUmqUElK/1TfRTJy99YJMqeMSooU5E6DnucYGRkOMI/6OhDkbq+uP93xlbADWaEjLqp7dShnQZv+edjQNCq6qqqp45plnWLhwIYWFhZxyyilkZGTgdDrZuXMn77//PrfccgsXXnghjz76KGeddVZLj1sI0ViBwKquVLuWcOpEKD2Mee3LXMLL1W9OSw8a/zfbjPazgVqp1EFtblV7b+0Zpxr/9qkSPDWVR9XZUaQGU/w25/so84QeY1KgT6KJk5ItwQYUia0wQ6MA0XYLMXYL0TYLqn82tUenaDRdx+XTcPtUXF4jsG4pDovCoGQLg2r8SlR5dXYVVzfM2F6kcrBUI7dSJ7fSxzcHqwOutOjwgCvG1nqRjdRxCdGCbNHQ7QxjA1A9/o6E/jTCnE1GB+ADq40NjEyQ1EHG7Fb6EONvr006EraEBgVX/fr1Y/jw4fzjH/9g7NixWK3hvfj37dvH66+/zg033MBDDz3ELbfccszzqqrKrFmzeO2118jJySEjI4PJkyfz8MMPh3wauXXrVu6//35WrlyJz+dj0KBBvPvuu3Tv3v2o53777bd55JFH2Lt3L3379uWJJ57gkksuacjDFaLj0XUjV7u1AiuAPV/Dzi8A402VDih9LqhO8evUxyjsjaDadU/GrJMudU/NrCJQL5VfXS/lqdUt32GGgclmf5qfhYGdzDitrRMMKIDTZiHWbiHabgl5s6/6x2kygd1sxmkzA8bfQJ+m4/KpuL0aLq+Gy+ujJXuNOK2Kv56s+vemwquzs0gN6VJ4uFwjp0Inp8LHVweqA64usSb6JZqMOi5/wBXVSt9jkDouIVqM2Va9ZhY3GnXNhbur0wizfwJXsfF19gZY95rRkbBTH3+DDH9HQmdChB9Ix9Cgdzaff/45AwcOrPeYzMxMHnzwQe677z7279/foIs/8cQTLFy4kEWLFpGVlcWaNWuYMmUK8fHx3HXXXQDs2rWLc845h5tvvpnZs2cTFxfH5s2bcTiOXnC8atUqJkyYwNy5c7nssst4/fXXueqqq1i7di0nnXRSg8YmRIdSvM94YW0t2z+DlU8Ea6g0xYxJVyGhO5x0dasNI1D35A20LPd32gvUjUjTvZaRX6n5F+o16qX2FIcHqgl2xZ/eZ7RE751owtLKb6idVjMxdgsxDkuTrm0xKcTYLMTUyLRx+zQj2PKpuL0q7haur4u2KgxJtTAktfrPeblHD3YnDGw5FTqHyjQOlWks328ENgrQNS7QodCo3+qd2LQmIMdD6riEaGYms9FFN7kvDL7G+IC15EB1oJXzk1Gvlb/d2Da+bdwvIdNfs+Wf3YrpHNnH0U4puh65woDLLruMzp078+KLLwb3/eIXv8DpdPLaa68BcMMNN2C1WvnXv/7V4PNef/31VFRU8J///Ce476yzzuKUU07h+eefP+b9S0tLiY+Pp6SkhLi4uEY8oubn9Xr573//yyWXXFLnjKEQx1S0z1gdvrVs+jes+mvwS3XoJP7D+VymLMe89mUYdlPdXQSboGbdk0+tfpMmdU+tR/fXSwVaom/K85FTEf59z4jxry+VbARUXWJNrdJmvDZHjYAqfHZEAYvD6D5pcYDVgVcz899ly7nk3NOw6m7wuYwUnAbSNIw0Qp+Gy2ukyUUipbTUrQXrt7YXqewoVMmto67NpED3OFONdbhM9E4wY2/lgKsuCtQIuKSOqyG8qsZ/1x7kklO7YpX8S1FTeW5osFW0N/yYmM7VaYRpJxsfkLbiL5xXM/HfH/e2iffAjYkNmpyT4/P5+Pvf/86KFStQVZWzzz6bO+64o94ZpdpGjBjBCy+8wPbt2+nXrx8bNmzgm2++Yd68eYDRQOPjjz/mD3/4A+PGjWPdunX07NmTBx98kKuuuuqo5/3222+55557QvaNGzeO999/v87j3W43bnf1R2alpUaxv9frxev11nmf1hK4fqTHIdqpkoOtF1jpOqYNizH/+FJwl3rqZNwn/wo2HMZ98o3YAfOal1A1HW3orxt0Wq+mo/pC655UzUjjU1Ukda+VeVWdncWav1ZKZUu+Sqkn9KdgUqBXgr/5hH+r3UJc0zGi41bgsChE261E2y1YzYqRDmO247XYqwMpswMstrA3DsHXYEcSBP64qz7wVRmLWvtcxv9Vd+3LBlnMCjFmMzF2o2GFV9ONWS2v0TnS5VNbfBY12qowtLOZoZ2rm2YUuzR2FAUaZmjsKFIpqDJaxe8t0Vi613jsJgUy40zBtvB9k8z0jDdhM7d+VONz+6io8a026rgUf2qh2T/LJXVcAYG6wJasDxTtlDMZep5nbACuEpQjm1ByfjK2gh0o5Udg5xHYuQwA3ZGA3vkk9LTBaGlDIKl3i3YkDCy30RbeAzdmDE2eubr99tvZvn07V199NV6vl1dffZV+/frxxhtvNPgcmqYxY8YMnnzyScxmM6qqMmfOHB588EEAcnJySE9PJyoqiscff5wxY8bw6aefMmPGDJYvX86oUaPqPK/NZmPRokVMmDAhuG/BggXMnj2bI0eOhB0/a9YsZs+eHbb/9ddfJypKiv2EOCZdZ9DhJfTN/QSA/OgB5MUOYnv6VWGH9st5H0XX2JbeeumBoulcPthTrrC7VGF3mcK+MvDqoW+qrSadHjE6vWKhV5xOj1gdh3QAbpdKPHCgQuFAucL+cthfoVDuDQ+izIpOehR0j9bpFqPTLdr4WjL3hOgYzKqLpIqddKrYRlL5dpIqdmLWQwMMn8lBYXQfCmIGUBDTj6KoXmimjtmRsLKykl/+8pfNO3P13nvvMX78+ODXn3/+Odu2bcNsNv6Cjhs3rtFdAt966y0WL17M66+/TlZWFuvXr2fatGlkZGQwadIkNM0IWa+88kruvvtuAE455RRWrVrF888/f9TgqrEefPDBkJmu0tJSunXrxoUXXtgm0gKXLl161EYiQtSp9DBU5rfOtTQV86r5mPyBlXrm7cSfdA3xQB/A49VYtvEwZw9MRdcVfP1uQVV1TtaM2Sipe2pbCqo0NuerwZmpPSXhtWnxdiU4I5WVbKZPBOqlQihmNLMdi81JTEw0cTHR2B3RRnes43Rcr8G6Dj539SyX12X8W1ePeVdNA48/ndDtVXGpKr5j361ZnFHj37quk18VqOHS/M0zNEo9cLACDlYokGscazVBzwSjhqtvopm+SSYy49rO2mNmxWgPb6tRx2Xr4NGgV9VYuuEwY4dkSFqgaII+wEUAaKoHPX87Ss5GY2YrdxMWTwWpZZtILdsEgG6yoqcMQE8bjN55MHrnLLDFNPnqXs3E0vX728R74EBWW0M0OLh66aWXWLRoEQsWLCAjI4NTTz2V3/72t/ziF7/A6/Xyj3/8g9NPP71RA50+fToPPPAAN9xwAwCDBw9m3759zJ07l0mTJpGcnIzFYmHQoEEh9xs4cCDffPPNUc+blpYWNkN15MgR0tLS6jzebrdjt9vD9lut1oj/MAPa0lhEG1dyCNyFtEpejOqFlX+C3cuNVKtz78M84BICkxblHh/ZJVUA5Jd7w95kKYpCBDKLhJ+u6xwo04It0Tfl+ciuo14qPTrQpc5oid4tQvVSKFY0iwPdbEcz29EtDsx2BwnRUcQ7rf5Ofi2jya/BNhsQG7rP5wZvpRFseSuNwEurlXJiBrvVFHJPr6bj8ga6E6q4WyGdEBTSYiAtxsy5/qUsdV0nt1IPaZixvVCl3Iu/rksj0H7dZobeCdUNM/olGc+fSAVcblXDrWqUuQOP7sSo47KaTRJcieNjdkDGycYW6EhYtKe6Ziv7J5SqQpQjG+HIRuM+islIHUz3dzJMP9lY7LihFOM52xbeAzfm+g0Orj766CPefPNNRo8eze9//3teeOEFHnvsMR566KFgzdWsWbMaNdDKykpMptBfdrPZHJyxstlsnH766Wzbti3kmO3bt5OZmXnU8w4fPpwvvviCadOmBfctXbqU4cOHN2p8QrQ7pYehIrd1ruVzwdJZcOA7o6X6eQ9Dr9GA8YF9QYWHokoPshxU2+HTqteX2uxvQFHiPlq9lBFMZaWYSXa27psy3WQ3AiiLHd3sMAIpsz2Y228xKyQ4rSREWYmyRbadf5MEar2cNfapXv/sVlV1wFWrjstqUrDaLcTW+CzQ7TPawLtbqTshGB+KdI5W6Bxt4txuxhsOXdfJqagVcBWpVHpha4HK1oLq9a4cZuiTaPa3hDeCrq6xJkwRiGh0MBqNeGU9LiEaxWQ2Wrl36mN0AdZ1KD0UEmxRdhgKdhjbpneN+8V3Cw22YtJCa13XvGycu67GVyufNIK6MQ+2zmNsokb9Vbr++usZN25csMHE888/zzPPPNPki19++eXMmTOH7t27k5WVxbp165g3bx433XRT8Jjp06dz/fXXM3LkyGDN1UcffcSKFSuCx0ycOJEuXbowd+5cAKZOncqoUaN45plnuPTSS1myZAlr1qzhhRdeaPJYhWjzSrOhPLymsEV4KuCzGcZ6GWY7XPhYcDFDr6qTU+ryv1kRkVTp1dlaYLRE35xnvMF11/qx2MwwIMno4JeVYmFQJzPRrbLYrIJutvlnoRxGEGXxB1FK+LtYs0khPspKvNNKjL0dBlTHYrYam6NGKrqmhgZcgeYZNcIne7BFufE9iVR3QkVRSI9RSI8xMaq7EXBpus7hMi244LHROEPF5cPoLJlf/WSMshgBV82FjzNilIjMkMp6XEI0gaJAfFdjG+BfV7YiP7QjYeFuoyV8yQH4+WPjmOiU6kAr7WTj9X+NvzHWKZOrz7/ySVg+B8Y81KoPqyka/RcqISGBF154ga+++oqJEydy0UUX8dhjjzWqS2DAs88+yyOPPMLtt99Obm4uGRkZ3HbbbTz66KPBY8aPH8/zzz/P3Llzueuuu+jfvz/vvvsu55xzTvCY/fv3h8yAjRgxgtdff52HH36YGTNm0LdvX95//31Z40p0XGU5UJ7TOtdyFcN//2CsjWGNhovn+hcuNNIAc0vc0gI9QgqrtOBCvZvyfOwqDq+XirUpwZboJ6UYtTHWFs3NVELS+IxZKIc/iKr/uiYTxDurA6qIpCJGkskM9hhjC9D1Gl0KK6uDL38dl8kETlvoYse10wldXrVVumyaFIWucWa6xpk5r4cxFlXTOVgWaAlv/H9XkUqlD37KU/kprzrgirZSoyW8saVFRybgAlmPS4hGi06G3ucZG4CrFI5sqg628rZBRR7s+sLYAOxxEN8d1ryEqSwPxTke09dPw1f/ZwRWo/4QucfTQA3uFrh//37uu+8+tm7dysknn8zTTz9Np06dmDNnDkuWLGH+/PlcfPHFLT3eViHrXIl2peyIMfXeGiry4OP7jEWJHfFwydOQ3Bddh/wKN8WV4a1KVU1n24Ei+ndLbDOF7R2BrhuLwm4Mri+lcrg8vN1yWqBeyh9MdYtrqfQrkz+VL7QmSjeFtzevj6L4A6ooK7FtIKBqN6/BwTququqtdh1XDYF0QpfXh9tntISP1EciqmaslRZIJdxeqLGrSA22Ya4p1qbQL6l6Ha7+SWZSoiIXcNWlLdVxyTpXok3zVkHuViMLJmcjHNkclg6tY/xORTqwapF1riZOnEhaWhpPPfUUn332Gbfddhsffvghs2fP5oYbbuC2227j5Zdf5q233jruByCEaKDy3NYLrEoPwcf3GrNk0Slw6dOQkGmkAZa4cLVWK7MTlE/T2VWksSnfx0Z/zVRxrXopBaNb20kpZganWMhKNpMS1cxvqBSzP43PHhJM6eamt99VFIhzGDNUsQ4LJgnCGy9Yx1WjWFz1hgdc/jcugTf88c7qdEKXT/UHXa272LHZpNAzwUzPBDPj/Pt8mrHe1o5gwKWyu1ijzKPzY47KjznVrzcJdqW6fssfdHVyRi7gakgdl0PW4xICrE7ocqqxgfGalb/dCLSyf0LfvwoF0M02lHYwYxXQ4OBqzZo1bNiwgd69ezNu3Dh69uwZvG3gwIF89dVXUtMkRGsqzzMCntZQuBv+Ox0qCyCuC1z6DMSmUeb2kVvqknbqLaDKXy+1Kc/HpnyjXsrlCz3GaoIBnczBNL+sZEvz1Usp1mANVCCY0ix2MDXP7I2iQIzdQkKUlTiHVQKqlmC2gjnemGUOqF3H5U8xNJl0omxmotpAOiGAxaTQJ9FMn0QzgZwYj2oEXDWbZuwt0Sh26/yQ7eOH7Or7JzmUGumERtOMREfkIhmp4xKiAcxW6JxlbKoHZf8qNMWMSfUYNVftJMBqcHB12mmn8eijjzJp0iSWLVvG4MGDw4659dZbm3VwQoijqMiH0oOtc63crfDJ/eAuhaRecMlT6M5O5Je5Ka6K/KrpHUWRq0ZL9HwfO4vqqpeCrOTqluh9E83YjrNeSjfZas1AGcFUoDNfc1IUiLZbSHBaiXNaJU00Euqr4woJuKqwmrSQ7oS6Dh41NJ3Q7asjd6+F2MzVAVOAR9XZXRwacO0r1Sh06Xx32Md3h6sDmRRnYIbL2PommkiIYMAFUsclRJ3WvgprXkI97Wb+o43istgtmJfPMW5rBwFWg4OrV199lXvvvZe7776bU045hb///e8tOS4hxNFUFBiddlrD4XVGV0BvFaQOgov+D681lpyiKkkDPA66rnOoXAu2RN+Yr3KoLPxNamqUYqT3+WemMuObXi+lm+zolhqzUP6aqLo68zW3aLs52JjCInlQbY+igC3K2OhUvT+wDpe/S6HircSu+OpMJ3T5qme4fK04lW0zKwzoZGZAp+qAy+XT2VWssqOwOujaX6qRV6WTd8jHqkPVAVfnqOqArW+S8YFFnD2yQb+q61R6fFR6qvcF6rgC6YQddT0uIQKBFcNuQjtlEvy4F+3c+zCbzUa3QGjzAVaDg6vMzEzeeeedlhyLEOJYKguhZH/rXGvfKlg208iB7nIqXPg4ZZqN3MIKSQNsJFXT2VWsGSl+/gYURa7weqke8Sb/rJTRgCI1urGBiNHevLomyuGfkbK1ShBVk9NmJsHfOl0K6dspq8PYavJ5wFc9y2XyVhFl8vjTCQ1e1Z9O6Gv9dEIAh0UhK9lCVnL1viqvzs6i6vqtHYUaB8o0jlTqHKn08fXB6oArPVoJ6VDYN7G1lic4OqnjEicMTYVhNxnrXNX8zDEQUGlt/4PdBgVXFRUVREdHN/ikjT1eCNEAlYVGl77WsHMZLP8T6Bpkno1+3qPku6C4ytU612/nqnw6PxdUt0TfcpR6qf5J5mCK36BkC7ENfgNnqrHIrj2kwUQkP8p22kzEO23EO63YJI2pY7LYjC2sjqsyGHBZvS6sZhex/pAq0umEAE6rwuBUC4NTq9/2VHhCA67thSqHy3WyK3SyK3ysPFD9S9sl1kS/RFMw4OqTaCbKGvmA61h1XCb5NRTtzbApR7+tjc9YBTQouOrTpw9Tp05l0qRJpKen13mMrussW7aMefPmMXLkSB58sG2vnixEu1JZCMWtNGO15QP4Zj6gQ98L8ZwznSNlXlyt/GaoPSl2Va8vtTnPx44ijdqN1mKskFWjJXq/pAbUSynm6hS+Govs6mZ7yz2YRnJYTcHW6XZL89dpiXbAZAZ7rLEFaJp/hstlpBN6q7Bbq4jXjbcdanCx48ikEwJE2xSGdLYwpHP1W6Eyjx7SoXBHoUpOhbHswaEyjeX7jSBGAbrFBVrCG0FX70QzTkvk8/Rq1nGp/u/p7rwKomxmrBYTNrMJq9mY8bKZJbVQiObWoOBqxYoVzJgxg1mzZjFkyBCGDRtGRkYGDoeDoqIitmzZwrfffovFYuHBBx/ktttua+lxC3HiqCryB1at8MZj/euw2t/1c9BVlJ12O7nFbkkDrEHXdbLL9ZCW6AfqqJdKcSrBFL/BKceol1IstRbZNRpMNFdnvuZms5iCKX8OqwRUog4mE9iijS1Qx6XrwfW4zN4qorxVRPmqQDMClkA6ocun4vJquFs5nRCMdbROTbNwalr126MSt8Z2f/3WDn/QlVdprM21v1TjC39CgUmB7nGmkC6FveLN2NtAwKUDLp8W9iGZAqEBl9mE1aJgM0t6oRBN1aDgqn///rz77rvs37+ft99+m6+//ppVq1ZRVVVFcnIyQ4cO5R//+AcXX3yxUXAmhGgeVcVQtI8WD6x0HX74hxFcAfopvyJvwK8oKfMc444dn6oZ3cg25fuC3fwKXeE/j2C9lL+bX+e66qUUa+giu8EgqsHlrxFjtSjEO60kOG04bfI6L5pAUY5ex+WtxOqtwuqrItZbBaoHXa9Oe4tUOiFAvN3E6ekmTk+v/j0tcgWaZVQ3zSh0Ga3i95ZofL7HqIsyKcZrQ80arp7xpuPu8tlcdMDjX0S6NotJCaYYBma6rBaTtIoX4hga9Re9e/fu3Hvvvdx7770tNR4hRICrBIr20vKBlQb/+4uRDgj4Tr+V7MyrcNUuEjpBuHw62wpUNvpbom/NV6ms9a2wmKBfktHBb7C/XqpmhzHdZDSVqF0T1RLtzVuSxewPqKKsRNnafgAo2qlAHZczoXqf6kPxVuLwuXB4K4n3r8fVFtIJARIdJs7MMHFmRvW+/Cr/osc1gq5it/HhzO5ijU93GwGXxQQ9awVcPeJNWNpY0OLTdHyaSpU3tIGASQGb2WzMcFmq0wutJkkxFAIaGVwJIVqJqxQK99DigZXmgxX/ZzSwQMF11lQOpV+AdgLVV5W4NTb7O/htzDNqLGrXS0VZ/etL+eul+icZrZB1sy0YPHlrBFOt3ZmvOZlNCvH+lL8Yu/yJEBFitoA5Doir3qdpmH3+dMIaCyB7fWrE0wkBkp0mkruYGN7FSOfVdZ28Sp3tRWpI0FXq0dlRpLGjSOPjXUbAZTVB7wRTyDpcmXGmNrkWnKYH2u+H7g90L6xZzxVINZTGGuJEIn85hWhrXKVQ1AqBlc8NX/wR9v0PXTFTOvwP5Kae3SqlXZGi6zo5FXpIS/T9peGBZCenwuBgS3QLmUlOFKujRk2UA1eEO/M1J5MJ4hzGDFWM3YLSQR6X6GBC6rj8dB2rz4XV608n9Faheytwe3zBgMt9lLS3lqYoCqnRCqnRJs7pWh1wHanQQzoUbi9UqfDCz4UaPxdqBFqt283QO6G6fqtfkpmusW0z4ILQ7oW4Q2+zmBXsgdkus9HMx2ppe7N1QjQHCa6EaEvcZUZgpbfwGwFPJXz+EBxeh262kXfWDEpShrXsNSNA1XT2lGjBluib8lUKqsKjx8w4EyelWMhKdTAoI5bUuCh0ixFM6SYbvg4YbCgKwS5/sRJQifZKUcDqNLbALsDhc+PwVq/HpXoqcblduL1acP2tSKQTKopCWoxCWoyJkd2qA67s8tCAa0ehkY68pUBlS0F1Wp7DAn0SzCFNM7rEhjbLeXWjC5Oi8KuTwruKvrbJjabrTBzsCLutJflUHZ/qAw8EgkcAs6L4G2oo2CxmbP7gy9pGatKEaAoJroRoK9zlULi75QMrVyl8ej/kbkW3ODl85iNUppzcstdsJW6fzs+F/pbo+T4256tUekOPsZigb5KVrM5OstJj6Z+RQFx0tLHQrl/bX6KwaRTFmKGKd1qJdVgwyafGoqOy2I3NX8dlBqJVH9E11uPyuitxVVVGPJ1QURQyYhUyYk2M7m4EXJputH/fXqgFOxTuKDJS8TblG7PuAVEW6Otf7LhfkplyL7y33Zg6mjCo+nXttU1uFm1yM6mOoCtSVF1H9aq4vECNtboUMBppSOt40Q5JcCVEW+CpaJ3AqrIAPr4Pivag2WI5OHw27sR+LXvNFlTq1ticrwa7+G0vUqmd/RNlVRiYGsWg9FgGdkmkb1pCWPvwDpwJiaJAjN1CQpSVOIdVAipx4grUcTmMOi4rYNU0Yr2V4HOheypwV1XgqqoIdieMRDohgElR6BZnplucmfN7GAGXqukcLKvRpbBIZVeRMcO1IVdlQ251wGUzwaJNbtbk+BidqvBDiZvXtniYdJK9zhmttkZax4v2rNHBVY8ePbjpppuYPHky3bt3b4kxCXFi8VRAwS7QW3i+pCwbPr4XSg/jcyRxaMRjeOIyW/aazShQq1CzJfq+OuqlkqIsDEqPY1BGAoMy4ujRKbrN1ii0FEWBaLuFBKeVOKf1hHv8QjSYyQT2GLDHoEQn40gEh66DzwXeKlR3BVWVZXiqKnF5vLi8KqoemY9jzCaFzHgzmfFmxvY09qmasd7WthrphLuKNTz+l8bN+Sqb882Ah6xkMyO6tu/P1I/VOt5m8ddzSet4EUGN/i2bNm0ar7zyCn/84x8ZM2YMN998M+PHj8dub/ufhAjR5ngqWyewKtoH/70XKvLxRnXm4Nlz8EWntew1j5Oq6ewr1YyW6Hk+NuVr5FeG/0HtmuhkUHocWRlxDEqPp3Oc/YStH4q2m406KqcVi3yMK0TT1KjjMkclEZPo3+9fANldVW4EW1XluNzuiKUTghFw9Uww0zPBzEW9jH1e1Xjt3F6oMv8HV3Bsm/NVbvu0gl4JJs7PtDIm00pKVMd5nfBpOj6Pj8pa+2u2jrdbAk01pHW8aDlNCq6mTZvG2rVreeWVV/j973/P7bffzi9/+UtuuukmTj311JYYpxAdj7cKClshsMrfjv7f6SiuEtyx3Tk04jFUZ6cWvWRTCqo9qs62Ao2NBUZNwZY8DxWe0GDKbFLonRLNoPR4BmXEMSg9jnintUUfS1vntJlJ8LdOt0pAJUTL8ddx2Z2J2IFYANWL7qkwarcqy3BXVeB2V0UsnRDAalbok2jmu0M+dMCs6Ki6Qvc4E9nlmn/dLTf/3OBmSKqRdnhuNyvR1o4ZadRsHV9Wq65LWseLltDk+eFTTz2VU089lWeeeYYFCxZw//33s3DhQgYPHsxdd93FlClTTthPj4U4Jm8VFOw01plqSdk/oX/6IIq3AldCXw6NmI1mizv2/Y6TSVFYtKm+gmoHpT47mwphU56PzblutudVhXXvclrN9E+L9c9KxdGvc2xYvdSJyGkzEee0kuC0YbPIuwAhIsZsRXEm4HQm4Ezy79NUVE8lVRVluCrLjYDLVYmqtV7AFXit/XWWjWFxlawpjeJfmz3cMNBGeoyJZXu9bMxTWZ9rbM/+6GJEFwvnZ1oZlm45IVqk19c6vnY9l7SOF43R5ODK6/Xy3nvv8fLLL7N06VLOOussbr75Zg4ePMiMGTNYtmwZr7/+enOOVYiOwesyUgFbOrDa/z360kdQVA+VnU4i+6xH0axRLXtNv8CM1aJNbjTFxLAYeH6Tife3uOmbEsWKw7BoU17Y/RKirGSlx/lnpeLpmXzi1UsdjcNqCrZOt1skwBSizTKZMTtiiXHEEhNIEtB13K5yXBXlVFVV4K4sx+uqQG+BJkY1uwJOGGRj24FKbsyyBz/0mnSSnXnnR3OkQuOLvV6W7fNyoFRjxX4fK/b7iLcrjOpuBFoDO5lPyA/KvaqGV9WO2jreHpzxktbxIlyjg6u1a9fy8ssv88Ybb2AymZg4cSJ//vOfGTBgQPCY8ePHc/rppzfrQIXoELwu/4yV99jHHgdt5wqUFY+jaD4qOp9O9hkPoJtbty7yVyfZKfRa+dfGcv6FGSgBYEdedUZ8lwRnML1vUHoc6fGOE/IP+dHYLKZgyp/M2AnRjikKdmcsdmcs8f5duq7jqqqkqrLcP8NVjs9VCfrxffCm6XqwK6BaIxsg8KGX5m/I0TnaxC+zjABsZ5HGF/u8fLnPS5FL58MdXj7c4SUjxsQFPaycl2mlS6zMkle3jg9N5w9rHe//v7SOPzE1Org6/fTTGTt2LAsXLuSqq67Cag2vd+jZsyc33HBDswxQiA7D526VwMq75T9YvpmHgkZZl5HknHYPmFq3Q5Sq6by+08onO4r8e4y/Ln1TY4LNJwamx5EQZTv6SU5QVotCvD/lz2mTgEqIjkpRFJxR0TijooHOgPHaWVlVhcsfcLmqytE9LhTNXf/JaqhvgeC66mAVRTHWyUoyc8sQO2uPqHyx18v/Dno5XK7x6iY3r25yM7CTUZ81uruFeLsEWjVJ63hRU6Pfce3evZvMzPrbN0dHR/Pyyy83eVBCdDg+T6sEVlU/LsH54/MAlPS4iNwhvwOldd+g7yvReOIHHzvyy4L7TIqOpiuc0TOJG06XJRxqs5j9AVWUlShb+26VLIRoOrNJITY6itjoKCAVALdPpcrl8c9wVeBxlaN4q1BUN829Sp/ZpHB6uoXT0y1UeR2sOuRj2V4Pa4+obC0wtoVr4fR0Cxf0sHJWhgW7RaZmjqbe1vHmGvVc0jq+Q2n0X/Hc3FxycnI488wzQ/Z///33mM1mhg0b1myDE6JD8HmgYAeonha7hKbqVH73T2I2LwagsO8vKBg0mdbMR1A1nXe3+3hlowuvqmM1K3hVnQmnd+Usy16+9/Vg8ff7ASTAwngTE+9P+YuxS0AlhKib3WLGHuMkIcYJpKDrOlVelUq3F1dlBa7KCnzuCky+KhSfC2ieOi6nVeH8HlbO72GlsEpjxX4vy/Z62VGk8d1hH98d9hFlgXO7Wbmgh5WTU82YJAeuwXyqjk9teOt4m0x1tRuN/kndcccdHDhwIGz/oUOHuOOOOxp1LlVVeeSRR+jZsydOp5PevXvz2GOPoddYoG/y5MkoihKyXXTRRfWed9asWWH3qVkTJkSrUb3GjFULBlZur4/ylX8OBlb5gya2emB1sFTlni+r+Mf6KryqTka8A6+qc+OZ3blhWFcArh/WlRvP7M7i7/ez5If9rTa2tsRkMpp29EiOYmB6LF0SnBJYCSEaRVEUomwWkmOddO2cTJ+emfTpO4D0PicT3+s0bGmDUOO643OmoFljmuWaSU4TV/e3s2BcDP+8OJpfDrLROUqh0gef7fEyfXklv/qwnH+sd7GnuIWXF+ngAq3jy1w+8svdZBe72FdQyc7ccvYXVpJd4qKgwkOZy4fbp9GKTShFAzX6r/qWLVvqXMtq6NChbNmypVHneuKJJ1i4cCGLFi0iKyuLNWvWMGXKFOLj47nrrruCx1100UUhaYYNWbA4KyuLZcuWBb+2WOQNjGhlwcCq4bnyjVVS4Ub5+ini9hvP9dyTf0dJr0tb7Hq1abrO+9s9vPiTG49qtE7/zbk9yStzYzYp3HB6dzS1ujg7MGOlaZFacrP1KQrEO63EOa3EOSzSsEMI0ewsZhOxZhOxDivEOYBOuH0qpRUuduw+gDtpIGZUFNWNSXX7/+8xUgsbudZiZryZKSebmTTYzuY8lS/2eVm530telc5bP3t462dPcKHi8zKtJHeghYojqbGt420WqeuKlEZHHHa7nSNHjtCrV6+Q/dnZ2Y0OYFatWsWVV17JpZcabwZ79OjBG2+8werVq8OumZaW1qhzWyyWRt9HiGaj+ozAyudqmdNrkFdSSsw3/0dM9ip0xcSRU++mrNuYFrleXbLLNZ76voqNecYf5lO6JfD78/qQGnv0Ymo4MVICFQXiHEbKX6zDgkly6IUQrcxuMQebBmUkRpFfqeJTneFJg5ovGHApqgeT6kIJBF711HSZFIXBqRYGp1q4/VQHq7N9fLHXy/eHfSELFZ/S2cz5mVbO6cALFUdafa3jbRZTjcWSpXV8a2h0cHXhhRfy4IMP8sEHHxAfbzQULS4uZsaMGYwdO7ZR5xoxYgQvvPAC27dvp1+/fmzYsIFvvvmGefPmhRy3YsUKUlNTSUxM5LzzzuPxxx+nU6dORzmrYceOHWRkZOBwOBg+fDhz586le/e639S53W7c7uqPAUpLSwFjLS+vt2UbEBxL4PqRHodoBNUHhbtBbZnAyu3VyC0qpvN3jxOdtx7NZOHwaX+gPH24kU/QwjRd5+NdXl78yY3LBw6LickjMrloUCqKooTMVAHBr2vv72gUBaJtFuKjLMTarcGASlV9qJIl027Ja7BozwLP22irQlySg/wKDwXlbvRafyo0kx1MdqjVANoIsjwQmOnSjKBLqdWcyazA8AwLwzMslLl1vj7o5ct9Pjblq6w7Ymx//dHF8AwL52VaOS3NLAvytgIVHY8aPtNltI5XsJhrBF7+joZtKbnC6/8koC28/jZmDIqu1/4Vq9+hQ4cYOXIkBQUFDB06FID169fTuXNnli5dSrdu3Rp8Lk3TmDFjBk8++SRmsxlVVZkzZw4PPvhg8JglS5YQFRVFz5492bVrFzNmzCAmJoZvv/0Ws7nuLmiffPIJ5eXl9O/fn+zsbGbPns2hQ4fYtGkTsbGxYcfPmjWL2bNnh+1//fXXiYpqnUVXhWgoi6+Cs3bPo1PFDnwmO9/3mkZ+bFarXLvQDW/sMrG9xMg16BOnM6G3SnL9k1VCCCFOQAUu+DFf4Yc8E7mu6nft0RadU5N1hiVrZMa0aomwEE1SWVnJL3/5S0pKSoiLi6v32EYHVwAVFRUsXryYDRs24HQ6Ofnkk5kwYUKda17VZ8mSJUyfPp2nnnqKrKws1q9fz7Rp05g3bx6TJk2q8z67d++md+/eLFu2jPPPP79B1ykuLiYzM5N58+Zx8803h91e18xVt27dyM/PP+Y3sKV5vV6WLl3K2LFjG/39Fa1MU40ZK19Vs59a1SC/zEVVWQHdvn0UR+keVGs0B86chSup5Zu16LrOp3u8/GO9m0of2MwKE8/qzqWD047ZHUpTfezf+B3dB5+Fydwxah+jbOZgDZVFkto7NHkNFu1Zfc9fl0clp9RFpaeZp9Y1NTjjpdRIMdyVX8WX+zysOOCjyFX91rNLjMJ5mVbGZFrJiJHX07bKYgabyUgpDLSOt7Rw63ivZmLp+v1t4vW3tLSU5OTkBgVXTXqnEx0dza233tqkwdU0ffp0HnjggeCCw4MHD2bfvn3MnTv3qMFVr169SE5OZufOnQ0OrhISEujXrx87d+6s83a73V5nkwyr1RrxH2ZAWxqLqIOmQvEe0N00dwWpy6uRU+pCL8shc9Uj2MoP4bMncGjEY3jje9LSq1jlVWrMW13FmhzjD/DAztFMGzuAjARno85jMlvadXDltJlJ8LdOt0pAdcKR12DRntX1/LVarcRGOyip8nKk1IXb20xt58wWsBrvqXSqq7Z6JMNN/bxM8bhYf6CQ5TuKWLWvgkPlOv/a7OFfmz0M8i9UPEoWKm5zdB3cqoZbBagOyAOt420Wo4lGs7aOV4xztIXX38Zcv8nvdLZs2cL+/fvxeEJbTF9xxRUNPkdlZSUmU+g332w2o9XTV/LgwYMUFBSQnp7e4OuUl5eza9cufv3rXzf4PkI0mKZCwS7wVjT7qYsqvRSUu7GUH6Lr/x7GWpWH15nCobMfxxvTpdmvV5Ou6yzd62XBWhcVXrCa4NdndeeKU7phPkFy5Z02E3FOKwlOGzaL/KEXQnQ88f5Z+IIKD7mlbtSWrN01WVEcVob2jWVo30x+51H5bk8BK7blsv5AMVsKVLYUqCxYC6d3cXBBpoWz0hVZqLgNC7SOd/kAqmurFajRSMMItmwWE1aTCVMH/3Pa6OBq9+7djB8/no0bN6IoSnBNqkB7YbURlduXX345c+bMoXv37mRlZbFu3TrmzZvHTTfdBBhB0ezZs/nFL35BWloau3bt4g9/+AN9+vRh3LhxwfOcf/75jB8/njvvvBOA++67j8svv5zMzEwOHz7MzJkzMZvNTJgwobEPV4j6aZqRCtjMgZWqQW6Zi3K3D1vJbrqsehSLuxhPTFcOjXgMX1RKs16vtoIqjfk/uPjusPFC2a+TjWkXZtGtU3SLXrctsFtNJDitxEdZsVtael5QCCEiT1EUkmPsJEbZyC1zUVDuCWt60RKcNjNj+qcypn8qRRUeVu7IY8W2XHblVfDdQRffHTTSsM/ulcCY3rEMSbVg1rzB7oaNbSMvWs+J3Dq+0cHV1KlT6dmzJ1988QU9e/Zk9erVFBQUcO+99/L000836lzPPvssjzzyCLfffju5ublkZGRw22238eijjwLGLNZPP/3EokWLKC4uJiMjgwsvvJDHHnssJI1v165d5OfnB78+ePAgEyZMoKCggJSUFM455xy+++47UlJa9g2pOMEEAitPebOeNpAG6FU1HAVbyfhuFmZvBa743hwe8UdUe3yzXq8mXddZvt/Hcz9WUeYBiwl+NTSJq84c2KFnq2wWUzDlz2GVgEoIcWIymxTS450kRdvIKXFRWtV6XV4To21cdUoXrjqlC/sLK1mxLZcV2/PIK3Oz9OcClv5cQHKMjVH9UhnTP53MTtE12sh7/Gt3uao7HIY3nRdtxNFax1tM1fVcVrMJczstJWj0qL/99lu+/PJLkpOTMZlMmEwmzjnnHObOnctdd93FunXrGnyu2NhY5s+fz/z58+u83el08tlnnx3zPHv37g35esmSJQ0egxBNEgysypr1tIE0QB2Iyl1H+vePY1LdVCUN4vBZj6LZYpr1eiHXdmk8u8bF1weNP6Z9Ek3cPaYn3TManoLbnlgtCvH+lD+nTQIqIYQIsFvMZHaKpsLtI7ukiipP6wYq3ZOimDi8B786K5Mth0tZsS2Xb3bmk1/u4d21B3l37UF6Jkczul8Ko/ql0CkmPKsiEHDVXDBZUT0omruOK4q2wKfp+DSVKq8xI6kr7XMJl0YHV6qqBtuZJycnc/jwYfr3709mZibbtm1r9gEK0eboOhTtadbAqmYaIED04VWkrXkSk+ajIvVUss+YgW5puX7nXx3w8tc1LkrcOmYFbjzJyS/O7I/J0XLBXCRYzP6AKspKlK19fiImhBCtJdpuoU9qLMWVHiOjwtcKuYI1mBSFk7rEc1KXeG4d2Zs1+wpZvi2XNXuL2JNfwZ78Cl5ZtZch3RIY3S+F4b07BV/bdbMN3WwDYglJHtT1GkFXzVkvN+jt8828aFsa/e7ipJNOYsOGDfTs2ZMzzzyTJ598EpvNxgsvvECvXr1aYoxCtB26bsxYuUub7ZRVXqMdrk81/mjF7v+Czmv/goJGWcbZ5Ay7D0wt0yWn1K3x7I8uVuw3/qD0jDcxfXgc3Xv0adFgrjWZTQrx/pS/GLsEVEII0VgJUTbiHFbyy93kloUvQtwabBYTI3onM6J3MmUuL9/szGfFtjy2ZJey/kAx6w8Us2DlLs7qmcSY/qmc0i2h7qUyFAXd4kC3OMITBzW1xkyXC0XzYPL567skzVA0UKPfaTz88MNUVBjF+3/84x+57LLLOPfcc+nUqRNvvvlmsw9QiDYjMGPVjIFVzTRAgPhdH5G68e8AlHS/gNxTfg+mlklZW3XIy/wfXBS5dEwK3DDQxi+HJKEn9kA3te8gxGSCOIcxQxVjtwQb7gghhGgak0khNc5BYrSNI6Uuiiu9EQmyAGIdVi4+KZ2LT0onp9TFyu15LP85l0PFVXy1I5+vduQT77Rybt9kxvRPpW9qTMP+DpjM6KYoVGtU+G2a1wi0NP9Ml8+Norn99V0R+kaINqnR76Bqdunr06cPP//8M4WFhSQmJsobGNFx6ToU7QVXSbOcrnYaILpO4vY3Sd76GgBFva8k/6Sbg2s8NKcyj86CtS6W7TWKSLvHmfjDmU76ZHTCG9MN2unvsaL4Wwr72wrL65EQQjQ/q9lE18QokmNUsktclLsim0qXFufg+mHduO60ruzMLWfF9jy+2p5HcZWX//yUzX9+yiYj3sFof1fCtPgmZmWYrGi2OrJIdL1WwFWzvssTfrzo8BoVXHm9XpxOJ+vXr+ekk04K7k9KSmr2gQnRZug6FO8DV3GznK52GiC6TvLml0jc+R4ABQNupLD/DS0S5Kw+7GXeDy4KqnQU4NoBNiYNtmOK6Yw3Oq3Zr9fSFMWYoYp3Wol1WDB14I6GQgjRljisZnomR1Pq8pJT0oyLEDeRoij07RxL386x3HR2T9YdKGLFtjy+3V3A4RIXr6/ez+ur9zMwLZbR/VM5p08ycc5mSLlXFHSzHd1sR7PVuk3X/DVdrmB9l7SR7/gaFVxZrVa6d+/eqLWshGj3ivdBVVGznKp2GiC6Sur6vxG/73MA8gbfQnHvK5vlWjVVeHWeX+fi093GbFWXWBPTz3SQlWzFG9MVnyOx2a/ZUhQFYuwWEqKsxDqsHbpFvBBCtHVxDiuxdguFFR6OtPQixA1kNikMy0xiWGYSlR4f3+0uZMW2XDYcLGZrThlbc8r4x9e7OS0zkTH9Uzm9R1LLLBSvmOqp75I28h1Vo9MCH3roIWbMmMG//vUvmbESHV9R8wRWPk0nt9RNhadG+oTmJe3HecQe+hodE7lDf09p5tjjvlZta3N8PL26irxKY7ZqfD8bU06247Ba8cRlolnb/sLAimJ0rYp3GrNUElAJIUTboSgKnWLsJETZyCtzk18emaYXdYmyWThvQCrnDUiloNzN1zvyWb49l915FXy/p5Dv9xQaCxX3SWZMvxSyusRjao20cpMFzWSBOv4GKzVaxwdmuqSNfPvR6ODqueeeY+fOnWRkZJCZmUl0dOiTYu3atc02OCEiqng/VBUe92mqPCo5ZTXSAAHF5yL9h7lEH/kRXbGQM2w65V3OPu5rhVzXq/OPDS4+2mnMVqVHK9x3ppOTUy3oJjvuuMw23xEwym4mwR9Q1dn1SQghRJthNimkxTtIqtH0oi3pFGPnqqFduGpoF/YVVLBye171QsVbjrB0yxGSY+yM7pfC6P4pxkLFEVDdRp4GtpH3gN62vtcnskYHV1dddVULDEOINqb4AFQWHPdpCis9FJZ7QvoImbyVZHw3G2fBZjSznewzZlDZ+bTjvlZNG3J9PP19FTkVxpUv72PlliEOnFYFzRKNJy4T2mhHQKfNTIK/dbpVAiohhGh3bBYT3ZKi6BTj43CxiypP2ysnyewUzcTh0fzqrEw2+xcq/t/OfPLL3byz9iDvrD1Ir+RoRvdPYWTfFDrF2CM95GO3kQ+2jq/RRl7zSH1XK2v0u6uZM2e2xDiEaDtKDkJl/nGdos40QMDkLqHLtzNxFO9EtURxePhMXJ2yjutaNbl8Oi/+5Ob97UaHotQohXvPcHJqmvGrrtoT2mRHQKfNRJzTSoLT1jJ570IIIVpdlM1Cn9QYSiq95JS68PjaXi2RSVEY3CWewV3iuW1kb37YayxU/OO+InbnV7Dbv1DxyV0TGNM/hbN6dWqbi9CbzOgmJ6rFGX6btJFvVW3w2SFEBJUcgoq84zpFlcffDbBWUa+5Kp8uqx7BXnYAny2ewyP+iDuh93Fdq6ZNeT6e+t7F4XLjj9clvazcOtRBtNUIpHzOVHxtqCOg3Woiwd863WFtmbW8hBBCRF58lJU4p4X8cg+5ZS60thdjAcaM29l9kjm7TzKlVV7+tyuf5dvy2FpjoWKbZRdn9ezEmP4pR1+ouK1pQBv58PouaSPfVI0OrkwmU73rx0gnQdFulR6GitzjOkVdaYAA1opsuvzvYayVR/A6kzk04nG8sV2P61oBbp/OKxvdvLvNuG6yU+GeM5ycnh749VbwxnRFbQMdAW0WUzDlTwIqIYQ4cSiKQkqsncQoK0fK3BRVeNpM04u6xDlrLFRc4mLF9lxWbMvzL1Scx1c78oh3WhnZN5nRjVmouC2p0UYeatd3Ha2NvAf0yK5t1tY1Orh67733Qr72er2sW7eORYsWMXv27GYbmBCtqjQbyo80+e4+TedIqZtKT/gLjq10L13+9wgWdxGe6HQOnT0HX1Tq8Yw26OcClSe/r+JAqfEx4NgeVm4/1UGMzf8Cr1gi3hHQbDbG0is5mtjott1AQwghRMuymE10SXDSKdpGTomLsggvQtwQafEObji9O9cP68aO3HJWbMvlqx35lFR5+einbD76KZsuCU5G909hdL/jWKi4LTlGG/nQma7qBhvSRr4JwdWVV4avwXPNNdeQlZXFm2++yc0339wsAxOi1ZTlQHlOk+9e6VE5UkcaIIC9aBtdVs3E7C3HHdeDQyMea5YZJI+q89pmN29u9aDpkORQmHq6gxFdqqf9dZMdTwQ7AioKpMbaSXCY2AU4bDJTJYQQwuCwmumRHE2520d2cRWuCC9C3BCKotCvcyz9/AsVrz9YzPKf8/huTwGHiqtY/P1+Fn+/n4HpcYzpn8LZvZtpoeK2xmRBN1lQrVFhNx29jfyJU9/VbDVXZ511FrfeemtznU6I1lF2BMqym3z3wgoPhRXhaYAAzryfyPj+MUy+KqoS+3N4+Gw0W0zTx+q3o9CYrdpbYvwhOi/Twh2nOoizV+d9R7ojYJzTQnq8E5vFhNcr7WGFEELULcZuoW/nWP8ixKHLlrRlFrOp1kLFBSzflsdPB4vZml3K1uxSXviqFRYqbmMa1kbeE/LvjtZGvlneeVVVVfHXv/6VLl26NMfphGgd5blQdrhJd/VqOkdKXFR5664xjM7+nrQf/g+T5qUyZQiHz3wYva4OPo25pqrzxhY3r2/xoOqQYFe4a5iDc7uFfioWyY6ANouJ9AQHcY4O+EmdEEKIFpMUbSPBaSWv3E1eWdtZhLghjIWKO3PegM4UlLv5akceK7blsTu/eqHiaP9CxaP7p5KVEdc6CxW3JU1oI99eg65GB1eJiYkhBXu6rlNWVkZUVBSvvfZasw5OiBZTngelh5p01/rSAAFiD6yg89p5KLpGedpZ5Jz+h+CnOE21u1jlye+q2FVsvCSd283CXac5SHCEfgoWqY6AgRTAlFh7+yvoFUII0SaYTAqd4xwkRrXNRYgbolOMnfFDuzJ+aFf2FVSwYpuxUHF+uZvPtxzh8xoLFY8ZkEr3pPDUuhPO0drI6yrsOb5GY5HQ6ODqz3/+c8ibJ5PJREpKCmeeeSaJiZHvRibEMVXkQ+nBJt21wJ8GeDTxe/5LyoaFKOiUdhvDkaHTwNT0WiNV01my1cNrm934NIi1GbNVo7pZagUxkesIGOuwkJ7gwG6RmiohhBDHL7AIcXKMSnZJFRXu9tmJOrNTNJNGRPPr4ZlsPlTC8u154QsVp0Qzpl8qI/ulkBR9fB/Eirah0cHV5MmTW2AYQrSSigIoOdDoux0rDRAgcfvbJG9ZBEBxz0vJO/k2UJqeX72vxKit2l5ozFYN72Jh2jAHSc5a54xQR0BJARRCCNGSnDYzvVJiKKnyklPSNhchbgiTojC4awKDuybw29oLFedVsDtvDy+v2uNfqDiV4b064ZQmUO1Wo4Orl19+mZiYGK699tqQ/W+//TaVlZVMmjSp2QYnRLOqLISS/Y2+W4XHR26p+6hpgOg6nbYsImnHOwAU9ruegoG/anLNk6rpvLvNwysb3Xg1iLHC7ac6uKCHNSzlTjfZ8cT3CK5R0RoCKYDJMXZMJkkBFEII0bLinVbiHBYK/E0v2uoixA1Re6Hib3bms2JbLltzyoILFS+wmDirVydG909haLdEzPK3tl1pdHA1d+5c/v73v4ftT01N5dZbb5XgSrRNlYVQvK9Rd9F1/6LA9aQBomukbFhIwt5PAMjLmkJx3180eZgHS1We+t7FlgJjhuz0dAv3nO4gOSp8BiwSHQElBVAIIUQkKIpCcow9WI9V2MYXIW6IOKeVSwanc8ngdLJLqli5PY/lP+dyuMTFyu15rNyeR4LTyrl9kxnTP5U+7XGh4hNQo9+V7d+/n549e4btz8zMZP/+xs8KCNHiqoqguHHPTa+mk1PiwlVPGiCaj85r/0zcwZXoKOSecgelPS5q0hA1Xef97R5e+smNW4UoC/x2qIOLeoXPVkHrdwSUFEAhhBBtgdmkkJHgpFOMsQhxaVXbX4S4IdLjnSELFS/flsvXO/IprrVQ8Zj+KYzqn0paXAdYqLiDanRwlZqayk8//USPHj1C9m/YsIFOnTo117iEaB5VxVC0j8YsXFfh8XGkxI1az0diiuoh7Yf/IyZnNbpiJue0eyjvOqpJQ8wu13jq+yo25hmB3Kmdzdx7hpPU6LrrtVqzI6CiQEqsnRRJARRCCNGG2C1mMjtFU+H2kV1SRZWnHecK1lBzoeKbz+7J+gPFLN+Wy3e7CzlUXMVr3+/nte/3Myg9jtH9UzinTzKx8sFnm9Lo4GrChAncddddxMbGMnLkSABWrlzJ1KlTueGGG5p9gEI0WVUxFO2loYGVrhvdAIsq60kDBBRvJRnfP05U/k9oJhvZZzxIZdrpjR6epuv8Z6eXf2xw4fKBwwK3nuLgst51z1YZHQG7oToSGn2tppAUQCGEEG1dtN1Cn9RYiio8HClz4fW181zBGixmE8N6JDGsh7FQ8be7ClixPY8NB4rZkl3KFv9CxcN6JDK634mzUHFb1+ifwGOPPcaZZ57J+eefj9PpxOn8//buO0yq8vz/+Hv6zPbOsrBLL7ICClLV2ChGQrAlti9B7DEElQQFIyAiIiTBNcZoNBj4JSr4NdGvLSoSUZCiVMVCWXrZyrJ9ys6c3x8DI+vuwjaYXfi8rovrcs6ceeY+uw849zz3uR8XI0aM4PLLL+eJJ55o0Fh+v59p06bRqVMnXC4XXbp0YdasWRjHrRjceuutmEyman+uvPLkpVfPPvssHTt2xOl0MmjQID7//POGXqq0Zu7iBiVWPr/B/iOVJ02szN5S2q96hIiCL/FbXRwcOrNRiVVueYApyyt4Zn0wseqTbOGFK6MY3dVee2JlsuKN7XxaEiu71UxGYgQdkyKVWImISKsQH2mne0o0bWIcp6ti/rSKsFu54pw2zBpzLn+/dQDjh3akU1IkVQGDNTsP8+T73/GLv6/lz//dzpYDxQRa+w1prViDV67sdjtLlizh8ccfZ9OmTbhcLnr37k2HDh0a/OZz587lueeeY9GiRWRmZrJu3TrGjx9PbGwsEydODJ135ZVX8ve//z302OE4cWe0JUuWMGnSJJ5//nkGDRpEVlYWI0eOZOvWraSkpDQ4Tmll3CVweBf1TazKvFXknaQMEMDiPky7VdNwlOzBb4vmwNDH8MR3a1BohmHwn50+/rrRTUUVOCxwe18HY7rZ69yt/XR1BFQJoIiItGZms4mUGCfxkcGmF0XlrW8T4vpIjHJwbb/2XNuvPbsLylm+LZ9PtuVRUOblg29y+eCbXJKjgxsVX9pDGxWfbo1uM9atWze6dWvYB8sfWrVqFWPGjGHUqFEAdOzYkVdffbXGKpPD4SA1tf73mMyfP58777yT8ePHA/D888/z7rvv8tJLLzFlypQmxSwtnKcUiuqXWNW3DBDAWp5Lu1WPYC8/RJUzgQNDZwU79TVAfkWA+Z9Xsi4neG9VryQLkwc5aR9d9+rQ6eoIqBJAERE5U9gsZtrHH9uE2E2Z+8xoelGbjkmR3JoUyS+GdGDLgWKWb83ns+wC8ks9/O/6/fzv+v10SY7k0h4p/KibNio+HRr8ie26665j4MCBPPTQQ9WOz5s3jy+++IL//d//rfdYQ4cO5YUXXmDbtm10796dzZs3s3LlSubPn1/tvOXLl5OSkkJ8fDyXX345jz/+eJ3NM7xeL+vXr2fq1KmhY2azmWHDhrF69epaX+PxePB4PKHHJSUlAPh8Pny+8H7rcez9wx1Hq+AtC5YCGie/qdXnN8grdeP2nfxce+k+2q+ehs1diDcilX1DZuGLTIW69r36AcMw+GhPFc9vdFPuA5sZbu3t4OpuNixmE/46xvHbY6mKahfME/2n5n8MVouJ1NijXQCNAL56/DwaSnNYWjPNX2nNzvb5awHax9opdZrILfa02k2I6+vctlGc2zaKOy/qwLo9RSzfVsD6vUfIzi8nO38Xf/9sF33ax3JptyQGd07AZWvhX6gawS+jW8L8bUgMJsNoWFFmcnIy//3vf+ndu3e141999RXDhg0jNze33mMFAgEefvhh5s2bh8Viwe/3M3v27GqJ0eLFi4mIiKBTp05kZ2fz8MMPExUVxerVq7FYak6KgwcP0q5dO1atWsWQIUNCxx988EE++eQT1q5dW+M1jz76KDNnzqxx/JVXXiEiQkupZ7PYil0M2fF7HP4ySpztWN31Qdy2+Hq/vsQLS3aa2VIUvL0xI9Lglq5+UjWtRERE5BQr88HGQhPr8s3sLvu+5N9uNuidYDAgyaB7nIFFdwOcUEVFBTfffDPFxcXExMSc8NwGr1yVlZVht9dcUrTZbKEVn/p67bXXePnll3nllVfIzMxk06ZN3H///aSlpYU2Iz6+A2Hv3r3p06cPXbp0Yfny5VxxxRUNDb9WU6dOZdKkSaHHJSUlpKenM2LEiJP+AE81n8/H0qVLGT58ODabWm3WylsRLAU0TrAnFcfKAD0U13NPDFfhFtp/9SQWfyWVcV3JGTyTDvb6zQfDMFi+r4q/fOWm1AtWM/xPpp2f9bCfdKd1X2R7As7Yer1PY0Q5rKTGOrCfphJAzWFpzTR/pTXT/K3JHzAoKPNyuNzT6jchrq9zgbHAoWI3n2wvYPm2Ag4Vu1lfYGJ9AcS6bPyoayKX9kiiS1Jky9mo2PCze/PqFjF/G5LjNDi56t27N0uWLGH69OnVji9evJhevXo1aKzJkyczZcqUUALVu3dv9uzZw5w5c0LJ1Q917tyZpKQkduzYUWtylZSUhMViqbGClpubW+d9Ww6Ho9YmGTabLey/zGNaUiwtircCSvaA2eBEzS99foPcEjfuKv9JkxuAiNx1tF37BOaAl4rEczk0eDomWwT1SUeK3AGeWedmxf5gEtc13syDg1x0ijvJq03W4P1VtsiGt/GsB5vVRNtYF7Gu8MwjzWFpzTR/pTXT/P2eDWjvsJMSG0FOsZviyvCXnJ0u7RKiuHlQFDcN7MC23DKWb83j0+35FFf6ePurHN7+Kof28S4u7ZHCpd2TaRPujYqN4Oe1ljB/G/L+DU6upk2bxrXXXkt2djaXX345AMuWLePVV19t0P1WEFxiM5urf4y0WCwEAnXXxO7fv5/CwkLatm1b6/N2u53+/fuzbNkyrr76aiBYfrhs2TImTJjQoPikhfNWQOGOk65YlXqqyCtx1/cWKaIOrCB13R8wGX7K2gwgZ+CUenfq+3Sfjz+tc1PsCS6x35Lp4KZedqwnSehOZUdAdQEUERGp7ti2IxXeKg4ecVPpPfFniTOJyWSiR2o0PVKjuf2iTmw8ulHx2p2H2V9UyT/X7OGfa/bQq20Ml/VI4aKuSUQ5T21jrTNJg39So0eP5s033+SJJ57g9ddfx+Vy0adPHz766CMuueSSBo81e/ZsMjIyyMzMZOPGjcyfP5/bbrsNCJYgzpw5k+uuu47U1FSys7N58MEH6dq1KyNHjgyNc8UVV3DNNdeEkqdJkyYxbtw4LrjgAgYOHEhWVhbl5eWh7oFyBvBVwuHsEyZWhgEFZR6ONOBbqZjdH5Cy6c+YMCht9yNy+k+qV6e+Ek+AZ9a7Wb43uFrVKdbMg4NddI0/+VrXqewIGOW0kqYugCIiIrWKsFvpmhLFkQovOSVn1ibE9WG1mBnQMYEBRzcqXpVdyPKteXy5vzi0UfFfP81mQMcELu2RzICOCdgs2qj4RBr1aW7UqFGh9unH27JlC+eee269x3nmmWeYNm0a9957L3l5eaSlpXH33XeHSg4tFgtffvklixYt4siRI6SlpTFixAhmzZpVrYwvOzubgoKC0OMbbriB/Px8pk+fTk5ODueddx7vv/8+bdq0aczlSkvjqwyuWAXqvnfK6w+QW+zBXVX/b6LidrxB8pYFABR3vJK8vr8E08mTklUHfGR94abIbWA2wY3n2PmfTAe2etwd6nfE4YtKp7l3PAx3CaCIiEhrEhdhJ9ZlI7/MQ36phxMUUZ2xIuxWhp3ThmHntKGwzMMn2/L5eGseuwsrWL2zkNU7C4l0WLioSxKX9UzhnLYxde7ReTZr8lflpaWlvPrqq/ztb39j/fr1+P31/zAbHR1NVlYWWVlZtT7vcrn44IMPTjrO7t27axybMGGCygDPRD43FGafMLFqaBkghkHCdy+TuHUxAIe7XUdhr1tPmvCUeg3+ssHNR7uDK2MZMcF7q3ok1m+VqCqiDVURzZvwqwRQRESkcUwmEynRThIi7OSWeigq9541TS9+6PiNincVlPPJtjyWb82nsPz7jYpToh1c0j2Zy3qkkK6NikManVx9+umn/O1vf+Pf//43aWlpXHvttTz77LPNGZtIdVWeoytWtZf5NaYMECNA8lcvErfzbQAKev2Cou4/P+nLPj/oY/4Xbgorg6tV1/ewM663A3u9epma8EWl43fG1T/OelAJoIiISNNZLWbaxblIjLSTU+ym9AzehLg+OiVF0impE2MHd+Trg8V8vDWPz3YUknfcRsVdk6O4tEcyP+qWTPxZvlFxg5KrnJwcFi5cyIIFCygpKeHnP/85Ho+HN998s8GdAkUa5CSJVbAM0I27IRsEBvy02fgnYvYtAyCvzy8p7lyz3PV45T6D5ze6eX9nMI520WYeHOSkV1I9/yod7QgYsEXWP86TUAmgiIhI83PaLHRMiqTU7SOn2I3bdxbWCh7HYjbRp30cfdrHcc8lfj7fdZjlW/NZv7eIHfll7Mgv46XPdnFeehyX9khhcKdEXPaz7wvfeidXo0eP5tNPP2XUqFFkZWVx5ZVXYrFYeP75509lfCJQ5Q0mVn5vrU+XuqvIK21AGSBg8vtIXTePqEOrMUxmcvs9QGn6ZSd8zYacKv7weSX5FQYm4Jrudsb3ceC01q/8rrk7AppMkBTlICVaJYAiIiKnSrTTRrTTxuFyL7klbqr8Z2mt4HEcVgsXd0vm4m7JFFf6WLk9n4+35rM1t5QNe4+wYe8RnDYzgzsncln3FPqmx9VrK5wzQb2Tq//85z9MnDiRX/7yl3Tr1u1UxiTyvRMkVoYB+WWeBu9RYapy03btbCLzNxIwW8kZMIXytoPrPL/SZ/DiZjdv7wi+T9tIE78d5KJPSv0Xfpu7I6BKAEVERE6vhEg7ccc1vThb78f6oViXjVF90hjVJ42DRypDjTAOFbtZvjWf5VvziYuw8aNuwfuzuiS3oI2KT4F6f9JbuXIlCxYsoH///pxzzjmMHTs2tPmvyCnh9x1NrDw1nmpUGSBg9paRtuZRXIe/I2BxcnDwI1Qmn1fn+ZvzqvjD2kpyyoP/go7uauPOvk5ctvr/o9CcHQFVAigiIhI+ZrOJNjFO4iPs5Ja4OVJx9mxCXB9pcS5uGpjBjQPS2ZpbyvKt+Xy6PZ8jFT7e2nyQtzYfJP3oRsWXtISNik+BeidXgwcPZvDgwWRlZbFkyRJeeuklJk2aRCAQYOnSpaSnpxMdHX0qY5WzyQkSq8aUAQJY3EW0WzUdR8ku/LZIDg6ZiTuhZ63nuqsMFnzp4c1twRWzlAgTvxnool9qw1aemqsjoEoARUREWg671Ux6QgRJUX4OFldS4Tl7NiGuD5PJRM/UGHqmxnDHRZ3YsPcIy7cFNyreV1TJP9bs4R9r9pCZFsOl3b/fqPiVtXswm03cOCCjxph/WrYdf8DggeHdw3BF9dfgGqXIyEhuu+02brvtNrZu3cqCBQt48sknmTJlCsOHD+ett946FXHK2cRfFUysqtzVDgcCUFDe8DJAAGtFHu0+ewR7+UGqHHEcGDoLb2ynWs/9uqCKeWvcHCwLropd1dnGXec7iWzAalVzdgSMclppG+vEaVMJoIiISEvislvokhxFcWWw6YW3gRU1ZwOrxczATgkM7JRAuaeK1dmFfLwtj6/2F/P1wRK+Pvj9RsU2i4lPtwf3rr3xgnahMf60bDvzl25jUgtPrKCJ+1z16NGDefPmMWfOHN5++21eeuml5opLzlZ1JFaeqgC5JW48jfhHy1Z2gHafPYKtMh+fK4UDFz6OLyqtxnlev8HCrzy8/p0XA0hymZg00MWAtg38a9JMHQFtVhNtY1zERqgEUEREpCWLddmIcVopKPMGq2uUY9Uq0mFlWK82DOvVhoJjGxV/l8eew8GNigFsFhMvr91LXkklV8XCnz/O5un/ZjNpeHcmXtHy+z40y931FouFq6++mquvvro5hpOzVcAPh7OhqrLa4VJ3Fbklbhpz36j9yE7arZ6O1XMEb1R7DgydRVVEco3zviv0M29tJftKgv8ajuhk45fnO4myN6wErzk6AqoEUEREpPUxmUwkRzuIj7CRV+rh8Fm8CXF9JEU5uK5fe647ulHx8q15LN+Wz+Hy4C0ZS7/NZykWoPUkVtBMyZVIkwX8wRUrX8X3hwKQX+ampJGb9zkLvyVt9aNYqspxx3bh4NDH8Dtiq53j9Rv882sPS771EjAgwWni/gFOhrRr+GpRc3QEVAmgiIhI62a1mEmLc5EYFdyEuKTy7N6EuD6ObVT8iyEd2XIguFHxsu/yABM2i6nVJFag5EpagoAfCrOrJVZNKQMEiMjbSNu1j2P2e6hM7MXBwTNqlOltPxxcrdpdHHyPyztY+VU/JzEOc4Pfz++IxxfVvtEdAVUCKCIicmZxWC10SIykzFNFTnEllV7VCp6MxWyib3oc3+aUBB+bDHz+4D1XrSXBUnIl4RUIwOGd4CsPHSqpDHYDbOxKeuTBVaSum4c5UEV5Sn8ODZyKYf2+1afPb/DqNx5e+caL34A4h4mJFzi5OL1xiU1TOgKqBFBEROTMFuWw0jUlmqJyLznahPikFn+xl5fX7uWWgekMtOxip6sH85duA2gVCZaSKwmfQCB4j5W3LPQwr8xNaSPLAAGi9y6jzYanMRGgNO1Cci74LZi/T5p2HvEzb00l2UeC3x5dnG5lYn8ncc6Gr1Y1tSOgSgBFRETOHvGRdmJdNgrKPORpE+JahRKrQRnceEE7dm7YxYTLumCxWFpNgqXkSsLj2IrV0cSqqWWAALHZb5Hy1QsAFGcMJ+/8CWAKJi7+gMHib73882sPVQGItgdXqy5JtzZul/AmdARUCaCIiMjZyWw2kRLjJD4yuAlxUbk2IT5eIGAEE6sBGWB8v3fYsYTK39BNTsNAyZWcfoEAFO0CbynQ9DJADIOErYtJ/O5lAIq6jKHg3DtC9z/tKfbz+7WVbD0cTNyGtLNy/wVOElyNWa0Cw+LAG9PwjoAqARQREREAm8VM+/ijmxAfqaRcmxADcPOgDnU+19JXrI5RciWnl2EEEytPSbOUAWIYJH39EvE73gCgsOctHO5xI5hM+AMG/9rqZeFXHnwBiLLBvf2cDOtoa9xqFY3vCBjpsJAW51IJoIiIiIQ4bRY6J0dR4g5uQuzxqelFa6fkSk4fwwiWAnpK8FQFyClp4k7mhp+UTc8Su+dDAPJ738mRLmMA2F/i5/dr3XxTGPwmaEBbK5MGOEmKaNxqFTSuI6DVYiItViWAIiIiUrcYp41oh5XD5V5ySzytovxNaqfkSk6P41asiiuryG9KGSBAwEfquj8SfXAlBmbyzv81JR2GEzAM3tzm5aUvPXj8EGGFe853cmXnxq9WQcM7AppMkBhlp020UyWAIiIiclImk4nEKAdxEXbySt0UlmkT4tZIyZWceoYBRbsJVBQ3vQwQMFW5afvFHCJz12OYrORcMJmydhdyqCzA79dW8lV+cLWqXxsLvxnoIiWy8atVjekIqBJAERERaSyL2UTbWBcJkXZyiz0UV6rpRWui5EpOvSN78JQdbnoZIGD2lZO25jFchV8TsDg4NPBhylL68c52Ly9uduOuAqcV7jrPyU+6NG21qqEdAVUCKCIiIs3FYbWQkRhBuaeKQ8VuKr1qetEaKLmSU6toD8WH85teBghYPMWkrZqOszgbvzWSg0NmsMfZkz8ur2BjbvAfnD7JFn47yEXbqKasVjWsI6BKAEVERORUiXRY6ZoSxZGK4CbEvirVCrZkSq7klPEf3kN+3sEmlwECWCoLaP/ZI9jL9lNlj+XAkJm8eTiDv24so6IKHBa4va+DMd3smJuyWgUEbFF4ozPq1RFQJYAiIiJyOsRF2Ilx2igo95Bf6iGgxoItkpIrOSXc+XvIydmPz9/0v/m2soO0WzUNW0UuPlcSX/V7jDlfJvLFITcAvZIsTB7kpH100xOc+nYEtFpMtI11Ehdhb/J7ioiIiNSH2WwiJdpJQoSd3FIPReVqetHSKLmSZld0MJvCvINNLgMEsBfvpt2qaVg9RXgj01icMZ25n8ZQ7vNjM8P4Pg6u7W7H0gzlePXpCHisBDAl2tks7ykiIiLSUFaLmXZxLhIj7RwqdlPWDFVC0jyadmNKE/n9fqZNm0anTp1wuVx06dKFWbNmYdSRgt9zzz2YTCaysrJOOO6jjz6KyWSq9qdnz56n4ArkeP6AwYHd2ylopsTKcXgr7VdOweopojyqI7+2Pcr0jTGU+6Bngpnnr4zkZz0dzZDkmPBFZZw0sYp0WOiaEkXbWJcSKxEREQk7p81Cp6RIOiZF4LSF9WO9HBXWlau5c+fy3HPPsWjRIjIzM1m3bh3jx48nNjaWiRMnVjv3jTfeYM2aNaSlpdVr7MzMTD766KPQY6tVi3SnUqXXz8G9OwiU5TXLeK78L0lb8xhmv5vciO5cWzKZA95IrGb4xbkOft6zeVar6tMRUCWAIiIi0pJFO21EOawUVfjILXFT5VetYLiENeNYtWoVY8aMYdSoUQB07NiRV199lc8//7zaeQcOHODXv/41H3zwQejck7FaraSmpjZ7zFJTQZmHgoO7sVQ0T2IVeWgtqV88iTngY4utNz8//AAVOOkab+bBQS46xTVP84iTdQRUCaCIiIi0FiaTiYRIO7EuG/mlHgrKPLofKwzCmlwNHTqUF154gW3bttG9e3c2b97MypUrmT9/fuicQCDA2LFjmTx5MpmZmfUee/v27aSlpeF0OhkyZAhz5swhIyOj1nM9Hg8ejyf0uKSkBACfz4fPF96N2469f7jjqI0/YHDoiJvywwewVubTHLsvxOxfTtuNT2EyAvzXuIBflk6gymRnbC87N5xjx2o24Q80/V+KgC0y2LgCC/hr1ilH2C20jXXisFkI+KsIaGuJRmvJc1jkZDR/pTXT/D17JUZYiLY7yS9txZsQG8EPXy1h/jYkhrAmV1OmTKGkpISePXtisVjw+/3Mnj2bW265JXTO3LlzsVqtNcoET2TQoEEsXLiQHj16cOjQIWbOnMnFF1/Mli1biI6OrnH+nDlzmDlzZo3jH374IREREY27uGa2dOnScIdwynXMX0bb/f8PEwb/8l/Eg767SYkw8z9dq2gfWUX2gYpmfLciYP8Jz/imGd9Nzo45LGcuzV9pzTR/pTVrCfO3oqL+n0HDmly99tprvPzyy7zyyitkZmayadMm7r//ftLS0hg3bhzr16/n6aefZsOGDZgasHfRj3/849B/9+nTh0GDBtGhQwdee+01br/99hrnT506lUmTJoUel5SUkJ6ezogRI4iJiWnaRTaRz+dj6dKlDB8+HJvNFtZYjiks95JX4sZcXoC1MrdZxkzY/r+k7P9/ACysGsEs/y/42TlObullx2ZpvpK8Klcy/oiUGsdNJkiItJMU1RwNMuR4LXEOi9SX5q+0Zpq/crySSh+5JZ5m2SbntDD87N68ukXM32NVbfUR1uRq8uTJTJkyhRtvvBGA3r17s2fPHubMmcO4ceNYsWIFeXl51cr5/H4/v/nNb8jKymL37t31ep+4uDi6d+/Ojh07an3e4XDgcNS878Zms4X9l3lMS4jFHzDYX1RBSWUVVs8RbJ48aGoiYhhEf7WQlJ3/AuBPVVfzuusGnh4cQY/E5tyY14QvKh3DGVejRWaEw0I7bQR8yrWEOSzSWJq/0ppp/gpAos1GQrSLgjIveaXulr8JsRH8jNkS5m9D3j+syVVFRQVmc/WPuhaLhcDR3/bYsWMZNmxYtedHjhzJ2LFjGT9+fL3fp6ysjOzsbMaOHdv0oM9SFd4q9h6uwFdlYKksxFZ+sOmDGgGM1c+SmvcBAE9U3cyRrtfyXG8H9mZcraqrI6C6AIqIiMjZxGQykRztID7CRl6ph8PahLjZhTW5Gj16NLNnzyYjI4PMzEw2btzI/Pnzue222wBITEwkMTGx2mtsNhupqan06NEjdOyKK67gmmuuYcKECQD89re/ZfTo0XTo0IGDBw8yY8YMLBYLN9100+m7uDNIfqmH3BI3hgEW92Fs5QeaPGa5x4d3+R8ZWLmSgGFinvUOMi8aRa+k5p2StXUEPFYC2CZGXQBFRETk7GO1mEmLc5EQaSe3xE1JpTYhbi5hTa6eeeYZpk2bxr333kteXh5paWncfffdTJ8+vUHjZGdnU1BQEHq8f/9+brrpJgoLC0lOTuaiiy5izZo1JCcnN/clnNGOLwOEo4lV2YmbQNTHpoMVtPn8SX7EBnyGhVeTJjJqyOU4rc2b6ARsUXijM8D8/TRXCaCIiIhIkNNmoUNiJGWeKnKKK6n0tvRawZYvrMlVdHQ0WVlZZGVl1fs1td1n9cNjixcvblpgUq0MEMDiLmpyYlXpM/h/G4u4dt+TDLV8gwcb6855iCE9BzdHyNX4HfHBVutHG6GoBFBERESkdlEOK11Toikq95KjTYibJKzJlbRMx5cBAljcR7CV7WvSmJvzqnhhTR7zqp7kPMtOKk0uDg6aRkpqn2aIuLqqiDZURbQBVAIoIiIiUl/xRzchLijzkFeqTYgbQ8mVhFT5A+wvqqTU/X3drdnTtMTKXWWw4EsPn23L5R/2J+lp3ofHGk3+hY8RiO/WHGEfJ9gR0O+MA1QCKCIiItJQZrOJlBgn8ZF2cordHKkI/ya+rYmSKwGg3FPFvqLvywABzJ5i7KX7gMZ9bfF1QRW/X+vGVJbD6/Yn6GDOw+tIIOfCx/HGZJx8gIY4riOg1WIi9eg/CiIiIiLScDaLmfSECJKi/BwqrqTc4w93SK2Ckishr9RNXkn1pV+ztwR76V4ak1h5/QYLv/Lw+ndeupj284pjDikU4Y1I5cCFj1MVmdp8wfN9R0CsDhJVAigiIiLSbFx2C52Toyhx+8gpduPxqenFiSi5OovVVgYIYPaWYi/ZQ2MSq+8K/cxbW8m+kgDnmnay2DmXKKMUT3QGB4bOwu9KPPkgDRDsCNiBCJddJYAiIiIip0iM00a0w0phuZe8Eg/+gG7Iqo2Sq7NUuSfYDfCH3WAam1h5/Qb//NrDkm+9BAwY5vyOv5h/jz1QiTuuGweGziRgj2nGKwh2BDRi02kf61IJoIiIiMgpZjKZSIpyEB9hJ6/UTWGZNiH+ISVXZ6G8EnetHWDM3rKjiVXDlnu3Hw6uVu0uDr7uvtQvua9kPuaAl4rEczk4eDqGLaKZog/yR7YhJjldJYAiIiIip5nFbKJt7NFNiIs9FFeq6cUxSq7OIlX+APuKKilz19yF2+wrx16ym4YkVj6/wavfeHjlGy9+A+IcJp7qvI5Ld2dhMvyUtRlAzsApGBZH810EJqyJHUhPTVMJoIiIiEgYOawWMhIjKPdUcajYTaVXTS+UXJ0lyjxV7KulDBCOJlbFu2hIYrXriJ+5ayrJPhJ8zcXpVh5L+ZSOXz+LCYPS9peQ0+8BMDffFDNbbCS07058fEKzjSkiIiIiTRPpsNI1JYojFcFNiI/vPn22UXJ1FqirDBDA5Kto0IqVP2Cw5Fsv//jaQ1UAou0mJl7g5GrP2yR//RIARzr+mPy+94CpeVaWTEBMdBSJ6edgsTubZUwRERERaV5xEXZinDYKyj3kl3oInIWNBZVcncF8/gD7DlfUuS+BqaoSR8kuMOq3hLun2M/v11ay9XDwb8qQdlbu7++g6+5XSNy2BIDD3a6jsNetYGqe+6CcNgspiYk4UrqCWWWAIiIiIi2Z2WwiJdpJQoSdnJLgJsRnU9MLJVdnqBOVAcLRxKq4fomVP2Dwr61eFn7lwReAKBvc28/JsA4WUrb8jbidbwNQ0GscRd1/1izxW452o4lJSIG4jGZL1kRERETk1LNazLSPP7YJsbvWe/7PREquzkC5JcFNgetiqnIfTaxOPsn3l/j5/Vo33xQGk7ABba1MGuAkyWnQZuPTxOz7LwB5fX5JcedRTY7dBMS4bCRGOrDEtoXo5t1wWEREREROH6fNQqekSEqPbkLsPsM3IVZydQY5WRkgHEusdp40sQoYBm9u8/LSlx48foiwwi/7ORnZyYY5UEXqF3OJOrQGw2Qmt98DlKZf1uT4nVYLydGOYBfAuAyIUOMKERERkTNBtNNGlMPK4XIvuWfwJsRKrs4QpW4f+w5XnnCi1nfF6lBZgN+vreSr/GCS1q+Nhd8MdJESacZU5SZt7eNE5G8iYLaSM2AK5W0HNyn2UAmgyxrsLhjfCRxRTRpTRERERFoWk8lEYpSDuAg7+aUeCspqb7jWmim5auUMwyCv1HPCMkAAk99zNLGqe5O3gGHwzg4fL252464CpxXuOs/JT7rYMJlMmL1lpK15FNfh7whYnBwc/AiVyec1Kf7YYyWAZsDqhITOYG3OfbFEREREpCWxmE2kxjqDmxAfbXpxplBy1Yr5/AH2Hq6g4gRlgBBMrOwnSaxyywP88fNKNuYGx+qTbOG3g1y0jTIDYHEX0W7VNBwlu/HbojgwZCaehB6Njv37EsDg+NijIaGTOgKKiIiInCXsVjPpCREkRgU3IT7ZZ9rWQMlVK1WfMkAAk9+LvXgXpoC31ucNw+A/O338daObiipwWOD2vg7GdLNjPtqhz1qRR7vPHsFefpAqRxwHhs7CG9upUXFbji4Hx7qOm3quBHUEFBERETlLRditdEmOorjCR06JG29V6216oeSqlTEMg9yS4MZsJ3OyxCq/IsBTX1TyxaHgtwS9kixMHuSkffT3q0e20v20WzUNW2U+PlcKBy58HF9UWqNir1YCeEy0OgKKiIiICMRG2IhxWSko85JfXBHucBpFyVUr4q0KsK/o5GWAAAR8RxOrmkmYYRh8tNvHsxvclPvAZobxfRxc292Oxfz96pH9yE7arZqG1VuMN6o9By58nCpXUoPjrlECCIBJHQFFREREpBqTyURytINou4nt4Q6mEZRctRKlHh85JScvAwQg4MNRvLPWxOpwZYCnvnCz5mCwY2DPBDOTB7vIiKl+r5Oz8BvSVs/EUlWOO7YLB4c+ht8R26CYay0BBHUEFBEREZETOv4L/9ZEyVUrsa+wErOlHr+uQFUwsfJXT6wMw+DjvVX8eb2bUq+BzQy/ONfBz3raa0zeiLwNtF07G7PfQ2ViLw4OnkHAFtmgeGstAQR1BBQRERGRM5aSqzNJHYlVkTvAM+vcrNgfXK3qGm/mwUEuOsXV7MwXeXAVqevmYQ5UUZ7Sn0MDp2JYnfUOofYSwKPUEVBEREREzmBKrs4UAT+O4l2Y/O5qh1fs8/GndW6OeAwsJrgl08FNvexYa1lqjd7zEW02/gkTAUrTLiLngt+A2Vavt6+zBPAYdQQUERERkTNcLcsLp4/f72fatGl06tQJl8tFly5dmDVrFkYdWzXfc889mEwmsrKyTjr2s88+S8eOHXE6nQwaNIjPP/+8maNvQQL+oytWlaFDJZ4AT6yq4LHPKjniMegUa+bPIyIZe66j1sQqNvstUjdmYSJAccZwcgZMrndiFeuykZEYUXdiFd0W4jsosRIRERGRM1pYV67mzp3Lc889x6JFi8jMzGTdunWMHz+e2NhYJk6cWO3cN954gzVr1pCWdvI24EuWLGHSpEk8//zzDBo0iKysLEaOHMnWrVtJSUk5VZcTHgE/jpJd1RKrVQd8ZH3hpshtYDbBjefY+Z9MBzZLLcmNYZCwdTGJ370MQFGXMRSce0e9EqETlgACmMwQm66OgCIiIiJyVgjrytWqVasYM2YMo0aNomPHjlx//fWMGDGixirTgQMH+PWvf83LL7+MzXby1ZT58+dz5513Mn78eHr16sXzzz9PREQEL7300qm6lPAwAthLdmOqCu4DUOo1mLemkhkrKilyG2TEmPnTsEjG93HWmVglbVkQSqwKe95Sr8TKYjKREu0kPcFVd2JltkJCFyVWIiIiInLWCOvK1dChQ3nhhRfYtm0b3bt3Z/PmzaxcuZL58+eHzgkEAowdO5bJkyeTmZl50jG9Xi/r169n6tSpoWNms5lhw4axevXqWl/j8XjweL5vAlFSUgKAz+fD5/M19vKaxbH3D/irqj9hBLCV7MGoqsAPfHGoiqx1bgorg6tV13W3M/ZcO3aLqfb27Yaf1M3PErd3KQC5595JUeefggHUUZYJEO2wkBAVLC30+evYPdvigNhOYLZDmH9+En7H5nC4/y6JNIbmr7Rmmr/SmrWk+duQGMKaXE2ZMoWSkhJ69uyJxWLB7/cze/ZsbrnlltA5c+fOxWq11igTrEtBQQF+v582bdpUO96mTRu+++67Wl8zZ84cZs6cWeP4hx9+SERERAOu6NTZ+9WaWo+7q+CNPWbW5AVXkJKdBrd09dMpuopdB2vf2doUqKL/nueJO/I5BiY2ZtzBPtvFsK+oGSPObsax5EywdOnScIcg0miav9Kaaf5Ka9YS5m9FRe2fqWsT1uTqtdde4+WXX+aVV14hMzOTTZs2cf/995OWlsa4ceNYv349Tz/9NBs2bMB0CpshTJ06lUmTJoUel5SUkJ6ezogRI4iJiTll71sfPp+PpUuXktF7cHCfK8PAVroXs6+MDblVPLXZTX6FgQm4upuNcb0dOK11/6xMVW7arZtD1JENGCYrB/pPJiJtKD3qON9igoRIBzF1Nas4njMeYturcYVUc2wODx8+vF5lvSItieavtGaav9KataT5e6yqrT7CmlxNnjyZKVOmcOONNwLQu3dv9uzZw5w5cxg3bhwrVqwgLy+PjIyM0Gv8fj+/+c1vyMrKYvfu3TXGTEpKwmKxkJubW+14bm4uqamptcbhcDhwOGpuamuz2cL+yzzGbLFiNluwl+zB4y7jxc1u3t4RXKJsG2nit4Nc9Ek58a/T7Csnbe1MXIXfELA4ODTod1Sm9KOuXadinFYSo2rvLlhDdFuIrv3nKwIt6++TSENp/kprpvkrrVlLmL8Nef+wJlcVFRWYzdUbIlgsFgKB4L08Y8eOZdiwYdWeHzlyJGPHjmX8+PG1jmm32+nfvz/Lli3j6quvBoL3bS1btowJEyY0/0WcAk8t3YbFbGLiFd2+P2gY2Ev3MH9lLiv3+yg/Wvo5uquNO/s6cdlO0oTCU0zaquk4i7PxWyM5OGQG7sRetZ7rtJpJinbgstVjs191BBQRERERAcKcXI0ePZrZs2eTkZFBZmYmGzduZP78+dx2220AJCYmkpiYWO01NpuN1NRUevT4vpDtiiuu4JprrgklT5MmTWLcuHFccMEFDBw4kKysLMrLy+tMyFoai9nE/KXbAPjljzoC4C/aw/0f5fNtoR+AlAgTvxnool/qyX+F1soC2n32CPay/VTZYzkwdBbeuM4139dkIiHSTlxEPbNzsxXiO4Ejqn7ni4iIiIicwcKaXD3zzDNMmzaNe++9l7y8PNLS0rj77ruZPn16g8bJzs6moKAg9PiGG24gPz+f6dOnk5OTw3nnncf7779fo8lFS3VsxWr+0m34/X5MpfDIGzmUeoPPX9XZxl3nO4k8yWoVgK3sIO0+ewRbZR4+VzIHhs7CF92+xnkNKgEEsDohoTNYa5ZTioiIiIicjcKaXEVHR5OVlUVWVla9X1PbfVa1HZswYUKrKQOszcQrumEYBk99tB2O3hUVYYVHLoxgQNv6/drsxbtpt+oRrJ4jeCPTOHDh41RFVN9EuUElgKGBoyGhE5gb8BoRERERkTNcWDcRlhO7b1h3ggtJJkzAyz+Nrndi5Ti8lfYrp2D1HMET05H9F8+tllhZTCaSoxykJ0Q0LLFyJUBiFyVWIiIiIiI/oOSqBfvTsu0EDDCbDAzgzW3eer3Olb+Z9p/9DouvjMqEnuy/6En8zvjQ8zFOKxmJEfW/t+qY6DSI76BW6yIiIiIitQhrWaDU7U/LtjN/6Tbuu7wLnSu3sq4kgkVbPAD8z7l13+cUeWgNqV/MxRzwUZF8HgcH/Q7D6gIaWQIIwY6AcRngij/5uSIiIiIiZyklVy3QscRq0vDu/PJHHXnvva3ckunAbDKdMMGK3vcxbTY8hckIUNZ2MDkXPIhhsWM2QWKko+ErVRDsCJjQGeyRTb0sEREREZEzmpKrFsgfMJg0vDsTr+iGz+cLHT+WUAUMo8ZrYne+S/KXz2PCoCT9cnLPvw/MloZ3ATyeOgKKiIiIiNSbkqsW6IHh3et8rrYVq/htr5H0zf8D4Einn5Df5y6cNmvjSgCPUUdAEREREZEGUXLVmhkGid8sImH76wAUdr+BI73+h+QoZ+NKAI9xJQTvsVLjChERERGRelNy1VoZflI2P0fs7vcByM+8jUDvn9OhsSWAx0SnQXTr2GxZRERERKQlUXLVGgWqaLPhKWL2f4KBiaJ+vyaq75jGlwCCOgKKiIiIiDSRkqtWxuT3kPr5k0TlfoFhslBx4RQSeg1v2qDqCCgiIiIi0mRKrloRk6+CtLWziCj4CsNiJzBsJpEdhjRtUHUEFBERERFpFkquWqKP5wS79F3yYOiQ2VtC+zWP4jyyHcNsxXTV77G07du091FHQBERERGRZqPkqiUyW+Dj2cH/HvoATl8RHT77I47SvQCYevwYmppYqSOgiIiIiEizUnLVEh1bsfp4NubyQi7a9m8c3rzgscxr4ML7mja+OgKKiIiIiDQ7JVct1SUPghHAsnwOoTYT5/4Mhv6q8WOqI6CIiIiIyCljDncAcgKXTsEwB/Nfw2xtWmJltkJiVyVWIiIiIiKniJKrluyTeZgCVfhNVkyBKtjw/xo3jtUJSd3Val1ERERE5BRSWWBL9ck8+Hg2/h9N4Z3SXvzE9DGWdS8Fn+v3i/qPo46AIiIiIiKnhZKrluhoYsVlvyMw9AF47z0C54/FYjZBQxIsdQQUERERETltlFy1RAE/XPa7YFMLn+/748cSqoD/5GOoI6CIiIiIyGml5Kolumxq3c+dbMVKHQFFRERERMJCydWZxGyFhM5qXCEiIiIiEgZKrs4UVmcwsbI6wh2JiIiIiMhZScnVmUAdAUVEREREwi6s+1z5/X6mTZtGp06dcLlcdOnShVmzZmEYRuicRx99lJ49exIZGUl8fDzDhg1j7dq1Jxz30UcfxWQyVfvTs2fPU3054RGRCIldlFiJiIiIiIRZWFeu5s6dy3PPPceiRYvIzMxk3bp1jB8/ntjYWCZOnAhA9+7d+fOf/0znzp2prKzkqaeeYsSIEezYsYPk5OQ6x87MzOSjjz4KPbZaz8BFOnUEFBERERFpMcKacaxatYoxY8YwatQoADp27Mirr77K559/Hjrn5ptvrvaa+fPns2DBAr788kuuuOKKOse2Wq2kpqaemsDDTR0BRURERERanLAmV0OHDuWFF15g27ZtdO/enc2bN7Ny5Urmz59f6/ler5cXXniB2NhY+vbte8Kxt2/fTlpaGk6nkyFDhjBnzhwyMjJqPdfj8eDxeEKPS0pKAPD5fPiO32cqDI69v88fCB4wWSG+I1gjqu+BJdJCheaw5qu0Qpq/0ppp/kpr1pLmb0NiMBnH3+B0mgUCAR5++GHmzZuHxWLB7/cze/Zspk6tvs/TO++8w4033khFRQVt27blzTffZMCAAXWO+5///IeysjJ69OjBoUOHmDlzJgcOHGDLli1ER0fXOP/RRx9l5syZNY6/8sorRERENP1CRURERESkVaqoqODmm2+muLiYmJiYE54b1uRq8eLFTJ48md///vdkZmayadMm7r//fubPn8+4ceNC55WXl3Po0CEKCgp48cUX+e9//8vatWtJSUmp1/scOXKEDh06MH/+fG6//fYaz9e2cpWenk5BQcFJf4Cnms/nY+nSpQzv3w1bcmc1rpBWJzSHhw/HZrOFOxyRBtH8ldZM81das5Y0f0tKSkhKSqpXchXWssDJkyczZcoUbrzxRgB69+7Nnj17mDNnTrXkKjIykq5du9K1a1cGDx5Mt27dWLBgQY0VrrrExcXRvXt3duzYUevzDocDh6Pm/lA2my3sv8xjbCldsdnt4Q5DpNFa0t8nkYbS/JXWTPNXWrOWMH8b8v5hbcVeUVGB2Vw9BIvFQiAQOOHrAoFAtZWmkykrKyM7O5u2bds2Ks4WwWQKdwQiIiIiInICYU2uRo8ezezZs3n33XfZvXs3b7zxBvPnz+eaa64BguWADz/8MGvWrGHPnj2sX7+e2267jQMHDvCzn/0sNM4VV1zBn//859Dj3/72t3zyySfs3r2bVatWcc0112CxWLjppptO+zWKiIiIiMjZIaxlgc888wzTpk3j3nvvJS8vj7S0NO6++26mT58OBFexvvvuOxYtWkRBQQGJiYkMGDCAFStWkJmZGRonOzubgoKC0OP9+/dz0003UVhYSHJyMhdddBFr1qw54b5YIiIiIiIiTRHW5Co6OpqsrCyysrJqfd7pdPLvf//7pOPs3r272uPFixc3Q3QiIiIiIiL1F9ayQBERERERkTOFkisREREREZFmoORKRERERESkGSi5EhERERERaQZKrkRERERERJqBkisREREREZFmoORKRERERESkGSi5EhERERERaQZKrkRERERERJqBNdwBtESGYQBQUlIS5kjA5/NRUVFBSUkJNpst3OGINJjmsLRmmr/Smmn+SmvWkubvsZzgWI5wIkqualFaWgpAenp6mCMREREREZGWoLS0lNjY2BOeYzLqk4KdZQKBAAcPHiQ6OhqTyRTWWEpKSkhPT2ffvn3ExMSENRaRxtAcltZM81daM81fac1a0vw1DIPS0lLS0tIwm098V5VWrmphNptp3759uMOoJiYmJuwTS6QpNIelNdP8ldZM81das5Yyf0+2YnWMGlqIiIiIiIg0AyVXIiIiIiIizUDJVQvncDiYMWMGDocj3KGINIrmsLRmmr/Smmn+SmvWWuevGlqIiIiIiIg0A61ciYiIiIiINAMlVyIiIiIiIs1AyZWIiIiIiEgzUHIlIiIiIiLSDJRctQDPPvssHTt2xOl0MmjQID7//PM6z33xxRe5+OKLiY+PJz4+nmHDhp3wfJFTrSHz93iLFy/GZDJx9dVXn9oARU6ioXP4yJEj/OpXv6Jt27Y4HA66d+/Oe++9d5qiFamuofM3KyuLHj164HK5SE9P54EHHsDtdp+maEWCPv30U0aPHk1aWhomk4k333zzpK9Zvnw5/fr1w+Fw0LVrVxYuXHjK42wMJVdhtmTJEiZNmsSMGTPYsGEDffv2ZeTIkeTl5dV6/vLly7npppv4+OOPWb16Nenp6YwYMYIDBw6c5shFGj5/j9m9eze//e1vufjii09TpCK1a+gc9nq9DB8+nN27d/P666+zdetWXnzxRdq1a3eaIxdp+Px95ZVXmDJlCjNmzODbb79lwYIFLFmyhIcffvg0Ry5nu/Lycvr27cuzzz5br/N37drFqFGjuOyyy9i0aRP3338/d9xxBx988MEpjrQRDAmrgQMHGr/61a9Cj/1+v5GWlmbMmTOnXq+vqqoyoqOjjUWLFp2qEEXq1Jj5W1VVZQwdOtT429/+ZowbN84YM2bMaYhUpHYNncPPPfec0blzZ8Pr9Z6uEEXq1ND5+6tf/cq4/PLLqx2bNGmSceGFF57SOEVOBDDeeOONE57z4IMPGpmZmdWO3XDDDcbIkSNPYWSNo5WrMPJ6vaxfv55hw4aFjpnNZoYNG8bq1avrNUZFRQU+n4+EhIRTFaZIrRo7fx977DFSUlK4/fbbT0eYInVqzBx+6623GDJkCL/61a9o06YN5557Lk888QR+v/90hS0CNG7+Dh06lPXr14dKB3fu3Ml7773HVVdddVpiFmms1atXV5vrACNHjqz35+XTyRruAM5mBQUF+P1+2rRpU+14mzZt+O677+o1xkMPPURaWlqNCSdyqjVm/q5cuZIFCxawadOm0xChyIk1Zg7v3LmT//73v9xyyy2899577Nixg3vvvRefz8eMGTNOR9giQOPm780330xBQQEXXXQRhmFQVVXFPffco7JAafFycnJqneslJSVUVlbicrnCFFlNWrlqxZ588kkWL17MG2+8gdPpDHc4IidUWlrK2LFjefHFF0lKSgp3OCKNEggESElJ4YUXXqB///7ccMMN/O53v+P5558Pd2giJ7V8+XKeeOIJ/vKXv7Bhwwb+/e9/8+677zJr1qxwhyZyxtDKVRglJSVhsVjIzc2tdjw3N5fU1NQTvvYPf/gDTz75JB999BF9+vQ5lWGK1Kqh8zc7O5vdu3czevTo0LFAIACA1Wpl69atdOnS5dQGLXKcxvwb3LZtW2w2GxaLJXTsnHPOIScnB6/Xi91uP6UxixzTmPk7bdo0xo4dyx133AFA7969KS8v56677uJ3v/sdZrO+c5eWKTU1tda5HhMT06JWrUArV2Flt9vp378/y5YtCx0LBAIsW7aMIUOG1Pm6efPmMWvWLN5//30uuOCC0xGqSA0Nnb89e/bkq6++YtOmTaE/P/3pT0Odf9LT009n+CKN+jf4wgsvZMeOHaEvBgC2bdtG27ZtlVjJadWY+VtRUVEjgTr2RYFhGKcuWJEmGjJkSLW5DrB06dITfl4Om3B31DjbLV682HA4HMbChQuNb775xrjrrruMuLg4IycnxzAMwxg7dqwxZcqU0PlPPvmkYbfbjddff904dOhQ6E9paWm4LkHOYg2dvz+kboESbg2dw3v37jWio6ONCRMmGFu3bjXeeecdIyUlxXj88cfDdQlyFmvo/J0xY4YRHR1tvPrqq8bOnTuNDz/80OjSpYvx85//PFyXIGep0tJSY+PGjcbGjRsNwJg/f76xceNGY8+ePYZhGMaUKVOMsWPHhs7fuXOnERERYUyePNn49ttvjWeffdawWCzG+++/H65LqJPKAsPshhtuID8/n+nTp5OTk8N5553H+++/H7ppb+/evdW+ZXruuefwer1cf/311caZMWMGjz766OkMXaTB81ekpWnoHE5PT+eDDz7ggQceoE+fPrRr14777ruPhx56KFyXIGexhs7fRx55BJPJxCOPPMKBAwdITk5m9OjRzJ49O1yXIGepdevWcdlll4UeT5o0CYBx48axcOFCDh06xN69e0PPd+rUiXfffZcHHniAp59+mvbt2/O3v/2NkSNHnvbYT8ZkGFoHFhERERERaSp9pSwiIiIiItIMlFyJiIiIiIg0AyVXIiIiIiIizUDJlYiIiIiISDNQciUiIiIiItIMlFyJiIiIiIg0AyVXIiIiIiIizUDJlYiIiIiISDNQciUiIqfE8uXLMZlMHDlyJNyhALBw4ULi4uIa9BqTycSbb755wnMKCwtJSUlh9+7djY6tudQn3oYaPHgw//rXv5p1TBGRM5WSKxERqSE/P59f/vKXZGRk4HA4SE1NZeTIkXz22WfhDq3FmT17NmPGjKFjx47hDuWUeOSRR5gyZQqBQCDcoYiItHhKrkREpIbrrruOjRs3smjRIrZt28Zbb73FpZdeSmFhYbhDa1EqKipYsGABt99+e7hDOWV+/OMfU1payn/+859whyIi0uIpuRIRkWqOHDnCihUrmDt3LpdddhkdOnRg4MCBTJ06lZ/+9KcA7N69G5PJxKZNm6q9zmQysXz58mrjffbZZ/Tp0wen08ngwYPZsmVL6LljpXrvvPMOPXr0ICIiguuvv56KigoWLVpEx44diY+PZ+LEifj9/tDrioqK+MUvfkF8fDwRERH8+Mc/Zvv27dXed+HChWRkZBAREcE111xTa2L4f//3f/Tr1w+n00nnzp2ZOXMmVVVV9f5ZvffeezgcDgYPHhw6dqwc8t13363zun/o4YcfZtCgQTWO9+3bl8ceewyAL774guHDh5OUlERsbCyXXHIJGzZsqHPM2soyN23ahMlkqlbCuHLlSi6++GJcLhfp6elMnDiR8vLy0PMWi4WrrrqKxYsX1+dHIiJyVlNyJSIi1URFRREVFcWbb76Jx+Np8niTJ0/mj3/8I1988QXJycmMHj0an88Xer6iooI//elPLF68mPfff5/ly5dzzTXX8N577/Hee+/xj3/8g7/+9a+8/vrrodfceuutrFu3jrfeeovVq1djGAZXXXVVaNy1a9dy++23M2HCBDZt2sRll13G448/Xi2uFStW8Itf/IL77ruPb775hr/+9a8sXLiQ2bNn1/vaVqxYQf/+/Rt13ce75ZZb+Pzzz8nOzg4d+/rrr/nyyy+5+eabASgtLWXcuHGsXLmSNWvW0K1bN6666ipKS0vrHe8PZWdnc+WVV3Ldddfx5ZdfsmTJElauXMmECROqnTdw4EBWrFjR6PcRETlrGCIiIj/w+uuvG/Hx8YbT6TSGDh1qTJ061di8eXPo+V27dhmAsXHjxtCxoqIiAzA+/vhjwzAM4+OPPzYAY/HixaFzCgsLDZfLZSxZssQwDMP4+9//bgDGjh07QufcfffdRkREhFFaWho6NnLkSOPuu+82DMMwtm3bZgDGZ599Fnq+oKDAcLlcxmuvvWYYhmHcdNNNxlVXXVXtmm644QYjNjY29PiKK64wnnjiiWrn/OMf/zDatm0begwYb7zxRp0/pzFjxhi33XZbtWP1ue7a9O3b13jsscdCj6dOnWoMGjSozvP9fr8RHR1tvP3227XGeyyOoqKi0PMbN240AGPXrl2GYRjG7bffbtx1113Vxl2xYoVhNpuNysrK0LH/+7//M8xms+H3++uMR0REDEMrVyIiUsN1113HwYMHeeutt7jyyitZvnw5/fr1Y+HChQ0ea8iQIaH/TkhIoEePHnz77behYxEREXTp0iX0uE2bNnTs2JGoqKhqx/Ly8gD49ttvsVqt1croEhMTq4377bff1iizOz4OgM2bN/PYY4+FVuqioqK48847OXToEBUVFfW6tsrKSpxOZ4Ov+/j3vOeee4Dg6tUrr7wCgGEYvPrqq9xyyy2hMXJzc7nzzjvp1q0bsbGxxMTEUFZWxt69e+sVa202b97MwoULq8UzcuRIAoEAu3btCp3ncrkIBALNspIpInIms4Y7ABERaZmcTifDhw9n+PDhTJs2jTvuuIMZM2Zw6623YjYHv5szDCN0fl0lbydjs9mqPTaZTLUea+5udWVlZcycOZNrr722xnN1JUw/lJSURFFRUYPf+/h71WJiYgC46aabeOihh9iwYQOVlZXs27ePG264IXTeuHHjKCws5Omnn6ZDhw44HA6GDBmC1+ut9T3q8zsqKyvj7rvvZuLEiTVen5GREfrvw4cPExkZicvlavC1ioicTZRciYhIvfTq1Su0h1JycjIAhw4d4vzzzweqJwzHW7NmTeiDelFREdu2beOcc85pdBznnHMOVVVVrF27lqFDhwLBvaa2bt1Kr169QuesXbu2RhzH69evH1u3bqVr166NjuX888/nn//8Z63Pnei6a3vP9u3bc8kll/Dyyy9TWVnJ8OHDSUlJCT3/2Wef8Ze//IWrrroKgH379lFQUFBnbMf/juLj44Gav6N+/frxzTffnPRnsGXLltDvWURE6qbkSkREqiksLORnP/sZt912G3369CE6Opp169Yxb948xowZAwTLxAYPHsyTTz5Jp06dyMvL45FHHql1vMcee4zExETatGnD7373O5KSkrj66qsbHV+3bt0YM2YMd955J3/961+Jjo5mypQptGvXLhTfxIkTufDCC/nDH/7AmDFj+OCDD3j//ferjTN9+nR+8pOfkJGRwfXXX4/ZbGbz5s1s2bKlRvOLuowcOZKpU6dSVFQUSmCact233HILM2bMwOv18tRTT9W47n/84x9ccMEFlJSUMHny5BOuJHXt2pX09HQeffRRZs+ezbZt2/jjH/9Y7ZyHHnqIwYMHM2HCBO644w4iIyP55ptvWLp0KX/+859D561YsYIRI0bU62ciInI20z1XIiJSTVRUFIMGDeKpp57iRz/6Eeeeey7Tpk3jzjvvrPaB+6WXXqKqqor+/ftz//3315mQPPnkk9x3333079+fnJwc3n77bex2e5Ni/Pvf/07//v35yU9+wpAhQzAMg/feey9UTjh48GBefPFFnn76afr27cuHH35YI/kbOXIk77zzDh9++CEDBgxg8ODBPPXUU3To0KHecfTu3Zt+/frx2muvNct1X3/99RQWFlJRUVEjEVuwYAFFRUX069ePsWPHMnHixGorWz9ks9l49dVX+e677+jTpw9z586t8Tvq06cPn3zyCdu2bePiiy/m/PPPZ/r06aSlpYXOOXDgAKtWrWL8+PH1+ImIiJzdTMbxxdgiIiLSIO+++y6TJ09my5YtmM1mli9fzmWXXUZRURFxcXHhDq/JHnroIYqKinjhhRfCHYqISIunskAREZEmGDVqFNu3b+fAgQOkp6eHO5xml5KSwqRJk8IdhohIq6CVKxERkWZ0pq1ciYhI/Sm5EhERERERaQZqaCEiIiIiItIMlFyJiIiIiIg0AyVXIiIiIiIizUDJlYiIiIiISDNQciUiIiIiItIMlFyJiIiIiIg0AyVXIiIiIiIizUDJlYiIiIiISDP4/59mHQc4L3PiAAAAAElFTkSuQmCC",
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ }
+ ],
+ "source": [
+ "grouped_df = df.groupby([\"kd\", \"global_round\", \"p\"])\n",
+ "df_mean = grouped_df[[\"loss\", \"accuracy\"]].mean()\n",
+ "df_std = grouped_df[[\"loss\", \"accuracy\"]].std()\n",
+ "\n",
+ "df_plot = df_mean.merge(\n",
+ " df_std, left_index=True, right_index=True, suffixes=(\"_mean\", \"_std\")\n",
+ ")\n",
+ "df_plot = df_plot.loc[:, 500, :]\n",
+ "grouped_df = df_plot.reset_index().groupby(\"kd\")\n",
+ "\n",
+ "plt.figure(figsize=(10, 4))\n",
+ "for i, (group_name, group_data) in enumerate(grouped_df):\n",
+ " label = \"FjORD w/ KD\" if group_name else \"FjORD\"\n",
+ " plt.plot(group_data.p, group_data.accuracy_mean * 100, label=label, marker=\"x\")\n",
+ " plt.fill_between(\n",
+ " group_data.p,\n",
+ " (group_data.accuracy_mean - group_data.accuracy_std) * 100,\n",
+ " (group_data.accuracy_mean + group_data.accuracy_std) * 100,\n",
+ " alpha=0.2,\n",
+ " )\n",
+ "\n",
+ "plt.legend()\n",
+ "plt.grid()\n",
+ "plt.title(\"ResNet18 - CIFAR10 - 500 global rounds\")\n",
+ "plt.xlabel(\"Submodel (p-value)\")\n",
+ "plt.ylabel(\"Accuracy (%)\")\n",
+ "plt.xticks(np.linspace(0.2, 1, 5))\n",
+ "\n",
+ "plt.savefig(\n",
+ " \"../_static/resnet18_cifar10_500_global_rounds_acc_pvalues.png\",\n",
+ " dpi=300,\n",
+ " bbox_inches=\"tight\",\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 8,
+ "metadata": {},
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "False\n",
+ "True\n"
+ ]
+ },
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAA04AAAGJCAYAAAC90mOkAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8qNh9FAAAACXBIWXMAAA9hAAAPYQGoP6dpAAEAAElEQVR4nOzdd5hVxf348fecc/u92ytLl95BkCIIi1JERLEnJmILilFjYtTE2BDbLyoRv3ajERXUBKNo1ChYEBEQUFB6h6Vs73f39jO/PxY2rtRdFhbk83qe++gpM+dzzs7y3M/OnBmltdYIIYQQQgghhDggo6kDEEIIIYQQQojjnSROQgghhBBCCHEIkjgJIYQQQgghxCFI4iSEEEIIIYQQhyCJkxBCCCGEEEIcgiROQgghhBBCCHEIkjgJIYQQQgghxCFI4iSEEEIIIYQQhyCJkxBCCCGEEEIcgiROQgghhDik6dOno5Ri27ZtTR2KEEI0CUmchBCiHvZ+edz7sdlsNG/enKuuuopdu3YdtetOnjwZpRQZGRlUV1fvc7xNmzace+65Dar72WefZfr06fs99tBDD3HeeeeRkZGBUorJkycfsJ5PP/2U4cOHk5qaSmJiIv379+f1119vUEyH491332XMmDGkpqbicDjIysri0ksv5fPPP689Z968eSilePvtt2v3/fRn+OPPn//85zrXePbZZ1FKMWDAgAPG8dM64uPjGTZsGB9++OE+5/r9fu677z7OPvtskpOTUUod8NkDrF27lrPPPhufz0dycjJXXHEFhYWF9XhKh6dNmzYHfCbBYPCgZbXWvP766wwdOpTExEQ8Hg89evRgypQpVFVV7XN+dnZ2nfrdbjc9e/Zk2rRpWJZV59xt27bVOddut5Oamsrpp5/OX/7yF3Jychr1OQghxMHYmjoAIYQ4EU2ZMoW2bdsSDAZZvHgx06dPZ8GCBaxatQqXy3XUrltQUMBzzz3HH//4x0ar89lnnyU1NZWrrrpqn2N33303mZmZ9OnTh08++eSAdbz//vuMHz+eQYMG1SZ5//rXv5gwYQJFRUX84Q9/aLR4tdZcc801TJ8+nT59+nDrrbeSmZlJbm4u7777LmeddRZff/01p59++kHr2fsz/LHu3bvX2Z45cyZt2rRhyZIlbNq0ifbt2++3rpEjRzJhwgS01mzfvp3nnnuOcePG8d///pfRo0fXnldUVMSUKVNo1aoVvXr1Yt68eQeMb+fOnQwdOpSEhAQefvhh/H4/jz/+OCtXrmTJkiU4HI5DPKn66d27937b1d7rXHHFFfziF7/A6XTWHovFYlx++eX861//4owzzmDy5Ml4PB6++uor7r//fmbNmsWnn35KRkZGnTpbtGjBI488AtQ8kzfeeIM//OEPFBYW8tBDD+0Twy9/+UvOOeccLMuitLSUpUuXMm3aNJ588klefvllfvGLXzTmoxBCiP3TQgghDtsrr7yiAb106dI6+//0pz9pQP/zn/88Kte97777NKB79+6tMzIydHV1dZ3jrVu31mPHjm1Q3d26ddPDhg3b77GtW7dqrbUuLCzUgL7vvvv2e97IkSN1VlaWDgaDtfsikYhu166d7tmzZ4PiOpDHHntMA/r3v/+9tixrn+Ovvfaa/uabb7TWWn/xxRca0LNmzao9fqCf4U9t2bJFA/qdd97RaWlpevLkyfs9D9A33nhjnX1r1qzRgB4zZkyd/cFgUOfm5mqttV66dKkG9CuvvLLfem+44Qbtdrv19u3ba/fNnTtXA/qFF144aOz11dD28/DDD2tA33bbbfsce//997VhGPrss8+us3/YsGG6W7dudfYFAgHdunVrHRcXp6PRaO3+rVu3akA/9thj+9S/bds23bFjR+1wOPSKFSvqHbsQQtSXDNUTQohGcMYZZwCwefPmOvvXrVvHxRdfTHJyMi6Xi379+vH+++/XOScSiXD//ffToUMHXC4XKSkpDBkyhLlz5+5znXvvvZf8/Hyee+65Q8ZkWRbTpk2jW7duuFwuMjIyuP766yktLa09p02bNqxevZovv/yydjhUdnZ2neOHo6KigqSkpDq9ETabjdTUVNxu92HVcTgCgQCPPPIInTt35vHHH0cptc85V1xxBf379z/ia82cOZOkpCTGjh3LxRdfzMyZMw+7bJcuXUhNTd2nPTidTjIzMw+rjn//+9+ce+65tGrVqnbfiBEj6NixI//6178OO5bG8tN3nAKBAI899hgdO3as7T36sXHjxnHllVfy8ccfs3jx4oPW7XK5OO2006isrKSgoOCw4mndujXTp08nHA7z6KOP1vt+hBCiviRxEkKIRrD3y2RSUlLtvtWrVzNw4EDWrl3Ln//8Z6ZOnYrX62X8+PG8++67tedNnjyZ+++/n+HDh/P0009z11130apVK7777rt9rnPGGWdw5pln8uijjxIIBA4a0/XXX8/tt9/O4MGDefLJJ7n66quZOXMmo0ePJhKJADBt2jRatGhB586def3113n99de566676n3/2dnZrF69mnvuuYdNmzaxefNmHnjgAZYtW8Ydd9xR7/oOZMGCBZSUlHD55ZdjmuYR1VVeXk5RUVGdz4/NnDmTCy+8EIfDwS9/+Us2btzI0qVLD7vu0tLSOu2hPnbt2kVBQQH9+vXb51j//v1Zvnx5g+o9mEgkss/z2N/7dHstWLCA0tJSLr/8cmy2/Y/8nzBhAgAffPDBIa+/932mxMTEw4550KBBtGvXbr9/ZBBCiMYm7zgJIUQD7P3SHQwG+eabb7j//vtxOp11Jmi45ZZbaNWqFUuXLq3tifntb3/LkCFD+NOf/sQFF1wAwIcffsg555zDiy++eFjXvu+++xg2bBjPP//8Ad8dWrBgAS+99BIzZ87k8ssvr90/fPhwzj77bGbNmsXll1/O+PHjufvuu0lNTeXXv/51Qx8H99xzD1u3buWhhx7iwQcfBMDj8fDvf/+b888/v8H1/tTatWsB6NGjxxHXNWLEiH32aa0B+Pbbb1m3bh1PPfUUAEOGDKFFixbMnDmT0047bZ9ywWCQoqIitNbk5ORw9913E4vFuPjiixsUW25uLgDNmjXb51izZs0oKSkhFArV6eE7UnPmzCEtLa3Ovvvuu++AE4KsWbMGgF69eh2wzr3H9v7c9orFYrWJanFxMS+//DLLli1j7Nix9e6h7N69O++99x4VFRXEx8fXq6wQQtSHJE5CCNEAP/3S3aZNG2bMmEGLFi0AKCkp4fPPP2fKlClUVlZSWVlZe+7o0aO577772LVrF82bNycxMZHVq1ezceNGOnTocMhrDx06lOHDh/Poo48yadKk/X7RnDVrFgkJCYwcObJOT0rfvn3x+Xx88cUXdRKqI+V0OunYsSMXX3wxF154IbFYjBdffJFf//rXzJ07l4EDBzbKdSoqKgCIi4s74rqeeeYZOnbsuN9jM2fOJCMjg+HDhwM1M+dddtllzJgxg6lTp+7T2/Xyyy/z8ssv127b7XbuuOMObr311gbFtrc3cX+J0d7JRwKBQKMmTgMGDKhNevc65ZRTDnj+3jZ9sJ/F3mN7f257rVu3bp8k7bzzzqvzDA+Xz+erjUcSJyHE0SSJkxBCNMDeL93l5eX84x//YP78+XW+xG7atAmtNffccw/33HPPfusoKCigefPmTJkyhfPPP5+OHTvSvXt3zj77bK644gp69ux5wOtPnjz5oL1OGzdupLy8nPT09ANeuzHddNNNLF68mO+++w7DqBkFfumll9KtWzduueUWvvnmmwOWLSkpIRwO12673W4SEhL2e+7eL8Y/TkQbqn///vsdCheLxXjrrbcYPnw4W7durd0/YMAApk6dymeffcaoUaPqlDn//PO56aabCIfDLF26lIcffpjq6uraZ1Ffe5PhUCi0z7G904MfrGemsLCQWCxWu+3z+WoTjANJTU3dby/cgexNig72szhQctWmTRv+/ve/Y1kWmzdv5qGHHqKwsLBBM1L6/f79XkMIIRqbJE5CCNEAP/7SPX78eIYMGcLll1/O+vXr8fl8tevR3HbbbXWmo/6xvVNbDx06lM2bN/Pee+8xZ84cXnrpJZ544gmef/55fvOb3+y37NChQ8nOzq7tdfopy7JIT08/4IQGP/1r/5EIh8O8/PLL3HHHHXUSBbvdzpgxY3j66acJh8MHnD77wgsv5Msvv6zdvvLKKw+4tlHnzp0BWLlyJePHj2+0e/ixzz//nNzcXN566y3eeuutfY7PnDlzn8SpRYsWtUnHOeecQ2pqKjfddBPDhw/nwgsvrHcMe4fo7R2y92O5ubkkJycftLfptNNOY/v27bXbBxty11BdunQB4Icffjjgz+KHH34AoGvXrnX2e73eOkna4MGDOfXUU/nLX/7C//3f/9UrjlWrVpGeni69TUKIo04SJyGEOEKmafLII4/UTu7w5z//uXaIk91uP6y/4icnJ3P11Vdz9dVX4/f7GTp0KJMnTz5g4gQ1vU7Z2dm88MIL+xxr164dn376KYMHDz7kOyP7m5muPoqLi4lGo3V6OPaKRCJYlrXfY3tNnTq1zkx/WVlZBzx3yJAhJCUl8eabb/KXv/zliCeI2J+ZM2eSnp7OM888s8+xd955h3fffZfnn3/+oM/1+uuv54knnuDuu+/mggsuqPczbt68OWlpaSxbtmyfY0uWLKF3796HvIcfTx5ysCF3DTVkyBASExN54403uOuuu/b7s3jttdcADrk4c8+ePfn1r3/NCy+8wG233VZnJsGDWbRoEZs3bz6i9/OEEOJwyax6QgjRCLKzs+nfvz/Tpk0jGAySnp5em9Tsr9egsLCw9v+Li4vrHPP5fLRv336/w7R+bNiwYWRnZ/PXv/61dvjWXpdeeimxWIwHHnhgn3LRaJSysrLaba/XW2e7vtLT00lMTOTdd9+tM+TO7/fzn//8h86dOx80yejbty8jRoyo/fy0d+LHPB4Pf/rTn1i7di1/+tOfaidz+LEZM2awZMmSBt1LIBDgnXfe4dxzz+Xiiy/e53PTTTdRWVm5z5TyP2Wz2fjjH//I2rVree+99xoUy0UXXcQHH3zAjh07avd99tlnbNiwgUsuueSgZQcPHlznmR6NxMnj8XDbbbexfv36/c7E+OGHHzJ9+nRGjx59WO+43XHHHUQiEf72t78d1vW3b9/OVVddhcPh4Pbbb693/EIIUV/S4ySEEI3k9ttv55JLLmH69OlMmjSJZ555hiFDhtCjRw8mTpzIKaecQn5+PosWLWLnzp18//33QM0wpuzsbPr27UtycjLLli3j7bff5qabbjrkNe+7777aCQx+bNiwYVx//fU88sgjrFixglGjRmG329m4cSOzZs3iySefrJ3xrW/fvjz33HM8+OCDtG/fnvT0dM4880wAXn/9dbZv3147LfX8+fNrJxC44ooraN26NaZpctttt3H33XczcOBAJkyYQCwW4+WXX2bnzp3MmDGjUZ7vXrfffjurV69m6tSpfPHFF1x88cVkZmaSl5fH7NmzWbJkCQsXLmxQ3e+//z6VlZWcd955+z0+cOBA0tLSmDlzJpdddtlB67rqqqu49957+etf/1pnKNvTTz9NWVkZu3fvBuA///kPO3fuBODmm2+ufb/rL3/5C7NmzWL48OHccsst+P1+HnvsMXr06MHVV1/doPtrbH/+859Zvnw5f/3rX1m0aBEXXXQRbrebBQsWMGPGDLp06cKrr756WHV17dqVc845h5deeol77rmHlJSU2mPfffcdM2bMwLIsysrKWLp0Kf/+979RSvH6668f9H1AIYRoNE27/q4QQpxYXnnlFQ3opUuX7nMsFovpdu3a6Xbt2uloNKq11nrz5s16woQJOjMzU9vtdt28eXN97rnn6rfffru23IMPPqj79++vExMTtdvt1p07d9YPPfSQDofDtefcd999GtCFhYX7XHfYsGEa0GPHjt3n2Isvvqj79u2r3W63jouL0z169NB33HGH3r17d+05eXl5euzYsTouLk4DetiwYfvUvb/PF198UedaM2fOrHMfAwYMqHOfje3tt9/Wo0aN0snJydpms+lmzZrpyy67TM+bN6/2nC+++EIDetasWbX7DvYzHDdunHa5XLqqquqA173qqqu03W7XRUVFWmutAX3jjTfu99zJkyfv86xat259wGe6devWOuVXrVqlR40apT0ej05MTNS/+tWvdF5e3uE8nnpp3br1ftvPj+19bj+NMRaL6VdeeUUPHjxYx8fHa5fLpbt166bvv/9+7ff796ln2LBhulu3bvu9xrx58zSg77vvPq211lu3bq3zfGw2m05OTtYDBgzQd955p96+fXuD7lcIIRpCab2fcQ5CCCGEED/y8ssv85vf/IYdO3bUTrsvhBAnE3nHSQghhBCHlJubi1KK5OTkpg5FCCGahLzjJIQQQogDys/P5+233+b5559n0KBBeDyepg5JCCGahPQ4CSGEEOKA1q5dy+2330779u0PuL6WEEKcDOQdJyGEEEIIIYQ4BOlxEkIIIYQQQohDkMRJCCGEEEIIIQ7hpJscwrIsdu/eTVxcHEqppg5HCCGEEEII0US01lRWVpKVlYVhHLxP6aRLnHbv3k3Lli2bOgwhhBBCCCHEceJw1qg76RKnuLg4oObhxMfHN0qdkUiEOXPmMGrUKOx2e6PUKU4e0n7EkZD2I46EtB/RUNJ2xJE4ntpPRUUFLVu2rM0RDuakS5z2Ds+Lj49v1MTJ4/EQHx/f5D98ceKR9iOOhLQfcSSk/YiGkrYjjsTx2H4O5xWeJp0c4pFHHuG0004jLi6O9PR0xo8fz/r16w9ZbtasWXTu3BmXy0WPHj346KOPjkG0QgghhBBCiJNVkyZOX375JTfeeCOLFy9m7ty5RCIRRo0aRVVV1QHLLFy4kF/+8pdce+21LF++nPHjxzN+/HhWrVp1DCMXQgghhBBCnEyadKjexx9/XGd7+vTppKen8+233zJ06ND9lnnyySc5++yzuf322wF44IEHmDt3Lk8//TTPP//8UY9ZCCGEEEIIcfI5rt5xKi8vByA5OfmA5yxatIhbb721zr7Ro0cze/bs/Z4fCoUIhUK12xUVFUDN2MpIJHKEEVNb14//K0R9SPsRR0LajzgS0n5EQ0nbaRxaa2KxGLFYDK11U4dzzESjUWw2G36/H5vt6KYjSilsNhumae73eH3a8HGTOFmWxe9//3sGDx5M9+7dD3heXl4eGRkZdfZlZGSQl5e33/MfeeQR7r///n32z5kzB4/Hc2RB/8TcuXMbtT5xcpH2I46EtB9xJKT9iIaSttNwhmGQmJiI2+0+KdcWzczMZMuWLcfkWtFolJKSEsLh8D7HqqurD7ue4yZxuvHGG1m1ahULFixo1HrvvPPOOj1Ue6ccHDVqVKPOqjd37lxGjhx53MwMIk4c0n7EkZD2I46EtB/RUNJ2joxlWWzduhXTNElLS8Nut59UyZPWmqqqKrxe71G/b601xcXFxMfH07Zt2316nvaORjscx0XidNNNN/HBBx8wf/78Qy48lZmZSX5+fp19+fn5ZGZm7vd8p9OJ0+ncZ7/dbm/0X/SjUac4eUj7EUdC2o84EtJ+RENJ22mYYDCI1prmzZs3+gioE4FlWUQiEdxuN4Zx9OeqMwyjdvK5n7bX+rTfJp1VT2vNTTfdxLvvvsvnn39O27ZtD1lm0KBBfPbZZ3X2zZ07l0GDBh2tMIUQQgghhGh0xyJpEIe3RtPhaNIepxtvvJE33niD9957j7i4uNr3lBISEnC73QBMmDCB5s2b88gjjwBwyy23MGzYMKZOncrYsWN56623WLZsGS+++GKT3YcQQgghhBDi561J09znnnuO8vJysrOzadasWe3nn//8Z+05OTk55Obm1m6ffvrpvPHGG7z44ov06tWLt99+m9mzZx90QgkhhBBCCCGEOBJN2uN0ONMuzps3b599l1xyCZdccslRiOjY0zoKWCjlaOpQhBBCCCGEEAcgAyubmg6iYwVoHTr0uUIIIYQQQvyM5OTkMHbsWDweD+np6dx+++1Eo9EDnr9t2zauvfZa2rZti9vtpl27dtx33337nWq8sR0Xs+qd3DRYfjQKzHSUkplphBBCCCHEz18sFmPs2LFkZmaycOFCcnNzmTBhAna7nYcffni/ZdatW4dlWbzwwgu0b9+eVatWMXHiRKqqqnj88ceParySOB0PlAG6Ch0rBjMNpfa/srEQQgghhPh50loTDh79XpOfcrgc9Zp1Ljs7u3Zugddffx273c4NN9zAlClT6j173Zw5c1izZg2ffvopGRkZ9O7dmwceeIA//elPTJ48GYdj31dZzj77bM4+++za7VNOOYX169fz3HPPSeJ00lA+0JVoywZGykm1CJoQQgghxMkuHAzzh6H3HvPrPjF/Ck73vmueHsyrr77Ktddey5IlS1i2bBnXXXcdrVq1YuLEiUyaNIkZM2YctPzeRWcXL15Mjx49yMjIqD02evRobrjhBlavXk2fPn0OK57y8nKSk5PrdQ8NIYlTE9I6gA58AEYyytEbjQesEjQmykxq6vCEEEIIIYTYR8uWLXniiSdQStGpUydWrlzJE088wcSJE5kyZQq33XbbYdWTl5dXJ2kCarf3LlN0KJs2beKpp5466r1NIIlT06p+AwLvg60dOHqjlA2NqyZ5UnaU4WvqCIUQQgghxDHgcDl4Yv6UJrlufQ0cOLDO6KhBgwYxdepUYrEY6enppKenH7S8ZVn1vub+7Nq1i7PPPptLLrmEiRMnNkqdByOJU1NyjYXABxBZh45uRdnaopQTjYWOFYIyUcrd1FEKIYQQQoijTClV7yFzx6P6DNXLzMxk6dKldY7l5+fXHjuY3bt3M3z4cE4//XRefPHFI4j48Eni1ISUmYl2DILQPAh+BL4ba/YrN1r7a5InM1PWeBJCCCGEEMeNb775ps724sWL6dChA6Zp1muo3sCBA3n44YcpKCio7aWaO3cu8fHxdO3a9YDldu3axfDhw+nbty+vvPIKhnFsVliSxKmpucfVJE6R5ejYbpSZBYAyfGirYk/ylIFS8qMSQgghhBBNLycnh1tvvZXrr7+e7777jqeeeoqpU6cC1Guo3qhRo+jatStXXHEFjz76KHl5edx9993ceOONOJ01vW9LlixhwoQJfPbZZzRv3pxdu3aRnZ1N69atefzxxyksLKyt91C9VEdKvo03MWW2QNt7QnR1Ta+T9zc/OhgHuuJH05TLesVCCCGEEKJpTZgwgUAgQP/+/TFNk1tuuYXrrruu3vWYpskHH3zADTfcwKBBg/B6vVx55ZVMmfK/d72qq6tZv349kUgEqOmR2rRpE5s2baJFixZ16tNaH9mNHYIkTscD58iaxCm8BO06D2XWZOlKKTQ+0OV7pilPlmnKhRBCCCFEk7Lb7UybNo3nnnvuiOtq3bo1H3300QGPZ2dn10mIrrrqKq666qojvm5DSBdGE9PaQpvNwdYNsCD4cZ3jSpmgvGCVgK5omiCFEEIIIYQ4yUni1MRCOoo/WgXOMTU7wgvRVmmdc5SygXKiY0Voy98EUQohhBBCCHFyk6F6TUxrB0FL47Rn4bB1gOhGCH4Cnl/UOa9mmvKYTFMuhBBCCCGazLx585o6hCYjPU5NLBLVVAY0gWgZOM+p2Rn6Em1V7nOuUh4gho4Vo3X4mMYphBBCCCHEyUwSpyYWCYSpLo8RjAUJq/ZgtgYiEPp0/wWUD3SgZtiejh7TWIUQQgghhDhZSeJ0HAgHTSJBRUBX/KjX6TO0Vb3PuUqpPcmTH22VHPVpF4UQQgghhBCSODU5rTWhiCYachGKVhIxu4GZBToIoS/2W0YpY89MexVA8NgGLIQQQgghxElIEqcmFghF8FcFCVaZxKIxqq3g/2bYC81B69B+yyllAzQc4LgQQgghhBCi8Uji1ITKQmW8u/I9NhduIRAAK2QnGK0gYvYGIw10FYTmH7gCZUNbVTJcTwghhBBCiKNMEqcm9Pd3X2V5+XJWln1PVBuE/CbaClGtw+A8u+ak4CdoHTlADY49PU4yw54QQgghhBBHkyROTejcnmMgBlWeSjaXbicQUKgoBCMBIrZ+YCSBLoPwwv2WrxmuF5XhekIIIYQQ4oSUk5PD2LFj8Xg8pKenc/vttxONHt7M0aFQiN69e6OUYsWKFUc3UCRxalJdunYkuSAVgKW5i6kOm0QDCghRraPgHFVzYvC/aB07QC02tOU/JvEKIYQQQgjRWGKxGGPHjiUcDrNw4UJeffVVpk+fzr333ntY5e+44w6ysrKOcpT/I4lTE8tuOxRtQYmziNxAIVVVBjYrRjAaImIbBCoOrCIIL9l/BapmuJ4siCuEEEIIceLSWhMKRY75p77vymdnZ3PTTTdx0003kZCQQGpqKvfcc0+D3rmfM2cOa9asYcaMGfTu3ZsxY8bwwAMP8MwzzxAOH/y77X//+1/mzJnD448/Xu/rNpTtmF1J7KOkrIotJQorz4NxSoBFOxfR0ns2CaEYuGMEtIXdOQKC70LwI7RjQM1U5D+ilB2tA6DDNUmUEEIIIYQ44YTDUX57z1vH/LrPPvALnE57vcq8+uqrXHvttSxZsoRly5Zx3XXX0apVKyZOnMikSZOYMWPGQctXVFQAsHjxYnr06EFGRkbtsdGjR3PDDTewevVq+vTps9/y+fn5TJw4kdmzZ+PxeOoV+5Fo0h6n+fPnM27cOLKyslBKMXv27EOWmTlzJr169cLj8dCsWTOuueYaiouLj36wR8FzcxcxY/d6CoM1XYz5tlwKQ1UE/DHsBgSiQSL2IaDcYOVCZPkBalJove9iuUIIIYQQQjS2li1b8sQTT9CpUyd+9atfcfPNN/PEE08AMGXKFFasWHHQz155eXl1kiagdjsvL2+/19Zac9VVVzFp0iT69et3dG7wAJq0x6mqqopevXpxzTXXcOGFFx7y/K+//poJEybwxBNPMG7cOHbt2sWkSZOYOHEi77zzzjGIuHF17dyMt7atIt9ykLndCa1CLNy5hObeM/CFAuDwEtAKu+NMCH0IwQ/R9lNRStWtSDnAqkIb0T0TRgghhBBCiBOJw2Hj2Qd+0STXra+BAwfW+T46aNAgpk6dSiwWIz09nfT09IOWtyyr3tfc66mnnqKyspI777yzwXU0VJN+yx4zZgxjxow57PMXLVpEmzZt+N3vfgdA27Ztuf766/nrX/96tEI8qsZ06cjj8+ZTagWoqmyJT21ip9pOUbA/iQE7HpeNYDSI2z4Me2guxHIgugrsPX5SkwOoAB0E5WuKWxFCCCGEEEdAKVXvIXPHo/oM1cvMzGTp0qV1juXn59ce25/PP/+cRYsW4XQ66+zv168fv/rVr3j11VcbGvohnVDdE4MGDeIvf/kLH330EWPGjKGgoIC3336bc84554BlQqEQodD/puve+4OKRCJEIgdaH6l+9tZT3/pMYFjbVry3ej07cNI5x060ZYSFu5eT4T0VuydIVBn4sRNvOwMj8im6+j9Y7s7wk14nbWmI+TFM5/4vJo5bDW0/QoC0H3FkpP2IhpK2c2QikZpJGSzLOqLel6byzTff1Il70aJFdOjQAaUUkydP5tZbbz1o+b0TSQwYMICHH36YvLy82l6qTz75hPj4eDp37rzfZzNt2jSmTJlSu717927GjBnDm2++yYABA/ZbxrIstNZEIhFM06xzrD5t+IRKnAYPHszMmTO57LLLCAaDRKNRxo0bxzPPPHPAMo888gj333//PvvnzJnT6C+TzZ07t95l2kciOGwG1R6LaF4GZosctkQ3sXpTaxwbK2vPc9jSOK1jEEOt5Iet71Fe3aIxQxfHgYa0HyH2kvYjjoS0H9FQ0nYaxmazkZmZid/vP+TsccebaDRKTk4ON998M1dddRXff/89Tz/9NA888AAVFRW4XC5cLtdB66isrPmOO2jQoNr3pCZPnkxBQQH33HMP1157bW3nx7fffssNN9zA7NmzycrKIjExkcTExH3qzMzMJD4+vraT5MfC4TCBQID58+fvs0ZUdfXhzxNwQiVOa9as4ZZbbuHee+9l9OjR5ObmcvvttzNp0iRefvnl/Za5884762S9FRUVtGzZklGjRhEfH98ocUUiEebOncvIkSOx2+vXxVpS6eeD92ezPr+UnbYEWufaiGTF2OHZzXkdepCQnknICuO2uYiP7sSIfMWQvtuwPOPr1KO1Bu1HmVkow90o9yWOjSNpP0JI+xFHQtqPaChpO0cmGAyyY8cOfD7fIZOM443NZuOKK64gFosxYsQITNPkd7/7Hb/73e/2fQ//ALTWVFZWkpiYyIcffshvf/tbRo8ejdfrZcKECTzyyCPYbDVpilKKjRs34nK59vvd3eereU3F6/Ue8Lt9MBjE7XYzdOjQfZ73/hKtA977YZ95HHjkkUcYPHgwt99+OwA9e/bE6/Vyxhln8OCDD9KsWbN9yjidzn3GQALY7fZG/0VvSJ1ep4dRHdqxuWQ55fEWxu7mkLWD9eFNVAQ6EB8L4na5icQiaPdoVHQhylqHoXagbG3r1KUtE4wohin/gJ2IjkabFCcPaT/iSEj7EQ0lbadhYrEYSikMw8AwTrxlVR0OB9OmTeP5559vUPm9w+mUUrRt25b//ve/Bzz3zDPPPOgaUaeccsoh15AyDAOl1H7ba33a7wn1k6qurt6nce0dp9iQRbeOFwMyWpMS7yJqg9xYAlahQTAUZFnxNsLBKkxlorEIaC84+tcUCn64b0XKDroKrU+8sbJCCCGEEEIcz5o0cfL7/XXmc9+6dSsrVqwgJycHqBlmN2HChNrzx40bxzvvvMNzzz3Hli1b+Prrr/nd735H//79ycrKaopbOGJKQYLDQ3bbNpimojReY2xLAw0rKtZTVlFBLBLFbtgJRINEHaMABZEV6NjOn9TmqFkIlxNrrKwQQgghhBDHuyYdqrds2TKGDx9eu733XaQrr7yS6dOnk5ubW5tEAVx11VVUVlby9NNP88c//pHExETOPPPME3Y6cgCnw4bdbnJmi1P4aMMm/LEwpYXpJJYWUGVUs6Ikh5TkDNxx8USsCAESibP3gch3EPwIvNfV1qWUidYx0CFQJ9Z4WSGEEEIIcfybN29eU4fQZJo0ccrOzj7oELvp06fvs+/mm2/m5ptvPopRHVs2m0m810VaII6eWRks2riTkkRN4vpE9MBSlhZvYFBmR1y+eOyGnWAsiNt5NrbIdxBehnZdiDJTf1SjHW35QcUf9gt6QgghhBBCiIM7od5x+rnyel14PU6ys9rgdtuoiLMIVmdBpaIiVMUPpVuIBkPYDBuWtgiQBrYugAWheXUrU3uH68m6CkIIIYQQQjQWSZyOAzbTIN7npmdyJplJPrQJRXEmeqWXWFTzdcE6AsEqgD29TgGijuyawuH5aP2/BX6VsgGRmuF6QgghhBBCiEYhidNxwuN2kOjxcHqzlrjcNioSYsQCrSAIJQE/q4vXE4ta/+t1Uu3BSANdDeHFP6nNhrYq93sdIYQQQgghRP1J4nScsJkGCfEehmS2wetxEnRqSl02rOUuouEYC/LWEKoOAnt6nawQMcewmsKhz+q+K6YcoENoLbPrCSGEEEII0RgkcTqOeFwOWiYk0TM9HafLRkWihspWEFbkV5ezoWwzWoPNsBHTMarNPoATYrshura2HqXsQHTPu05CCCGEEEKIIyWJ03HENA1SErwMzmiN22Onwhujwu7G+s5OOBBlQe5yoqGaSR8choOgpYg5BtYUDn32k9oUWlcf2xsQQgghhBDiZ0oSp+OMx+2gb7MWNPPFYXea+JOA0hYQVezwF7OlombR29p3ncw9iVPkB3Ss4H8VKQdY1WgdPfY3IYQQQgghxGHIyclh7NixeDwe0tPTuf3224lGD/399cMPP2TAgAG43W6SkpIYP378UY9VEqfjjGEoMpLiGJTZArfHTllcjCpHPNYKk2BVhK92LyMWs4Cad52qdQKWrQugIfT5j2pyILPrCSGEEEKI41UsFmPs2LGEw2EWLlzIq6++yvTp07n33nsPWu7f//43V1xxBVdffTXff/89X3/9NZdffvlRj7dJF8AV++d1OxnRuj3/3b6RKneYYJzCm5uFtnLYXLaLnRV5tE7KwmbYiFhRArbBeKNrIbwA7R6PUi6UUmgNWgdReJv6loQQQgghxEForQnFjv1IIadpQyl12OdnZ2fTvXt3AF5//XXsdjs33HADU6ZMqVc9AHPmzGHNmjV8+umnZGRk0Lt3bx544AH+9Kc/MXnyZBwOxz5lotEot9xyC4899hjXXntt7f6uXbvW69oNIYnTccgwFC1Tk+iVmsHC4A7K42N4K5Nxr9xJoHeYBXnLaZ2UBYDDtOOPtsat0jB0IYQWguvMmoqUA3QVWieilNmEdySEEEIIIQ4mFIty6dv/PObX/dfFl+Gy2etV5tVXX+Xaa69lyZIlLFu2jOuuu45WrVoxceJEJk2axIwZMw5avqKiAoDFixfTo0cPMjIyao+NHj2aG264gdWrV9OnT599yn733Xfs2rULwzDo06cPeXl59O7dm8cee6w2oTtaJHE6TiX63JzVugNL83Mp90WIuJ04N6dh9MxnTfEW8v1FZPhSMZUJyiBgG4Q38j6EPkc7s1HKAGoSJwgD7ia+IyGEEEII8XPQsmVLnnjiCZRSdOrUiZUrV/LEE08wceJEpkyZwm233XZY9eTl5dVJmoDa7by8vP2W2bJlCwCTJ0/mb3/7G23atGHq1KlkZ2ezYcMGkpOTj+DODk4Sp+OUUop+rVrQcm08oVCEirgorqpmGGsLqO4RZGHB91zgOwsAp+nAT0/ckY8xrDyIrgF7d5Qy0FqjrSDKlMRJCCGEEOJ45TRt/Oviy5rkuvU1cODAOsPyBg0axNSpU4nFYqSnp5Oenn7Q8pZl1fuaPy171113cdFFFwHwyiuv0KJFC2bNmsX111/f4LoPRSaHOI4let0Ma3kKTqeNirgYUadJbHUCVtRiRf56SkM13ZyGMjCUh6DZr6bgj6cmV/Y9w/X0fq4ghBBCCCGOB0opXDb7Mf/U972kQ5k0aRI+n++gn70yMzPJz8+vU37vdmZm5n7rb9asGVD3nSan08kpp5xCTk5Oo97LT0nidByzmyYjO7THa3dg+kyqfBod3wprs8JfFuCbwh/+d65hx2+ehqU1RFaiY3sboWPPzHoyu54QQgghhDhy33zzTZ3txYsX06FDB0zTZMqUKaxYseKgn70GDhzIypUrKSj435I6c+fOJT4+/oCTPfTt2xen08n69etr90UiEbZt20br1q0b90Z/QhKn41x6nI8BzVridtkpj4sRcTmIfeslEo6yLG8NJeH/9TopM4Ow2amm4J5ep5pJIWIyLbkQQgghhGgUOTk53Hrrraxfv54333yTp556iltuuQWA9PR02rdvf9DPXqNGjaJr165cccUVfP/993zyySfcfffd3HjjjTidTgCWLFlC586d2bVrFwDx8fFMmjSJ++67jzlz5rB+/XpuuOEGAC655JKjet/yjtNxzm23M6ZjB+bt3EY0EaIl4IprBZvWUu7280HOPH7Zeix2u4nDcFBlDsIRW4cRXoh2X4BSbsCOtqpAxTd6d6wQQgghhDi5TJgwgUAgQP/+/TFNk1tuuYXrrruu3vWYpskHH3zADTfcwKBBg/B6vVx55ZVMmTKl9pzq6mrWr19PJBKp3ffYY49hs9m44oorCAQCDBgwgM8//5ykpKRGub8DkcTpOKeUonNGBp2SU1kdzaPUF8UMuHHMcaJahVi9fQsLYssY2Lw3Lq8TbXYiqtKw62JU6GtwjdgzLXkIiFCzMK4QQgghhBANY7fbmTZtGs8999wR19W6dWs++uijAx7Pzs7e5119u93O448/zuOPP37E168PGap3AnDZ7Iw4pT020ySUCspuktKvG3yuCFaF+WjTQtZv2Ep5YTlEFVW2gVjaqpmaXFsoZQOiMlxPCCGEEEKIBpLE6QTgME2GtG5NktuNzW1S4YlRHDM5a/BpGDtNwrEIby77hJy1+ZTmlVFR1Z2YttdMEBFdtacWo2a4nhBCCCGEEKLeJHE6QcQ5nZzRui0Ou51QOsQsRZEtkbEdT8PAIJgZYubbn5CzupDqUoui6m5EYzEIfFpTgXKADqB15OAXEkIIIYQQ4gDmzZvHtGnTmjqMJiGJ0wnCZbMx4pRTsJkGES8E7RbLV1WQlt6CQWndsTttBIcG+ee0z1m/aCfVkQFEIlEigR8I+rehlAMZrieEEEIIIUTDSOJ0gjANgxbxCfTOyMLltBNpZhAKW7z23m46xbWhZWoGzmQ70dEhPnx6IfPf2E2AjsSsKKGS/1JWWEEsEkPr6qa+FSGEEEIIIU44kjidQDx2B9lt22IaJuFURUKKG39llH99VMSp7k74kj14+juxusRY/M5qPvyHGw24nN9RVVJAYW4V1WVFxKLS6ySEEEIIIUR9SOJ0AnGYJr3SM2kWF4cGmg9Iwut1UlwcZf7nITrYWuGIs5P4Gw8qXrHgnSCbvreIRQPEJ65EKQel+UUU7cwlFJDkSQghhBBCiMMlidMJRClFvMvF4JZtsBkmP5QWc/a5nbDbbezcEaZgWRxxpg/Lq2l7XwbueBdffZBM0c4yYtWf4XDZ8CR4CAUqKSsoJxaLNfUtCSGEEEIIcUJo0sRp/vz5jBs3jqysLJRSzJ49+5BlQqEQd911F61bt8bpdNKmTRv+8Y9/HP1gjxNuu50zWrfG43BQHo7wr50bGDLqFFAm69aF8G1rgWEY5HmLOG1KF3bvOgV/hUF53jZ2r/0MlB1PvEW1vxp/qUxPLoQQQgghxOFo0sSpqqqKXr168cwzzxx2mUsvvZTPPvuMl19+mfXr1/Pmm2/SqVOnoxjl8cVmGKR5vdzYfwCpHi8lgRCzCzbTZUgzNCYrvo6RUtkMwzD43rmBM+4cxLat7bEsi+Jt77LoPxswVBSXR1FRXEmwWobsCSGEEEIIcShNmjiNGTOGBx98kAsuuOCwzv/444/58ssv+eijjxgxYgRt2rRh0KBBDB48+ChHenzx2h20SUjm3mFn0j45lWA0xryqXST1TkBjsPYjJ86oB8uMsjy6jvgB43H6XLTqUM6i2Z/w/nPfYLNXYFkhyosqZMieEEIIIYRoEjk5OYwdOxaPx0N6ejq333470Wj0oGU2bNjA+eefT2pqKvHx8QwZMoQvvvjiqMdqO+pXaETvv/8+/fr149FHH+X111/H6/Vy3nnn8cADD+B2u/dbJhQKEQr9r1eloqICgEgkQiTSOIvB7q2nseo7FFNr7IBSBn8ZPJQXvl3Col05rFNlJHVwYN9okfd5IokjAxTYC9nkz6B5s554HCs5dehuPnnLgyfeydBLT8VfFsbmtBGf7DsmsYt9Hev2I35epP2IIyHtRzSUtJ0jE4lE0FpjWRaWZTV1OMec1hqAaDTK2LFjycjIYMGCBeTm5nLVVVdhs9l46KGHDlj+3HPPpX379nz66ae43W6efPJJzj33XDZu3EhmZuY+51uWhdaaSCSCaZp1jtWnDSu9N/ImppTi3XffZfz48Qc85+yzz2bevHmMGDGCe++9l6KiIn77298yfPhwXnnllf2WmTx5Mvfff/8++9944w08Hk9jhd+ktNZ8W+Xnq8oyLA2UWyRth7geRSR1r8SDm5GhZpza9l2qK2Lcd017wiEHo287g7hUb1OHL4QQQghxUrHZbGRmZtKyZUscDgdQ830uog/e03I02JUNpdRhn3/uuefSpUsXAP75z39it9u55ppr+Mtf/lKvegDmzp3LL37xC9auXUt6ejoA//jHP5g8eTKbNm2qfTY/VlxcTPv27fnwww85/fTTAaisrKRVq1a8++67ZGdn71MmHA6zY8cO8vLy9unNqq6u5vLLL6e8vJz4+PiDxntC9ThZloVSipkzZ5KQkADA3/72Ny6++GKeffbZ/fY63Xnnndx666212xUVFbRs2ZJRo0Yd8uEcrkgkwty5cxk5ciR2u71R6jyUcDRKrr8SULjtdvpZms5bNvLamu8JuzR59hBqeyqOltXYMyKsMIJ0IZ24tEJGXRHjk+k2dn21kyunnEmwIoDTl0Vys5qJJcSx1RTtR/x8SPsRR0Laj2goaTtHJhgMsmPHDnw+Hy6XC4CwFWbaukePeSx3dr4Dh7FvgnIgNpuNt956i2uuuYZvvvmGZcuWMWnSJNq3b8/EiRO54YYbmDlz5kHrKC8vp7Kyku+//54ePXrQvn372mPnn38+f/zjH9mxYwd9+vTZp2xcXBydOnXinXfe4YwzzsDpdPL3v/+d9PR0zjjjjP1+vw8Gg7jdboYOHVr7vPfaOxrtsO79sM88DjRr1ozmzZvXJk0AXbp0QWvNzp076dChwz5lnE4nTqdzn/12u73Rf9GPRp0Hu1aGYVBQ5SesNR6HnewOnUhwO/jHD9+jk2C7DmCtSUV5ciG+gPVl7emTVsCw80qY+2YcG7/byZqv8+h5RnMClbmE/C7ikrPq/dcC0TiOZfsRPz/SfsSRkPYjGkraTsPEYjGUUhiGUftHa6OJph74cQyHq2XLlkybNg2lFF26dGH16tU8+eSTXH/99TzwwAPcfvvtBy2/97tmfn4+GRkZda7frFkzAAoKCg4Y16effsr48eNJSEjAMAzS09P5+OOPSUlJOeA9KqX2217r035PqMRp8ODBzJo1C7/fj89X807Ohg0bMAyDFi1aNHF0x16c04lSioIqP1WRMF67g55Zrfmt3eSN1avZrDXbSi3ULi+GUc1XcU66Rpw4nX4u/1OE16eY/OeFBXTsezl2p4eq0m24XCZ2TzpKSc+TEEIIIcSxYld27ur65ya5bn0NHDiwzh/aBw0axNSpU4nFYqSnp9cOuzuQI3mvS2vNjTfeSHp6Ol999RVut5uXXnqJcePGsXTp0trE62ho0m/Hfr+fFStWsGLFCgC2bt3KihUryMnJAWqG2U2YMKH2/Msvv5yUlBSuvvpq1qxZw/z587n99tu55pprDjg5xM+dz+Eg3VuTRFZFwjjtdtqnNOOKbl05vWUzEhPcbCtMprRas7uqih/C7dExTZ/+67nktzlUVlQw57XF2F1eohEnlaU56GgRWstMe0IIIYQQx4pSCofhOOafxh5pNGnSJHw+30E/e2VmZpKfn1+n/N7t/U3yAPD555/zwQcf8NZbbzF48GBOPfXU2ld2Xn311Ua9l59q0h6nZcuWMXz48Nrtve8iXXnllUyfPp3c3NzaJArA5/Mxd+5cbr75Zvr160dKSgqXXnopDz744DGP/XjiczhQ+Mjf2/PkdNEyKZNRWpPp9fDO6o3s3JmK0S6fd4odJCQPpb1nAaedVY7bVcE7L1mcelZnWrRPo7qiCoc7F2+CBWYqSp1QnZJCCCGEEOIo++abb+psL168mA4dOmCaJlOmTOG22247rHoGDhzIww8/TEFBQW0v1dy5c4mPj6dr1677LVNdXQ2wzzA+wzCO+gyFTfqtODs7m4NN6jd9+vR99nXu3Jm5c+cexahOTF6Hgwx8NcP2wmHi3B4yE9KJWjHSTuvJS0t+oKzEj072M6PQzaW+8XRJ+YgOvcq46o8/MGemh6vuvRaby0VlaQSnswSba2/ydPgvDAohhBBCiJ+3nJwcbr31Vq6//nq+++47nnrqKaZOnQpQr6F6o0aNomvXrlxxxRU8+uij5OXlcffdd3PjjTfWzlGwZMkSJkyYwGeffUbz5s0ZNGgQSUlJXHnlldx777243W7+/ve/s3XrVsaOHXtU71teZPkZ8e4ZtqcUVIXDJHl9ZCakkex0cscZfTFLWxCNmuTFyphdUcGK3Zdi+DJJSgsz7tIFLJ//LjanQSwMleVArBIdK0Tr0CGvLYQQQgghTg4TJkwgEAjQv39/brzxRm655Rauu+66etdjmiYffPABpmkyaNAgfv3rXzNhwgSmTJlSe051dTXr16+vXW8pNTWVjz/+GL/fz5lnnkm/fv1YsGAB7733Hr169Wq0e9wfGYf1M+N1OMhQcbUTRqT6EolEo+RVFHHn8EE88VUA3TqHHHJZGHMSLbyMlryNy7mV9q1mU1lux+0bQ3V5EJc7EXdcAB3LAyMNZfw81r0SQgghhBANZ7fbmTZtGs8999wR19W6dWs++uijAx7f3wi1fv368cknnxzxtetLepx+hjx2O+leH6ZhUB0Nk5GQTIovkeqI5tbTh+DflonWsDq6jTU6jxxjAhvWtwKtcQZmYzPeIuqIUF5aRTTqBh1Fx/LRlr+pb00IIYQQQogmIYnTz5THbidjT/IUjEXJSkwlzuWlGhuTuveleHsqMa35NrKZLeRQnnwpc99tRdAfwQwsJMH9D6pC+VSUlIPyggIdK0BbFQd9L00IIYQQQoifI0mcfsbce5Inm2EQillkJaXjsnvwJSZzXnovinamELIslkU34m/pp9QczJvPt6d4VxCn3kJa/N8prdhAVWUVSnlA2fb0PBWhdbSpb08IIYQQQhxj8+bNY9q0aU0dRpOQxOlnzr1n2J7dNIhpRVZ8GoZy0rtTa7pGOlCcm4Q/GmVZZDXpFyaxc2cWLzzSiYJdCqdRRkbcyxSVLKQ6VI1STlAesEr3JFDVTX17QgghhBBCHBOSOJ0E3HY7GT4fDpsBpo3MhDQiUc1lQ3vizW9BSUECJaEI3xtraH99S/J3enjsj60pq0jDboRJdbxGWfF/qI5WAyaoeNBBdCwfK1aK1kd3znwhhBBCCCGamiROJwmXrabnyWkzsNmcpPqSqYqE+e1ZAwhsz6S0KI6CQIjdHXeTNjqRylIbT93XCn+sOzbTIFG9T6DsVaoilQAowwfKBKuo5t0nHW7iOxRCCCGEEOLokcTpJLI3eXLZTTwuD0kuH4YjxvUD+1K4NYPSEg+F1UG4OIxximLHqlL++0F3KqyzUMrAE/0a7Z9KILwZYM/QPS/oCnQsT2bdE0IIIYQQP1uSOJ1kapMnh4MEXwJOu4MWWT7Oa92B3VszKSpxURGLEnejDZ1hMX/6Crbln0qF/iWWZcce2YLpf4iw/w20DqKUiTISQMf2DN0rRutYU9+mEEIIIYQQjUoSp5OQy2YnzePFaXeS5EkkGo0y4rTW9HKlsXNLJnkldnS8iXMiBJ1B5ry8jLJIF4qit+APdcCKRtHBT7DK70KHl6K1rlkcVznAKqmZOEKHmvo2hRBCCCGEaDSSOJ2kvA4HaV4vHqebJG8CVaEg147oQ3rUQ86GZuwqNnE1d2FMiLL++62s/nY7USOJSuMq8iovIxyJIxotQvufB//f0LE8lHKA8oGuRkdz0ValrPkkhBBCCCF+FiRxOonFOZykerz4XD7iXF6COsjvzuqPM2Jny4ZMCqscuJo7Ma6I8tlriynxV6MUKHdPdlf8lsKqQYRjoCNroOI+dOAdIIIy4vYsmJuPtoplzSchhBBCCLFfv/vd7+jbty9Op5PevXsfVplgMMiNN95ISkoKPp+Piy66iPz8/KMbKJI4ndSUUiS53aTFxRHvicdpt+NLMJnQoxuETTasTSdk82GkGvhHVfLV7CVURyMopXDFxVEdOYstpVfij51SkxwFP4KKu9HhbwEXKPePhu4Fmvp2hRBCCCHEceiaa67hsssuO+zz//CHP/Cf//yHWbNm8eWXX7J7924uvPDCoxhhDdtRv4I4rimlSHa5iVkW4WiE4spienfJIHtXcz4v38XKNan07mRhpZXxfWwjHbc2o8MpbXHZXDg9LoxQCzYXnUtWYj5p7k9RVglUPQe2buC5HIx00H50NA/MjJp3oYQQQgghRB01rzc0xTviTpRSh312dnY23bt3B+D111/Hbrdzww03MGXKlHrVs9f//d//AVBYWMgPP/xwyPPLy8t5+eWXeeONNzjzzDMBeOWVV+jSpQuLFy9m4MCB9Y7hcEniJDANgxS3h6iVQiASxh+o5OIzu7L9nXI2B/2s2pRB12ZBrMwgszcsYrw26NI+E5vpwe60E2fGs7scApHraJ60Alv0U4iuhor7wDUaXOeADqOtUlBOlDKb+paFEEIIIY4zIXTx5cf8qirlDcBVrzKvvvoq1157LUuWLGHZsmVcd911tGrViokTJzJp0iRmzJhx0PIVFRUNjvfbb78lEokwYsSI2n2dO3emVatWLFq0SBIncfTZTZM0j4dwYgrbYhGqwgEmjerLQx9+TRmwJb81bRI3EUkOM2v1fAatP5WR53TCYQtitzmJ9yZQWlWBjp1GZmp/XPy7JnkKfgjhxeD+Bdjaoa0KlJnU1LcrhBBCCCEaqGXLljzxxBMopejUqRMrV67kiSeeYOLEiUyZMoXbbrvtqF07Ly8Ph8NBYmJinf0ZGRnk5eUdteuCJE7iR5w2G83iEghHo2wv2Y3pVVzTrztPffc9JX4biY5upMevJ5wZ4uu8Zex4tJRf3DiUhAQLuxkiweeksqoCnR9PZvp1uD1rIfhPsIqh6hnw3gw2G9pwo1T9/rIhhBBCCPHz5tzT+3Psr1tfAwcOrDMsb9CgQUydOpVYLEZ6ejrp6ekHLW9ZVr2veTyQySFEHW67neYJSaTFpRCKRmnbNpmxGS1RFmwpiWGjH/FxCRjNIKfbZp65fTYbvw9jkYzNMIjzWQQoJTc3H39VZ/BNAUf/msoDb4EOoGMlaH1i/sIIIYQQQhwNSimUcjXBp/7vJR3MpEmT8Pl8B/0ciczMTMLhMGVlZXX25+fnk5mZeUR1H4r0OIl9xDmdtEvJIBAKUlpVRvZpbdn+cQXLKePr7WUMb9WLtMz1FNuKqDqvlNceeYehYwYx7Nf9sJkhnK4SotFSCgvDRKMZJCT9ChXdCFYhhD4D19mg/aDim/pWhRBCCCFEPX3zzTd1thcvXkyHDh0wTfOoD9Xr27cvdrudzz77jIsuugiA9evXk5OTw6BBg47adUESJ3EA8U4nHdKzWJkboTro59IhXaj88gc22ar4fHsxZ1R1oFlXB0XOAoJXhpj32kK2rsjhwj+PITGjBVFbKZYRoLxkF7FwIglJl2ELPw+hT8HeB40NlKtm0VwhhBBCCHHCyMnJ4dZbb+X666/nu+++46mnnmLq1KkA9R6qt2nTJvx+P3l5eQQCAVasWAFA165dcTgc7Nq1i7POOovXXnuN/v37k5CQwLXXXsutt95KcnIy8fHx3HzzzQwaNOioTgwBkjiJA1BKker10SGtGavzthH1RLjm9G68/sMG1gbL+KqwlNOWNCdrgEmJu5iqa6vIeWUnL/52BufcdBbdh3cmSBX4olRVBYnFkkiO74GNlRB4E7w3o2OlYKY3ehexEEIIIYQ4eiZMmEAgEKB///6Ypsktt9zCdddd16C6fvOb3/Dll1/Wbvfp0weArVu30qZNGyKRCOvXr6e6urr2nCeeeALDMLjooosIhUKMHj2aZ5999shu6jDIO07igAylaB6fRNvkZmi7QnkUv+7ZgR5JyaBgWaCSrZ+lkeRKJaFjAo7fQrU3wLuP/pf3Hv8EFbQRwkHUl0oo4qOkdAixmBNiORBeBLoCdFVT36YQQgghhKgHu93Oc889R3l5OSUlJTz00EMN/kP4vHnz0Frv82nTpg0Abdq0QWtNdnZ2bRmXy8UzzzxDSUkJVVVVvPPOO0f9/SaQxEkcgmkYnJKSQcvEdEK2GDaPg190aEePtGSUoVhlVLPhwyQ8sQR8bX34bnNgZVl8/9kaXrr5TUo2lhGwwsTcXiwzk7KKYUSjMXTgP2CVoa1itA439W0KIYQQQghxUJI4iUNymCad0rNolphC0BUlLtnHLzp2pHNaIobdYKMvzIb/JGFW+HCkOUi6y4Orm4PiXaW8etsslr+9kspogIjNi3L2JxhsTSwSwPL/E6wg2qrYs1q2EEIIIYQQx6cmTZzmz5/PuHHjyMrKQinF7NmzD7vs119/jc1mo3fv3kctPvE/HoeTbs3akOxNoFwFSEzxcmWPbrRPTcDmNtiWGGHrR8lE891ol8L3BwdZ52YQicT44tWFzL7vvxRVVRM2vERt5xGzTGLB1YQrvwWrTIbsCSGEEEKcAObNm8e0adOaOowm0aSJU1VVFb169eKZZ56pV7mysjImTJjAWWeddZQiE/uT6PZwatYpZMYlURapxhPn5NpTe9EyJR57vI2clAi5c9OpznESJkrk4mpO/WMPlM1k6/KdzJ4yh6LqGBGVTtQYgUajq2fhLy1AR0vQOtrUtyiEEEIIIcR+NWniNGbMGB588EEuuOCCepWbNGkSl19++VGfq13sK8HtpkeztrRIyiAQDWPaNBNP7UVGkg9nioNdaVFKv8qgYpOTQCTMrh7bGPzQaRgOkx0rdzH7/82lLGQQ1APAaI5hBtFV71JenI8VLm3q2xNCCCGEOGbkVYVjo7Ge8wk3Hfkrr7zCli1bmDFjBg8++OAhzw+FQoRCodrtiooKACKRCJFIpFFi2ltPY9V3vHMaBu2T0rFjkFdRgrIiXNm9C/9YuZpKM0yhEYWFmUTDuVgdgqxNW0n/+3qx+J7lbPpmO28/No9Lbzsd7OcQb/sHLucqisu6EQpYxKUpHK6Epr7FY+pkaz+icUn7EUdC2o9oKGk7R05rjd/vx+l0NnUox9zeREZrXWdNp6MlFArVztb30zZbnzZ8QiVOGzdu5M9//jNfffUVNtvhhf7II49w//3377N/zpw5eDyeRo1v7ty5jVrficQOjHX5+Hd1IZFkjV+BXpxGLFRAtIOfdSk/0HVSK354cgOrP13H69UBTr2gK6c060KLlO9QvMOy1VcQs/Ka+laazMncfsSRk/YjjoS0H9FQ0nYaLi4ujlAoRDAYxOFwnJTrWhYXFx/1a2itKSwspKSkhI0bN+5z/MfrQx3KCZM4xWIxLr/8cu6//346dux42OXuvPNObr311trtiooKWrZsyahRo4iPj2+U2CKRCHPnzmXkyJHY7fZGqfNEURUOk1dZQZG/FGfAz2WVaczauJmIW+NKchD6wUalyiXW3o+zXynD/jSEBX9bwtalu0ltFU+/a8bgdu7GSylDT11PRdVQopEk4lKa40vyYhg//4kfT+b2I46ctB9xJKT9iIaStnPktNYUFBTUjoY6mWitCQaDuFyuY5Iw2mw2+vXrt9+2Wp/nf8IkTpWVlSxbtozly5dz0003AWBZFlprbDYbc+bM4cwzz9ynnNPp3G8XqN1ub/Rf9KNR5/Eu0W7HZrdjdzgoq/LgcJYxJtaKj7bkUOCJ0H1oJnnfmAS8WymkEmeP7Zxx/SC+emERS/+9Do/XYPyvxhJvm4lLLcTy9SIc8lBZXIy2NIlpCdjs9WumWsdAh4AoKA9KnRjN/GRsP6LxSPsRR0Laj2goaTtHpkWLFsRisZNuyGMkEmH+/PkMHTr0mLQfh8NxwD/G1+f6J8Y3SiA+Pp6VK1fW2ffss8/y+eef8/bbb9O2bdsmikz4HA7wxaGUwm7aOM2w4Q+Hmb8zj1XhUoaObM3aeZqQeyO7VCHuQR4GVZzG4jeX8uVrq3B5e3LeBb1wmd/jUu9gOSfhtYXxl1YSDcdIykjA6T74+F+tLSCEtgKg/aDDgAXKC2YySrmPybMQQgghhKgP0zQxTbOpwzimTNMkGo3icrlOqMS7SRMnv9/Ppk2bare3bt3KihUrSE5OplWrVtx5553s2rWL1157DcMw6N69e53y6enpuFyuffaLY8/ncAA+AOw2G6NPcVAdibI0v5CvinczfFhbflgYIThwM5vIocdoH338vVj+n++Z88IPeHxdGTNqI6aRj53FYA7Gl+jFXx6icFcJKZmJuH11k5+aFwt/nCyFAA3KUZMwoUD70dE8tJGIMhJQ6uc/9E8IIYQQQjS+Jv0WuWzZMvr06UOfPn0AuPXWW+nTpw/33nsvALm5ueTk5DRliKIefA4H6V4fTruDlLhELuvSkx4pyUStGF+W7qbHaR2IrmhBKBhlVcUafBf56HpWFxQm7z+xmu+Wd8eyLJzqUxQlKFWOL9GOtjRFu0qoKq/aMyNKCG2Vo2O70dFdYBUBMVBelBGPUi6UMlBKoYw4UCZYRehYPlqHDnkfQgghhBBC/FSTJk7Z2dm1UwP++DN9+nQApk+fzrx58w5YfvLkyaxYseKYxCoOz97kyTRtJMQl8Ju+A+icnEQoFmW+P5eunboQXZtOMBhlceU3NP9VK9oPag+WyT/uzWfX7gxisQhOZqMIoSjH7XNg2GIU786hsmgjOrITHcsHwqDce5Il9wF7k5RygvKBrkJHd6OtSlk3QQghhBBC1IuMWxKNbm/ypJTC6/Lwu/5D6JycTCgWZXGkgHbp3YntTCAQiPCVfz4dr+1Gi16tiIYVz9/loLpag7UFFVuBohJFAS53MS5XGRWFhZSXRIA4lPKg1OGNCVbK2NP7pNCxPLRViNbho/sghBBCCCHEz4YkTuKo2Js8Qc3Mhrf0H0z7xESC0SjfGWW0MntilbrxB6tZUPUlvW7oT0bHZhTuNpn9UgKRUAS7/oBwqBIIAQ5MZzIObxKVJdWUFpZjReu/YJpSblAesMr2JFD+xr1xIYQQQgjxsySJkzhqfpo83dx3IK3j4wnFoqx0VpFW2QMdtFEWKmNJYBGn3TyMxJYpLPhvImuXgRWrxrRmUx0w0bqmZ8m0m3ji3FSVVVFaUEY0Eq13XErZQMWDjqFjeVixYrSufz1CCCGEEOLkIYmTOKp+nDzFeXz8ts8AWvi8hGJRNnojJOR3RVsGeZHdrIr8wMBbz8KTHs+/X2pOye5K3OZabLF/EazehrZq3ktSpoEn3kOgMkhpbhnhYP2H3NVMHOEB5QKreM/EEYFGvXchhBBCCPHzIYmTOOp8DgepHi8xNGkJCUzs2ZdmXg8hy2Kr18CT2w4NbIltYIfeyZDbRlAZTOKz99Ipzasg3rWEROcT2GOPY1pfA0GUYeBOcBMKRSjOKyNYFWxQbErZ9/Q+BdDRPKxY6Z41oYQQQgghhPgfSZzEMRHvdJLidhO2LJonpnB1195ket2EtcUOuxdnYXO01vwQ/ZYyM8DwO85k0Vctmf63tnwz14kV1ZhqBw7+iZd7cfEmNrUNT7wLbWlK8sqorqhuUGx1pi2P7UZHNqF1QGbeE0IIIYQQtSRxEsdMottNostJVEGr5DR+3akHGV43YSx2WynYK1PQWCwMfkXA7WTkbWewY3cK0//Wgnuvace61X0IRVKwYkFMvsGj/g+v+ivx3kWYtgAleWVUlvgPO+HROoyObkEHP0dXvQKVD0PFnVBxKzrwCTqWi7b80gMlhBBCCCGwNXUA4uRhKEWy20PE0lQTpk1SOr/q2I0ZG1aTXxkgvzqLdGeIqMPP52WfMzZlJBc8MJRPn11O/ppS/u/2ck4d248zL08jzfMDiZ51mEY+TvUeTvcHhO1d8ZecSizWm/jkeAzzf38X0DoCsZ0Q2wbRbRDLgdguYH9JkYLqGWBkoM1UUG4w4kF5aiaWEEIIIYQQJx35FiiOKbtpkubxkGvFiLgVzeNTubxjF2auX0t+ZTWFZa1ITdtEzBPgo13zubDVEM6/8zQW/nMLqz7YzHcfbCRvcwln3XwGvrihZMVtJcXzAw5jFw7bSpLifiAajqe6aBCeuEyUzqlJlg6UJKl4sLUGc8/HaA7Vz9eUCbwKvj8DkT0L7trRRjzK8KGU45g+NyGEEEII0bQkcRLHnNNmI9XjJc9fic/nJjOWzK+6dOX1NaspqKzGLGtNUsoWYsllzF73LZd0O5Whv2pPRrsUvnzhO3atLeadO+cx6vcDCLVrT164C2neSpJsK3Eb32OzVaCtj4lWKkzTQBkGSgHKB7Y2e5KkNjX/rxJRSgHUDPHTFeC9ESrvr0m2AjPA8xtQAGGwitBWOdrwoQwf4KotL4QQQgghfr4kcRJNwudwkOJ2U1hdRZLPRyQa44ruXXh15RoKKzVmZQvi4nOItMhl1g/LubBbDzoNSCa5xXDmPrmYsp2VvPfAlwz5VW+6jjiFQisBv3sYTvtQfMZGfMYqiFhEAy3QZmucnvY4vZnYXfvvKapJmipB+VC2ZLR3IvifhPA3YLZDuc4EnKCcNcP+rHK0VfGTYXzyyqAQQgghxM+VfNMTTSbR5SbJ5SFmUyTHxZFkxHNlj66keNyUlPmoDmZgmIpI6928nfs1W4sDpDW3ccH9Z9J+cCuilsWXry/ns+eWEK6MEawCK+ykyupObvRS8vkl1Y7RhGPdKSsyKdxVQkl+KcGqINr6ybA97a9JfsxUlHKBvS+4xtUcC7yFjm6qPVUp+55Z+Fw105jHcvdMJFEhC+kKIYQQQvxMSeIkmoxSiiS3mzi7A2yKRG88CUYcV/XuQqLbSWFBMoGq1pjKRizOzyeBRSwp3oHTFWL4Df0YfGUflKlYv3gHs+77lOIdZVT4g0SCGqdyopRBVayKqD2MO8GN3WGnujxA0a4SinaVUl1RTSwaQ1t+UI49SVNNj5QyvOAYCfY+gAX+59FW+U/it9UM11M+IFyziG6sQJInIYQQQoifIUmcRJOyGQYpHg8epx2b0yDe4SXZjOfKPl2IczopKPZSUtIJM5CApSy+Da3jv2XLCatyuo1qx7h7huFNclOyu4JZ937GxsU7qPSHqKwKYWgbdsOO3wrgj1Wh7OBJ8OCOcxGNRinOLaVw504qiquIhOOB/w3jU8oBZgK4LwajGegyqHphv0mRUgZKeWoSKO1HW2WyBpQQQgghxM+MJE6iyTltNlLcXuw2E7fHgUu5yXQlcGXPjiQ5nVQEYGd5K8hvhY4Z5ASK+U/FInZY20hvn8yFD59F824ZhENRPn56MfNnLqesoopKfxAsA5fhIGAFqIhVEtVRlGHg9DjxJpoYhqK82EVBjp+iXSVUVwaIxWIAe3qT4sA7sWZYXnQDBN454H0oZYDyglVa876UEEIIIYT42ZDESRwXvA4HaV4vym7g9bkwIm6ax6dwba/OnOKLIxqz2GnFE9jeiViJh1J/mK/9q1gS+xYVB+f8eQh9zuuEAn74ZCPvPTyf3N2l+KtDKBQuw0VUR6iMVRK2IkAYRRSbMwNvUioOj5OAP0hBTiEF24sIVAVrep2MODASwHN1TaChOejwsgPeh1K2mgkkYsVoHTgmz04IIYQQQhx9kjiJ40acw0mS242yG/g8TlTYRUp8Ipf37sTgzGZoS1PkNCktaUdofQaVlRabq3L5PLqA3eRz2mXdGXXrIJxuB/mbSnj77k9Z/tVmqqpDKKVwGi4sHaMyVkTIqkCTgsYHgGkz8cS78SR4iIRrhvGFAqGaSSCwg70nOM+uCbTqFXRs9wHvQyknYO1JniJH/8EJIYQQQoijThIncdxQSpHscpPgcmE4TTxOB2bEidflYXTXNlzasSN2pah2WRQ6UqlY1JbKPDsl/mqWxlawLPo9zU5NY/yDZ5LaOpGwP8LcJxfxzv8toKysCgCHYWKqMOUxg8qowtJ1Z9czDANPvJtYJEZpXhmRsAGGF3QQ3BeArTMQAv+zB+9RUl7Q1ehYCVrvZ+FdIYQQQghxQpHESRxXzD2TRcS5nNjdNhymA0fMg92007NVGpNO7U2iy4nlgKJmToq+bUPlyiQqKyLstGp6nwLpAc6/P5te53bENAzWfbGNF3/3Hzav2okigKFSMFQKlVYV/ph/n+QJwBPvJlgdpjSvFCvmpmYFXA3e68BIAiuvpufpAJNAKKX2TBZRsc9sfEIIIYQQ4sQjiZM47jhMk1SPB5fTjt1tQ8cMnNqDiUFGoptb+vejXVIiNqdBeWtFXkEqFZ+3oKpQEdQhvol9xw/GWvr8ogvn/OUM4lM9lOVV8urt/2Xu6xuIxuKxKTsuw0mVVU1FrILoT2bLU0rhTfQQ8AcpLQihdc2aTcqIB+8kwITIdxD65ID3oZRZM6mEVVIz5bkQQgghhDhhNShx2rFjBzt37qzdXrJkCb///e958cUXGy0wcXJz2+2kerw4nXZ8cS50xMBJTfJkd8B1/U5lSMuWuFw2glmKHIeT4jktCayLw7I0OdYu5ka/pLRTCWMeOZ1OgzOJxuCLGSv5x22zKNldiqEM3IaLoA5SHisn8pP3kZRSeBI8VJVXU1YMWBZax1C2duD5Rc1JgXfQkbUHvI+adaEMtFWM1uGj98CEEEIIIcRR1aDE6fLLL+eLL74AIC8vj5EjR7JkyRLuuusupkyZ0qgBipNXnMNBituNYVd4vA50xMSJFwODiA5xfueOXNa1Cz6PAyPdxrYMi9zvkgkuyMIVcxEhyiZrK185F+Ge5KbPHb0xXSY5a3bz4s1vsHzOagBcykVURymLlhG0gnViqHnnyUNFcZTyUg1Wdc0BRzY4BgEWVL2ItkoPeB/K8IAOo2NFaB07Sk9LCCGEEEIcTQ1KnFatWkX//v0B+Ne//kX37t1ZuHAhM2fOZPr06Y0ZnziJKaVIcrtJ83ixOU3cXgdWxMC1J3kKWyH6ZjXjhr59SUvw4kpxsq1ZlG25JgXvZ9E91IMMIxGNIs8qZXe37WT+XyLxF7qptgL8Z9oc3n74QwIVQVyGC4DyWDnVVnWdd5cM08Ad56GiWOMv86OtWM07TJ5fg9myZs0m/7MHn0FPFscVQgghhDihNShxikQiOJ1OAD799FPOO+88ADp37kxubm7jRSdOerXJk9eL3WXi9thrkiflRaEIWyFaxMfxu9P60TE9mfhkN7szLTbrEF+8W07i5g4MU6Npa3TGrhxEHCGc4wziHrKjx4dZs209L9z4Olu+3Y7DcGBTNiqiFZTFanqf9k4cYbObODwJlBVFCFSW7InNCd4bQHkgthUC/zrIffx4cVx530kIIYQQ4kTToMSpW7duPP/883z11VfMnTuXs8+uWd9m9+7dpKSkNGqAQiilSHS5yfDG4XTZcXnsxEIKJ57a5MnncPCb3r05o00rkpK8lKTC+vgo7y0u4tOPC2hW2YmzjHH0NE4j0UjGlegkYZQP8/ooFRcWM/OdWfz3xc/RYXAaTiI6Qmm0jJJoCVWxKqI6it3pxO5MpqKolOqKmunNlZkO3t/UBBr6Ah1aeJD7sIGy7RmyFzzgeUIIIYQQ4vjToMTpr3/9Ky+88ALZ2dn88pe/pFevXgC8//77tUP4Dsf8+fMZN24cWVlZKKWYPXv2Qc9/5513GDlyJGlpacTHxzNo0CA++eTAs5qJnw+lFAkuFxlxPtxuBy6PDStcM2EEe5Inm2FwfscOXNKtM2mpPki1sz0zzCJdxD8+XsGKFXlk6dYMMUcy2DaCLHsbkrKScXdyYp0bZmmHb3jqzRfYsnUbTsOJ23CBgopYBSXREiqiFVhONzHloryokGB1qCY2e09wjasJtPp1dHT7Qe7DDcRkcVwhhBBCiBNMgxKn7OxsioqKKCoq4h//+Eft/uuuu47nn3/+sOupqqqiV69ePPPMM4d1/vz58xk5ciQfffQR3377LcOHD2fcuHEsX7683vcgTkzxThcZPh8erwunx4aOmLhrk6eaWetOa9aMP/Q/jeHtW5OW7CHqhZyUEP/M28xzHy9nZ24FSSqFvvaBDIqNpl/a6WSkZaC8Cn+nct7M/yf/+fK/ANiVHY/pwaZsVOtqSmNlBJ0OQtpPcUEJ4eCemfJc48DeA4hA1TNoq/LAN6F8oKvQVqm87ySEEEIIcYKwNaRQIBBAa01SUhIA27dv591336VLly6MHj36sOsZM2YMY8aMOezzp02bVmf74Ycf5r333uM///kPffr0Oex6xIktzulEKUUBUGYFiAbB7fAQoIqIDmNXDlI9Hi7s3JmzWrdm3qatfLV9F6XVYTZpP0/8sILuW5K4oE8H4pxeXKFT6JDShXzfTub/MI/KuHKWO5aTO7OAX517Cd5ED6YycSs3lrYIa41yQGVVLpG8CBnN0nA73WjPRKh8EKwCqHoB7ftDzVpOP6GUQuMFqwyNHfAd82cohBBCCCHqp0GJ0/nnn8+FF17IpEmTKCsrY8CAAdjtdoqKivjb3/7GDTfc0Nhx7pdlWVRWVpKcnHzAc0KhEKFQqHa7oqICqJngIhJpnKFSe+tprPrEoTmVItXlJhaNUR4NEAmA0+EiQDWWCmJXdqBmSvOxndpzeloGX+fsYv7uPMpCYb4PlLB60RL6pqVx1iktUX5NVlxLfjXoCj754RO2mRvJbbeLp177O+cNPpd2PVthM2o6aG04UToZ011Asb8E/64AaalJ+Bw+7M7rsVX/FSJrsfz/QrsuOeA9aG0HXUR0zwQU0n5EQ8i/P+JISPsRDSVtRxyJ46n91CcGpRswVig1NZUvv/ySbt268dJLL/HUU0+xfPly/v3vf3Pvvfeydu2BFwQ9YCBK8e677zJ+/PjDLvPoo4/y//7f/2PdunWkp6fv95zJkydz//3377P/jTfewOPx1DtOcWKLac3i/HK+LC7Fb9YkLE6bQc8EH6cnJpBhd6DRrAmtZ4OxmVg0hl5g0EV3pNuIDihDHfIaKfGb6NryQwDW7RxNYXnno3pPQgghhBCiYaqrq7n88sspLy8nPj7+oOc2KHHyeDysW7eOVq1acemll9KtWzfuu+8+duzYQadOnaiurq530PVNnN544w0mTpzIe++9x4gRIw543v56nFq2bElRUdEhH87hikQizJ07l5EjR2K32xulTnH4QtEIBf4qisqriIVi2JyaANUopTB+9BqfFbOoKq8mUBVEmSafr9rBN4UFVLksFOD22OiUmsjwVi3okpTKuurVLMpfQMgfQi8zydjWkqE3DiMhxYfbZsNtC+A2K7AsH8GqEJ54F7Z4E6fNSaL1OfbIHMBOzPOnmvWeDiASLuPTLzYxYsQIHA7H0X9g4mdF/v0RR0Laj2goaTviSBxP7aeiooLU1NTDSpwaNFSvffv2zJ49mwsuuIBPPvmEP/zhDwAUFBQ0WjJyMG+99Ra/+c1vmDVr1kGTJgCn01m75tSP2e32Rv9BHY06xaHVPHcHNrudvOIKrIiF1xlHkGr2/lVAAYZh4Elyow0I+IOM7tuagRVZfLxoC5uilVToKD+Eitla4Se7dRbntu2B2+Xm68IvqTzNT4FzBx/c/T7Zv80ms2cWfksRZ0Zx2QLYvA6C/iA+m5dIXJhK23AS9U7M2Fpsoech7m6UEbff+LVOAMBmVmG3e4/NQxM/O/LvjzgS0n5EQ0nbEUfieGg/9bl+g2bVu/fee7ntttto06YN/fv3Z9CgQQDMmTPnqE/S8Oabb3L11Vfz5ptvMnbs2KN6LXHicNpsZMbFkZkcD3ZFJKTwGQkk2pLqfFKcqTRPaU5GXCb2gJuWCelcd87pTOjYk24FcSSU2SgrC/LZlp3MXLuGVrZ2DEsfQXKLZGx9FcFRFXz814/5/q3vcOAgor34I5WUhIP4DYuiknIChUGqq8OU2C4ippLBKoaqF9E6tt/Yldoz/M8qR1v1760VQgghhBBHX4N6nC6++GKGDBlCbm5u7RpOAGeddRYXXHDBYdfj9/vZtGlT7fbWrVtZsWIFycnJtGrVijvvvJNdu3bx2muvATXD86688kqefPJJBgwYQF5eHgBut5uEhISG3Ir4GXGYJpnxcSgFu4vLCQSimMpAq5oeJwBFzYbD5yYciVJeWoXb56Rz+3RaNEtg/jdbWFGYz24dYomVT3kwzIQuXRmacCYL7fOpcFUQsIdZ/q8V7F6Ty+g/DCM100XMgpAFARvoYAh3MEY4ziTmu5Q0XsKMroXAu+C5+CB3oNFWKShHzWK5QgghhBDiuNGgHieAzMxM+vTpw+7du9m5cycA/fv3p3Pnw38RftmyZfTp06e2l+rWW2+lT58+3HvvvQDk5uaSk5NTe/6LL75INBrlxhtvpFmzZrWfW265paG3IX5mHKZJZlwczVMTcXodOLw2XF4Hbq8Dj9eBz+fA53EQ53WS1iyZ1PREYqEYCkiId3POWV0ZlNWCVvkuApUR1heV8uz3P2BVxDHIOYzE9ETiT/diXBklf3s+/7ztPdYvLMJmhHHZbDhtNgKGRdCuCVVGKS2MIz88lqgVg9DH6PCSAwevPHvWdyqT9Z2EEEIIIY4zDfqztmVZPPjgg0ydOhW/3w9AXFwcf/zjH7nrrrswjMPLx7Kzsw/6BXH69Ol1tufNm9eQcMVJxr4neYpzOolYFpFYlKhlEbUsYlqjtUZrcCiDJHcCTp+NqtJqPB4HDruN8aN6YHwKtq27ydFBdlHJS+vWcHHrdvSNH8K3ngXYBtiojg8SfC7ER48tZOfolgy9pi82hxOP3UEgEkY77Li1QWlJB1RcX1JdS7FVvQJGJsrWap+4lVKgatZ3QrlqFsoVQgghhBDHhQYlTnfddRcvv/wy/+///T8GDx4MwIIFC5g8eTLBYJCHHnqoUYMUor5shkGCy1W7rbUmpjUxy8LSmuie/0YsC5/DQbEyKC+uJOJxgglnZXcCFObmXWyPBSkhyFvbNjGueWt6xw1muW0hRncD118syv4a4Ic528ldX8yIm4eQfkrSnuQpQswwifPYqagaiV3nEufcga3yKcyEyShj34kglLKhMdGxoj1D9mSWPSGEEEKI40GDEqdXX32Vl156ifPOO692X8+ePWnevDm//e1vJXESxx2lFDalahex/alMn49iTzmlReUYdgclBBg+uD3KALVhFzusIOXA7F3bOKd5K3r7hrDC/BrdPEj6w/FUPB6kaHsFb972XzoMasWAy3qQ0iqBYCRKWShEgstJRexX2MNP47ByMUqfwpl4G8rc91dQKQ9aV6BjJWCmo1SDR9QKIYQQQohG0qBvZCUlJft9l6lz586UlJQccVBCHGumaZKakUh6RhK2YAyPZWI4DIacdgr9uragVb4ToyCGvyrMBzu3s7E8zACVjSPmJuKJkHiXh9ajsjDQbFy0g5m//4hPpi0iUFiNaRiUhUJUazuV6iosbSMWWkl5wWuEg+H9B6S8oCvRVvkxfQ5CCCGEEGL/GpQ49erVi6effnqf/U8//TQ9e/Y84qCEaAqGYZCYlkBysyTiHU5sYYtINEKfLi04rXcrsopsOPMsqqvDfJG7i4V5pZxuOxOfiidohqi+JMSIR0+lXf9MNLBu/jZm3PwRXz63jGBRgIpwmOJwIpXWxRimgZ3PKMz9mMoS/z6xKGWCcoJVhtaBY/8whBBCCCFEHQ0aqvfoo48yduxYPv3009o1nBYtWsSOHTv46KOPGjVAIY4lpRQJKXH4Ej3EVfrYkldIWZGfLq3SsKIRvl21m7Kohb9FmGWFBVSGI5zTehjfqQVU6DKWJW2i16Q29Dq/I9+9s4lt3+5mzedbWP/lNrqOOIVu53UgltEBm3sIceYCElzvsLs0EXAQqg5hT7D9KBYnWlfuGbKXIVOUCyGEEEI0oQb1OA0bNowNGzZwwQUXUFZWRllZGRdeeCGrV6/m9ddfb+wYhTjmTNMkOTGedq2bkXFKOunNU+jXqy2n9m5GYrlBwjYIVkdZX17KrM3b6aWHkK6aoZVihbWFDRmbGXJzby584Exa9cwkFrNY+ckmZv3+E754+VtW7z6NgHUKNsMiI2EWNjNI0e5iKkv8WDHrf4Eo354pystlinIhhBBCiCbU4D9hZ2Vl7TMJxPfff8/LL7/Miy++eMSBCXE8SHC5CcXHahbNdTo4c6gXl9fOoq93Ytsco7RthB26khkbNvOLdv1Itm9nvbWS3aqEiujX9G7Wh1F3nE7xpjKW/HMlu9YWsvq/m1j36VZWn38Kv7qyCKetjC4tPyLk+BX5hRaJ1XHEp8bhcDlQSqHxgFVaM3RPpigXQgghhGgSMl2XEAdhKEWSy02c24XDbWJi54xBncke0ZY4y076VhOr2qI4EOCVdWuxVzZjgJmNCx9VKswivmF9YBveVj7G3jOU8fdmk9khhVgkxtJZ23jsdwmUFYZI8OSQ5X0B07OBkqpSCnYVU1VejbYslLIDJtoqQesDTCYhhBBCCCGOKkmchDgEp81GituDw2nH7jAhYqNvn9ZkjzgFt7aRvtXEHlIEtMUbWzeyIzfGaZFsUnQzLKVZY6zhe2sVpVXVJLVP5IIHzuS8O4eSfkoyu7e4ePqe1mxbGSFaVUymYwbxcXMIqHLy84ooLSgnEo6iDA/oIDpWKkP2hBBCCCGagCROQhwGn8NBsteD6TQxlYFDO+neI5MRIzvg0CZJmxS+oIk2FB8V7mJFSSXdAwM4JdwZtGa3ymWJsYyiUBmV/iDp3VO59P+NYuwdQwhEmvPUvZ354t9uyvP9JKiFZHj+jvLsoKS8jMJdxVRXVgNe0OWgK5r6cQghhBBCnHTq9Y7ThRdeeNDjZWVlRxKLEMcttWfIXtAbpTjqJxwAl8NNx65J2MxOfPLJerzrY9jb2ilLiLKgOJ9KFWN0s1NJCcfxvfqOSipZbCyju+pCRnU6YVuMlqdmclmvTOY+s5T333SycVUp46/eTrO2u8lwv0SZcSZVwYFEd0eIS/IRn+TAoASUE6VcTf1YhBBCCCFOGvVKnBISEg55fMKECUcUkBDHK5thkOLxEIxE0bEA4bCJw+mkXedEzjG78PHH61Bbo7gTLMqzLL4vKqIiGuGyjl0YaaWwpHoh+bEKVrCSVmZzulqdqfBbOOwGPc/pRJfRbZn/3FKemRzHuF9tptfpFSSlzsHl2UBZ5CJKS6OEg17ik8Dl2ztFudnUj0UIIYQQ4qRQr8TplVdeOVpxCHFC8NjtpHjcBKMRrJhGx5xEjSjtOibz69S+zJmzHnIrUEGLwqwgm2IWf1+5igk9ujAi5UxWVf3AyuA2cqxdlFhl9DF64Ap6AMjqkML4v47gh7fW8K/n7WxcWcCYX+wgteVmMpzPUGKeQ2V1b8K7TRKSc/El2TGdaU38RIQQQgghTg7yjpMQ9ZTodpPi8WC6TKyogQMXEStMcrKHyy7rw7Bh7UmKOcnaYaeyIMiO4nKe++4HNlaY9Irvyaik0/A6PFTbAyxSy8i3cgHw+4M4nTZOu7onF917JpvWtuH5Kd1Y+XWM6pJSUs13SPT9i7DLT3FxmJLcrQSrSpv4aQghhBBCnBwkcRKingylSPF4iPc4sTlMrJCJ03AR0iEMQ9G3bwsmTOhHx6wU2ua7iBVH2V1QwYvfrWbh7hCZ9nTOTRxCM2c62GC1awOr4tayO1JAeVkVkViMlB4pXPvUeTTr2pVXp3bj45kpFO2owBX9nnTnM5i+LVQEqinauYnKEkmehBBCCCGOtgYvgCvEycxps5Hq9dW872RprKjCMKNEdQSbspOY6Oaii3qyalUeX3y5iW3RKoqiVby+fAP5Vc25oL2Hs+IG8ENgM6sC6yhwFFDiLMaIGaSWp9DCnUWnuBaMv+0MOg1qxUdPL2LzmkTGX7WJlp0KSEt8lYq4gVQHB1GSp4EOxCUnNfVjEUIIIYT42ZIeJyEaKM7hINXrxeY0IaZwahcxHSVshdBao5SiR49mXH1lf4YmNyet3I7fH+L9H7bz9JJ8orEgPT0dGeE7g6xQM9w2F9quybMVsCT0HW+VfsRH5V9j72dw9TPn4E7rzN8f6cn8932U7KrEZy0k2T0Dy7mNkrxN+EvLmvqRCCGEEEL8bEmPkxANpJQiye0mFI1SGPUTCYLPHU9AVxPUARw4MZVJXJyT88/rTpcNGbz99Wq2eqtZsr2IXcV+bh/aklR3Ip39HWneJpViXUpOaBfbqnfij1WxqXoHOZFcbMog6/fpdFvTibnPO9i4spBxV2wlo81OUuL+SQm/pCQPoD2+JOl5EkIIIYRobNLjJMQRsJsmyR4PXo8D06bQERvxZiJe5SWqI4SsYG3vU6dO6fzhF4MZ7muOGYOdlQH+/N9NfLepZnIIpRTp9lT6+XpxUdo5nOU9g5ZWK5xRJ1HLYmckn50dcon7fw52nJnK8x91Ycu6GFUFuSQ6/knUvoOSvI1Ulck7T0IIIYQQjU0SJyGOkNfhIM3nw+62o5WmOhDFZXiJNxOwKTshK0BMRwHweBz8amxvbujTBx92ApbFUyt2Mnt1PhUVQSAGxFDKomV8Gv0TuzPIHMBI1Z9eRntSzQRMu8LXx4Ue7+U13Y7dxVC5O4d44w1ittw9yVNJkz4TIYQQQoifGxmqJ0QjSHC5CMZFKTUDWGGL6kAEl9Mk3ownqIIErGoiOopTOVFK0adzFu1aJvPkvKVs9ZezxPSz/d1VjGyRzOkD0khMcAAQ5zEAk2jMSXezNb2s1oTcEXboQjbbd7PDZWNGAVwZ3UhiZDvOpFcJ2a+iNE+D6oA3IblpH4wQQgghxM+E9DgJ0QhMw6iZotztwu62kRDvIhq1CARiuJSHeDMBx57ep6iOABDvdXHXOUMY0a41dlNR4I0xO7+Ep6dv4/1PKsgvjSdMJk5Pa4I6gyKVjhHXBnukJW2jfRkbfxHt4jtgb57Om/7OlJfZCBbnYJW8TFgVUpa7gary4iZ+MkIIIYQQPw+SOAnRSJw2Gxk+H/FOJzgMkhI9OJ0mVYEQWCZxZgJeMw5La0JWAEtbGErxiz5d+XWLTDJSfUTjFBvTAyzZmMsr05fz4UcbKSoOkRiXQHXEojRq4U1NwuF0UV0e4QznUBIcyai2qXwQ6ENVpR0V3kXV9meojBRSnr+JqvKipn40QgghhBAnPEmchGhEDtMkw+cjyekijCY+3k1SgodwJEooFMNteEiwJeBULsJWkIgOA9DB7eEPA/rRuUUqCWlucltEKPCFWLsuj9deW8ZHH64hUh2jNBCgNBDCm+IjMS0OQjDEPAPTcOJvF8cSK5tAlR2vq5CKDU+RX7CL8vxNVFdI8iSEEEIIcSQkcRKikZmGQarXS5rHQ1jHcLhspCb5sNkMKqtCYBn4zDjibAmgIWQFAEhxubi+dy/6NW9GQqKLSFsbFe0UEWKs31DAG298x6dz17Fy2052lZVguTXeZm6SXAn0VadhaYNNLSx2eC4kEnaQmlZC1bZnWL9yM2X5m6iuKGyS56G1Rusw2vLXfHQYrXWTxCKEEEII0VCSOAlxFBh71njK8PqIWhaWCSnJPhLj3IRCUUJhC5fhJt6WiFO5AYjoMMqwGN/xFM7r0A63w0Y4QRHu66RZhyRMZbJ7q58P/72Bd99Zz45NAVLcaaRkpdInqxcd7J2IRRWL4soIpF8NykVWq3J06ct8Ofs7SvM2EqgsOCb3r3UMrQNoqwwd242O7kLHcms+e//fKkfrIFpbxyQmIYQQQogj0aSJ0/z58xk3bhxZWVkopZg9e/Yhy8ybN49TTz0Vp9NJ+/btmT59+lGPU4iGUEoR73SR6YvDUIrqSIT4OBcpSV6UAZVVQZQ28Jo+AOJtiSTakkiyJzOiZRduPXUIaW4fQa3ZkFjNaeM60rNrC2yGybbtJbzw+kL+7+/zKcuPYPfZGNN2FBnuTIJWjE+jeZB5PQ63lzYdy0mN+zf/mjaP/G1rCVTmN/q9/rhXyYoVomM79yRIhaDDoBwoIx5lxINygA6hY/l7ztmFFStBW1XoPdO2CyGEEEIcb5o0caqqqqJXr14888wzh3X+1q1bGTt2LMOHD2fFihX8/ve/5ze/+Q2ffPLJUY5UiIbzOhxk+ny4bDYqIyFcLgepST58HifVgTDRSE2Pi03ZsCkbpjIxlEGrhERuP20w3VLTiVoW/921iepTTK698gy6dW0OhmL9lgIee/oLPvt0CxEd5aK24/G64iiimi+riokkXosvJZH23crp2u1zXrr7E7Z8v4JAZd4RD5c7UK8SVjmgQHn3JEselPrfygdK2Wr2GQmg3EAMrOI9dezEihXUDukTQgghhDheNOk6TmPGjGHMmDGHff7zzz9P27ZtmTp1KgBdunRhwYIFPPHEE4wePfpohSnEEXPZ7GT4fBRVVVEeChLncJKc6MXptFNW5gcgGIyiDAsNoEGj0Roua9edr1zbmbNjE4t27WBraSm/7N+D7j2yWLp4K9u2FzPns418v3oXl13Ym3Obn8M7O2ezMVZAVjSZ9uavSW72Bp2MMmKxZTx3RzWX/r6SfqMHYne6MAwDZSiUYQBqP9Hvu0/rAOhATW8SALaaXiXlqddzUcoE3KDcexK5MFgVaMoBG1o5UYYPlBOlnPWqWwghhBCiMZ1QC+AuWrSIESNG1Nk3evRofv/73x+wTCgUIhQK1W5XVFQAEIlEiEQijRLX3noaqz7x86SAZKcTpTWl1dV47HZcDpPEOCc7AbtdYbMZKKUwlEKpPQmLUpzfuRNd01N5ZfUKCkNVvLhuGZd36cHFF5/KxrX5fPzpGnJ3V/Lkc19x5pD2nNrvVL6t+JavI5tIj++PEf4lyVlv0cNWTiy2htcfjLBpRS6DL+iCN94FhsIAlM3EZjcwlYEyDQxToQzz/7P33/GSXfWd7/1Za8fKJ5/TfU7nHJUaRSwQSMiASTZjbGzAeAaPzTD2Y17zYDMO2K+xh3sfvy4Xj+0x42szHgZzjTHBYLCwECggCanVakmtbnU+nU5OlWvHtZ4/dnW3GgUkFFphvfUq1Tl1Kuyqs7vO/tZa6/frbhPZ9aRAWllQgtz57QTguU616wYpyKbt6RZQAxyQlWwES5ilmY9n3n+M58LsP8aPy+w7xnPxUtp/ns02CP0SKW8lhOCrX/0q73znO5/yOhs3buSDH/wgH//4x89d9q1vfYu3vvWttNttcrncE27zB3/wB/zhH/7hEy7/whe+QD7/7D4dN4yLrZmmfLO6wJko+zBgV6HENcUySaT5wb5FzsxmFfp6yw5DN83QKtTIqRy7qpczXDzFlrF/plNrc9+tJb76t6uxXYcNr13FxuvX4BXci/nUDMMwDMMwXnTtdpv3vve91Go1yuXy0173ZTXi9OP4+Mc/zkc/+tFz39frdVasWMGb3vSmH/niPFNxHHPrrbdy00034TjO83Kfxiub1ppmHDHfaqGSlEO7d3PZdddh2T/6n+RrleKrhw7w7ePHeAw4EbW5ZnQF77psO0cPznDb7Y9RDRSNfx1k+KcS6NXUls3w+qFr0cEg/YXPc8PPtNmw8wgzJ23qS6doPbKH/k0b2Xj1ZXjFETQlstGfC7cZpUkTTdwJsVybQjlPvpxHWk82xe/5pbXKRqCEh5B9CGk++ADz/mM8N2b/MX5cZt8xnouX0v5zdjbaM/GyCk4jIyPMzFxYEWxmZoZyufyko00AnufheU9cG+E4zvP+i3oh7tN45epzXTzHZaZeA7JJbmen6Mnu6cnYwM/vuJRNA0N8cf8+ZltNvndynO+dHGdT/wCvf/dWDv3gNKeO15n8Tj/F608S9J6i3+nj2v4NRMn7KFh/x7pLI8Y2RLSrDZIwAY6RLtyCSHKUegtIqxdFD4petO6e04tyh1D5QeIwpbXUJOxElHsL+AUfab2w0+i0dkC3gQWQCiErZupel3n/MZ4Ls/8YPy6z7xjPxUth/3k2j/+yCk7XXHMN3/rWty647NZbb+Waa665SFtkGM9NwXUZLhTZB4AmVilKazSgVHcWrThbnkFgnQtWsG1wiP9ywxs5ND/P906M89DMFIcW5jm4ME9uTDKyrIfFh2yCh4eZ3zHJv3R2Y8cOGypj+N7Hca1JongC2b+EWlxg8dQJbKtOGndo1ToUKi1KvYs4lnxCfQjFELF3KbG7kyAYYGFqCT/vU+wt4Be8H1r39PwRIqvWp3UIaj47t/oQwkwzNAzDMAzjhXVRg1Oz2eTo0aPnvh8fH+ehhx6ir6+PlStX8vGPf5yJiQk+97nPAfCrv/qr/Pmf/zkf+9jH+OVf/mW++93v8g//8A9885vfvFhPwTCeM7c7PW+0XMGy7Sw4aU2qNercSZEoTapU1lBXq+znSjNW6eFDl++iHUfccfIEd5w8QS3scEQ3UFs0bq2fwnQTRmp8+fC9/NTAdWxY2QNiLfnCesJ6izAfMrSrl1N75vn6X+ymNjtJuTekbyjh8huG2HJVhVy+hWAJi0kks3jiX/HEv5LPDxL7l9DsbGVhYohcKUehp4CXc1/AAOWhsUHX0WkEsh8hCy/IYxmGYRiGYcBFDk4PPPAAN9xww7nvz65F+sAHPsDf/u3fMjU1xalTp879fM2aNXzzm9/kN3/zN/nTP/1TxsbG+Ou//mtTitx4RZBCYMtnNu1MdwNVqjVBErPU6SCl5C0bNvL2TZt5cGqKbx87xKHFOZI+wXRnOQNJh5wb8uXjP2DX4Ut4w/UjNByfXDGHh2SxtcSKXQN8+Kqf4dD9p7jtCw+wf/cc+3fHOF6Vq966net/5lKKvQKb/TjiIWwOIpnDk9/BK3wHlR+gHWyjOrUDr7SWYqWI470wQ/BCWGjKoNtZM13dg5A9ZuqeYRiGYRgviIsanF7/+tc/bRPOv/3bv33S2+zdu/cF3CrDeOkTIpu2ZwGuZZF3XBphQC0MaUYxlwyPsGv5ch5bPMOdJ0/x4OQsrdoa7JHDMNjktonHePhvm/zMjcsYGrHxbImtBIvNWVS+n01XrWTzVas5eP9JvvuFB5g4Msv3v/IQ933zUa58yzaue8dOeoZ2AQE2j+KIh7MQJeYp5u6goG8niftozezEyl1JrrIB233+A9T5qXtR1kRXR2bqnmEYhmEYL4iX1RonwzCenC0lvbk8RdejEYbUwoAgTFhVHuCnt3m8fdMGHpqa445JiyB/gPzYEseVzX/7VsSbNoxxyWU92FaMlyRESyeJihV6c71suWoZm698J4cfmOA7n9/NxJFZ7v7qw9zztUfYeu1aXvuunazccgUJZ0NUdyRKHMRxlrDt29HJ94gXBkjcXbilNyHd/uf9+QvhorHM1D3DMAzDMF4wJjgZxiuIY1n05fMUXJdGFFELOqjEQhNw3coxrls5xi0TRfbWHmTZqjk6g1X+5XSNw/+8kjffsIlcqYIK2sxPtugr1RgsJBQ8m/WXV9i0600c2jPDXV9+lOMPT7H/7mPsv/sYoxuGuPbtO9hx/XpwriDRV5CFqAPZSJR1AKEW0NEtxIvfRdk3YxV+EscvPK9roISwQFTQqo1Op9G610zdMwzDMAzjeWOCk2G8Anm2jWfblFyXpcDhTGuGxahOj1PiJ0dfz0ixxF1z9zMvG/gbp5jrzPO53RPcMHwp2y4bJW4WmGs0aUaaiuOTdyx8V7Fi+xi/dMky5k8vcvfXj/LQ944wcWSWL/1ft/Evn72Xq9+6nSvfvJVib56Ey0n05UCALQ7giruQehwZf4N06S464p04havw8z7Sfv7CjZB5M3XPMAzDMIznnQlOhvEK5tk2I8UyOUdysjlFKwmQWFxSvoKtpR3cOXU/P1jcg8gHqO2nuas1y2N3ruKmnVeTr5QImi1sq4MjekgDh3pL4ciYypDmp399Fzf/0tXc/y8HuO+f99NYavGdz9/P9764h0tfv4Fr37GTZWsHAD8LUVyGLfbiy69j6ypS/y1R7Xbml96FX1yHX/BwPOd5GYW6cOqeAmsIIczbnWEYhmEYPz5zJGEYrwIVr8gqhpgLq6SJpBXHSCF4w/LXcklpJ187eRcnk4OIQsTimiN8efoUW6wdXD66hXorImWBwUI/RZEnTR0W6hAlDcqFkDf8/BVc/+7L2HfXMe75p0eYODLLnlsPsufWg6zdOcp179zJpitXIaUk4XKaejsu38WTt+F7p/DUn9Ju7mJ+6U14+X5yJR8/99xHobKqeyXQDbSyQQ6YaXuGYRiGYfzYTHAyjFeJolsk1DGJHVP0PGpBSDOOKeXz/OzqN3J4dgf/PH0vsXeGuBhwIH2Ak1OPsb1yCWPRMmI1w0hpkIJVIJ8r0uwI4qRKpbiE51W47A0bufSGDZx6bIa7v/Yw++8+zvFHJjj+yAR9yypc87btXPbGTeRLPhE/SayvwhPfwJF7KeQeIO/voxm8kYXJq7A9n0Ip95xHoYSQaAqgatkIlOx7wXpLGYZhGIbxymaCk2G8SljCouwUWQgXcS3BUKFAM8oq8AlHsHagn1/OvYk7p0+xe+Eh8uU5lNtiT/MHHPNKbBAbCasJKyrDeNLDzjl0ojJBdZ5SfppCvh/bchjbMsjPbbmR6myD+/75ALtveYzFqRrf/Ku7ueV//oAtV63m8hs3sfGKlQTW+4m5Dk98FUtMUMp9i0LuAZrhT1GfX0djsYmXd8mX8+QKHuIZ9rl6PCFsNDlQiyBsEJUX4NU1DMMwDOOVzgQnw3gV8S2fgl2gETco2Hkqvk/OdqiFAVUR4CQpb1i+mi39g3zxwAGq6jSVgQXSqE472ct47jCrq2vZWNhAySkgLdDSpdFskE+Ok/P7cJ3u20qf4Mr3b+by96zn0e+N8+A/H2b+VI1H7jrKvruOUuzJc+kbNnL5GzexbO1HcbgPT3wLySxl77Pkva104rfRCXoImovkK3lKfSUc99m/bQnhoFHodAGwELL4/L6whmEYhmG84pngZBivMiWnSKRiAhXgShfHkgzkC+Qch0Upmau2GHJzfPSqq/j6kX5+cOo0ljvP4NASUdKgkzvA8eAQ/Z1BVtlrWOeuJe+NkcbzBEkVnevF91ykFGg0jm9z2Zs3cMnN65g5vsi+747z2B0nqFWb3PmVvdz1lYdYtnaAK27czGWv/036eu/CFXdic4CScxDPvp4gfSPtWocwiKn0F8kVc896yp0QHpoUnc6BsBAi9wK9woZhGIZhvBKZ4GQYrzJnp+zVohqxilFaAxohoSfvgPCZXKzTaYf8zKZ1bOnv48uPHebE0R4KxRojlSaFckiSn2VBzfFQ5wGG1TJWO6tYZZXQnRoqLVHI5bFt63EPDCs3LGPlhmW86Zev5PADp9j33eMcvX+CieMzTPzVLN/6G4uNu1Zy7Vvfw44r9uJaB3HF7bj298n3bqITbmFpaiNhzyCl3iK28+zewoTIo3UDnc6DNWzKlBuGYRiG8YyZ4GQYr0K+5eP6LqlO0WiUVtkJRcVLKbo5Ts4uMtdssb6vxH+8aidfPXSMRydtDp0oUyBmNB/Qs7KN358y404xpSZ5ILJZoftZ2SkzFqykUi7jeU98m3Ecm23XrGXbNWtp1QIevfMYj9x2jKmjC+y/7zgH7hsnX/R443uu4CduPkqpXMVmPyVvPyVXEIaraMzsxCtfRa607NmNPoliVmkvnTdlyg3DMAzDeMbMEYNhvEpJIZFPUZ67x60w4g0yPjPHfLtDnyf5tUuu5L6hSb559DDz9TaHWg7lPRWWKcXI1oT86iaJE3JCz3BCTZKLDrNsepjNubWs7lmG+xRrkwoVn6veto2r3raNuVNVHrntKI987xiNxTZf/+uAr/9NL1uvGuYt78uxet0stpjC807gquPo1teIwtXY+V1I/wqEtfxHPm8hBJqz4ckCa9CUKTcMwzAM40cywckwjCdVyHlsWDZMbq5Ktd0hQXPNstXsGl7BN449xvfPnKLZCjncVizuyzFwT4k1l7kMbo+ZdyYIVYcT1gTH4wnKcyXW2aPszK+jnCthPUWPpsGVPbzxg7u44QNXMP7QJI/cdoxD957ksftiDtwXs3zTKDf/8vVs3VEnJx7DEifQyThp8yQEX0Xay8G9DJzLwVr9lCNRWZnyIugaWlkg+02ZcsMwDMMwnpYJToZhPCXPc1g+UMFekLSTmJAUDbxn005+YnQ1Xzqyj8MLC9RzEdVCQOOgovKQzfYdV7B6l82CdZSJ8DQN1eQhfZj9rWNsDlez015Pyc1ju86ThigpBesuH2Xd5aO06yH3fnkfu79xkKlDi/ztby2wYscwr3vfzazdUqYgj+DE+/DUcSw1gUynEcG/gOxFO1eA/xaELD/hMbIGuQVQS2gshNX7IryihmEYhmG8XJngZBjG08r5LoN9JRaWmnhKEdmKWtBhqFDkNy9/LbtnTvNPRx9jPt9mNh+zWI8JHpvk4AGHSy7ZwE++5jKW5DEOdk6zkNR5ND3OYX2azaxmR7KWHB6WLZ8yROXLHm/84C6ufMdW7v6HfTx4yyHO7Jvj8x+7lXWvWc51793BsnXbsKIEv32Ycu4oOe8okiUIvwPRvejcz4F79RNGlbIeTx6oRbRwTJlywzAMwzCekglOhmH8SPmcCxSZX2zgCBs3X2QxaJMqxZUjK9k5sIxvnzzEd0+doFOIOV2KKSwlBHtPs2+fw5W7BrnpsuXM6iX2to5S1TUejY9xWJ5kq7+ObWIN+UATKo3tSGzXRVoXhpxSX56f/NWruPpd27jr7x/mkduOcXz3FMd3T7H5tSu57r3b6Fm+hcXGBrwGDPXNUXRuQ+oJaP8NxPej8+9DyL4L7jcrU66yMuVIhMy/eC+sYRiGYRgvG2ZFtGEYz0g+59LXU0QpjUygYvtIIWmEIa5l84512/nPV/0Ely8fodzrocZsTq4ImXI63HXPJJ/5n2dYOOzwlspVXJe7koooE6uEhzuH+MfObeyxj0LFRlgWUTukXe8QBREq1RdsR89wkbf9xnX86l++k63Xrwbg4PdP8dmP3MKtf/YAYSshcuD4Qg/H6++lzc2ABfE+qP8+OrwdrdUF95n1dFJoNY/W4Y/1+midoHWAVq0n3L9hGIZhGC9/ZsTJMIxnrFjwsCxBpxPTDiO0UlSThMW4TTnnMZyv8Cs7rmL/4iRfO3KQKbdFu6Bo1iP65xS3fGeeh/bVeNMNg9w8dD0nO1McTo9Q0w0eah3k0fZRtpc2cvnAJoqpQ9yOCNsBWoPt2diOjZTZSFT/aJmf/tjruPbdO7jj83s5cv8ZHrntGI/ecZzLf3IT1/zMNjoi4Wh7E/2FdQznv4klTiLan4fofnT+Awhr+NxzE7KIVo8vU+486WuQhaIEdAIkaB2BDkFHQAookD0g+0ypc8MwDMN4BTF/1Q3DeFZyvkvOd6mkOcIoobcTMl1vMN9sYwtJOeexrXeUDVcOcueZY3x7fJzAt1goJlQXIzoLis/9/Wl2blviuutWsMwbYVbPcDg5ylJa46H6AR5tHGZ7aQO7erZRpoCOEzrNgLAVAALbtbBdGyEEI2v7eM/vv5EzB+e443/vZfzhKR7454M89O3DvObtW9j1zs0stAWN1s8yUtxPyftXpD6EqP8BOvcO8G5CiG6jXlEEXUenC2ANApqzIelcQCKCs+EJDUgQdnbCzy5TVbROwBowTXYNwzAM4xXCBCfDMH4sliXJ51zyOZfecp6FVpupap1mGOFGEoTg+uH1XDqwjFtOHuH+qWmUbzNZDvEWID7Y4PDRg7z2umVs2zLCkBhm0Z/jsegIS0mNh+uPsb9xhK3FjVzes4XyYAmZauJORNAO6TQChBA4no3t2oxtHuQX/vhNnHhkits/t5czB+e498v72XvLEa792e1sf/NqZsMdNNvr6St+Hdc5hmx9CRk9gC78EsIa6/Z4KnXDU9wdVUrJAhKADcIC4SLEU6+Fyu6jgU51Nzx5L8avxDAMwzCMF5AJToZhPGe2bTFcKVEp+Mw0mtQ6HRwkUZiST/K8beVWXjO0jO+cHueQrBLnFOONkPKSoPW90zyyb5E3vH6UgYEhXu8NsVhY4JH2QZbiGo80DvBI4wB9di+juSHWFkZZMTCEnVpEQUjYDmnVYixL4Pguq3cu4wN/MsLRByb43v/aw+yJKt/9nw/ywD8f5Lpf3M7O12+gmvwyXvAAxdw3sdJjyPgPkbm3Qu6tCOFkPZ6IQTiA/6wb5GZ9osqgm+h0GuSgKTphGIZhGC9zJjgZhvG88W2H5eUyvmNTDUIqOQ+hNFGcknNcfm5dgePDc9x+ZpLTTpNOLuZoKWSp2mDmS0fYtqWfa68eppLv5+bS65lllocbj7EQV1lKllhoLPFI4xASwZDXz+r8claWhxi2+olbCXEQErY1li1Ze9ly1l0xyr7vHuP2/72X+lybf/m/72P31w5ywy9dwY5dV9FMt+MmX8KzDyCSryE692MVfxnprQes5/RaCCFAlLJiEekMMICQpefldTYMwzAM48VngpNhGM8rx7IYLBRxLYuFTgcpBeVijmLepzcqUPB9xgolDlcXuWtihhm/RS0XsNCMqJ2c48ixGtdePczWzb305wd4e/+NhCJiMpxlMpxhMpihkbSYDueZDuf5AeAIi+X+EGP+EMtEP8UgR1Bvo5Ri07Ur2PoTq9n9jce4+x/2MT9e5x8+cRv3XbKfN//ba1mz/kNoHsJTX0ZEE+ilP0Y5b8DKX4Pt+Nmok3DJpuk5gHN+TdQzIGQBrTvodBatE4TseUI/KcMwDMMwXvpMcDIM43knhaA3l8exbBbaLepRQNHxyPkuy90BKmGBgp9ndaXMYws1vj8xyWKuzYwXMNdps3D/GfY9usj1r13G6lUl8jmPMWs5o7nlpF5KM2kxGc4yHc8yE8/TUQHH4jMcb5zJ1j0JmyHZxyD9LK8O0ePmueqd27j0pg18/x8e4YFvHuTkwzN85te/wiU3rOfN77+OvpH/jM9XsfUDiOhWVPwdEiEQlkQKgZDnw45GAk63KMTZUOVml1nLIfcuhOw9d30hcmhCUPNoFMjeZz39zzAMwzCMi8sEJ8MwXjBF18WRkoV2m1oYUHJdbGlRyuXJex7ldp5izmdTfw8PTy9y7/QZlpptzrgx80GdM7cEXLKqh11XDNJb8ZCWQACWcFllr2CVvQJymqZuMp3MMxPPM5fME+mYSTXHlJjnUecwY8kI62dXMlYa5E0fupLXvG0Lt39uL4/eeZyHvnuU/Xed4Lp37OSG97ybUulyPHkLQjVIVQRphBApQqZIKRFSIIUCEXbLkLcufNLpCYj3onPvAfe6c6NLWaNdC9QCmtSUKzcMwzCMl5mXxF/tv/iLv+BP/uRPmJ6e5pJLLuHP/uzPuPLKK5/y+p/+9Kf5y7/8S06dOsXAwADvfve7+eQnP4nv+y/iVhuG8Ux4ts1wsYgTWFQ7AZZIyTsOlrQYKPZSyuVY7FQp5RwuHR7k3skz3D81Qa0RctwLmV+Y5+GvVCljMTyQZ3gox/BQjsHBHAP9PrYt6aOXPreXrWxAa81iXGUynuVkdIaarnHKmuSEmGCo0cfm5mo296/mXR+7nqveuZXb/ucDjD8yxR1f3ssD3z7IDe+5gmvf/uvYrg3dQSaVpMRBRBIFSJHi+JDLWzgeuC4IK836OOkAgm9l4an9txDtRhfej5D9AAhhZ4UnTLlywzAMw3jZuejB6Ytf/CIf/ehH+cxnPsNVV13Fpz/9aW6++WYOHTrE0NDQE67/hS98gd/+7d/ms5/9LNdeey2HDx/ml37plxBC8KlPfeoiPAPDMH4US0oGcnlytsNSp0MtCsnbDq5l4Vk+Q4VB8q5PPdfgzcV1XDkywp2nj7Fnep66m3I8H4IGP+6QOyHJHZHkIkkhlQz0+wwN+QwPegwPeQwPugx5LkPuKDtzy5iKFnksPMlUMs+cWGAmnWfv3GNs9zewc/16fvG/3syxPRPc+tn7mT9V51t/cw/3fuNRXnPzFtZdOsrohiFsx8Kzc3j5HFop4iiltpiA1liejZ938fM+ru8gS5dAeCt0/gmS/VD/BDr3bnCvRwiJENYPlSvvRwjzoY9hGIZhvNRd9OD0qU99ig996EN88IMfBOAzn/kM3/zmN/nsZz/Lb//2bz/h+vfccw/XXXcd733vewFYvXo1P//zP8999933om63YRjPjhCCouvi2za1IGAp6BAmCQXXxRIWZaeCa7k03Ca+J3hP5XJeOzrB7adPcrwe0ggT4lgTJYpWnBAnGpTOwtSEJDeehSkvEfRUcgwNFRgZKbFz+zquL69jKZrnaHiMU8kZmmmLe8IHeXBmHzvyq7nk8vX86uVvYc93jnDX3+1nabbOdz5/P9/5PDiezaqty1h36Shrd44yun4Q13dwfQetNUmU0qq2aS21sVwLL+dS7HkDTvnSbNQpOQrtz2ejT/kPIKyhHypXPgNyACELF/tXZBiGYRjG07iowSmKIvbs2cPHP/7xc5dJKbnxxhu59957n/Q21157LZ///Oe5//77ufLKKzl+/Djf+ta3eN/73vek1w/DkDAMz31fr9cBiOOYOI6fl+dx9n6er/szXl1ejftP2XFwgKUgYLHVImfbeLaNg0tZFGlYgtgLWD00xi/29BIHcyy0E2aClMlOzJlmwESjTStKiBNFmCiasSJJEnSi8eOE3GyL3JkFvn/fJLsuHePyy1ZwSX4la6MWp+OjnEiO0I5b3N88xt72OBtzK7jkdWvYfO0befSOE0w8XOXUo/N06hFHHjzFkQdPA+DlHVZvX8aaHaOs2bmckTV9uEUPrTVppGgstWg1Aoo9BfLF38SSdyDDr0J8EGqfIPXeiXZuACGBHFq1gSmE1Y+QxWf9Wr4a9x/j+WP2H+PHZfYd47l4Ke0/z2YbhNZav4Db8rQmJycZHR3lnnvu4Zprrjl3+cc+9jHuuOOOpxxF+m//7b/xn/7Tf8o+7U0SfvVXf5W//Mu/fNLr/sEf/AF/+Id/+ITLv/CFL5DPm4aUhvFypbWmmibMxDEzccR0HDEbR4RKkaSaNNWEkSJNNcW2oLdtceVIhR1ryji2RKOZdec4nZugbtfP3W9v3MvKzih9cR8oqM00mT26wOzRBeaOLxJ1LnyDdXMOQ+v6GFrfz9D6fsrDxSeUG/edGhtGv0NP4QwA9fYyDk/cSCfqe+FfKMMwDMMwnlK73ea9730vtVqNcrn8tNd92QWn22+/nZ/7uZ/jj/7oj7jqqqs4evQov/Ebv8GHPvQhfu/3fu8J13+yEacVK1YwPz//I1+cZyqOY2699VZuuukmHMd5Xu7TePUw+w9EScJSGNAIA1xp4zvZNLhIhbSSBolOsYSN1hBHKUGQEMUKKcB1LaTMSnsrrVnodDjTaDLZbHJkaYkzSw2arZAkUUgt6E0crlk1ys1XrMNxJGGiWIpnOdo+wIw6g5QCy5b02r1s87ay2h1DkICOkTqmemKBM/tmOblvjlMH5ok6CVkViSws9QyXuO5dO7n8jRtJoxRpSYq9RfJFH0t9Hxl+GQgAB+X9FNq5CYSF1nFWXEKWEbL3GVfcM/uP8VyY/cf4cZl9x3guXkr7T71eZ2Bg4BkFp4s6VW9gYADLspiZmbng8pmZGUZGRp70Nr/3e7/H+973Pv7dv/t3AOzYsYNWq8Wv/Mqv8Du/8zvnDqDO8jwPz/OecD+O4zzvv6gX4j6NV49X8/7jOA4536cZRSx12rTThLzjUHCKuI5HJ+2Q6ASlE4QjsHyBE8cEQUIzSpBIfNfBsWyGSyWGSyWuIBuVmmm1eHh2jnvGzzBVbbAgIv55cpxvT51k++AgN21fw+reZfR6wzQ6S4y3HmNSnWRRLXJ3cg97gxxbc1vY4m3Blg49G3qprF/FJT+dYqUJC8fnOf3INOMPz3DysTlqs3W+9T/u5o6/f5Br37GTK27aTHOxQRrFlPpei9VzCbT/N8SPYkVfg/QhKPwSwhpDawd0E4RGWAMI8cT3rqd7DZ/t/qN1+qya+RqvXK/m9x/juTH7jvFcvBT2n2fz+Bc1OLmuyxVXXMFtt93GO9/5TgCUUtx222185CMfedLbtNvtJ4Qjy8r+8F/EwTPDMJ4jKQRlzztXPKIanC9dXnayT4BSnaK0QjkK5aXEuZh2FNDqBLSjgEag8DyJbdlIJFJIBgs+N65ZyRtXr+BMvcl3Hh1n79QMHZWwd3aGR+6Yo7+U46pVo2wb7Gdr71Wsa27jVHiMCes4ddnkvvgB9sqH2OBuYEdhB71OD6lOia2EwvoC2zaMcem7QYYxj952mHu/so+l2Ra3fu4+7vzSXq58y1auuGkTURBT6i1SKP9HhPMD6HwxK11e/y9o/63gvwXE2aIR0y9Y0QitVfYYqooWRYTsMQ15DcMwDONHuOhV9T760Y/ygQ98gF27dnHllVfy6U9/mlarda7K3vvf/35GR0f55Cc/CcDb3vY2PvWpT3HZZZedm6r3e7/3e7ztbW87F6AMw3j5ci2LgXyenHNh6XJLCISQ2EKeW0OUs6DsVlAFRRBFtIKAZjsg7ERoB7St0WQfqGhgeTnH+67dys8lG7nzoVN8//AE8ypkttriW40j3F44yVAlz6b+Hjb2ruRaezPz6RlOqiM00iUOBI/xWHCQMWuMLdYWlnsjuLYPtiAhJXVd1r15J1tv3s7Ruw5z9z8+zNypGnd9eS/3fH0fl96wkavevIXlawcp9b8Gt7wtq7gXPwTB1yG6D/I/h3B2oFU7q7hHP4jyE9ZN/bi0DtDpYjayhQN6IZsmaPUhhPnU2DAMwzCeykUPTu95z3uYm5vj93//95menubSSy/llltuYXh4GIBTp05dMML0u7/7uwgh+N3f/V0mJiYYHBzkbW97G3/8x398sZ6CYRjPsx8uXV4LA8LuiHJ2djYOkSWibqbwfB9puQRxQrsdEDRTHEuS8yxs20IKkEKCrfnpqwd4+xU72btvgn/de4hJOtSTiJOtmPl6wJ6peXp8l60DfWztex3YDU5EB5nXU5xhgkk1wUA4wJZ0K2Od5YBEANKx6FiCla/bxrrrN3Nmzwnu+dJeTh+cY8+/HuDBWw+y5erV/MS7drL+slUUyr+GcPdA++9BzUDzT9HOTsi9B2RPt1x5DLL3OU2r0zpBqzqoGpCCKGY9pXQKuoZOzzbkfebTAw3DMAzj1eSiByeAj3zkI085Ne/222+/4HvbtvnEJz7BJz7xiRdhywzDuJhsKenP5yl53rmpuBqN1lle0lpfcM65n2niJKUdxNRbAc1OQCeKsC2JZWfhQwCWsLj8ktVcumMljzxyhrvvP8aMalFtJizkI+p+yHS1xV3eBGOlMtv6N7OzZxtz6hjT6iQzzDMff5+KU2JnYSeb3A2oQBNHCVEtpi1ierav5N2Xr2bh4Az3fOkBjj54hgP3HmP/3cdZf9kYb/z5XWy97hK8yg7o/HPWPDd+BOID4N8E3s2gFtE66gYb91m9hlpr0C20qoJug8ghxPmKollD3u70wGQqe4wfoyz6s9me52v0zDAMwzBeTC+J4GQYhvF03B93Gm4RdL8mCGMarZB6u0MYJwgpsG2LRKfESqGlZPslY2zYPMzDj5xhz4MnaS6FNHIptXzMgh9Sr4Ycm13C9x3W9/Wwvm8UOz/LjB5nLq1yZ/x9dtsPsKNnOxsr6xminyiICVoBzWYLZ6yHm/+/b+a1p6vs/vqD7L/7OMceOs3hB0+zeusIb3r/1Wx77dvwC9ciwn+AZD8E/wLhvZD7abC3oUm6656eWSsFrSN0ugS6AVhPOeVPCAGi1J0eOIvWcXfd0/MXcLROugGuBbLnGT8HwzAMw3ipMMHJMIxXNCEEOd8l57v0VfJ0wphmKyQIYlxh4XoW0pakWpHmNaPXV7jp2s0cPT3PgSNTHD8+x9xkk3ouppqPmXNDlpba7JuwyXkOmwe3M9zfIXBPUUta3Du7m93WHvq8HjaWNrBhaANjYjlhGNNoNnHGyvzEv7+B17zzCvZ+82H23XmEU49N8Ve/9VWkZTG8qo/l68bYeV0/2y7dTb6wgEz/Bux1SP9daCclW/dUespgk02/a2ZrmUhA5J9ReXMh82gdgppDkz7n6YHnt+XsiFcICHQa/sjnYBiGYRgvNSY4GYbxqmHbFiXbopDzCKOYVjuiE0QE7RjXsfEdGykFuPCajSvYuW4Z7Tjh9NwSh47McHx8jhMzCyzlImq5hDknYrHawTtlU8wNsWVMUqk0iZx55tQiS9ED7F7cQ8Ups7G0gU0DGxgTy2iHHbxej9eteiNX//SVPPCNvRy45wjtesDMiTmmT8zz4HcE0hrk6hsTXvf2aXLFJVzvYZqdS3Erb6XYtxHL6Uf/UPDQqo1WS6BbIPwLpuWdv04T0glAgL3hgvAihIfGek7TA+Fs5b42WtW6UwTtc0FJ67C7dit6XsKZYRiGYbwYTHAyDONVR8rzo1BxnJ4bhWp1QoQQeK6NY1v4toNvO5THPNaNDNC6OmKx0ebI8RmOj89wcGKWBSeilk+YbUfMLQp832bn8m1ctsojsGZZVNNUoxq7F/fwwOIeyt0QtaG0nkpPhWggz5s2vYU3tSOaU4tMHTnN9PgsMyeXmBqvce+tK3j0/iFueOdJtu2aA75Pdf5evvPZdczOXcrQyhVElZDZM9OUywrXa2U9KawSEKKTo5BOdk8T2UnXz78Y1hp0/mcR9oZzFwlhoyl1R62e7fTAs2uqall4wwGKkByE4F/QIg/5XwSRf87hzDAMwzBeTCY4GYbxquY4Fo5jUcx7BGFMsx0SBNlIlBQSy5LYliRvOxRdlx7fZ1lPiSt2rqLWaXP89BTj43McPLPIFAFLOmb3+AyH531uWLOSa1a8htBdYEFNMBGeph7XeeBxIWpDcT1rCqsZGOynb2AFA5tXckkSo4IWadikMbfI1PgSUyea3H3bMbZeej+9/VVe++bDzE2d5l+/tJYjjznM338fl73OZ+OlHgPLAhxnFkvUEFI8+XQ4OQCqAek4NP5PtHMp5H4GYS0DQAjZDU+tbln0gW4lvqeaHqhBd84XoSBbN0V6DDpfheTw+Ss3xqHwa2Ct+rHCmWEYhmFcDCY4GYZhkI1C5XMu+ZxLGCXEcUoUJ4RRQpKkBFF8riKcZ1n4rk3F9RgqlNi8dpTrkjrT01Xu+P4pHk2rLKkO33zsGA/NzPKWdWvZMnIll/VdR40ZJoITnOicoB7X2bP0IHuWHqTslFldWMWq/EpG8iM4hQq2KtM/MMTQhoDtQYM03EISvRmRPEDO+Q7l/ib/dsMR2o0I2c0zOoDF0xZ+0SVX8LDcfpDLke4Y0h3DdleCsxwh/CzkdL4O0fezXlLxI2jvJ8B/O0JWukUjit3eT4+fWndhs1ytO+i03i1CQTaalJ6CzteyIhcA2OD9RFYtUM1A4/+E/HsR3vVodTac9T2vPasMwzAM4/lkgpNhGMYP8Vwbz7WBrKdRkirSJCVJFXGSEoYJUZIgUvC0RZ9VpCh8CiM5Bt9VYuPeSe549AxTpZDxqMr/aj3Ktul+fmrDekZ6h7m0OMo1va9jPpnkWOsYx1vHqcd1Hqnu45HqPhzpsCq/klX5lazIj5FzC9j5Ag4xfhxCfAMquhwnuY185X7cUoiKckyf9Dj6iGL6tM/cZJ6FmRz58gBbrl7N1qtXMrSyDzfn4uUivBw4XhEr/z7wb4TOV7LwFN4B4Q/Q/s3gvwkh/CxkYYFayKrjdZvlah12e0M1AAUiB+kUBP8ruy8AJLivhdxPIWQfWreh9dns5+3PoZNjkP8FQKPT2eelZ5VhGIZhvBBMcDIMw/gR7O50vce3hk1TlQWqVHVHpBJ6Ip96UKR4eYkN68b47ncPsr++yLwOeTCY4Wi1ynUrlvH69SvJuy6+3c+1PWPcOPhGzgSnOdY6znjrBO2kzdHmMY42jwGwzF/G6sJKVuZX0uP0IG0P4RYg+mmi+nUcOLTIulVj5EZttg9LSgdmiO8dZ/rMSRana9z9tYe5+2sPMzDaw7br1rL5NSsZGK1gOTa2beF4eRzvg9jWOE76VYQ+iQi+DuHt6Nw7wH1tFpQonmuWq4XXDUxJFpjUPLS/APHu7iskwb0a/LchrEG01kRpRKIFXu7fI+1bsxGp6G5IT0PhwyB7zLonwzAM4yXLBCfDMIwfg2Vl65/OKgNKaQbjEr31NkWrzs++4yoOPDbJd+87xOlSh8W4w/fCCR6drfJTG9awc3SAmbSGa0nKbh/X9I5wff/rqCZLjLeOc6w5zlw4x1QwxWQwxd3zP6AoSyyzRxm0RugVvbgMEKcBLT9BpG1sK2X4Eo9lO3dw069czslH5jhw9wmO7DnN3Jkqt3/xQe74hwcZWtXHlqtWs2ztAIOjFcp9eRC9SPlv8b39FNxbsKxFRPK/QH4b7f00lncZwsqa5WaV8nxQLeh8CaL7AJW9GM5rIPd2hLUMpRVhGtBJmkRpA61jHKtIwXkDnlwF7f8nm9bX+C9Q+Hdgb3vcuqd+hCxclN+vYRiGYfwwE5wMwzCeJ1IKPM9huL9MKe8zvVTH2TbKihW93HHHIQ4szDGVhJwJa/xdcJAHpwZ5744dlCsFgiimHUWktAHNqL2BNb1bCJKAE+0TnGqfZDqZZElXqUZVDooDuNJlubMcy9Ws9TcjLUUcB0RhDZ20Ia7SvzPlhsvXc0OylfG9cxz8/hlO7J1i6sQ8UyfmEd3/ckWPkTX9LFvTx9DKPoZW/RIrVh2klP8uUkxA+GdEeg2RfhvS24DjNLDVP2KpexFCAQLhXpatj7JXEKuEKGnRiZskuokFOFYFKfLE6Ry1cJKcvYJC8T8j238F6Qlo/jfw3w7+W7NCE+ksZt2TYRiG8VJhgpNhGMbzTEpBseCx2uujUvOZdGq89S07WX9omu/fe5RJv8Nc1ObhaJqjS4u8cfU63r19Gzm3QjsKaYVtap02zahOmMYI1cMGe4itBZvFdIbJ4DSTwWkaaYPZ9CDCSzg9P8220ja2FLbRnxsiTSKiuEPQaZFG2QjRmktLbLxqB0lyKUfvn+Hko3PMjVeZP1Wj2WxzbF+H4/smsiglQEjJ8nXbeN3bZth6+TE8/zC++6ekwVqs+BSalASI002005vRYiU4mtieICFGWh0cy8KxepBuL1LmQQgc6aPSedrJDLGsUMj/Bl74VYjuhOCfskp/+X8LwjLrngzDuKiyJt4dEAXzAY5hgpNhGMYLxbYtBvtLFAses4sNvK02o2O93P39oxyemGMqCFgoxdySHOb+qTNs7R9kJFdkOFdkWb5EPhR0qk1mF+rMLc4xv9CiutghDBWwHFEOkMN15PIFOvE8c817eCC/h5X+KrYVtrPSHyOXq5DEKe1OiyRoEYVtHNVh42uGuOz1y5GOR5Jopk81mBlfYna8ytx4jbnxKmEz5szhGn/3f+Uo92zgJ956ih1XzWE5S1i2xcLMIIcevZxWZxS3OINdmsUpSHJFRbnskyv14eZ7EY6NtNpYMsDxHXJFH88fwccjTueoRgF5910UrDXIzhcgfgQaf9QtWb78/Lon2dNt6it/1EtvGIbxvMgK4FQRtg34F3tzjIvMBCfDMIwXWM53GRvupZT3mPRcCjdvY+2RWe6+5yjzrYCpsEMQJkwu1s8XnUgUXizIxRa5yMKPJblIIjRYUlAq+/T29VLKrebwnWeIhjskqxZo9zeo5w5xvD1OxamwOb+FTfnNFItFZKlEFMW02wGtTpvmUoAjAlw7on9AMjI2jLh+DGU7KKA212b+RJWFk3XmjtW49/YBdn9vmu1XznH8sR5OHCqjWEAzh9YghEIgAJlV4UMgLUmulCNX9smVfAZX9XP5T25neN0ghVIZ1/Ow1DzteJZYbqFY+Chu569BzUHjk5B/X1ZkQrfQabauSosCQuYAz3wCbBjGC0brEFQVCNCqhbBMcHq1M8HJMAzjRWBZkr6eIoW8T2WpQc52WD7awz13H6V4Yo5mNaHtKDquJnEU2tKEDiQ5RWCDbWkcG4aLJdb19TJWKbOskKfXl2zr18zZJR7eO8PC/gXSlYt0xqq0iiHVqMqexm5WeGvY6G9huT9CvpTHKheysuphRJDEhCqCZgdbdrAIQKbkcxZjW3tZvmMAZQFSELUSqpNtNm/usLIREDYC0npA2Ihp1VJatZR2PaJdD4g6ESpVtKotWtUWAKf3T7D3lkfZePVaXvO2nazauYJCqR/faRKnNZZUgXzuoxTDLyCS/dD+bDZ1L/ceQAIhqHm0kllxClnMznFNiDIM43mjtUanVbLKoQVQDbQsI4RzsTfNuIhMcDIMw3gRea7N2FAPPcUcEwtV8jduYe34IOPH5ikVffr7CvT3FXDLLg0iJpp1TjfqnG7UqEch1SRgz+wUe2anuveo6YkTLl3n8/p3rSGeW80Du88weXiJdFmVzqoFcoNtovwRTobHqDT7WO1uZJW3hpKTR+QcUJIwsYgTjzDKkyYRIo2wCJC6g0WM1BqERaIl9qBFZaTIgMxhCdAij6SMYxdxLQuBRAoIw4D5xiKLzSWqnSrVTo25hxdZ+ladQ/cc5dDdRxnbuowr33kZW167hlKpgmc3aSUWsfOLlOSdONEtEH4PkhPgvxmc7QiZy9YdEHYb59oXhChTxtwwjOdMt7Km3iIPWEA96zBugtOrmglOhmEYLzIhBKWCz3pvkJ5ijqLrsWbtIHnPwZEWtjy/hmdL38C5r+thyOlGjTPNOmeaDc406tTDgOk44nsnJ/nuyTP05ly2vWaQ9WqI0/sXGL+7l2ZPh/bKBXKrWiTFORpqkceCBxm2x/CEjyNtbOHgCBfHdbA9G6ktLJ1H6iKOTnHQuMQUZYJWIWkKjVRQS6CeVmmkk7Rp0dFtItEhFB1SkqzQRCH7wFYgYAX031jEuddl9stLnHlsijMHpugb6+XKd+zk8ptXU+mJUaLFonwtJX+UXPB3iHQcWv89m6rnXALOrm6IymdNeXWITpuAgxYeQpZAeCZEGYbxrGmdoFUVsBDC7l7mZOudRNGMbr+KmeBkGIZxkdi2xUhfhXI+x+RijcVWm04UkaKRQmBLiSMtrG6QKroeW/uH2No/dO4+ltot7npkN5N+gfHGEkudmDtPnwGgPOSxavUQyUTI/L48zYMJ7dElchtq5HrbJP74M95WjUZogURiIYl0hAYQZH9JfuiviQQENjYOvsiTE3nyIs+cmqJjtQhuaDPw2hK5PUWmvjjH4sQS//IXd3Dn393Prp/awrXvWkVlUFG3VhL6v05J7caOHwK9mPWMiu7rhqhLs75RzlaEOBuiAnTaAmy0yCFk8WVREUvrGEgA/yW/rYbxSqZVvdurrnz+QuFll+lOdxTKeDUywckwDOMiy/su65YNsCpVxColSlOCOKYdx8SpAgGWlLjSwrEsHn9MPZAWqB0t8jOXv4ZqK+TA4hwHa7PsX5ijHgU8GkWIAliXWRQ7LtEZSXyin3Z/C384olCxyBUsvILA8QW2D1qmpCTEOibRMYnORo4QoLr/CSGxhCQn8+RlkbwsUJAF8rKQfW8VyMkCTndai9KKRCmCJOZUfJTj4X46dpvaNYv0XVmkvK/C5BdnaS92uOPzD3LPPzzEZTev4Sd+dh09K4eInDeSz/8URTGDiPdAvAfUEkQ/yE4ih3YuA3cX2FsRws4q8ekmOm2AKIHVixDeRfotPzWtY7Rqgq6DjrPqgbLn3CfdhmG8eLQOQNW6037Pv9kKYaE1aN1GYILTq5V5VzYMw3gJEELg2BYOVvYnOZcjVYoozYJUM4oI04ROlCCFwLUsXOt8X6NKKU+pkKNSyHHpwAjJ2pQjzVn2LUxzcGGJdpRQyynUeslCO0FWPfyTPpYCqQVSg1TZec516K/k6evJM9pXoL8/T6XXI1+2UCIh1Sme9PFF7hmPjEghcS2Ja9lstreywlnHZHqco8GjdJIOc5fP0HNpnjWHx5j84hxLZ6rc//Wj7P7GYbZdv5xr/802+tb00ywUKNk/hWO/A6FPINM9yPTB7EAnvRuCu7MQZV+Gtq4AewtSCqTVRCcdtKwgZPklEUouDExh9om2yHXLr4dZ7yppDtAM48VyviBEiniyUSXhPa5IhJkG/Gp08f9yGIZhGE/KkpKclOQch4rvnwtRnSSmE0c0o4gkjgFIlcJzXfodmyDnUm8FbNQjbKwMEK3vcKpR4+hCgwPzCyyIDjoPUZKSJCorgZ5kJ6U0WneAOtSAGojxLFBZWuDZNjnXJi9tciI7P3sqCIectLAsiRBZE10pBUKIc+d9fTnWrRug7OZwk00st9YyrcY5EjxKO2nT2TJB4Q8KrD21nakvzTN5YJr9d0zx6O2nAI3tWxR6c/T0lSn3FCn3Fij1vollq6qMrjpF/+AxXLeOkHcCd6LJ0U6vQNmvx8sP4XpTWG672xPq4kzfOxeYVA2IuoGpfG5bNOVu+fVptO5FyMorqneV1gFgvyTCq2Fc4FxBiMKT/lgIN3t/1B0wwelVybxrGYZhvEycHWUqui5K54nShFYQchjoJDGBUjiWhefaDLhFOrmYRisgCWBVUbKiVOSmNStZ6EQcmF9gqtkkUilRen5kK0gS2mFMJ4qJu2HqbLBKtSJSEY0gesptFICTSpxU4CTi/NepwEkkbiIo5j22bRth585llMo+dryBZYW1zKrjHOw8Sjttc2L1OKWPFXnt7BVMf2WBE3tPk0QxKlbUZ9vUZppIYSGxEEKgdffxxUrG1tXZceUSW65YpFipYTu3YLvfoTq7EXI34pfW4RequF4vtj/wok3fe9LARPkJ4U0IAaLY7SEznwUNq/9l/wl3tuC+BqqePXdryIQn4yUj2z8XyUK99dRXFA5aZdN/X0kfaBjPjHnHMgzDeBmSQuDbDlb3mH+sVCYRgnoY0k5ilFa4jk1/b4F8x6HetGknbSInojdn8/qVK552tEVrTaxSgjSbIhjEMYu1Dkv1gKVaSDOJaaURrTSmlaa0VUJbJecCDEDcPWmtAY3WKWmoqNRiag+e4IEHTrNqVS+XXLKcsZU9DMv1jJbXM5kc40BrHw3d5ODwISofKfNW+QaGWyN0qh3atQadWp16bZGoHhDVElqLEY3FkMZihzPHKpw5VuGWv1/F6s1VXvP6KdZuWQIehMaDzB7s4dT4VhK9jbGNo6zZuYW+kVFs94VpbvlkgUmIClprVKpIk5QkTlGpQiuNl3dxPAchPDROtk4rCcHqf1lW9NJaZyNoqtpdcO93155ZYA08/UGqYbxItKp1y42Xf8Q1/e7IVOcpR6aMVy4TnAzDMF4BXNum4DiUPY8wTWhHMc04JEgShCfpsfLkApdqp0WYtEicDr7lI4VEaYUmG1FSKERWLw8hJHnHoeTksHM26ypWd5RHIoAUhdIKRUqiYiIdUw8DlsKAWhBRCyPq3VMtjFkKAoIkISwnnOxE+FVoTyScOLFIseixY8cy1m8eYKSwnlU96zkZH+FAax+1tM69+geUiyUuHbmEjc4mVKBIkpQwaqB0h7yQFB2B6wiSWNOsKxpLEfX5NtMnFjj+T+OMjD7Mpksm6emr0tN3D+3mbvbcOcQX/usgjtfP6IYxVm1by+rtK1i9bQW9wz3PKaScD0xVVBKQpjZJ4qLSlDisEocJSZKikpQscWaPJaQgV/LJl3J4eQ8hy2jdyXpWyfBlVThC6ygLTKpG1m8rG2HT2KDraGWB7DOf3BsXVVYQog7PYN2mEDIrEqGaCGmC06vNy+Od1zAMw3hGRHckyrcdenQum3oXx7SiECWhbBfodCwaQZOm08ayBAKBLWwsaeELD0vYSCwsIbOQ9BQHtRLr7LE+SNBa0WMrRvNZmEp1QqwTlE5RKFKVcnixyg8mZzhZbZCWFNOdGKue0FNLaNw7zg9+cJK1a/vZtG2IFSs3sK5/M8c6BznQ3kc1qnPH0vfZbe/hsp5LuXRoJwO6lyAKaUcdwlihY7BESL7Qoli0GVnZw6bXjCDkZQjxblpL84TNOygV7sMv1nj9O6a59uZJDjzQz/3fm2H/PYcRwkJIC7/oU+kvUeornjuV+7vnZy/rL1HqLWLZFugUUIBCp23SeIkkahGFFmlqk8bBuYAkLInlWDiOjcw5iMf17lJJSrsR0K538PMe+UoeP+8hpPOyKRyhteqOKi0CMYj8BWFPCAtNHtQSGpk9n5fZSJrxyqC1QqdLPGVBiCcjfNBttI5e9lNojWfHBCfDMIxXKCkEecch7zj0+v65ENXwPeymTb3TQsRg42AhkcJCC4i1RgiNJkWQZsUdusUehBAIslER+biiD5B9EmshsX7o+PdccNKKK4bKbB8c4nS9zr0Tk+yfX0QVoNaXUG3H5OcFyfE5jh2bp1zx2bx1mEt2rmfjwFZOBkfZ13qEZtTgrrl7uG9hN9vL29nVdzkDhX4iFSIQ5MnjpAKddoiDOnHUABWgVUqpr0Da8zbQb8eW+ynZd2FxgqEVMde95TiTJ3v4wXeG2f2dHO1aSrvWZur4dPeZaFw/oViOKZYjipWIYiWkWI7pHUrpGVCUexMKpQQhrWxKnuUiLQ/L9vAcD9v1sd3se7CzkRccwCHWmkhVsK0N5MvD6FQTdCKCySXcnEuhksfPF5BW8JIuHKF1kB2I6gYIF/G4qU9ax6DmQC7LSsaTA7UAwgJRuYhbbbxqnSsIUXzGNxHCQes2WrUQlglOryYvieD0F3/xF/zJn/wJ09PTXHLJJfzZn/0ZV1555VNev1qt8ju/8zt85StfYXFxkVWrVvHpT3+at7zlLS/iVhuGYbx8WFJScF0KrkuP7zNYKNAMQtpxTKIUcZpmpXg1WEIgEUgpEd3LUqXRSpFqjVaaVCkSpUlTfa4RrhQCS0qkzKrpSSG751kRB1sAeOR0jkJPifWVIeaDOvdOTvDA1CwdPyEuayaCBH8Boqri/ntPsvu+U6xe28e61f28dsWbCXsW2Nd6mIV4ngeXHmTv0kOsz63nyr5d9OZ6qFsNSl6Rot1PoWe42xQ36hZbaKFVCDpB69eg9DWQnMFObief7KEymLLligne/7EKreZaok6VNF4CXceSTSBCpdnapGx9kn7Ca61VdgIghjQ7ewJpyewkBcISCAmWI5Geg+P2IJ31eMXVJHo1nc4Qi9MhtudQLOfxCwLbeWkVjtA6zRqHqiqQdtdjWd2faYj3QvuLKDWHdn8Cq/C+7AAUhU7nAYmQpYv5FIxXmWw67RII59mvtRMu6AZal806vVeRix6cvvjFL/LRj36Uz3zmM1x11VV8+tOf5uabb+bQoUMMDQ094fpRFHHTTTcxNDTEP/7jPzI6OsrJkyfp6el58TfeMAzjZcixska6Jc/rFoFQpEoRK0WSpoRpQqxS4lShdBaMHASWFNhCZqNPkP1MadJUkWpI45Q4SbNQlShSlaCUPjciZUmJJQXSEriWi4vLWD7Hu9b1c/OqNrunJ7h7YpI5uw0FwcxwQq4BuRnN8aPzHDsyD0Au5zA6uopl61bQHjpD3Z7nSHCEIxOHGXVWsrOwk6HCICWvTV++h6KbB2EjyKNlD4L4XJCSug3OCuDnQf0URPdAdDdSN+npeQh6ALpV+3QxC5f4pKpAmhZIkgJh26PVcGnWHZo1i/qSJAoT4jAh6nRIwoA4DkjCkCQOEMTYtsZyFJalsJwUaStcTzG0rM3IiiaW3cCyJnC8u3E8m7zngjeGsNcRVEdp1dbiF0fIFxdxvACsgYtaOEKrdnYAqtvd4hfnpzzpdAra/y8kB7J1dDpFhHeg1CKy9GGEyP1QeDLrRowXh1b1Z1gQ4sn43R5snWc1WmW8vF304PSpT32KD33oQ3zwgx8E4DOf+Qzf/OY3+exnP8tv//ZvP+H6n/3sZ1lcXOSee+7BcbKO9KtXr34xN9kwDOMVQ3Sb6WJZ5B53udKaRKlzp7hbqjxRKWGaojRklfKA7kwxISWOJ3F0Nj0vTRVKZfeVpilRkqIShUwEQmfhy7YsbFtSssu8cUWZ60fXcWBhhjsnTnJ4cQnlC2o9iiiR5FuSdD5GdWKOHp3n6FEQooDbZ1PaXoXBGuPuOGeiUww1h9ngbWGFN0pvrodKvkjO8fAcB8tysylkFNG6F84GKVkB+Tbw3gjxHlCL2QGVLCFEKStsIMsgXCwkqGwkzo0hn0BPpIgCSGKNSpoIWliuj+MWEY9rVhyHCc16i2q1zlKjRtCIiRsp7XrIg3trzH51FptJlq1usGptk9HVLUo9VaCKkPtxPAfHtYhaRcK5NeR6N+IVNiG97SDLQA6eYYCKk24fsCTFsiykfHbT/rJP7Gvd4g90SzR3+1HpDnT+GcJbAUWqBS37dWCN4Qd/D/GjiPonofT/Qci+bOpTOgdCIkTuqR/UMJ4HWney/Vbkf6wPHLI2CFa3NPnF6QlnvPguanCKoog9e/bw8Y9//NxlUkpuvPFG7r333ie9zde//nWuueYa/sN/+A/80z/9E4ODg7z3ve/lt37rt7CsJw6VhmFIGIbnvq/X6wDEcUwcP9nkiWfv7P08X/dnvLqY/cd4Ll7I/UfQXX3TbcRbdpwsUKUpuhuaNGfLjXe/7t5WP/7/mnPXj1VKsxPSiWPiVBHFMVEIQmm0ztZRbSqOsHnrEDOdBndOnGTP7AyxnVLzFbpfoBJNWTt4bUE6H5MsOIR3DCGLPdhr5pGjSyw4LY46p6jIMmv99awrraHHqVBw8hS8HEU/h+/YWFbWsBd8wEfrFHQE9hBadUBIzvd1sUCL7ByZhRNLYlkSWwjOdoNKk5Q4jIg6NYLmLO36HEo5WG4e27VIpUaVU/yiyyq5HOtJpvnEYcLU+DxTx+Y5dleTzvw0nnWa4RU1xtY2GB5rYlkdYI7O0v3Yrs3C3AhnTl9FrryNgbFllAcGfuT6p1SlAEwen8Z27Gw00ZbYtoW0JJbdbWgsZbauTQqQAolGyhA4W8L5bPGHrEKgSO5DhF9G6DoaiORmqtbNWPYyLGHR9vrxw/8HklPI2h+hch8Ba2VWhTCZRlhDL4nph8ZTezn/7coKQsyDjhHSA5If836ydgHCbr1o/eBeKV5K+8+z2QahtX7iBO0XyeTkJKOjo9xzzz1cc8015y7/2Mc+xh133MF99933hNts3ryZEydO8Au/8At8+MMf5ujRo3z4wx/m13/91/nEJz7xhOv/wR/8AX/4h3/4hMu/8IUvkM+/dCsSGYZhGJmOSjkWBJyJAs5EIbU0veDnWmmKqSTXloiqQrVDvLVV7NVVsLPFRkIIvCBHT7uPUT3EWL4H1355rUtQqaI+22LpTI3GzBKunqBSnGZ0VZ1NO2tIK0uyD9/Xx21fG6XVLFAaKlIZKVIZLlIeKVEZKZGreC/Yp+MFf5b1y26nnJ8CoBNVOD71Ohaba55wXc+ps23V1yl4C6TK4eCZn2SxsfYF2S7D+HE5dgulHFJlgvwrVbvd5r3vfS+1Wo1y+emnbb7sgtPGjRsJgoDx8fFzI0yf+tSn+JM/+ROmpqaecP0nG3FasWIF8/PzP/LFeabiOObWW2/lpptuOjd90DCeKbP/GM/FK2H/OTsVMEwTWlFMlCZEcdJtnCrQqSZNFEppkkQxH7QYby5xol7lRKPKYhjw+DEurTUlHEoJ2O4Cyl1C9jQvfNDAJdfsZ0SOsqY8yqqRAYb6e8i7PjnXw/ccHFtg23Y2yvIcRUmLRjRJEM5BLIgDmzhMSRNFjKJNwpyu4QuXkiySc2xylo1rWWitCVSErS0KIo/EBp0V60iSlOr0ItWpY/T338Pg8DGSKCXsKPbcMczd315O0PbIRskEIHDzLsMrBxhcOUD/WC8z9Ukuu/xyKv1lij35rHfUBcEqQRACTdABKk3RykYpmzTRqCRFig7l/HfIebuREhAusfUm6tbVJBI8y0MKSZAk1IMAR9pUcj5aN8mH/wtfH89KlHvvQTmvP1flTFgDL0jPquzQJwEU6ARI0ToCHWcnFFlZDwCdlZ8WRYTMm5Gwrpfre4/WMTqdBjRCPE3Ta60R8e3I8B/QooLK/xbI3ie5WgRaIezlL5v+ai8FL6X9p16vMzAw8IyC00X9DQ8MDGBZFjMzMxdcPjMzw8jIyJPeZtmyZTiOc8G0vC1btjA9PU0URbjuhW9onufheU8cPnUc53n/Rb0Q92m8epj9x3guXs77jwPn1lcNaE2UpkRpSpAmdOKIKFUopUhTDVpT0jnW6MGsEEWSUu0EHK7Oc6y6yHhjicWgQ5OUhqOzildhmd4lmx6/hbSrpO4Syo/o+FOMM8XxZC/6kRJOtY/l7hgrh/pZtTw7VYpF8jkHz7XxXOdZhyilFa2kRZMOyumh5FWwdAuV1jnRnOVQfZZTrRnmkoWs9xEwSD8rGWNI9JGTLnnXxrMsIhGgZEzRzpGzLIRMkZak0j/M2u2riDq7UPFxKs7tuPZxlq0JuennTnFk/yZ23z7C1HibhYkmUSfi9KFJTh+aBA3NdpNH/vHIuZ5ctmNT6i1Q6PEo9XoUe2xKvS7F3gLFnjLFniLF3hy5goXtSfLFB8nb38oKQ2johDtY7NxIQ9lg18i7efAhQNFKY6QtiJTCTi0qXoXQ+vfo8Evk0z1Y0RexxAL47wZaIOrd8PTjjw5qnQIR6DQrh65DEPG5wIQ4O8lUZmXRtQNqCdREts7NWgfWMmAJaGTTEmWx2yz15TVq+UJ4ub33qLQGIkHIpy5/r3UE7c9B9AMQIKgiw/8BpY89IThrbYFuIKwEIc3avGfrpbD/PJvHv6jByXVdrrjiCm677Tbe+c53AqCU4rbbbuMjH/nIk97muuuu4wtf+AJKqXOLWA8fPsyyZcueEJoMwzCMlxchBJ5t49k2JTyUzmcjUGlKlCqSc9X+stLoSmlyRY/hviLXqtUkSlNttziyuMCx6gLjtSrznTZLYcpi6KEZQsohRnpiSn4L3EWUiklGa6TLq5xWJzi5UODOH5QRS2WGSr2MLetl1Wgf61cPMNRfIe95+J6La9lPO+UtUhGNuEEn7eAIh7ZqcrBxivHmSU62ThKmbbKRDQEWuMoj1CGzLDLHInmZZ7W9klXOKP2eS9GzcCyQToLv9VJwB5CW3+2VZKNUh7g9RNjeSKdzAEfeSr44xaVXH+KSa6aJuIEguoS5iZSZUwEzJ2tMn5xj/Mg4ObtAu9Ym7ISkcUh1tkN19uyEFMH5Tsfnja6pc9O/GWdkrEkVmJ8pcPvXNzF5WiCc25C2wHEchCVQUqNt8FzIlzxWXDLI8suXM7piOX2lAqH3HlTUTzH5V2T4HVDzkP9l0HW0skD2PeN+VVkA7Rb8UK1sDRYJ51fgWd2+UQ5oC9QMpKe6p9PZSQcX3qkcAGcXOJeBHELrOggfLUrdUagLP6CNwpg0TnA8B9sxoxAvFVq1QdXhaRrd6nQeWv892x+Q4L8ZwtshPQHtz6Hz//aCf/fni0TUL2plS+PFcdH/NX/0ox/lAx/4ALt27eLKK6/k05/+NK1W61yVvfe///2Mjo7yyU9+EoBf+7Vf48///M/5jd/4Df7jf/yPHDlyhP/6X/8rv/7rv34xn4ZhGIbxApBC4NsOvn3hJ4JKZ72klNZZgNIapRVJqkh6y2xaNkyUJERJyny7xaG5OQ4tzHOsusB8p83UosukdoAKfj5gqCckV2iACCDfIRlpkaYTzANzocPeeQd9xsEXOfryZZb3DrB+ZBkblo/Rl8um+EkhkUikkIRpyFQwzen26e75GWpxnVSprCeWBkd4DMh+RqwiI3YJpcpoW3MqOcl4eJJAt3gseYzDyWGWd0ZYLtcxYA/jOh62E1F2GpQdgesobEvgOja2s4xiXy+ofuJgO0nwMFb8LVBLuHwVy74bZ9WNDK3exBY9QJhu4+GHRrjisi34VkwSNGlW29SXYprVhFY1oLHUJmjWEHoWx5kj5y9R7l1k9cYFAMLA5q5vruCBO5ehFKQ0u1FLdn8/ClBkdRQlWsCBO6aAvfSO5Vl7+Rg7r13PysuuJs5V6FVfQ8YPQfNTUPg1UEtoRDc8PfGgNJt2d77EfNbQNCabatcNSBSystFng9G5kDRJFl5/mA3WchKKyPQwUs1DeEt2koPgXAHupSAG0MpFixxK5Qg7kk4zotMMUInCci38vEeulMP1HRz35TMy80qjtUKrKqAQ4sl/DzreD62/yvYhUYTCv0c4W9D2Zmj+39kIlDUG/k9eeEPhZyOuBIAZdXolu+jB6T3veQ9zc3P8/u//PtPT01x66aXccsstDA8PA3Dq1KkLyqOuWLGCb3/72/zmb/4mO3fuZHR0lN/4jd/gt37rty7WUzAMwzBeZFII5JNUUn083Q1Vq1UvlyxfTqJSWlHEdL3Jo3NTHJyf4+jiArNtyal2Dk0F143IFRv0VgK8QpCtJcprVDEgTdpEusY000wlh9lzBuRkFliKdoH+QoXhch9512eyM8VSVAXohiWFUoIK/fToIQgrNFs2p1sBezot5oJFonQWKWBFyWfzwCX09AcsyWlqaZ3Tep7T6Tx9DDLGepbrMUJVJUxjynEZqS2UVlhSYtuSnFvBdVys4tVovR0V3IcV/ysincbVnyOJR2nE1xGLFQA0k2NE+Pge9I7UGFo2j2QWKWawmEVQ+6FX1wIGCZMrqXfexI0fynH1+5o0wyZpDMTQ7ITUW3VUEqMji3k74HTfJA2njtjt0PpySHUyZM/EIfZ+4zCu6zC6bYBdN17F6153H17uOCL9P5CFD4MNGomweru/2wR02A1KnawSIgnZvCon60CszkByIhspSE9m0+6ejCiAtaJ7WgnWGDEDtFRAmIZgR+T0UXLpfuz0AELNnQtRWgyR6J104vUE7TJx6IBVxvFKWMU8SZTQbgQ0qi1sx8bLuRTKORzfxXGffsTSeH5klTJDtG6dD0RPuI6G8F+g8zVAgbUair+GkP0AWXjKvQc6/y90voK2RhHOjnO3F8LKqoeqFsIywemV7KIWh7gY6vU6lUrlGS0Ae6biOOZb3/oWb3nLWy76PE3j5cfsP8ZzYfafZy8925tKKaIkYaJR45HpKfbPzXBkcYHZdgutdTY+YiVYdoxtJ9h2gm+n2CJGEKEJsZwkG13p/iWVQuA42Yd9SkFa9+lU8zRrPvWmS2BDaCue7A/v2UPocxPkhMCVkmUlxeBIHaenjmNJpBT4MscKZx1r8msZyQ0y5PfiWy6xSojiiE4SE6URWnRw7Aau3cG3EorsJq9+gNAxWmma4XrmFwVD/U0cex4pQgQCIUS30bE835eJEoohUj2MYpiU9SiWo7WilbZpp20sbaESTT2s04maCO2xIFIeU8dYYCFrLyUEaI0f5xl8bJT6/UucefQ0cT1AYKG0pncg4AO/eZyRlQmOX0L5v0yuZwu2M5CtUSLMRpUEoFNIpxH6DEKdRKiToOef/PW1BhD2yvMhyV4Jovfcc4xVQidt045bQIJnSQSCWEGiwRaKnDqKEz2MTB5Fp/H5cvxiiETsJGULKcs5V+IeG7BJIohCRZqk2LaFm3MplPPZSJTnvOxC1Ev5vUdrBTrIejXpVjdYA8J/wmiT1gG0Pgvxg9kF7nWQ/8UnuZ7urnu6C0QOSr+DsEYe9/MQdIqwR00BkWfgpbT/PJtscNFHnAzDMAzjxWRJiSVl1nfJdenL59k6OEyiFGGSMNOssWdmkmNLC8w0Gsy02iy0O3RUSv1xnzVqnfVs0jrCEjGOjvFEgm0p2m2Xascn4nHrcvzzU8IcLcmnFkXtUMahIl3KlksjjpiO2yyKkKafEuqUE1U4US1hOzkG+hr0DdSx7TZTapEH9APIpV5KzWWM9Q6xcvkAy0ZKuI6FbdtYwiNMfOI4oEOdtrga17mMXvte8vaDlORxrJ6IvO+SNTS2idNeQtVLmPQSp/3E6RBCLcemhIV1QU+nJGlloYkOHg6JiOnoJom0aBdz7I+PMRvPgiXwhMvm0lYGnCF2135Ay24xcdkx1u/axNXJNSyOT1A9OMnM/gVOHVrkrz+5iX/zK0dYsW4Opf5/fOfOS6nVNjK4rMnASJP+oSqVvkUKxRpScm6bRPek9ACpGiNOx0jSUVK9HGEXst5djsRxHWzHQsqs6EZMh4AWQqd4lk89jTnWWGS5P0iv4yGiDp1Oh5lmmTS8Hl++kb78SQruYzjiIDCPxXeB76IYItbbSNmOYgCQOK7dnarnk8SKKIjoNJpYloPjuxQqWYhyffcFCVFZwDtbLVBlo3LnvteA2w0Wz64J8ktFtrYtzPqv6WY3YAPC7TaofeLz0ukUNP8C1DRgQf694F7/pK+/EAKd/wVQU5AcheafoUu/g5D57s89tD7b18wEp1cqE5wMwzCMV71zYcq2Kfs+GwaGz41MZYUpEs40apypVTldrzPVbDDXajHbarHU6aC0RmtNM0lJIgUO+K4gJwR9nsdQIc/yUokVlTIrKxV6c/7THhwnSUqtHnB8folDC0uMN2rMhm2qNZ+FpX7ypQY9A1UK+QBrcIG4f4ElDY/WBcy55K0CFb/MQKmXgVIvBatIXuYpiDy5GGb1NbhiO332fpZmFf2r1uPlVmA7w9iOh6U0vlI4SUKkYpIkQWgQWmNrC5kIVKpIfYVjW4zIfhLaNJKEauJyoH2UqSCrmGtbNhvymxkW63nodJUHmlUuW/Za4uJxjnWOckwfYdad5orLrmF42xp2vbOJ1a5x/LE59h/YRBzfydrNZ7jq9XvR+iGEOB9e4zZU21Bf8pg6WWTqdJGpk0VmzhQI2g6Or/Bz0+RKS1QGxin3Fyj15Sj1eZR7XMoDLl6/Q+oqUmHhWEVmaHEw2cfpaBKywTGGxADrWMEqOUjOL2AVE2IdcHqxn87idSTVXfjyMOXSIXr7T6FVFSGOYLvfIIqHiNId2MVd5MrLkKKN42gcR6KxUYkkjizmJwEhsX0fz/OzqZa27IZBibTk+YB4timxEPCE8cs0G4VDk5VZP1ti/fGXnw1NdL/v7osiB7LUrRj40j/4vzAstbphSXfDUvFpQ6CO9kL7b7pBpyebmmeve9rHE8JGFz4Mjf+SFRVp/xW68OvnH0c4aNX4kY9tvHyZ4GQYhmEYT+LxYQo8+vMFdg4tI3ncVL8kTamHIdPNBpONBhP1KktBwEAux0ihyIDnIYQmUXHWd0mnpDqi2o6wLJGtSZI2UlpYwjo3Tc62Lfr7CvT3FXgNYwAEScKxpSpHlpY4srTIYjUgbLTIlxdw/CaWHYNSSDugrkOaapGp6klkXeA4Fq6bnXzboyA9CtIip/vQDZ/h+QJlO6XgLlFwXYo5D9+18R0Xz8oqxqU6JVEJqU5BWNhYeEriCptWFHC4WWdf6xgz0SwAEsFqdwNxY4S7j1c5Vn0IyA7R983Ns76nh9euv44Tei+NtMHtjX9lU2ELW0qXMtA/yK6xIdrXLWBFl2MFt1P0b0elmjDwWJzrY366h5mJMlMni1TnBVEQE7YjgnZEGmc9mOIgIg4iGksNZk/NczYkaC268UGjNPj9Pu7VFsklIaonRdpZUKmoCnVZ51g6zhF1HN0B94iH3i3oHA5QKhu90VojBEh68HNlNl2yyObLFlizuYqUJ4ATJJ1vML6vxMTplVSrm3BLy+gbK9C/PE9luYddzHqatSIFgcDGxsHBxkJiAToLUdb5ET9LSqSThSvLtpCWQFoCyxLYjgVCdl/x7tfCAmT22/mhA/sshATodAZw0CLXLbvuv6T6E2UjZ48PSwGgnnZk6cLbKwj+CYJvZhfYG6Hwqwj5zJZvCFlGFz4Cjf8D4keh82XI/5vuT/3uNnWytXPGK85L51+CYRiGYbzECSFwLAvHss7VzurL51lRqZwLVI9fP5XolLTbuDdIEuI0IdUJSZwSJwmJigmSFKUShMyqBEoBUsrsOFdIJKLbVBbW9RVY11fgJxljodPh+FKNo0s1Ts016CQJlpWiRZit2xAhworwrQTPSXDcCEsqbNE5F6JcB1QhZTw5jkihGOUpkqckCvTZJQbcHgbzJXKei++6uLaNZ0lSHaN1iq0V+2tL7K4fZrYbmIQW9OmVLM738k8zDYL0BEKAJQUbevvoz/nsnprmeK3G8QeqXDGymbHRKmfSYxxqPcbp4BS7ytexo2cN/aV+gnCWvHoLOriJJErBKdGT1/SsgvUoQCNsjSV1FhyEIk0VcaQJOilBO6VVjanNR1Rn2yzM1Vmcq1Gdq1PXddSOgPbOGm2nO3IzJ5AP24g9Nu3FJXRJoS9JslOPpr2yDStBzEjkQy6liQrlUpFiv0eh36LS79PXn6fRUnz/zioF/yjDI8dZvnKe/pEG/SOPAo8yfabAgQf7+de/7qO24FPqz9M/WqFnKI/lSYSlkY6F7WQB1nUdXNvBtiykJbPRKCsbfZKWxLYtKgNFyv0FbNdB2grbljieg+OCtMCyBZYlkfYTw0UWOPLdUbYEdAudNkC46G7zX3j6kdJn6/z0wez3eOEUwm601SnnR8vOft1tUiycrK/WM+ynpVUrq5qX7M8u8G6E3LufdTAU9ip04YPZfYXfRltjCO8ahJBZkQjdQmCC0yuRCU6GYRiG8RxdsG7qcc5W9kuVOneutCZSKVGSEqcJcaIIkogwSbIRnVSTJgqtFFpl96EQCAGCrGiDFILlXpGxkUFetzybrrUQNTnRWGKy2WSm2WGq2aIZxtTjlChKiaOUJI2wnZiCpchZKTknwfMDigMKx1e0nTZLToBlVSHUqEDj1j3KskyPVWLI62U0389QrsJCWuWuhQNMhjNZjQYtcIIRJmbK7K5HCFFFAD2+z65lI1wxMkLF9YiihOvGxviX48c5MDfPnpkF9s1Jrlq1A1U+TitpcsfitznV2cCNw6+nnF9JnCxRKedwdY5UJahUoVSa1YVIIEk0SSRIUos4tdCpQCuJ7QmKnkWpz2JwfUpIQIeAyXSSY8kxptIErXKoRJELcgzODOOfydNOO9THGrSKHfJln5JVpDhdQHsJiyOLLJYXYA1Y10ksYbPKW85mfxMj9gjtsEWtU0UlLZbZffjuOmwnT5TGiNY+PLWPoneKfDlh5fopbnzXGSbGCzy2t4/HHuzn5CP+4/cg9Nn/a7pFO8C2wbY0tqOxLIW0NZatsCyN6wv6hvL0jhToGfKp9HtU+n0qAx65oo20wZIKy9HYtkDKFGF5CKsPKfsRTh/IXoQsdoNNBGoRrZYeN5XP58l6e1247z9+PVX3nLQbys6Wi0+7QelsE+Lu19kO1b2ns33EZFZYBPljNR/WyamsP5OaBxwo/BLCvepZ3cfjCfdKdDqRjVy1P4e2RhD2muy1US20jF4W0x2NZ8cEJ8MwDMN4gQghsIXAlk8+fejJgpUm60sVpefDVZJoEpWNpKQqJUxS0iRbf6XRaKXJ6QLbinl29CQkMkaLlFasmGkETLVaTDSanGk0WGp1iKKUZpASN1KUUogJcG1F0Y0p+CH5QkyhkuAWYjxbEzgR83qRI/EJdFNjYaPIpsMlSpO0+jk1XSKIbJSKUKlmzM0zJgoUapLF01W+XJ+m0QjRWlMu+yxfXua1fct4NFlkKQm5a7xBr7+c7avbtNwzHO8c5n+fmuCmkTewvriCetqmYttIq4h0bYSW56YMuki0ECilsrVXqSJN0yxgpYowilgKqxxoP8bR8Bht3UbrLIiukCvYlNvIcH4Y+gViWzbip4VGWzqrlGhl643OjrYEKuBoeJSDwWGW0iWOhsc4EhyloIssS1exnFUUnWGCOCKKmnhWgO8K/NJVKPsG2lYbXxygJPbhiuMMrlLsuLZGEi9SrxXRCoRIECiESB936o7G6O5ozdlsoUF3RzbRP7zmKZPWod0UWI6F7Vjnzm3Hyp7b2cIa3WqKWpRAZiEK0QuyF2mVQBaQ9ghKD3X34RCtIrLRoRh4XDB63FqrcxvanSqYTSN8/PTBLBSdG9F6DgNbWqfdxsZn+3adgfgQEGd9uAofRtgrfvwHOMt/R/e+H4bmX6DLv4OQvWjd7pYmf/bBKQucWRVAIfynv7LxojPByTAMwzAukh8VrM7S55r8XnhKu01/oyShEUQ0OiGtTkgaSbSIcUXMyorP2t4SnuUihaQehkw2m1mQqtUZPzNNTQriRNGIXZY6OfSihtMgpCKXCyl4MYV8TLEU4xdjXDsm1ZrqYpnJ6QphYJOmIXYU0tu06WvbpKrNSdpP+nzq9YB6PQDAQuOWFEsDKU0nZLpmsW5oFSNjs7Rlk3+a+AbbKlt43dC1JIlHdvAddcdizh9fxyol1hFhEtJJIzppQJhGdJKAU50zTAQn0WiEDXlZZHtpKzuK2yjbZXQ3lMYqIU4i4jhGJRqpBKnSBFENlWokdrbqyLZYJ9ez2l3PnJ7nYHCYE8k482qJBVFlv3iEUtxLkTJFypRUnnxoUW4tkLMX8d0CvrcDz74cR7bJycfIefvI+ePkSvHj9xDoljN/kr2i+58F2gIczpY9DzuKoK0IWimdZkqnkdBppiSJQKWCNBWkiex+L3G9lHJPSLk3otwbY9kaIeYQYrw7ZbQbqM6Fq+yyrb0Osw//D4S0EdJBWjZSukjbQ1oOluNh2z6W44F0EN3nooUFFBBWGSHLCLuCtCoIq/KsK9Jp1TwfjtIz3a+ngPiJV3a2Q/5DCPn8TKMTQqIL/w4an8yaKTf/O7r0sew56AZal3/kyNiFDZw7WSNdHQMCLUsIWUGIHx7LNi4WE5wMwzAM4yVOCIElBE93CDZcykapgjih2gloBgH1TkA77tAJA9oixLYkvu2yqa+Pzf39qDRlRlv0b9tANY6Za7eZbbU5tVBjstpgtt2mE7ssdFLmFrsjGULjejFpKkljG4mg3Ibelkc+kljSolz2KJd9KhWfctm/4GvHsZiebjA1VWdyssbUVAPRiCk2JXOlmIVSwJ6qxjleYOMY9C6rsjd+lFOtU6wprSRUEbGKiVREkIZEKiJUEalKUd0pbZoLR11EN171OgNsyG9hZW4tjrRoK2hHTaTQ2JbAdRwKhTIFO49vu7jSIU1TWmGHZtCmEbVoRR1qYYeokxCGimZbUeysYqe1gnpujhnrNA25REMs0iBrupsNAmlsZZMLfAptmxIeRVGmz+6nZK3FdzbiijZ5Z57sN20jsRE4gERpiyxmZkFJYXUPygWKlESkoDWp6j53H6y8hBFJQQhySlOfbbE02chOEw0WJxosTtRp18MLpgIWi0k3REWUe0NKPefPS70h5UqEkBqhOkSdxhN6kJ3zuAvOloo/d95dm2VZAmnLrEy8LRFWDmn3YDkVkBWELGXnVhkpexAyzZobp6chnQC19BT/aHywRsEae1xz4zXPe6l3IXLdYhF/DOl41usp90Gg2S0ScWHD3QuD0uMbOKdkDZxdEPnsxVM1tGqhu6/DD/eWMl58JjgZhmEYxiuAEALPtvFsm0rOP9eXqhlG1NodamGTeqdFIwpQQRvPcnG7UUxqGMzlGMrn2TYArMruU2tNM46ZabY4OV/j1EKVyVqT+aCDjWRtqczWnn4GewrnglGh4CJldnCqtOqOimRruzQKTczIihzLVhR4jRhFK0GtGnSDVJ3x6SUeU1XquYT9JwoUZm1Wjs6xlJvmlDOHbVvYtsCyJZYls366nB95skVWi84RLrZwsIVDQRZZn9vE8vwIliNBamIVozQIbSGFjdIOMTY6sQlljC0TbGGhyUazEuGAXSJn57C8iKbbIm61KeV9Bq0ytnZQeoQt8VZqUZWqWqBJg5ao0xR12qKFQtGSbVpo5lBoJkBngaoYlanYJXpFmbKdo2KVKNkFXMvBsyxsy8KWEksKpLSwZbdJcbb4rbsuCNJUoVKNSlOSOCGJUtI0RaNw+iv091dgRzeM2+JcWfM4TAijkHYnzIpwhCBCCxVpVKSJwphaGDF/IiYOQ9BVZhfnGChbaJWQxgkqiUiTBJVGpEmESmOkVFiOwrKyNViWrbEdRS6fkC/FFEoR+WJCvtTGsjRQRTOVbZ+VFcCwulUOzxXFkI8rjCEFWvSDvQItRlFiOZoxhBzIAmh6dsQMEB2EkOemJMrHT08U579+1v/2rCF04d9D89MQ3ZsFNffabDRMFMimLZ4NSu1uUEq6N3ZAeE9eoEKUs9uoebRugOzpljp/duu7jOePCU6GYRiG8QpkS4ntuhRcl8FigSjtIUxSFloNFjoNGmGDetACoBY00ZHEEpKsULeVldfuVvgbK+RYUcoj1i5HiuzAVSBQnA9GWRELTUSASDW6u1ZGni2xLiwkLpaw0FoRk6B0ihKKQo9kfW+FDVt7uUGsIQoUDx6f5tbTJ5nr2Bw65tJbbpJLQaUSffakLFzLwfd9cp5PPpejmPfAld2JWlmhgSUB02IBT1Ypei5lz2egUKY/X6Qnl6fk+uRsO2tyenYKpNIEKju4tYUkbztgOyw228xXI+YbEZHUNOKQmfo804t1FhoB1VZIM4xxkYwWi6wfWMOOlcOsWFampevU0xr1dIlaWqWeLtFKa2gSGrpKQ9U4E0lELLJphVqQt/IURIGSVaAg85TtIhWrQNku4Fk2ri2QCGxL4FgSz3Hx/Gx0CjxAkKZZnzHVLTyilCKNE+JuqFJJirQE+bxPvuCjdIwSCYoYC3CFiyvKONLFslzAJtUW9z10lNfs3IylIxQRqATdLb2vtUQpTRpD0NFEgSYKk+zUjmnXAxZqHU6e6tCsdmgutYk7dVRaw3HbFEox+WKcnZe658Vs7c/cZI6ZiQKzE3lmJ/JEgQ0EePlT5Ioz+Pn9+EWXXMHDL3j4BYdc0et+7+IXs/N8ySdf9vGL3rkQJkVW0TIr756NhAn5uLVhtvXkDXKdrejcz0Ln76HzjyCHwV6fNdnVIeeDkg3CQYj8M/p3LIQHwkPrDjqdBdEEWemWXn/+GyUbT88EJ8MwDMN4hZNC4NsOvu1Q8X3G0j7acchis8a+ozMM9vaB0MQ6K6qQaIVKs1LqUZqAVujucd/ZwgSiO9QjRBaP0BIpsn5UIBA66xWkASkskIJUCyI0UtjZehf1/2/vzoPkqq7Dj3/vfUsvs/VsmtEykhBiF8gxMrJ+xM7PIEMcB2MnKRNM2dihQjlGiW3AKVOxwdip4IqX8oIDsZ1YrkqVsZ0Y5xcvBEWAbAhgEJIRIGQB2qWZ0Yxmpve33fv74/W0NEggCQmNgPOpanX3e69f39dzZqqP7r3nGjQW0yh6kNiIgBiD4Zz5nZw9v8DTI/u4f8cuSrU8RWOIMI3ExmCMBWMhqkGpBqTDtrRWuK6ecnO0g9PoXTmwfsJkz4OjFXnPp9X3yXseec8j47pUo4hSEFAM6hTrAWEUE8eGKE6IovRmDijUYLHgQBUYr+3j6R37+M8dz5OxDjMyOeYVOjhzZg/nzTuHtrYMiU0oJWOUk70U4yGK8RglU6NkKsQkVJISFUoMH/j5p58wLSpPXmVp0S20kkdpjeuA52l8t7FgrmNJbIKx6eeWYEhskn6GjW05L8NAtoe52Rl0+K1Yco0eEI/EusTaYq1GqQye00LObUlHlq1/jt45C/A8D5MkWBORJCHWRJgkwiQ1MAHGxCRxSBJHmBiMVSQmTXzTCohm8tJQQFiPqZTqVIp1KuM1KhN1RifqbN9Wo1KsUauE1CsBtXKAdgO0k6bJURATBTHF0cphfyem1NCwkMl7jSTL33/L+2RbPNp7Wukb6KTvlB4KPa34OR8/6+K4Lp7n7i/vnrk4HUIYPgTV70LLDeD0pr1K5I4p0VEqhyULtpYmY6oFdKFRJl6cKJI4CSGEEG8wvuPgO3laHI8NwKLeARzXbQynS+fJpAUo0i/WUWII4og4SQiTxhfwJGmMENPpkChl00p1jcTKYtN9tjHjqFFx2loLSpGQflmeHGNnNY0FgEErk37J1DC3Zyb/97SF/GbXLiphSJxYojghiS1RmBDUYqqViHI5pFiqU6mETB28lyYziY4ItMW6oDwNLhgHEm2xKk22ik4d10mHADqNHogwTAjCmDCMCaO0Up9jFY5RuEaRSRxcq+lqyTGjvZX+rjZmdrcxVCny7NBedpXKjMchgUnYEZbZMVzmweFdqCcVHY7PrNZWFvZ0cu7cfk7vPg1fp70q1lpqtk4pKVNKSunNlCgnJcqmjLGGChEVIvaaUvoaAzaymJqFRnLrqHRB3Mky9gd/d08ziGcruwHo8ArMyw8wNz+Xgfxc8l4et9GeyMZUkjrlpIY26XCxMAmxTlqSX+sMjje1Elw6pydObzYBYqypYZM6JgmwJsHYNIkyJq2wZ4wlidPPOr1P4zEt0W8xyWT1wPRi4jghqMeNRZAjgsnEqhpSKwfUSnWq5YBaKaBeDZvP6+WAsB6DgnotoFYLYGR/zBzEQq49Q8/sDnoHOugd6KR/QTczT+mhs7cdz/fwvD8jo/egzAtQvRPchaS9fxo7WUnQ6vSarQarMDYtn59euyIxYOjH8U5DuzkcV+O6LspRKJ1PqwbaKjapYa0UkDiRJHESQggh3uAcrfGcI1xE9ICqftCYP9K4B6Y8PpLzJJMJmtn/ODJpj05i0zLtWZNj+fwODAYFOMohSdK1qYIgJoktrqPxPQdrLXtHywwNFxneW2Job5Gx8SrVckicJIdsi8ESK0usLbEyGG0wTprMZRPIJ+BZH8+kCVNvVxuz+gvMmtnBrP4Cfb1tuO7Bn19sIwJTZyyosHHXMJt3j7N1rMhgrUrNxozHAePjAc+Mj/L/nnsO33Xo8rO0OB6tnkeb59OeydCeydKR7aAvn2Nh1sfzHRK/TuhWqFGmZEpUG8mTxkmHm6GxBoxRGAOudci4Hi2ZLC1+hpznk/MyOEozFo2xvbqDwfoQE1GJJyee4cmJZwDozfQyNz/A3PwAc/Kzybs5jDXUkhoAI+EonvEaiZnGUQ6OcnF12vuo0Y2FnD20yqCVJjFtJDbG8RIcHQOTRRImy5kDysPiYkl7cxKT9prtLzmfpOX7E0MUx3ixQxhpcolHkvgYk28WDEmnf9l07KlKl4OyKu2JNYmhXo0JqyFBJSaqRtTLEUE5ol4JqZdCxgfL7N02zvhQmVqpzvZn62x/dmh/IRILLZ1Zugfa6Z3bwayFZ/H2t28hn9sF7GqWi59cl80ae8A9Ux7T+I8G7Wis4xDbOSh3IUYvxKpTcTMdeJm0gqHjWBxnFMcpYd3OlywgYRoJZ9Ioz2+NIaiFjOzax94dI+zdMUpxtExbdyu9s7vontVJz5wesnkf7WiU1vuHMjrpUN03KkmchBBCCHHEjqTC39Gf59Bne3FyFZt0rasgjgmSmNgzuL5DLYipBxHlSoBSivZClr7etilf8Ky1hFFCpRpQrYRUaiGVSkC1FlKpBlQq6X15cn8lXXOqtSXD7IFOZs0sMDCrwMz+AtnMkVU3c5WH63hkcjm6FrRx/vwaCTEal5F9dTZsH+Z3Q6PsLJXYZwLCOGEwfvlhZgqFmyi8JL33rSarHPLap833KGQzdOVztLekhTryeY983iebTSsgJnWDrYPjuXgZn87WHAP5BVxQeCuGmF31nWyv7mB7dScjwQh7g73sDfayduwJlFLMzM5kbn6A2f4sDIa8k0M7Ok1qSItu1G3A1Bp7GmPBRIZaLaIWJJgYfE/hagfPd3A9nSZejsFRIY4qASHGGpTSGOticbCqURDEbdbDaKwApcmRbSYhmEYPVdJ4DiSxAQNJAiSNcu7Kkm210GKx3TT/QwD2n59Gj2QSG0Z3lhjdVWR0xzh7t0+wd9sYxZEq9YmQnRMj7NwwwhPWcv+/zuDMN43huAla00hyaD5WmsZzi3Zset947PmGmXPLtHeWgQngaRzXwfE04+VOKuU5RMl8lHcaHX1zaenUuN4Qjp9HewUcMiRRTGlflb079jG6Z5zR3WPs2zPO2FCRfUNFSpPDGSeXzWpcbLN3GGjpyFHobacwo43Ovg4KfQU6+zvo6u+gd5Yllx/BVUMopwu8RWivrzEvTDcqKL7+EixJnIQQQghxUnq55Gpy8eDYJMTGEiUxpVrARLVGuRpSbhS+yPgu/mRvkAI359CazdKSZEhMK4mxzUWEJ+c6odJ7C/ieSxTFJIkhTixj1Rq6HjSG9CmcRqU72D8MbnLIoavTghuucnEdl4zOEJiAwNTo6nS5qHsul3AqSilqtZCN2/aybWSCYhhSCkMqcUQliamZiJpNCG2CtenwwsgxL/q00h4ggvTm7FN4scJLNN4BiVYWhw4/Q2vGJ+d7ZHIOHYUscwc6GZhToD3fzXn5WbylsIzIBOyu72ZnfSfbq9spRkV213azu7Ybay2V7ipPb30W3/HJ6Ay+9vC0T0b7eNpHWweMg4mgXk8YHa0zMlRnfG+I7/n0dLUwo6edGd1ttPgZstrHdVx8zyHntJHxFZ6b4OkQTydoneCoGKUmv5CnaZOxlsCEhCambkICE1FPosbjmLqJCJIQUDgKtKv2z0uyNu0Rs6BwsFahLGAbi/FahY9HzmbIOD69C1rpm9+GMbPTqoxAUIvYt6vI6K4SozsmGN1ZZHj7OOsezIECz08XHHZ8jetrtNdIFn2HjO83hvi5+BkXz/dwXM3GpyqE1SHa2nYyc2CcOacW6emv4jnDFDqGgSewwMROn9892MHo6AyK5R6KxTzFwYDScJ2onjRicv+iwir9B8cFP+fS3d9GZ38r+dYMxX01JkYqjA2XCWsx9UqZkh4j41dpzVehtYpbreHXqtR2GgJHpcUyGlU0K6U8w7t6GNrVy9DuGVTKbTiOg9ZOo+fKaVY1tMC27dtYsugCZp8681j+TJxQkjgJIYQQ4jXn4MWDfTpzeWxngXoUU6kHFCt1SpU6QZA0h2g5CjKOi+NqPNch47k4TjpUUTsaV6vm/KZJpjHMKUoMUZQQxjH1MCaK43RIYZT2iqjGXCLlpP99X4tjjDG4WuNpB1c75J0WsjpLaENqpkpgamjlks16vPnM2byZ2S95zbExlMOQYhgyUQ8Yq9YZrzVu9YCJoM5EEBImCcZYjLEExlBrFNIwk2s8NRYmdhKFN67wRjXuJshah96OFub0tHHqQDenz+0hn+vhrEwfb277PwS2zJ5gN7trO9lZ20mZMrWkRs3UmKyPYW3aK2isJQwTwjCt3hdFjUSvo3ED9gG/C4DdpEU8PI3vemQ9n4zr4ygXb/KmXXzHwVUQ2ojANNbwsiHhZOWSA9bwsnZyiWAAlW5QCq3TIXqTQ0r3J7t2yn0aY40HB/RCWSCrPVp1lryTJa8z6dDHNp/e010G9ExazDz8xMUYi3YUFg+Lx4HJv8USE4OyGAyatLiKi4uvGyX9FSTKUC5WGBssMbRjGB1tJefvoNA2RKFrjI6ukI6uvcBeAJJYUa851CouQc3FmDzabcX123Azbfi5DrItBbItBfxcB5Y8ysmjHB+thnDMHhSDKLsLZfaArZLECUlsSGJDHCckkSWOFMO7c4wO5Sh0BfTPLeO5E8yeN8Hsec8DUBz32LG5ne3PtbH12Xb2DWehUWXTAtV6xMTomCROQgghhBDTQSlFzvfI+R7dbS2EUVrcQU9Wz5ss/vAK1+w5UNwoXhAnhjCKCYKYKEmavVOxSghtuq9kAhJrUQp81yHjuLTqdmInIrB1AlNHK42n/IPaZRtf3LWC9oxPW8Zjdlt+MkXYf2/BWkM9SZgIAopB2LwvhgFjtTTZmqiHBHEjuUosYZxQjgzWxOwxAU8O74PhbagnFG2uR1dLjr5CC3N62ulpydGdP5Pzs+cyMvosMwfmUzUx1ajGWLHEnqFxRsZKTJQqxDZCeQblGnAS3Kwi36bJtGiMTQjDkDCJSGxCHBvi2FAnpkgtrYzopcmt62lcz+HAfqYpnw+WtPi9g2O99GZctHVxEheVuCjj0NLi42iNtaYxlAyUQ2P9pskzT6ZbFjCAJSKknlSpJFUsCYGFIKkzmtRf1I79/ygUrbqFLqedXq+VXreFXreNVt0Cyk8XMiatKGjMZA9qTGJiQhuk87OMRSXgZEjnTw0U0JwBjXWnIifEZQfUn8O1W8h5u1E6wXU12tVoV4EK0YyhmGj0hKbX2lyzavJ5M2U8YK6VBWtzxKaLOOkninpwki5i00lY6kAHikycUEqgvKVEW9sW2tp30NY+SGvbCB29hoHTAv4PATBCFOYpTsxkYnwm42P9PLnR0Dur45h+B080SZyEEEII8bqklCLju2T8V+frjutoXEeTAVpIq5pNJlNJo4fHGEOcGII4oRaE1KKQShRRDkLixIAFR/sorambOmWKac+YVqA0StlGsQfVWEOrMSyQxgAspRoJRbr2llIeGQ0FrxXVBo1K3425PyotEmAs9ThhIgwohiHFsM6+Wo3d40WGimVGq1VKUYQxlmIUUhwP2To+wW+27cHzNJ7v4PsORBF2aJwwjAnCtFdivxxK5fEbx/q+i2s11PYngq7WtGV8sspBRQlxLSKqhtTKdRxj0j4alZaIUK6lrZDO2TKhxoSKpK6J6xAEUK9DkBgSLEZbEm0xKh3WmCiLVRFeElPIZJnX28HCmV3MntVBa2sGb/Ln6Hs4bjr8DNIUyljbGM6Z9kYZFZPYkFjVqJsataRK1VQoxxXKcZlyXKYapz16ZVujHNXYHg0xmYT52mGG38EMv43eTCczcj10+724zsHz5qylWczBNKoJputxGeIooRZqJuI5VDM9VOPF1JMqeRsxoFrJ6BjXBDhuAE4VX4dkHYPvRGjqYCuNxXgraWEODEp3gTMbpWeDM3nrx7UWE9ca/0ngkMR5Mh0u7T2K+NS08ITSCu28FdeNcL0arq7gOjtw2IG2W9BsBWJ6Zw2DGsbY9Zz7VkNH11vgZXpZTzaSOAkhhBBCHCeTydRLsTYd9hcmMdUwphqFVKOIMI6JkzyJTYf30eiFMChU+p07ZRq9BNCoVpf2pE0OLTSN6oS2UZ2tORxNp8mY8tL2aUczh3S7QjWHIwZhTBIb6lHEtqFRXhgcZdvecXaPFanamNAxRE5MzQ1JsCgVT7k+z9P4fjqHx/OcZtGBtPqibfZ2RIkhNAmj1TTJUEqhPIXqcLDtLelaWZFprpWlIos7AtrGaVI0mRg5FvKkt4Zmr8oB9wBxbBi0Ic+Wivx3aQfeRk2rdultyTO7s415vQVmd7XR19ZCazZDxk8Xvc15DiiFMYbIuiTGwzUt5BT0ZjS+csj5aSVBXzsobanGVcajCYYbxTWG63sZDUcJjWFnfZyd9TFgK2DQQLffTm+mk16/h95MNwZLNalTTerUkjrVuLb/cVKnmtSIbXLA8MTGWMkEXOMwz+3ntOxcTm0ZIO+0ENmEcRPh6wwtbp6MzjQ/F2sNkEypyGdtnC7ca+uAi/Y7cTIt+CqHUprEJkRJRBinvYYZ18dzPDzHS9d1o441C8CUgCitlW/2QPwcxL+DaDOeMwGq5+h+waaZJE5CCCGEECeIUgrXdXBdh3wm7atKjCFMEsIkoRymleRMI/FJEoOxYI0htpbEJOl30EZyZEw6xG1y/pLjNhIjrZvDE10nLVKhnXR41v45Pgf2XaW9K3ljiON04eP2thwL5/YRxQlhmDA2VmXX7n0MDU6wa/c4pWKJ7t4Cc+d2MX9uF3NmFfB8F6xFa91cEyvjuXiui+sqHCddiDg2htFqmZFyhdFqheFKhYl6jYkgpBI1CmNEMZUoSivyGZsuNmws7kGJUdp7lfVccp5LznXIui6Zxn3WcXG1Zl+1zs59JfaWq1SCiEgljNmQsWrI76rjsGtHOt/N03RkM8xoa2FWRyuFbDY9r+M23sMn77pkXQfP0fiu0xwu6DgaXzvkXI+c28kCr5sF3pnoNoUhYSIaYzQaZSQaZV80wt5ghMDUGArKDAdF0mRq6jpkLxFJ6c9bOeSdHHknR87JMRYVKUYlXqgN8kJtD4rfMCfbzWkts1iYn43rekwE42TcPHndhudkUaqxzpS1pKXhQ8ABlUXpNlBZEquIbUwxHGFbZTu7arvYXR9kb7AXV7l0eB10egU6/QJdfjczMr30ZnvI6Bm4RGnPlnLTXqzsJSShYf26e/j9gb7j/0v2KpLESQghhBBiGjlak9OanOfRkc02e4sOfW+nFGKYTLD2d0mlqZBuzIOZfAxMef7i9beA5vpcptljZTEWwjghimLCOQn10+dQDUKq1Trbnl5P/1nnohydVix0GsPyXJeM5+B5HlnXbb7/gTJOWu56bkdv2nqbLmwcxTG1MKQWBGnClsRU44iKqVOJQyKTkHFdcq5LzvXIaJec4+FoB43Cmv09b1g1uQxzunCTMumQRwdqJmKoXGbz7hFeGBpj13iR0XqNwBrqSUy9HjM0XuHp3XtxjqCstovCReNalS6QjMJH0+J5FLJZunJZulqy9LTlKbTnaM+fQqezkAXKEjg1SnacshmjZPZRthM4KDI6S0Zl8VSGjM7hN55ndI6MypHVeRzcKQmkdSy1XJm98S52BtsYi0fZVhtne22M1TxNf6abha2zGcj20OPVybkZ8m4GRzVSApUB1YVRGeqJZXd1D9uqO9hV29VY56s4JeFWQERELamxpz5Ic45Y2t1Jm9tGp9dBl99Fb6adHq+VHj9Dh/Ip1WYeUIHjtUESJyGEEEKIk8hkUnOiv1TqRm/Ui+U9D3L7nxtjqdXqDG7ewOK5s3A8F6UVBktiDWGcpOtsGUMlCjHWNr9mO1qnJea1bvapTCZvnuvgey4tuez+RKpR3COIYqIoITFJo8hB+nqlQaNQVk0u6ISj094119GgNcqmc3CwYOK0fW7ikWvNMv+0HvQZCtfVGGPZMzjBpq2DbNozws6xcWo6IW7MmUqUbQ4TTJ+DUWnCGr3Uh5oAdWB8/ybHKvxEkVMurW660HFnJkNnPk93azfzWvO4Oh0aGBtDlFjqxlA1Np1DZ0MSUyc2+0iMadzA2LSCY0cmQz7bTlfmTXRlIqr5vZScIcbVCDviEXZWR1EK2t125mXncGp+gFPyc6mYOjvr29lR38VgMMRoOELSWHR6Mkmy1uLZVnTSRq2WZayUDmXs69B05MH1QspJkYl4gshEFMMiE+EEWyvbD/hQ0mWNTSecWTqNM7rPPrpAnUaSOAkhhBBCiCOmtcJvFNzI53w871CFDdK5Vknjy39s0jlN9SgmsYYojqZUx9tf8btR1W0yq3LAyTpkfY2XpBUMJ+duaQVa60ZvS5qQ7a+Qx/6S480zQ+IpslaTJBZlLcYqkiithGiMpaenjd7eNv6vcwbGWAaHigTBAW21+2sZGmuJTExgYmpJ3Fg7KqaWpOtIlaOQiXq6JlcpiqiYiNAYEmOpKUuNkH02hLACIVAChpjS3lfKNYpMrPFijR8pvDhDVs+kta+K11dGd5fY61TYogZZo9ei0ViSZmVAa0kT4dghKGepVn2KVZ+Jmkcca7ARlqhZ5n1y6KSLokP5dOnZ9GRcCq0WLxdhsgFJpkbs1QjdKkYZjImo16XHSQghhBBCvIEduM5W5kX7EmOaBSwmTV1/qbGtOQxxfyW+/QMSmTLccHLOVrpv6tDEyR4tY+3+JC5JqEVRuoCy76RtSgzKgjWWKIrRKGbMaDugnWmCYA8Y0miYHC5pm8nf5LA5hSU0ac8bWBwNkU0oxTWGS1VGilVGKlX21QIm6kGaXCUxiWmuPnXA58mUa9I0Ese0/y3tLQRCDEFjHlzgW2ombpQVbzQu9HC3dpHZ0kl7W5XWzgrZrgpaGwyKsJqhUslQqmeoVjNEgfuiloBrIBNqcrFDNnKIHUvFj6n6CYG2DFNnmAACUEXIxJp8oGkJW2kJO/ATIBMTOSXCHu+1VFRPEichhBBCCHHiOFofsBTsieU7U985TaSSdHHjJKEex0QmITJeOqQvNmDTAh3pukfp61yt0yGBKFyt8BwX33WawxBdrdEKwtgQxDHVMKRUD6iHMa1xG71ejO20aAWOo3AdhVUJCQmVJASb9uw56sDEaHIo5f5ERtnG/C1oZJWKShyyr15lX73GSL3GvnqNvdU6o9Ua1ShqFhep2AylUgFTSnDdiCjyGjPC0jlw+Yyi0Jql28vRk8nTn21lZr6NnnyerJ/OX/NdB2uhGoRUgpCdxQm2lCfYUZ1gMKhQikMslrqx1Kxhr03wrKYtdvEnWohkjtPR+9a3vsWXvvQlBgcHWbx4Md/85je54IILDvu6u+66iyuvvJLLL7+cn/70p69+Q4UQQgghxOuG20hysgd8Iz5weOHkEENPa7SanJ+lpszXermFlPMHdLdZa4mMIYhialFEJQioRXFjDldEkjg4KNpNHrCQqLQHrNGDpRtdWRY7uYhXox6ibRSIUKAsBZ2jw29nftuBvXjpGlC1OGK0VmGkXme0XmW0VmO0XqUSRXRls8zItdKXa2FGrpWZrW1pdcBGUug3qhNOlr4/cBikaXQVnmP7iUxCGBkimzBUrfDCxD62lsbZWS4yWC1jLCTWUmyF6MXdkSe5aU+cfvjDH3L99ddz5513snTpUr72ta9x6aWXsmnTJmbMmPGSr9u6dSs33ngjb3vb205ga4UQQgghxOuZ0+hNOt7f6ZVS+I6D7zi0ZTPQ1tpM0sIknf9VCQPqcUycmGaypEkTpsniHUrvr46odWNhZEVjoeS048k0esmape2NTZfgNZZ5treRSKVrOKXHWRyl8B2XvOeRcZ20F83RjQSpMXfsgGSpuUYYjTlRk+/TLJVvWUgvS6I51JO0WuJYpcbzxTFeGB1l0/bnWdjVdZw/5VfXtCdOX/3qV/nLv/xLPvKRjwBw55138vOf/5x//dd/5dOf/vQhX5MkCVdddRW33norv/71rxkfHz+BLRZCCCGEEOLYNZM016Utk6GXFqIkSXtwONQ8rskad7xsT9ehvFSZ+8l9Wik855UNolSN+VaHGoOZz/nNx8Zazkn6qQYBvy4XKeRzB7/gJDatiVMYhqxdu5abbrqpuU1rzfLly3n44Ydf8nWf//znmTFjBtdccw2//vWvX/Y9giAgCILm82KxCEAURUTRSxaPPCqT5zle5xNvLBI/4lhI/IhjIfEjXimJnVdXc+WoA4piHEuVvZeiDrxvDCV8tTlAVqVXqIyZ9hg6mvef1sRpZGSEJEno65u6anBfXx/PPvvsIV/z4IMP8i//8i+sX7/+iN7jtttu49Zbbz1o+7333ks+nz/qNr+cVatWHdfziTcWiR9xLCR+xLGQ+BGvlMSOOBYnQ/xUq9UjPnbah+odjVKpxAc/+EG+853v0NPTc0Svuemmm7j++uubz4vFIgMDA1xyySW0t7cfl3ZFUcSqVat45zvfeci1DIR4ORI/4lhI/IhjIfEjXimJHXEsTqb4mRyNdiSmNXHq6enBcRyGhoambB8aGqK/v/+g459//nm2bt3KZZdd1txmGl2KruuyadMmTj311CmvyWQyZDIHT+/zPO+4/6BejXOKNw6JH3EsJH7EsZD4Ea+UxI44FidD/BzN++vDH/Lq8X2f888/n9WrVze3GWNYvXo1y5YtO+j4M888kw0bNrB+/frm7T3veQ/veMc7WL9+PQMDAyey+UIIIYQQQog3iGkfqnf99ddz9dVXs2TJEi644AK+9rWvUalUmlX2PvShDzF79mxuu+02stksixYtmvL6QqEAcNB2IYQQQgghhDhepj1xuuKKK9i7dy8333wzg4ODvOlNb+Kee+5pFozYvn07Wk9rx5gQQgghhBDiDW7aEyeAFStWsGLFikPue+CBB172tStXrjz+DRJCCCGEEEKIA0hXjhBCCCGEEEIchiROQgghhBBCCHEYkjgJIYQQQgghxGFI4iSEEEIIIYQQh3FSFIc4kay1wNGtEnw4URRRrVYpFovTvoiXeO2R+BHHQuJHHAuJH/FKSeyIY3Eyxc9kTjCZI7ycN1ziVCqVAGSxXCGEEEIIIQSQ5ggdHR0ve4yyR5JevY4YY9i9ezdtbW0opY7LOYvFIgMDA+zYsYP29vbjck7xxiHxI46FxI84FhI/4pWS2BHH4mSKH2stpVKJWbNmHXbt2Ddcj5PWmjlz5rwq525vb5/2H7547ZL4EcdC4kccC4kf8UpJ7IhjcbLEz+F6miZJcQghhBBCCCGEOAxJnIQQQgghhBDiMCRxOg4ymQy33HILmUxmupsiXoMkfsSxkPgRx0LiR7xSUbFe/gAAD05JREFUEjviWLxW4+cNVxxCCCGEEEIIIY6W9DgJIYQQQgghxGFI4iSEEEIIIYQQhyGJkxBCCCGEEEIchiROQgghhBBCCHEYkjgdB9/61reYP38+2WyWpUuX8pvf/Ga6mySm2a9+9Ssuu+wyZs2ahVKKn/70p1P2W2u5+eabmTlzJrlcjuXLl7N58+Ypx+zbt4+rrrqK9vZ2CoUC11xzDeVy+QRehZgut912G295y1toa2tjxowZvPe972XTpk1TjqnX61x33XV0d3fT2trKn/7pnzI0NDTlmO3bt/Pud7+bfD7PjBkz+NSnPkUcxyfyUsQ0uOOOOzjvvPOaC0suW7aMX/7yl839EjviSH3xi19EKcUnPvGJ5jaJH/FSPve5z6GUmnI788wzm/tfD7EjidMx+uEPf8j111/PLbfcwhNPPMHixYu59NJLGR4enu6miWlUqVRYvHgx3/rWtw65/x//8R/5xje+wZ133smjjz5KS0sLl156KfV6vXnMVVddxdNPP82qVav42c9+xq9+9SuuvfbaE3UJYhqtWbOG6667jkceeYRVq1YRRRGXXHIJlUqlecwnP/lJ/uu//osf//jHrFmzht27d/Mnf/Inzf1JkvDud7+bMAz53//9X77//e+zcuVKbr755um4JHECzZkzhy9+8YusXbuWxx9/nIsuuojLL7+cp59+GpDYEUfmscce45//+Z8577zzpmyX+BEv55xzzmHPnj3N24MPPtjc97qIHSuOyQUXXGCvu+665vMkSeysWbPsbbfdNo2tEicTwN59993N58YY29/fb7/0pS81t42Pj9tMJmN/8IMfWGutfeaZZyxgH3vsseYxv/zlL61Syu7ateuEtV2cHIaHhy1g16xZY61N48XzPPvjH/+4eczGjRstYB9++GFrrbW/+MUvrNbaDg4ONo+54447bHt7uw2C4MRegJh2nZ2d9rvf/a7EjjgipVLJnnbaaXbVqlX2D/7gD+zHP/5xa6387REv75ZbbrGLFy8+5L7XS+xIj9MxCMOQtWvXsnz58uY2rTXLly/n4YcfnsaWiZPZli1bGBwcnBI3HR0dLF26tBk3Dz/8MIVCgSVLljSPWb58OVprHn300RPeZjG9JiYmAOjq6gJg7dq1RFE0JYbOPPNM5s6dOyWGzj33XPr6+prHXHrppRSLxWbPg3j9S5KEu+66i0qlwrJlyyR2xBG57rrrePe73z0lTkD+9ojD27x5M7NmzWLBggVcddVVbN++HXj9xI473Q14LRsZGSFJkik/YIC+vj6effbZaWqVONkNDg4CHDJuJvcNDg4yY8aMKftd16Wrq6t5jHhjMMbwiU98ggsvvJBFixYBaXz4vk+hUJhy7Itj6FAxNrlPvL5t2LCBZcuWUa/XaW1t5e677+bss89m/fr1EjviZd1111088cQTPPbYYwftk7894uUsXbqUlStXcsYZZ7Bnzx5uvfVW3va2t/HUU0+9bmJHEichhDiJXXfddTz11FNTxokLcThnnHEG69evZ2Jign//93/n6quvZs2aNdPdLHGS27FjBx//+MdZtWoV2Wx2upsjXmPe9a53NR+fd955LF26lHnz5vGjH/2IXC43jS07fmSo3jHo6enBcZyDKoIMDQ3R398/Ta0SJ7vJ2Hi5uOnv7z+owEgcx+zbt09i6w1kxYoV/OxnP+P+++9nzpw5ze39/f2EYcj4+PiU418cQ4eKscl94vXN930WLlzI+eefz2233cbixYv5+te/LrEjXtbatWsZHh7mzW9+M67r4roua9as4Rvf+Aau69LX1yfxI45YoVDg9NNP57nnnnvd/O2RxOkY+L7P+eefz+rVq5vbjDGsXr2aZcuWTWPLxMnslFNOob+/f0rcFItFHn300WbcLFu2jPHxcdauXds85r777sMYw9KlS094m8WJZa1lxYoV3H333dx3332ccsopU/aff/75eJ43JYY2bdrE9u3bp8TQhg0bpiTgq1ator29nbPPPvvEXIg4aRhjCIJAYke8rIsvvpgNGzawfv365m3JkiVcddVVzccSP+JIlctlnn/+eWbOnPn6+dsz3dUpXuvuuusum8lk7MqVK+0zzzxjr732WlsoFKZUBBFvPKVSya5bt86uW7fOAvarX/2qXbdund22bZu11tovfvGLtlAo2P/8z/+0Tz75pL388svtKaecYmu1WvMcf/iHf2h/7/d+zz766KP2wQcftKeddpq98sorp+uSxAn0V3/1V7ajo8M+8MADds+ePc1btVptHvPRj37Uzp07195333328ccft8uWLbPLli1r7o/j2C5atMhecskldv369faee+6xvb299qabbpqOSxIn0Kc//Wm7Zs0au2XLFvvkk0/aT3/601YpZe+9915rrcSOODoHVtWzVuJHvLQbbrjBPvDAA3bLli32oYcessuXL7c9PT12eHjYWvv6iB1JnI6Db37zm3bu3LnW9317wQUX2EceeWS6mySm2f3332+Bg25XX321tTYtSf7Zz37W9vX12UwmYy+++GK7adOmKecYHR21V155pW1tbbXt7e32Ix/5iC2VStNwNeJEO1TsAPZ73/te85harWY/9rGP2c7OTpvP5+373vc+u2fPninn2bp1q33Xu95lc7mc7enpsTfccIONougEX4040f7iL/7Czps3z/q+b3t7e+3FF1/cTJqsldgRR+fFiZPEj3gpV1xxhZ05c6b1fd/Onj3bXnHFFfa5555r7n89xI6y1trp6esSQgghhBBCiNcGmeMkhBBCCCGEEIchiZMQQgghhBBCHIYkTkIIIYQQQghxGJI4CSGEEEIIIcRhSOIkhBBCCCGEEIchiZMQQgghhBBCHIYkTkIIIYQQQghxGJI4CSGEEEIIIcRhSOIkhBBiWiil+OlPf3rEx3/4wx/mve997zG959atW1FKsX79+mM6z4m0cuVKCoXCdDdDCCHe8CRxEkIIcVwNDg7y8Y9/nIULF5LNZunr6+PCCy/kjjvuoFqtTnfzhBBCiFfEne4GCCGEeP144YUXuPDCCykUCvzDP/wD5557LplMhg0bNvDtb3+b2bNn8573vGe6m3lUwjDE9/3pboYQQohpJj1OQgghjpuPfexjuK7L448/zvvf/37OOussFixYwOWXX87Pf/5zLrvsspd87YYNG7jooovI5XJ0d3dz7bXXUi6XDzru1ltvpbe3l/b2dj760Y8ShmFz3z333MPv//7vUygU6O7u5o//+I95/vnnj+oa5s+fzxe+8AU+9KEP0d7ezrXXXgvAf/zHf3DOOeeQyWSYP38+X/nKV6a87lBDDwuFAitXrgT2DxP8yU9+wjve8Q7y+TyLFy/m4YcfnvKalStXMnfuXPL5PO973/sYHR09qvYLIYR4dUjiJIQQ4rgYHR3l3nvv5brrrqOlpeWQxyilDrm9Uqlw6aWX0tnZyWOPPcaPf/xj/ud//ocVK1ZMOW716tVs3LiRBx54gB/84Af85Cc/4dZbb51ynuuvv57HH3+c1atXo7Xmfe97H8aYo7qWL3/5yyxevJh169bx2c9+lrVr1/L+97+fP//zP2fDhg187nOf47Of/WwzKToaf/d3f8eNN97I+vXrOf3007nyyiuJ4xiARx99lGuuuYYVK1awfv163vGOd/D3f//3R/0eQgghXgVWCCGEOA4eeeQRC9if/OQnU7Z3d3fblpYW29LSYv/2b/+2uR2wd999t7XW2m9/+9u2s7PTlsvl5v6f//znVmttBwcHrbXWXn311barq8tWKpXmMXfccYdtbW21SZIcsk179+61gN2wYYO11totW7ZYwK5bt+4lr2PevHn2ve9975RtH/jAB+w73/nOKds+9alP2bPPPvuQ1zOpo6PDfu9735vy3t/97neb+59++mkL2I0bN1prrb3yyivtH/3RH005xxVXXGE7Ojpesr1CCCFODOlxEkII8ar6zW9+w/r16znnnHMIguCQx2zcuJHFixdP6am68MILMcawadOm5rbFixeTz+ebz5ctW0a5XGbHjh0AbN68mSuvvJIFCxbQ3t7O/PnzAdi+fftRtXnJkiUHte/CCy+csu3CCy9k8+bNJElyVOc+77zzmo9nzpwJwPDwcPN9li5dOuX4ZcuWHdX5hRBCvDqkOIQQQojjYuHChSilpiQ6AAsWLAAgl8u96m247LLLmDdvHt/5zneYNWsWxhgWLVo0ZR7UkXipoYYvRymFtXbKtiiKDjrO87wprwGOeiihEEKIE096nIQQQhwX3d3dvPOd7+T222+nUqkc1WvPOussfvvb30553UMPPYTWmjPOOKO57be//S21Wq35/JFHHqG1tZWBgQFGR0fZtGkTn/nMZ7j44os566yzGBsbO/YLa7TvoYcemrLtoYce4vTTT8dxHAB6e3vZs2dPc//mzZuPuvz6WWedxaOPPjpl2yOPPPIKWy2EEOJ4ksRJCCHEcfNP//RPxHHMkiVL+OEPf8jGjRvZtGkT//Zv/8azzz7bTDJe7KqrriKbzXL11Vfz1FNPcf/99/PXf/3XfPCDH6Svr695XBiGXHPNNTzzzDP84he/4JZbbmHFihVorens7KS7u5tvf/vbPPfcc9x3331cf/31x+W6brjhBlavXs0XvvAFfve73/H973+f22+/nRtvvLF5zEUXXcTtt9/OunXrePzxx/noRz86pXfpSPzN3/wN99xzD1/+8pfZvHkzt99+O/fcc89xuQYhhBDHRhInIYQQx82pp57KunXrWL58OTfddBOLFy9myZIlfPOb3+TGG2/kC1/4wiFfl8/n+e///m/27dvHW97yFv7sz/6Miy++mNtvv33KcRdffDGnnXYab3/727niiit4z3vew+c+9zkAtNbcddddrF27lkWLFvHJT36SL33pS8flut785jfzox/9iLvuuotFixZx88038/nPf54Pf/jDzWO+8pWvMDAwwNve9jY+8IEPcOONN06Zj3Uk3vrWt/Kd73yHr3/96yxevJh7772Xz3zmM8flGoQQQhwbZV88IFsIIYQQQgghxBTS4ySEEEIIIYQQhyGJkxBCCCGEEEIchiROQgghhBBCCHEYkjgJIYQQQgghxGFI4iSEEEIIIYQQhyGJkxBCCCGEEEIchiROQgghhBBCCHEYkjgJIYQQQgghxGFI4iSEEEIIIYQQhyGJkxBCCCGEEEIchiROQgghhBBCCHEY/x+f2ukbg32wJgAAAABJRU5ErkJggg==",
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ },
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAA1cAAAGJCAYAAABmacmGAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8qNh9FAAAACXBIWXMAAA9hAAAPYQGoP6dpAAEAAElEQVR4nOzdd3xUVdrA8d8t0ye90nvvIFVKYEFEREEUlRWE10XxhX3ZZa1rAbHtqiy4dtcCCjas2EGKgPQqIF0gkN6TyfSZ+/4RE81SpARC4Pl+PnfdueXcc29O+MyTc85zFMMwDIQQQgghhBBCnBW1uisghBBCCCGEEBcDCa6EEEIIIYQQogpIcCWEEEIIIYQQVUCCKyGEEEIIIYSoAhJcCSGEEEIIIUQVkOBKCCGEEEIIIaqABFdCCCGEEEIIUQUkuBJCCCGEEEKIKiDBlRBCCCGEEEJUAQmuhBBCiIvYnDlzUBSFQ4cOVXdVhBDioifBlRDiklD+BbN803WdOnXqMG7cONLS0s7ZfadPn46iKCQlJeF2u4853rBhQ66++uozKvvFF19kzpw5xz32+OOPc80115CUlISiKEyfPv2E5Xz33Xf079+f+Ph4oqOj6datG2+//fYZ1elUfPLJJwwZMoT4+HjMZjO1a9dm1KhRLF26tOKc5cuXoygKH374YcW+//4Z/na77777Kt3jxRdfRFEUunfvfsJ6/HcZkZGR9OvXjy+//PKYc10uF9OmTePKK68kNjYWRVFO+O4Bdu3axZVXXonT6SQ2NpYxY8aQk5NzGm/p1DRs2PCE78Tr9Z70WsMwePvtt+nbty/R0dHY7XbatWvHjBkzKC0tPeb8lJSUSuXbbDbat2/P7NmzCYfDlc49dOhQpXNNJhPx8fH06tWLv//976SmplbpeziRcDhMQkICTz311AnPKf8dzc3NrbT/yJEjNGnShNjYWDZv3gzAuHHjKj2X0+mkcePGXH/99Xz00UfHvAchxKVHr+4KCCHE+TRjxgwaNWqE1+tl7dq1zJkzh1WrVrFjxw6sVus5u292djYvvfQSf/vb36qszBdffJH4+HjGjRt3zLEHH3yQ5ORkOnXqxLfffnvCMhYuXMjw4cPp2bNnxZfMDz74gLFjx5Kbm8tf//rXKquvYRj8z//8D3PmzKFTp05MnTqV5ORkMjIy+OSTT/jDH/7ADz/8QK9evU5aTvnP8Lfatm1b6fP8+fNp2LAh69evZ//+/TRt2vS4ZQ0aNIixY8diGAaHDx/mpZdeYtiwYXz99dcMHjy44rzc3FxmzJhB/fr16dChA8uXLz9h/Y4ePUrfvn2JioriiSeewOVy8cwzz7B9+3bWr1+P2Wz+nTd1ejp27HjcdlV+nzFjxnDTTTdhsVgqjoVCIUaPHs0HH3xAnz59mD59Ona7nZUrV/LII4+wYMECvvvuO5KSkiqVWbduXZ588kmg7J288847/PWvfyUnJ4fHH3/8mDrcfPPNXHXVVYTDYQoKCtiwYQOzZ8/m2Wef5fXXX+emm26qyldxjPXr15Obm8vQoUNP67q0tDT69+9Pfn4+3333HZ07d644ZrFYeO211wDweDwcPnyYzz//nOuvv56UlBQ+++wzIiMjq/Q5hBA1iCGEEJeAN9980wCMDRs2VNp/7733GoDx/vvvn5P7Tps2zQCMjh07GklJSYbb7a50vEGDBsbQoUPPqOw2bdoY/fr1O+6xgwcPGoZhGDk5OQZgTJs27bjnDRo0yKhdu7bh9Xor9gUCAaNJkyZG+/btz6heJ/L0008bgPGXv/zFCIfDxxx/6623jHXr1hmGYRjLli0zAGPBggUVx0/0M/xvP//8swEYH3/8sZGQkGBMnz79uOcBxqRJkyrt++mnnwzAGDJkSKX9Xq/XyMjIMAzDMDZs2GAAxptvvnnccu+8807DZrMZhw8frti3ePFiAzBeeeWVk9b9dJ1p+3niiScMwLjrrruOObZw4UJDVVXjyiuvrLS/X79+Rps2bSrt83g8RoMGDYyIiAgjGAxW7D948KABGE8//fQx5R86dMho3ry5YTabja1bt5523U/HQw89ZDRo0OCk55T/jubk5BiGYRhpaWlGs2bNjOjo6GPa2q233mo4HI7jlvPkk08agDFq1KgqqbsQomaSYYFCiEtanz59ADhw4ECl/bt37+b6668nNjYWq9XKZZddxsKFCyudEwgEeOSRR2jWrBlWq5W4uDh69+7N4sWLj7nPww8/TFZWFi+99NLv1ikcDjN79mzatGmD1WolKSmJO+64g4KCgopzGjZsyM6dO/n+++8rhiilpKRUOn4qiouLiYmJqdSroes68fHx2Gy2UyrjVHg8Hp588klatmzJM888g6Iox5wzZswYunXrdtb3mj9/PjExMQwdOpTrr7+e+fPnn/K1rVq1Ij4+/pj2YLFYSE5OPqUyPvroI66++mrq169fsW/gwIE0b96cDz744JTrUlX+e86Vx+Ph6aefpnnz5hW9UL81bNgwbr31Vr755hvWrl170rKtVitdu3alpKSE7OzsU6pPgwYNmDNnDn6//6TD9QA6d+7MddddV2lfu3btUBSFH3/8sWLf+++/j6Io7Nq1q9K5X3755Wn1WmVkZNC/f3+ys7NZtGgRl1122Slfe99993HFFVewYMEC9u7de8rXCSEuLhJcCSEuaeVfOGNiYir27dy5kx49erBr1y7uu+8+Zs6cicPhYPjw4XzyyScV502fPp1HHnmE/v378/zzz/PAAw9Qv379ivkZv9WnTx8GDBjAU089hcfjOWmd7rjjDu6++24uv/xynn32WcaPH8/8+fMZPHgwgUAAgNmzZ1O3bl1atmzJ22+/zdtvv80DDzxw2s+fkpLCzp07eeihh9i/fz8HDhzg0UcfZePGjdxzzz2nXd6JrFq1ivz8fEaPHo2maWdVVlFREbm5uZW235o/fz7XXXcdZrOZm2++mX379rFhw4ZTLrugoKBSezgdaWlpZGdnH/dLebdu3diyZcsZlXsygUDgmPdxvPl95VatWkVBQQGjR49G148/O2Ds2LEAfPHFF797//L5VdHR0adc5549e9KkSZPj/iHit/r06cOqVasqPufn57Nz505UVWXlypUV+1euXElCQgKtWrWq2JeZmcmWLVu46qqrTqlOWVlZDBgwgMzMTL799lu6du16ys9TbsyYMRiG8bvPJYS4eMmcKyHEJaX8i7nX62XdunU88sgjWCyWSkklpkyZQv369dmwYUNFj87//u//0rt3b+69915GjBgBlP1V/KqrruLVV189pXtPmzaNfv368fLLL59wLtOqVat47bXXmD9/PqNHj67Y379/f6688koWLFjA6NGjGT58OA8++CDx8fHccsstZ/o6eOihhzh48CCPP/44jz32GAB2u52PPvqIa6+99ozL/W/lPQrt2rU767IGDhx4zD7DMADYtGkTu3fv5rnnngOgd+/e1K1bl/nz5x/3y7LX6yU3NxfDMEhNTeXBBx8kFApx/fXXn1HdMjIyAKhVq9Yxx2rVqkV+fj4+n69ST+HZWrRoEQkJCZX2TZs27YRJTH766ScAOnTocMIyy4/9d09QKBSqCGbz8vJ4/fXX2bhxI0OHDj3tns62bdvy2WefUVxcfMI5Sn369OHf//43u3btolWrVvzwww+YzWYGDx7MypUrmTRpElAWXPXu3bvStV999RVWq5UBAwacUn2GDh1KQUEB33777UkTofzeM8GxPeFCiEuHBFdCiEvKf38xb9iwIfPmzaNu3bpA2V/Gly5dyowZMygpKaGkpKTi3MGDBzNt2jTS0tKoU6cO0dHR7Ny5k3379tGsWbPfvXffvn3p378/Tz31FBMnTjzul9EFCxYQFRXFoEGDKvXIdOnSBafTybJlyyoFXWfLYrHQvHlzrr/+eq677jpCoRCvvvoqt9xyC4sXL6ZHjx5Vcp/i4mIAIiIizrqsF154gebNmx/32Pz580lKSqJ///5AWUbAG2+8kXnz5jFz5sxjes1ef/11Xn/99YrPJpOJe+65h6lTp55R3cp7JY8XPJUnTPF4PFUaXHXv3r0iMC7XuHHjE55f3qZP9rMoP1b+cyu3e/fuYwK5a665ptI7PFVOp7OiPicLrgBWrFhBq1atWLlyJV27dmXQoEEVQxoLCwvZsWPHMYldvvrqK/r373/KQV9WVhaxsbHHDYxP1W+fSQhxaZLgSghxSSn/Yl5UVMQbb7zBihUrKn3R3b9/P4Zh8NBDD/HQQw8dt4zs7Gzq1KnDjBkzuPbaa2nevDlt27blyiuvZMyYMbRv3/6E958+ffpJe6/27dtHUVERiYmJJ7x3VZo8eTJr165l8+bNqGrZSPFRo0bRpk0bpkyZwrp16054bX5+Pn6/v+KzzWYjKirquOeWf3muii+d3bp1O+6wu1AoxHvvvUf//v05ePBgxf7u3bszc+ZMlixZwhVXXFHpmmuvvZbJkyfj9/vZsGEDTzzxBG63u+JdnK7yL/I+n++YY+Wp0U/2ZT8nJ4dQKFTx2el0VnxhP5H4+Pjj9uadSHngdLKfxYkCsIYNG/Kf//yHcDjMgQMHePzxx8nJyTmjTJsul+u49/itpKQkmjVrxsqVK7njjjtYuXIl/fv3p2/fvvz5z3/m559/ZteuXYTD4YpADMqGSi5evPi4c8pOZN68edxyyy0MGjSIVatWnfB38GyfSQhxcZPgSghxSfntF/Phw4fTu3dvRo8ezZ49e3A6nRXr1Nx1112VUnH/Vnla7759+3LgwAE+++wzFi1axGuvvcasWbN4+eWX+dOf/nTca/v27UtKSkpF79V/C4fDJCYmnjAJw3/3GpwNv9/P66+/zj333FMpmDCZTAwZMoTnn38ev99/wtTh1113Hd9//33F51tvvfWEaz+1bNkSgO3btzN8+PAqe4bfWrp0KRkZGbz33nu89957xxyfP3/+McFV3bp1KwKTq666ivj4eCZPnkz//v2PSaRwKsp7PcqHB/5WRkYGsbGxJ+216tq1K4cPH674fLLhfWeqfF7Sjz/+eMKfRXmyiNatW1fa73A4KgVyl19+OZ07d+bvf/87//73v0+rHjt27CAxMfF305b37t2bJUuW4PF42LRpEw8//DBt27YlOjqalStXsmvXLpxOJ506daq4ZtWqVRQXF5/yfCuAfv368cEHH3DdddcxePBgli9ffsI/FpzsmYATpv4XQlz8JLgSQlyyNE3jySefrEhIcd9991UMpzKZTKfUGxAbG8v48eMZP348LpeLvn37Mn369BMGV1DWe5WSksIrr7xyzLEmTZrw3Xffcfnll//ucKbjZdw7HXl5eQSDwUo9JeUCgQDhcPi4x8rNnDmzUgbD2rVrn/Dc3r17ExMTw7vvvsvf//73s05qcTzz588nMTGRF1544ZhjH3/8MZ988gkvv/zySd/rHXfcwaxZs3jwwQcZMWLEab/jOnXqkJCQwMaNG485tn79ejp27Pi7z/DbhCcnG953pnr37k10dDTvvPMODzzwwHF/Fm+99RbA7y5w3b59e2655RZeeeUV7rrrrkoZEk9mzZo1HDhw4JTmC/bp04c333yT9957j1AoRK9evVBVld69e1cEV7169ar0HF9++SWtW7c+5ayZ5YYNG8Ybb7zBrbfeytVXX82iRYtOay7Z22+/jaIoDBo06LTuK4S4eEi2QCHEJS0lJYVu3boxe/ZsvF4viYmJFYHP8XofcnJyKv5/Xl5epWNOp5OmTZsed0jYb/Xr14+UlBT++c9/VgwVKzdq1ChCoRCPPvroMdcFg0EKCwsrPjscjkqfT1diYiLR0dF88sknlYb3uVwuPv/8c1q2bHnSL5ZdunRh4MCBFdt/93L8lt1u595772XXrl3ce++9FQkofmvevHmsX7/+jJ7F4/Hw8ccfc/XVV3P99dcfs02ePJmSkpJj0un/N13X+dvf/sauXbv47LPPzqguI0eO5IsvvuDIkSMV+5YsWcLevXu54YYbTnrt5ZdfXumdnovgym63c9ddd7Fnz57jZpj88ssvmTNnDoMHDz6lOXf33HMPgUCAf/3rX6d0/8OHDzNu3DjMZjN33333755fPtzvn//8J+3bt6/oTerTpw9Llixh48aNlYYEQtl8q9NdOLjcmDFjmD17NqtWrWLkyJEVGTp/zz/+8Q8WLVrEjTfeeEpzMIUQFyfpuRJCXPLuvvtubrjhBubMmcPEiRN54YUX6N27N+3atWPChAk0btyYrKws1qxZw9GjR9m2bRtQNmQqJSWFLl26EBsby8aNG/nwww+ZPHny795z2rRpFUkXfqtfv37ccccdPPnkk2zdupUrrrgCk8nEvn37WLBgAc8++2xFJrsuXbrw0ksv8dhjj9G0aVMSExMrMqO9/fbbHD58uCIl94oVKyqSHowZM4YGDRqgaRp33XUXDz74ID169GDs2LGEQiFef/11jh49yrx586rk/Za7++672blzJzNnzmTZsmVcf/31JCcnk5mZyaeffsr69etZvXr1GZW9cOFCSkpKuOaaa457vEePHiQkJDB//nxuvPHGk5Y1btw4Hn74Yf75z39WGjb3/PPPU1hYSHp6OgCff/45R48eBeDPf/5zxZf+v//97yxYsID+/fszZcoUXC4XTz/9NO3atWP8+PFn9HxV7b777mPLli3885//ZM2aNYwcORKbzcaqVauYN28erVq1Yu7cuadUVuvWrbnqqqt47bXXeOihh4iLi6s4tnnzZubNm0c4HKawsJANGzbw0UcfoSgKb7/99knnJ5Zr2rQpycnJ7Nmzhz//+c8V+/v27cu9994LUCm4OnjwILt27TqlNeVO5P/+7//Iz8/nkUceYezYscyfP79i6GwwGKz43fB6vRw+fJiFCxfy448/0r9//1POHiqEuEhV6xLGQghxnrz55psGYGzYsOGYY6FQyGjSpInRpEkTIxgMGoZhGAcOHDDGjh1rJCcnGyaTyahTp45x9dVXGx9++GHFdY899pjRrVs3Izo62rDZbEbLli2Nxx9/3PD7/RXnTJs2zQCMnJycY+7br18/AzCGDh16zLFXX33V6NKli2Gz2YyIiAijXbt2xj333GOkp6dXnJOZmWkMHTrUiIiIMACjX79+x5R9vG3ZsmWV7jV//vxKz9G9e/dKz1nVPvzwQ+OKK64wYmNjDV3XjVq1ahk33nijsXz58opzli1bZgDGggULKvad7Gc4bNgww2q1GqWlpSe877hx4wyTyWTk5uYahmEYgDFp0qTjnjt9+vRj3lWDBg1O+E4PHjxY6fodO3YYV1xxhWG3243o6Gjjj3/8o5GZmXkqr+e0NGjQ4Ljt57fK39t/1zEUChlvvvmmcfnllxuRkZGG1Wo12rRpYzzyyCOGy+U6ppx+/foZbdq0Oe49li9fbgDGtGnTDMMwjIMHD1Z6P7quG7GxsUb37t2N+++/3zh8+PBpPecNN9xgAMb7779fsc/v9xt2u90wm82Gx+Op2P/8888bUVFRRiAQOKWyT/Y7+uc//9kAjIkTJxqGYRi33nprpeey2+1Gw4YNjZEjRxoffvihEQqFTuu5hBAXH8UwjjM2QwghhBAXhddff50//elPHDlypGLJgYvZVVddhdPp5IMPPqjuqgghLkEyLFAIIYS4iGVkZKAoCrGxsdVdlfMiJSXlmDlYQghxvkjPlRBCCHERysrK4sMPP+TJJ5+kQYMG/PDDD9VdJSGEuOhJtkAhhBDiIrRr1y7uvvtumjZtesL1x4QQQlQt6bkSQgghhBBCiCogPVdCCCGEEEIIUQUkuBJCCCGEEEKIKiDZAo8jHA6Tnp5OREQEiqJUd3WEEEIIIYQQ1cQwDEpKSqhdu3bFguInIsHVcaSnp1OvXr3qroYQQgghhBDiAnEq6wVKcHUcERERQNkLjIyMrJIyA4EAixYt4oorrsBkMlVJmeLSIe1HnClpO+JsSPsRZ0PajzgbF1L7KS4upl69ehUxwslIcHUc5UMBIyMjqzS4stvtREZGVnsDETWPtB9xpqTtiLMh7UecDWk/4mxciO3nVKYLSUILIYQQQgghhKgCElwJIYQQQgghRBWQ4EoIIYQQQgghqoDMuRJCCCGEEOICFQqFCAQC1V2N8y4QCKDrOl6vl1AodE7vpWkauq5XyRJMElwJIYQQQghxAXK5XBw9ehTDMKq7KuedYRgkJydz5MiR87LurN1up1atWpjN5rMqR4IrIYQQQgghLjChUIijR49it9tJSEg4LwHGhSQcDuNyuXA6nb+7cO/ZMAwDv99PTk4OBw8epFmzZmd1PwmuhBBCCCGEuMAEAgEMwyAhIQGbzVbd1TnvwuEwfr8fq9V6ToMrAJvNhslk4vDhwxX3PFOS0EIIIYQQQogL1KXWY1VdqiqAk+BKCCGEEEIIIaqABFdCCCGEEEIIUQUkuKoB/L4A4XC4uqshhBBCCCGEOAkJrmqA4rwSPCWe6q6GEEIIIYQQ51VqaipDhw7FbreTmJjI3XffTTAYPOH5hw4d4rbbbqNRo0bYbDaaNGnCtGnT8Pv956W+ki2wBggFQ3hcXhxRjuquihBCCCGEEOdFKBRi2LBhJCcns3r1ajIyMhg7diwmk4knnnjiuNfs3r2bcDjMK6+8QtOmTdmxYwcTJkygtLSUZ5555pzXWYKrGsLr9hEMBNFN8iMTQgghhLjUGIaB33t+el/+m9lqPuWshSkpKbRt2xaAt99+G5PJxJ133smMGTNOO/Ph0qVL+emnn/juu+9ISkqiY8eOPProo9x7771Mnz79uAv+XnnllVx55ZUVnxs3bsyePXt46aWXJLgSvwr4Avi9AQmuhBBCCCEuQX6vn7/2fbha7j1rxQwsNsspnz937lxuu+021q9fz8aNG7n99tupX78+EyZMYOLEicybN++k17tcLgA2bNhAu3btSEpKqjg2ePBg7rzzTnbu3EmnTp1OqT5FRUXExsaecv3PhnxTryEC/hBetw97xKW3iJwQQgghhKg56tWrx6xZs1AUhRYtWrB9+3ZmzZrFhAkTmDFjBnfdddcplZOdnU1iYmKlfeWBVmZm5imVsX//fp577rnz0msFElzVGLpZx+PyEooPoWladVdHCCGEEEKcR2armVkrZlTbvU9Hjx49Kg0B7NmzJzNnziQUCpGYmHhMwHSupKWlceWVV3LDDTcwYcKE83JPCa5qCJNZJ+gN4Pf4sTml90oIIYQQ4lKiKMppDc27UJ3OsMDExES2bt1a6VhWVhYAycnJJy0jPT2d/v3706tXL1599dUzr/BpkuCqhlCUsv/xSXAlhBBCCCEuYOvWrav0ee3atTRr1gxN005rWGDXrl2ZOXNmpeGBixcvJjIyktatW5/wurS0NPr370+XLl148803UdXzt/qUBFc1iG7RcZd4iIyLOK+NRAghhBBCiFOVmprK1KlTueOOO9i8eTPPPfccM2fOBDitYYEDBgygdevWjBkzhqeeeorMzEwefPBBJk2ahMVS1ou3fv16xo4dy5IlS6hTpw5paWmkpKTQoEEDnnnmGXJycirK+73erqpQrd/Qn3zySbp27UpERASJiYkMHz6cPXv2/O51CxYsoGXLllitVtq1a8dXX31V6bhhGDz88MPUqlULm83GwIED2bdv37l6jPPGZNEJeMuyBgohhBBCCHEhGjt2LB6Ph27dujFp0iSmTJnC7bffftrlaJrGwoUL0TSNnj17cssttzB27FhmzPh17pnb7WbPnj0EAmXfjxcvXsz+/ftZsmQJdevWpVatWhXb+VCtwdX333/PpEmTWLt2LYsXLyYQCHDFFVdQWlp6wmtWr17NzTffzG233caWLVsYPnw4w4cPZ8eOHRXnPPXUU/z73//m5ZdfZt26dTgcDgYPHozX6z0fj3XOqKqKYRj4PNWzxoEQQgghhBC/x2Qy8dJLL1FUVER+fj6PP/74aa9xVa5BgwZ89dVXuN1ucnJyeOaZZ9D1XwffpaSkYBgGDRs2BGDcuHEYhnHc7Xyo1uDqm2++Ydy4cbRp04YOHTowZ84cUlNT2bRp0wmvefbZZ7nyyiu5++67adWqFY8++iidO3fm+eefB8p6rWbPns2DDz7ItddeS/v27XnrrbdIT0/n008/PU9Pdu7oZh1Piee8NRAhhBBCCCHEqbmg5lwVFRUBnHSRrzVr1jB16tRK+wYPHlwROB08eJDMzEwGDhxYcTwqKoru3buzZs0abrrppmPK9Pl8+Hy+is/FxcUABAKBii7Gs1VezpmUFwwFCYVCBENBFF3BW+rBXerGbDm9tJii5jqb9iMubdJ2xNmQ9iPOhrSfsxMIBDAMg3A4TDgcru7qnJbyep9tGVVV1qkIh8MYhkEgEDhm2aPTacMXTHAVDof5y1/+wuWXX07btm1PeF5mZmalVZqhbDGx8oXEyv97snP+25NPPskjjzxyzP5FixZht9tP6zl+z+LFi6umoP1VU4yoWaqs/YhLjrQdcTak/YizIe3nzOi6TnJyMi6XC7+/5kwJKe/wKO+sOFslJSVVUs7v8fv9eDweVqxYQTAYrHTM7XafcjkXTHA1adIkduzYwapVq877ve+///5KvWHFxcXUq1ePK664gsjIyCq5RyAQYPHixQwaNAiTyXTK17mL3Xy/YA3t+rbCFlGWgt3r8mGy6iTWja+SuokL35m2HyGk7YizIe1HnA1pP2fH6/Vy5MgRnE4nVqu1uqtz3hmGQUlJCREREWc8X+t0eL1ebDYbffv2PeZ9n06geEEEV5MnT+aLL75gxYoV1K1b96TnJicnVyweVi4rK6sitWL5f7OysiplBcnKyqJjx47HLdNisVSkc/wtk8lU5f8YnE6ZoVCIJ0b/m4LMQqITomjdswUAVrtCwOvHCIPZIv9YXUrORZsUlwZpO+JsSPsRZ0Paz5kJhUIoioKqqpfkEjzlQwHL38G5pqoqiqIct72eTvut1p+UYRhMnjyZTz75hKVLl9KoUaPfvaZnz54sWbKk0r7FixfTs2dPABo1akRycnKlc4qLi1m3bl3FOTWFpml06l82RHLzdz9W7NdNGsFgiIC35nQRCyGEEEIIcbGr1uBq0qRJzJs3j3feeYeIiAgyMzPJzMzE4/FUnDN27Fjuv//+is9Tpkzhm2++YebMmezevZvp06ezceNGJk+eDJRFt3/5y1947LHHWLhwIdu3b2fs2LHUrl2b4cOHn+9HPGuXj+gGwK61+3AV/pqiXtM0PK6anVpeCCGEEEKIi0m1Blfl+e9TUlIqLfD1/vvvV5yTmppKRkZGxedevXrxzjvv8Oqrr9KhQwc+/PBDPv3000pJMO655x7+/Oc/c/vtt9O1a1dcLhfffPNNjRyvWq9FHeo0SyYcDLNlyfaK/SaLjtftIxgInuRqIYQQQgghxPlSrXOuTmWtpuXLlx+z74YbbuCGG2444TWKojBjxoxKqzfXZF2u6MjRPRls+HoLva/rjqIo6GYdn9uP3xtAN10QU+eEEEIIIYS4pF16s+NqoPb9WmOymshNz+fQjiNAWQCJAl6373euFkIIIYQQQpwPElzVABabmXZ9WgGw4ZstFfvNFhMel5dQKFRdVRNCCCGEEOKcSU1NZejQodjtdhITE7n77ruPWYfqRHw+Hx07dkRRFLZu3XpuK/oLCa5qiM6D2gOwY9Vu3CVlCT9MVhNBbwC/V1Y+F0IIIYQQF5dQKMSwYcPw+/2sXr2auXPnMmfOHB5++OFTuv6ee+6hdu3a57iWlUlwVUPUbpJErUZJBAPBisQWZUMDFXwyNFAIIYQQ4qJmGAY+X6BatlPJk1AuJSWFyZMnM3nyZKKiooiPj+ehhx46rTLKLV26lJ9++ol58+bRsWNHhgwZwqOPPsoLL7yA33/yJYm+/vprFi1axDPPPHPa9z0bkgmhhlAUhW5DOvHZi9+w4Zut9Lq2a1liC4uOu8RDZFzEJbnAnBBCCCHEpcDvD/K/D71XLfd+8dGbsFhOfSHduXPnctttt7F+/Xo2btzI7bffTv369ZkwYQITJ05k3rx5J73e5XIBsGHDBtq1a0dSUlLFscGDB3PnnXeyc+dOOnXqdNzrs7KymDBhAp9++il2u/2U610VJLiqQTr0b8PXry8hOzWH1F1HadC6HiaLjqfYg98bwGq3VHcVhRBCCCHEJa5evXrMmjULRVFo0aIF27dvZ9asWUyYMIEZM2Zw1113nVI52dnZJCYmVtpXHmhlZmYe9xrDMBg3bhwTJ07ksssu49ChQ2f1LKdLgqsaxOqw0q5vazYt3saGr7fSoHU9VFUt6yb2+CW4EkIIIYS4SJnNOi8+elO13ft09OjRo2z6yi969uzJzJkzCYVCJCYmHhMwVaXnnnuOkpIS7r///nN2j5ORcWQ1zGWDOwKwfeUuPC4vALpZx1PiOaOxrEIIIYQQ4sKnKAoWi6latt8GSmdr4sSJOJ3Ok27lEhMTyc7OrnR9VlYWAMnJycctf+nSpaxZswaLxYKu6zRt2hSAyy67jFtvvbXKnuNEpOeqhqnfqg5JDRLIOpzD1mU76DnsMkwWE363j4AvgNlqru4qCiGEEEKIS9i6desqfV67di3NmjVD07TTGhbYtWtXZs6cWWl44OLFi4mMjKR169bHvebf//43jz32WMXn9PR0Bg8ezPvvv0/37t3P8IlOnQRXNUx5YovPX17Ehq+30OPqLmi6RigUxu+V4EoIIYQQQlSv1NRUpk6dyh133MHmzZt57rnnmDlzJsBpDQscMGAArVu3ZsyYMTz11FNkZmby4IMPMmnSJCyWsukw69evZ+zYsSxZsoQ6depQv379SmWU94Q1adKEunXrVuFTHp8MC6yBOvRvi27SyTyUzdE96QCoulax/pUQQgghhBDVZezYsXg8Hrp168akSZOYMmUKt99++2mXo2kaCxcuRNM0evbsyS233MLYsWOZMWNGxTlut5s9e/YQCFwY675Kz1UNZI+w0a53K7Ys286Gb7ZSr2UdzFYzPrcPvy+A+TRSZQohhBBCCFGVTCYTs2fP5qWXXjrrsho0aMBXX311wuMpKSknzTvQsGHD85qXQHquagBNUwkGQpX2XTakIwA/fr8Tr9uHbtIIBkMEvCdfUE0IIYQQQghxbkhwVQM4oh2oilIpwGrYph4JdePx+wJsW7YTKOs6Lc8gKIQQQgghhDi/JLiqAax2C45oO97fBE7liS0ANnyzBQCTRcfr9hEMBKulnkIIIYQQ4tK2fPlyZs+eXd3VqDYSXNUAiqLgjHag6SoB36+T9ToOaIuua6QfyCRtXwa6WSfoD+H3XhgT+oQQQgghhLiUSHBVQ1hsFpwxDnxuX8U+R5SdNpe3BMp6rxRFAQW8vzlHCCGEEEIIcX5IcFUD+EI+guEgzmgHulnH7/k1acVlV3YEYNvynfg8fswWEx6Xl1AodILShBBCCCGEEOeCBFc1QK6rkHxPIbpJJyI2Ap/HX5FSsnH7BsTVisXn8fPj9z9hspoI+gIyNFAIIYQQQojzTIKrGuBIej5H8rIp8BTjjLJjsZvx/dJ7pSgKXX9Jy14+NNAwqDR8UAghhBBCCHHuSXB1AfMHgrz27ipeeWMtGVklHM7JoMTnITI2goAvQDgcBqDzH9qjaRpH96aT8XMWJqsJd4mn4rgQQgghhBDi3JPg6gKmaxoFxR78gRBLvjtIIBQkNTeLgGpgc1jxlpb1TjljHLTu2RyADd9sxWTRCXhlaKAQQgghhBDnkwRXFzBVVfjTTb2wWSErs4jNG7MJ4CMtPxfDrBMOhAiHynqnuv6y5tWWJdsJ+kMYhlExdFAIIYQQQoiaKDU1laFDh2K320lMTOTuu+8mGPz9NV2//PJLunfvjs1mIyYmhuHDh5/7yiLB1QUvJsrBsMH1gBA/rDlIbrafoO6jyO8moCqUFnsAaNyhATFJ0fg8Prav/AndrOMp8VQkvhBCCCGEEKImCYVCDBs2DL/fz+rVq5k7dy5z5szh4YcfPul1H330EWPGjGH8+PFs27aNH374gdGjR5+XOktwVQO0axVLh9YOIMhnn+8gFDAImfx4VYOiEg9+fwBVVen6S1r2sqGBJgJef6VFh4UQQgghRM1kGAbeYKBattP5Y31KSgqTJ09m8uTJREVFER8fz0MPPXRGf/BfunQpP/30E/PmzaNjx44MGTKERx99lBdeeAG///gjtILBIFOmTOHpp59m4sSJNG/enNatWzNq1KjTvv+Z0M/LXcRZ8RGmd59EMjLTyCt08+23+7n6muZYI3VKShSyMoqoXTeWzoM68N1bK0jddZTctHwckTb83gBmq7m6H0EIIYQQQpwFXyjIqA/fr5Z7f3D9jVh10ymfP3fuXG677TbWr1/Pxo0buf3226lfvz4TJkxg4sSJzJs376TXu1wuADZs2EC7du1ISkqqODZ48GDuvPNOdu7cSadOnY65dvPmzaSlpaGqKp06dSIzM5OOHTvy9NNP07Zt21N+hjMlPVc1gKEaqFa48or6qEqYn/ZksHtnHn7FR2SSFU8gQHZuMbZIG616NAPKeq80k46rsFSyBgohhBBCiPOmXr16zJo1ixYtWvDHP/6RP//5z8yaNQuAGTNmsHXr1pNu5bKzs0lMTKxUdnmglZmZedx7//zzzwBMnz6dBx98kC+++IKYmBhSUlLIz88/B09bmfRcXcDCRpjPDq/g66OLublWTyLi7PS7PJFlq7L5ZvFuatfpRlysRnSCnfxMF6qu0mFge3au2cOWJdsZNLYvvlIfHpcXR6S9uh9HCCGEEEKcIYum88H1N1bbvU9Hjx49UBSl4nPPnj2ZOXMmoVCIxMTEYwKmqlTeqfDAAw8wcuRIAN58803q1q3LggULuOOOO87ZvUF6ri5onoCfTw5/S0m4mMXZu0FVadE6nkb17AQCfhZ+/hOBYAAiwOE04yr2EtM4kci4CDwuDz+t3ouiq7gKXNJ7JYQQQghRgymKglU3Vcv220DpbE2cOBGn03nSrVxiYiLZ2dmVrs/KygIgOTn5uOXXqlULgNatW1fss1gsNG7cmNTU1Cp7jhOR4OoC5jBbuarWYAAO+VPJN0rxGUGuGFQfm1UhI7OQNavTCKoB9Cgds6pgGND88paEwwYbv92K1WHB6/JVrIklhBBCCCHEubRu3bpKn9euXUuzZs3QNO20hgV27dqV7du3VwqwFi9eTGRkZKXg6be6dOmCxWJhz549FfsCgQCHDh2iQYMGVfugxyHB1QVuVJPLaWMxMBSDz9LXEWG34FNCDL2iHhBk1ZqfyUxzY9hDhM0GKgZtUtoQDhsc+PEwuUfzUHSVkvwS6b0SQgghhBDnXGpqKlOnTmXPnj28++67PPfcc0yZMgUo641q2rTpSbdyAwYMoHXr1owZM4Zt27bx7bff8uCDDzJp0iQsFgsA69evp2XLlqSlpQEQGRnJxIkTmTZtGosWLWLPnj3ceeedANxwww3n/NmrNbhasWIFw4YNo3bt2iiKwqeffnrS88eNG4eiKMdsbdq0qThn+vTpxxxv2bLlOX6Sc8MwfFByP5Pq7qKe2YVbyeOHvENoukp8XSed2kUDQT79fAf+QBg1Cnw+H/HJ0TTu3IhgKMzKTzdI75UQQgghhDhvxo4di8fjoVu3bkyaNIkpU6Zw++23n3Y5mqaxcOFCNE2jZ8+e3HLLLYwdO5YZM2ZUnON2u9mzZw+BwK/LDz399NPcdNNNjBkzhq5du3L48GGWLl1KTExMlTzfyVRrcFVaWkqHDh144YUXTun8Z599loyMjIrtyJEjxMbGHhOFtmnTptJ5q1atOhfVP+cUxQJaPSy6zh8TMlAJs754K4qu4fb56HV5HeJiTJSUuFn07X4USxjDGsZT6qXrVZ1RUNjw9RbyMgrLeq9k7pUQQgghhDjHTCYTL730EkVFReTn5/P444+f8bytBg0a8NVXX+F2u8nJyeGZZ55B139NsJGSkoJhGDRs2LDS/Z955hmysrIoLi5m8eLFlTpjzqVqDa6GDBnCY489xogRI07p/KioKJKTkyu2jRs3UlBQwPjx4yudp+t6pfPi4+PPRfXPC8UxHkVx0MgeoF9kNmg+PkrbjNNqwRMOctWQBhXp2ffuLkaNAF/QS/029WjUoQHBQIivXluC1W7BW+KV3ishhBBCCCHOkRqdiv31119n4MCBx0xO27dvH7Vr18ZqtdKzZ0+efPJJ6tevf8JyfD4fPt+vQUdxcTFQNvntt12MZ6O8nNMvz4ZXH4Y59A5Xx+WyzR1Njv8Qe4ubUN/kwBZppnevBL5flc3Xi3YxbkxnNFuQElcJKbf04dCPh9nxwy5+3nGYpPoJFOQWoplVVFWm29UkZ95+xKVO2o44G9J+xNmQ9nN2AoEAhmEQDodr3Mij8nqfbRlVVdapCIfDGIZBIBBA07RKx06nDStGec2rmaIofPLJJwwfPvyUzk9PT6d+/fq88847jBo1qmL/119/jcvlokWLFmRkZPDII4+QlpbGjh07iIiIOG5Z06dP55FHHjlm/zvvvIPdfiGsD2XQruEnRDtS2eaz8Ep2E0IlMYwKdcGsqhiGwaK12WTl+4iLMjOkVxKqWtb1uvHDHRxYm0psvSgG/rkXilp1qTSFEEIIIcS5UT4Sq169epjN5uquzkXP7/dz5MgRMjMzCQaDlY653W5Gjx5NUVERkZGRJy2nxgZXTz75JDNnziQ9Pf2kDa6wsJAGDRrwr3/9i9tuu+245xyv56pevXrk5ub+7gs8VYFAgMWLFzNo0CBMJtNpXVvg2U04lEqk7yV8ITevZSWxxZVAw1AnhiQ2wdAh6Anw7nv78XgVenVvTNvWkWilJiyGlVf+/AYhf5BRdw+jxWVNsTjMxNeOrdI1C8S5dTbtR1zapO2IsyHtR5wNaT9nx+v1cuTIERo2bIjVaq3u6px3hmFQUlJCRETEefnO6vV6OXToEPXq1TvmfRcXFxMfH39KwVWNHBZoGAZvvPEGY8aM+d1IPjo6mubNm7N///4TnmOxWCrSOf6WyWSq8n8MzqRMPaASVBMJMhiz7wtGxmWxxx3Nfv9usgJ1qGO2oThNDBpYj8+/PMya9YdoUL89dpMPp9lJzxHdWfneKhbPXUHbXq0IuAOE/CFsTluVPps4985FmxSXBmk74mxI+xFnQ9rPmQmFQiiKgqpemtM5yocClr+Dc01VVRRFOW57PZ32WyN/Ut9//z379+8/YU/Ub7lcLg4cOFCxWnNNFrb+AUNNJsGsMSzuKLrZw8L0HQQCYSyqRp2GEbRvEwUE+erbvaBrFHoK6X5NF5xxERTlFrP6sw0oqkpJQSkXSKelEEIIIYQQF4VqDa5cLlellZgPHjzI1q1bSU1NBeD+++9n7Nixx1z3+uuv0717d9q2bXvMsbvuuovvv/+eQ4cOsXr1akaMGIGmadx8883n9FnONQMDFJ2A9SZURaNXZDFNrEX4bUf4ISebcMhAVxW6965NTLROcUkpq9dmENbDeANe+t7cm3DY4PsPVuP3+vGUePCWeqv7sYQQQgghhLhoVGtwtXHjRjp16kSnTp0AmDp1Kp06deLhhx8GICMjoyLQKldUVMRHH310wl6ro0ePcvPNN9OiRQtGjRpFXFwca9euJSEh4dw+zDlk0SyEjbIMJoq5CQG9FzbVwg0JRzBpATa6dpFV6kVXVDRdY9AV9VCUMLv3ZZGaWYov4KNVz+YkNU7C5/WzZP5K6b0SQgghhBCiilXrnKvyRb9OZM6cOcfsi4qKwu12n/Ca9957ryqqdkGxqhY8qolAOIBZMxO0XYMW+pH6lnwGxaTztaHyxZGfGW9ric1iIibBTs8eiaxek82a9UdoMjIar8dLn1v68NGjH7L5ux/pPrQzMYlReEu9MvdKCCGEEEKIKlAj51xdahTFhE01ETRCGIaBrtnxmkeiqyYGRmeTZHZTYD7IhuwCDMPApmm0ah+H06lT6vKy61A+mA2SGyXSrHtzwmGDb95YhgHSeyWEEEIIIUQVkeCqBlDUSKyaFbOiEAiXLWKmWTrjV1oTodsYlXAIi83F8px95Lq8qKqK3WKi02XxhI0w6zccQXWoKMEwXUd0RdVUfv7xEId3HpG5V0IIIYQQ4oKVmprK0KFDsdvtJCYmcvfddx+zDtV/27t3L9deey3x8fFERkbSu3dvli1bdl7qK8FVDaCodhQtGptGRe+Vqir4bTeAYqWZzU+PyGzM0Rl8eSidUDiESdVo1SqOiAgVt9vPlj0ZWCLNOJx2Og3pBAZ8++YygsEwrkK39F4JIYQQQogLSigUYtiwYfj9flavXs3cuXOZM2dORX6GE7n66qsJBoMsXbqUTZs20aFDB66++moyMzPPeZ0luKohFDUKq+rArBgVvVcWUzwudTA21cpVsWlEmUs5HDrIj1lFANjNJi7rGg9GmNXrDmPYFGwOE60GtMEeaSc3PZ8fv9+Ju9iN1+072e2FEEIIIUQ1MgwDf9hfLdvp/BE+JSWFyZMnM3nyZKKiooiPj+ehhx46oz/kL126lJ9++ol58+bRsWNHhgwZwqOPPsoLL7yA3+8/7jW5ubns27eP++67j/bt29OsWTP+8Y9/4Ha72bFjx2nX4XTVyEWEL0WKYkHRYrCFXRQFgphUE4qioNhS8JduJlY/yIiEVN4KWfgy9RDN45w4zGZatYpn86Y8ior8rN12iAEdWlLq8tN9ZHeWvbmM5e+vplWP5rgKSrHaLedlBWwhhBBCCHF6AkaAx3/6R7Xc+4HW92FWzKd8/ty5c7nttttYv349Gzdu5Pbbb6d+/fpMmDCBiRMnMm/evJNe73K5ANiwYQPt2rUjKSmp4tjgwYO588472blzZ0XG8d+Ki4ujRYsWvPXWW3Tu3BmLxcIrr7xCYmIiXbp0OeVnOFMSXNUgihqBVYvEEyqoyBxo0c2UmG4gNvAsHR2FbHTks8WZzlcH4rmhVX2sJjNdusay9LscVq8/RN/LmhMV66R+hwYkNEggJzWHNQs3MODm3nhjHNgc1up+TCGEEEIIUYPVq1ePWbNmoSgKLVq0YPv27cyaNYsJEyYwY8YM7rrrrlMqJzs7m8TExEr7ygOtEw3xUxSF7777juHDhxMREYGqqiQmJvLNN98QExNzdg92CiS4qkEUxYSqxWHTiikKBCp6r6yWBhQHLsepfc+I+EMc8ESwJf0IXQpiaBwTQcsW8WzeWEBBoZ9VG/YzuGdbigpL6XNjLz5++jPWfbmZ9v1a44iyS++VEEIIIcQFyKSYeKD1fdV279PRo0ePSt8ne/bsycyZMwmFQiQmJh4TMFUlwzCYNGkSiYmJrFy5EpvNxmuvvcawYcPYsGEDtWrVOmf3BplzVfMoDqxaDGYlWDH3yqzr+C1XElbiSTLBkNh0nHGZfLj3CKGwgdVkoVu3aCDMyvU/4wsHiUuKJLpeHM0va0I4HGb5+6spLZK5V0IIIYQQFyJFUTCr5mrZqvIP7xMnTsTpdJ50K5eYmEh2dnal67OysgBITk4+bvlLly7liy++4L333uPyyy+nc+fOvPjii9hsNubOnVtlz3EiElzVMIqioWqx2HQrwd9MMHSYnOQr12DTbPSMzKRRRB4lWibLUjNRFYVmzWKJjdHxev0sX7uX6NgIdKuZ7sO7oioqezbs5+ftqZQWyrpXQgghhBDizK1bt67S57Vr19KsWTM0TWPGjBls3br1pFu5rl27sn379koB1uLFi4mMjKR169bHvbfb7QZAVSuHOaqqEg6Hq+gJT0yCq5pIcWDVYzErgYreK5OuoVva4aIDTs3KDfEHiYjNYumRdIq8fiwmO917RGEQZuX6/Xh9QaLinDjjI+lyRQcAlr27CldhqfReCSGEEEKIM5aamsrUqVPZs2cP7777Ls899xxTpkwBynqjmjZtetKt3IABA2jdujVjxoxh27ZtfPvttzz44INMmjQJi8UCwPr162nZsiVpaWlA2RDEmJgYbr31VrZt28bevXu5++67OXjwIEOHDj3nzy7BVQ2kKMovvVcOgmFvRU+T3WSiSL0akxpFHYuXfrHpWKKy+epAOqqi0KRxFAlxZry+AEvX7iYyyoHmsHHZ0E5YHVYyD2WzddlO6b0SQgghhBBnbOzYsXg8Hrp168akSZOYMmUKt99++2mXo2kaCxcuRNM0evbsyS233MLYsWOZMWNGxTlut5s9e/YQCJR1OMTHx/PNN9/gcrkYMGAAl112GatWreKzzz6jQ4cOVfaMJyIJLWooRbFh1RPwBH6uyByoayoOSxw5niuJ1z5kUPQRtpXEsPVwLL2K4qkXaad79yi++CqXlev3079HSxyRFgIYXD68G0vmr2DFgjW06dUCZ4wTq91S3Y8phBBCCCFqGJPJxOzZs3nppZfOuqwGDRrw1VdfnfB4SkrKMZ0Cl112Gd9+++1Z3/tMSM9VDaZqMdjMUQTD7opGZTOZCGjdCChNcWg61yUcIiI2i4X70sFQadLYTmKCGZ8/wNLVu7FZLegOC50HtiM2KRpXUSk/fLoeV6FLeq+EEEIIIYQ4DRJc1WCKYsaqJWIiRCBUtkq1pio4LVayuAa77qCFrZCO8YfJCOSyNasAs+6ge7dowoRYtWE/Pn8QRdOwx0XQ76bLwYD1X20mfV8mxXkl52XinxBCCCGEEBcDCa5qOFWLxm6OIWSUVuq90vXalNAXq2bhmrjDxMSn8/XBDPwhhcaNzCQmWvAHgyz5YQ9mk45h0mjftzX1WtYhGAyx/IPV5GcWUphTTCgUquanFEIIIYQQNcHy5cuZPXt2dVej2khwVcMpio5VT0bHIBAqy/KnKhBhsZBPCmYtmVhTgMGJ+/GY8ll+KAezZqN792hCRojVmw7g9QUIBsPYou0MvrUfGLDjh90smb+SgqxCCjILCQUlwBJCCCGEEOJkJLi6CKhqJHZzAiHj13lSVl3HbnaQyzBsmpU+URk0SjrIioxsCjwKjRqYSUq2EggGWbZmD6AQVlSad23KoLF9UVBY//VmPn3ua/IzC8lNz8fvC1TvgwohhBBCCHEBk+DqIqAoKlY9ERMqwbDvl33gNJvx0hJDbY9ZVRiRuBfdmceiw9mYVJ2u3aIJGiHWbjmIx+fH7fFhcVrpM7In1/11KLpJZ8+G/bz75CfkHMkjPz0fn0fWwBJCCCGEEOJ4JLi6SKhqBDZzIoFwcUXvlUXXibLayGIYNs1JU1sJPer8xNbcPA7nB2lcz0xybRuBYIjv1+4jGDLwB0JExUfQqlszxk4fhSPSTtr+DN565AOO7MsgJy0fj8tTzU8rhBBCCCHEhUeCq4uEoihYTUmYFTPB8K/Bj91sxqwn4FIGYVI1hsUdJDY2g0VH81AI06VbDCEjxPqth3CVenG5fZhtFmJrxVCnSRLjHr2JuFqxFGYXMe+RBfz842Fy0/IpLSqtxqcVQgghhBDiwiPB1UVEUx3YTMkEwyUVvVeaqhBltVCi9EVVaxOlBxlabyuH3UX8lBugfm2NWnWdBEMhVm04gM8fxOcP4ox2EF83jrg6sYyZdgP1W9bFU+rl3cc/Zvuq3eSm5VOUVyJrYQkhhBBCCPELCa4uMlZTArpiIxj+tWepbHigkyJlJCZVp1dUFk2T97P4aD4EgnS6LJqQEWbDtkMUFnsocfswDAOb00ZCnVhik6IYde+1tOnVklAoxKfPfcWazzdRkFFAUW6xrIUlhBBCCCEEElxddDTVht1Ui2CoFOM3QY/dbEY1tcKrdEJX4fq62ygOF7E5z01SItRtEEE4bLBq/X68Hj9+fxAAi81CXJ1YouMjGXbnIC4f3g2Ape+u5Js5y8hNz6cgq1DWwhJCCCGEEFVuypQpdOnSBYvFQseOHU/pGq/Xy6RJk4iLi8PpdDJy5EiysrLObUV/IcHVRchqikdXHQQNV8W+8uGBXm0EmmKnodVFnwZbWZZWjBIM0KZzFCHCbNl5hKy8Ytxef8W1JrOJuFoxRMVH0veGngydMBBVUdn83Y98PPtLctPyyc8oJBgIVsfjCiGEEEKIi9j//M//cOONN57y+X/961/5/PPPWbBgAd9//z3p6elcd91157CGv9LPy13EeaWpFuzm2hR796ErDhRVA8Cs6zitSZSUXolD+YRhSXvYktGcH3LsdEmIoX7DaNIOF7Nq/QFqJUZht1mwmMuaiKZrxCRFo+kaHfu3JSI2go9nf8H+LQeZ//hHXP/XqwmHQsQkx2C2mKrz8YUQQgghLjpl89yra0kcC4qinNKZKSkptG3bFoC3334bk8nEnXfeyYwZM065jN969tlnUVWVnJwcfvzxx989v6ioiNdff5133nmHAQMGAPDmm2/SqlUr1q5dS48ePU67DqdDgquLlEWLRVOjCBkl6ERX7HdYLHgDA/F7V+PUMhneeB1zt8fSMd5Bi45RHDlUxI+70rj8siZoqkZcjKMiwFJVlaj4SDRdQ1UVbpk2ig+e+pTMg9m89cgCRt19LaFQmNjkGKx2SzU9uRBCCCHExciHkTe6Wu6sxL0DWE/5/Llz53Lbbbexfv16Nm7cyO233079+vWZMGECEydOZN68eSe93uVynfT4yWzatIlAIMDAgQMr9rVs2ZL69euzZs0aCa7EmdE1Mw5LbYq9+1GVIlQlEhQFVYEom5280M2YA7PpHp3O6oSDLEuz84d6Dho2iiH1UCGrNuznuiGdyM13ER/rrAiwFEUhIsaJpmsousatj9zIB09/Rs7RPN5+5AOu+8vVNOvSmLhaMdgcp/5LKIQQQgghLg716tVj1qxZKIpCixYt2L59O7NmzWLChAnMmDGDu+6665zdOzMzE7PZTHR0dKX9SUlJZGZmnrP7lpPg6iJm02MJmBtS6j+CXS1CUSNBUX8ZHtiWAl97opRtjGq4nic31qNLYjRN28Vw6GABP+5OJyEugj5dm5KbX0J8jBPLb4b72SNsaLqKrqnc8tANfPLvrzi0M5X3//kpQ+8YRKcBbcsCLKetGt+AEEIIIcTFwvJLD1L13Pt09OjRo9IQwJ49ezJz5kxCoRCJiYkkJiZWdQUvGJLQ4iKmKiqR5kRs5ga4QzpGuASMsqx+DosFq+2PhLBQ1+qif4OtLD6SQ1yiQY8eDQGDJT/sYdHKXfh8IXILXPh8gUrll2cSjKsdw6i7h9G+b2vC4TCfv/Qtqz5eT05aPqXF7vP/4EIIIYQQFxlFUVAUazVtpz9X6kQmTpyI0+k86XY2kpOT8fv9FBYWVtqflZVFcnLyWZV9KqTn6iKnKipR5jgMw8ATyMSOC1Q7qmIi2p7IIe9AkpUvGVp7BxszWvBzSRzN2zQiMSKSL7/bwepNP+P1Brh6YDtyClwk/FcPVnkmQU1XGTLhD0TEOvnh0/UsfXclJfklXHFrP6gThyPKUY1vQQghhBBCnC/r1q2r9Hnt2rU0a9YMTdPO+bDALl26YDKZWLJkCSNHjgRgz549pKam0rNnz3N233LV2nO1YsUKhg0bRu3atVEUhU8//fSk5y9fvvyXqL3y9t/jJ1944QUaNmyI1Wqle/furF+//hw+xYVPUzSizHGY9dq4DTuEPWD4MOs6Sc5rKQzHYddCXNd0NcsP5RBQS6nXJIY/Du+Kqips3nmEBV9uxu3xk1PgwvtfPVjlmQTjkmNIuaEXg2/tD8CGb7fy8bNfk3UoB1dh6fGqJoQQQgghLjKpqalMnTqVPXv28O677/Lcc88xZcoUABITE2natOlJt9/av38/W7duJTMzE4/Hw9atW9m6dSt+f9myQWlpabRs2bLi+35UVBS33XYbU6dOZdmyZWzatInx48fTs2fPc57MAqo5uCotLaVDhw688MILp3Xdnj17yMjIqNh+O27z/fffZ+rUqUybNo3NmzfToUMHBg8eTHZ2dlVXv0bRVZ1oSwwmLRE3EWAEwXATYbPj1UehKAqXxRyhbsQhNuVk4lH8JNWNZtz1PdF1lV37M3n3sw2UlvrIPU6AVZ5JMLZ2DN2GdOS6KUPRNI1d6/by7j8+JW1/BiUFrl/SiAohhBBCiIvV2LFj8Xg8dOvWjUmTJjFlyhRuv/32Myrr9ttvp1OnTrzyyivs3buXTp060alTJ9LT0wEIBALs2bMHt/vXqSizZs3i6quvZuTIkfTt25fk5GQ+/vjjKnm231OtwdWQIUN47LHHGDFixGldl5iYSHJycsWmqr8+xr/+9S8mTJjA+PHjad26NS+//DJ2u5033nijqqtf4+iqTpQlGl2LwW1EACqq4aJhVDdSgy3RVIWbmqxhw9EMfJoXjxEgPtHJn266HKtF5+fUPOZ+uJaiIg85+SV4frPQMPyaSTCuThxt+7Ri1D3XYLGaObzrCG8/soBDO1IpzpcASwghhBDiYmYymXjppZcoKioiPz+fxx9//IznbS1duhTDMI7ZGjZsCEDDhg0xDIOUlJSKa6xWKy+88AL5+fmUlpby8ccfn5f5VlBD51x17NgRn89H27ZtmT59OpdffjkAfr+fTZs2cf/991ecq6oqAwcOZM2aNScsz+fz4fP9uihbcXExUBYJBwKBE112WsrLqaryzpQCOBQHheEAJSGwKR40o5QI2834/I+TbCtmQJ1tfL03nrHtulDqNXBGmLltVE/mfLiWo5mFvP7BD4wZ0Y1QKEhctAOrxVzpHiarTkxSJM26NOKmB0bw4TOfk52ay9xp7zPq7mtp0qkhEbHOKp0cebG7UNqPqHmk7YizIe1HnA1pP2cnEAhgGAbhcJhwOFzd1Tkt5fU+2zKqqqxTEQ6HMQyDQCCApmmVjp1OG65RwVWtWrV4+eWXueyyy/D5fLz22mukpKSwbt06OnfuTG5uLqFQiKSkpErXJSUlsXv37hOW++STT/LII48cs3/RokXY7fYqfYbFixdXaXlVyZfckW5J6xhS90f+vrIxH+b56RERWXG8Z2szi9cVcvhICbNe/ZKB3ROJcppOUmKZy8a1ZsWrG0g/ksmL975Bn9suI75hzLl8lIvWhdx+xIVN2o44G9J+xNmQ9nNmdF0nOTkZl8tVMb+oJggGg/j9/orOirNVUlJSJeX8Hr/fj8fjYcWKFQSDwUrHfjvk8PfUqOCqRYsWtGjRouJzr169OHDgALNmzeLtt98+43Lvv/9+pk6dWvG5uLiYevXqccUVVxAZGXmSK09dIBBg8eLFDBo0CJPp9wOS88EX8lEYKAQjjFUppcDtJKt0L0mmAsa028Aru2JpVLc+nZNrYQnaMYdUOnYKMeejdeTku1j7k58/Du9AneQoYqMd2KzmY+4RCoUozi2hXfMOfDTzCzIOZLHx7Z+4ZvJgugzqQFR8RKVhneL4LsT2I2oGaTvibEj7EWdD2s/Z8Xq9HDlyBKfTidVqre7qnLIVK1ZUSTmGYVBSUkJERMR5Ge3k9Xqx2Wz07dv3mPd9OoFijQqujqdbt26sWrUKgPj4eDRNIysrq9I5v5fX3mKxYLEcuziayWSq8n8MzkWZZ8pkMqGZNAp8hYSUGBIirRzwDiZJ+YAuCUfpmbeXj3drRNosNIvRsCgROEwm/nRzb+Z9vI6jmYW89fF6/ji8K5qmo+sm7DbzMfew1LFgsVq45cGRfPzvrzmw9SCfPvs1nmIv/W7oSXRi1DHdr+L4LqT2I2oWaTvibEj7EWdD2s+ZCYVCKIqCqqqX5B+iy4cClr+Dc01VVRRFOW57PZ32W+N/Ulu3bqVWrVoAmM1munTpwpIlSyqOh8NhlixZcl7y2tdENs1GlDmKoBHGj50O8deyzdcMVTG4pcUPJNmymbdtN2mlhRQrLrAq2G0mxo7sTuP68Xh9QeZ+tI7dBzLJK3Dh9hzbba0oClHxkdRuksyN91xL+76tMQyDb15fypevfkd+ZgGhUKganl4IIYQQ4sImicDOj6p6z9UaXLlcropc9QAHDx5k69atpKamAmXD9caOHVtx/uzZs/nss8/Yv38/O3bs4C9/+QtLly5l0qRJFedMnTqV//znP8ydO5ddu3Zx5513Ulpayvjx48/rs9UkDt1OlCmSAEF0LYpmcX8jIxiHRQ3xv+0WA8XM2fwTee5i8kJFqHYVh93MzddeRqumSQSDYd75bCPbdqWRm19Cqdt3/PtEOUhqkMCIKVfR65quAKz6eB0fPL2QvPR8QkEJsIQQQgghgIpRPTVpvlVNVj6v6mx7Wat1WODGjRvp379/xefyeU+33norc+bMISMjoyLQgrLG9be//Y20tDTsdjvt27fnu+++q1TGjTfeSE5ODg8//DCZmZl07NiRb7755pgkF6Iyh+7AwKAoUESiox6GcS++0oeoZXFxa9tlvL79CuZu2cXtXduQjUGyLZ4Izcr1QzrzxZLtbPnpKB9+uRmvN0C3jg3KyrQfO9TS5rCSUDeeIbcNwBFt57u3VrBt2U5KC9388cGRJNSNw3ycuVtCCCGEEJcSXdex2+3k5ORgMpkuuaGB4XAYv9+P1+s9p89uGAZut5vs7Gyio6PPeqpKtQZXKSkpJ+2CmzNnTqXP99xzD/fcc8/vljt58mQmT558ttW7pCiKglN3EjbClARdJDnaUBD+H1Tvy3SPSmN/kw18v78787fu5dbOLcnw5tHAmUCEZuPqge2wWk2s2XyQz5dsJxwO071TIwwDnI5jAyyzxUR8nTj633Q5zigHC1/6lv1bDvLavfMZ/eB11G6chD3SLqnahRBCCHHJUhSFWrVqcfDgQQ4fPlzd1TnvDMPA4/Fgs9nOy3fC6OjoKlkLq8YntBBVR1EUIkwRGIAr6CLGOQRP6ABKYDHXJ+/iiNfJz0fb8uGOA9zYvgmppTk0dCYRrdgZ1KcVFrPO8rX7+HLZTqxWEx1a1QWOH2BpukZscgw9r7kMZ5SdD55ZSNqBDOY+/D433zeCWk2SiIyLQDdJExVCCCHEpclsNtOsWbNLcmhgIBBgxYoV9O3b95wnRDGZTFWWXE2+uYpKVEUl0hQBGJQEXdgib8MoPgSBfdxadyfPBizszoKv91gY0qI+h0szaRJRm1jNTr8ezfAHgqzedJBPvtmKzWKiWWMDA4MIx7EpRFVVJSo+ko5/aIfNaWXeYx+RfSSXt2cs4MZ7h1OrUSJRiVHYjnOtEEIIIcSlQFXVGpWKvapomkYwGMRqtdaobJOX1uBNcUrKAqxIHLoDr6GjOiYRqcdR2+ziurr7iUw6wPq0I6w9nI0v5OOQKwPdqhAX7WBg71Z0al2XsAHvfb6Jw2kF5Oa7KHZ5j3svRVGIiHHS+vKWjH9iNFFxkeRnFfL2jA85sjeD3CO5FOWV1LiVyYUQQgghxKVHgitxXKqiEmWKxG5y4FFrEbaNJlp30t2RTveELCIT9/PtgX3syiqhNOjl5+JMdJtKYnwEwwa1o3XTZIKhMPM+XkdWTgl5BWUB1onm2NkjbDTt2JBxj95IYr14XIUu5s1YQNrP2eRnFpCXXoDfFzjPb0EIIYQQQohTJ8GVOCFN0YjUI7HpEbhN3cF8OVG6k2ui99Mg2kVk4gEW/LSD9AIfroCbn4vS0SwQF+NkxJUdaNIgHn8gxNwP11BY7Ca/wEVxieeEAZYj0k79VnW55aHrqdu8Np5SL29P/4C0fRmUFrnJPpJLabFb1nsQQgghhBAXJAmuxEnpqk6UKQqLGoPLNByTXpcks5Xr4w4Q7fTiTNjPnB+3UuKBIr+Hg0XpWOwq8TERXH9VZ+rVisHtDfDmB2twuX0UFLlPGmA5ox3UapzETfcOp1nnxgT8AebN+JAD2w6BYZB7NI+C7CJZE0sIIYQQQlxwJLgSv0tXdaIt0eim2hSbb8Ki2WhpCzIkLosIhxdb3D5e3byRUECjwOchtTgbq0MnLtbJzddcRlJ8BEUlXt54fw0+f4j8IjdFJwiwFEUhMi6C+LpxDP+/q2jfpzXhcJgPnv6Mrct2YnFYKc4tJudoHt4TLFYshBBCCCFEdZDgSpySsgArDtXcmlLTEKyazgBnLl2ifDjsHrToPby8eSO6YSbbU0x6aS6RTjOxMQ5uua47sdF28gpLeeOD1YRDBgW/E2BFxUcQkxjJkAkD6D60C4ZhsPDFb/j+g9XYo+z4PH5yjuRSnC/JLoQQQgghxIVBgitxykyqiShLbULmAfjVtlg0nVExGTRxathsboLOnby6aTMOzUqGp4B0Tz6RkTZio+2Mva4HEQ4LmTnFzP1oLYqikF9YSuEJ5lBpmkZMYjQRMREMuLk3A0b3AWDJ/BV89ep32CKsaLpGXkYB+RkFBPyS7EIIIYQQQlQvCa7EaTGrJqKs9fFabsBQo4nQfNwaX0wdhw2L1U2JbSuvbdmKU7WQ6c4nx1dMRISVuBg742/oid1q4nBaPvM/WY+uaxQUuSksdhMOHyfA0jVikqKwOSz0uLoLV99xBQCrF27go5lfoJk0HJF2SgpLyUsvwOeRYYJCCCGEEKL6SHAlTptZsxNpa47HciOKopKgH+WPCTZqO+yYrW6ylB95+8dd2HUTmaV5FAU8RETaiI1xMO6GnphNGnsPZvPBF5swm3UKiz0nDLBMZhOxtWIwWUx0HNCWUXddi6qqbFm2nXmPfkgwEMIZ7SgbJng0D3eJpxreiBBCCCGEEBJciTNk0WNwWLvjMQ1EVQyamNZzXXxDkuwWLPYiDnh2sWDHAUw65HjzKQn7cTotJMZHcOvIHuiayvbd6Xz27Tas5QFW0fEDLLPVTGxyNIqi0LJHM255+AZ0k86eDfuZ8+C7eEt9OKLsAOQezaMor0TStQshhBBCiPNOgitxxsymRCz2awhoLVCVMG0s39M3ugkJdjOOmHS25e7ni92HUNUQ+d5CPEoQq81MraQoRg/viqrA+m2H+Xr5TmwWncISNwVFbkKhYxNU2BxWYpOjCQdCNG5Xn/95YjRWu5VDPx3htXvnUZRbjNVhRbeYKMgsKEvXHpJ07UIIIYQQ4vyR4EqcMUXRsJpqoTnGEFaiMCsF9Hbsp4WzFvF2ExEJh1hz9BCL9h8B1U9xsBS/GsJk02lcL54bhnYG4Pt1+1m+dh8Om4WiEjf5haXHDbAckXZikqMJ+ALUbVaLCU/dgjPaQcbBLF7+61wyfs7CbDVhddooyi0mP6NQEl0IIYQQQojzRoIrcVYU1Y7F1BjsowGIUHdwdXSYZEsU8U6IiD/E0p8Ps+xgGobixx32EdTCKLpC62a1uWZQewC++f4nfth4AIfdQkmpl7zCUoLHCbCc0Q6iE6PwlHpJrB/PxH+NI6FuPEV5xbx691vs3XgA3aThiLLjKpJEF0IIIYQQ4vyR4EqcNUWNwmK5DMNyNWCQqH7HiFgrUWYLidE+7FHpfLP3ICsPlfVgBY0QAd0gqITp1qEBV/RpBcBni3/ku5W7cdgsuEp95BW4CAYrD+1TFIWIWCdR8ZG4SzxEJ0Ryx8yxNG7fEJ/Hz1vTPmD9V5tRVRVntAOv20dOWr4kuhBCCCGEEOecBFfirCmKhqLFYrZfhWG5AgVopC9hWLQdp9lEcmI+Jls+n+3az5ojh1G0AIoGAd3AHfDTr3tTruhbFmAtWrmLj7/Zit1qxu32k5vvIhCoHGCpqkpUfAQR0Q7cxW5sTivjHr2Jzn9oT9gI8+nzX/PN60sxDANntAMMQxJdCCGEEEKIc06CK1ElFMUKajRm21BC5v4oCnSwriYlQiPSYqFenUw03c0HO/ayPj0VTQ9jMesETAYlPh8DerZg5JBOqAqs3XKQeZ+sw2LRcfsC5Ba48PmDle6naRoxSdHYImyUFrrRdJWRU69m4C39AFjx0RrefeITAr5ARaKLwsxCSXQhhBBCCCHOGQmuRJVR1ChQIzBbryFg6oWqhOnn3EoHm49Iq07D+pmAn/lbd7I5MxWTbmCzmvHrYXJdpXTv2JAx1/2Spn1POv959wd0VcHrC5Cb78Lnq5ycQtM1YpOjsToslBa6ARgwujej7roWXdfYuXo3r903H1dBKWarCYvTSlFuMQWZkuhCCCGEEEJUPQmuRJVRFBVFi0fV7Zjto/DrndEUg2uidtPYXEyEPUSTegWEjSBvbtrKj9lHsZo0nHYbfi1MVlEJbZrXYsLo3lgtOj+n5vLi2ysIhcIEAiFyClx4/yvAKl9k2Oa0UFpYSjgcpuOAtox/fDQ2p40je9J4aeocslNzf010USiJLoQQQgghRNWT4EpUKUUxo6jx6KoZzTYGv94WkwqjYg5QWy/AGVlM41p+guEAr27cwO68o9jNOtERNrxqmPTCYurVjuF/x/QjwmEhI7uY5+d+j8frJxQMk5Nfgtvjr3RPs8VEXO1YnNEOSgvdhIIhGrWrz8R/3UpscgwFWYW88re5/PzjYVRVxSGJLoQQQgghxDkgwZWocorqADUWq6YSto7Dr7XArqncFHOARL2IqPgsGsbr+IIBXli3lkOF6TgtFpJiIgiZDNKLi4iOtvN/4/sTH+OgoMjNc3OXk19YihGGvALXMQGWbtKJSY4mKj4Cd4mHoD9IQt04Jv7rVuq3qoun1MubD7zL5iXbURQFZ7QDI2yQl5ZHaVFpNb0pIYQQQghxMZHgSpwTihoFWiwRuk7AOp6A2pQok4kbonYRqxZSu04mtSNtlAYCzFr3A+klGVhNOskxkWhWnWyXC82kMXlcCvVqxVDq9vPi2ys4kl4AKOTml+AqrTysT9M0ohOjiEmMwlfqw+8N4Ix28Kcn/0i7Pq0JhUJ8OHMhS+atwDAMbE4rqkknL72AkgKXZBIUQgghhBBnRYIrcU4oioKixoAaQYRuwWMdT0hrSILZwoio7ZjJpn3TIpIcdoq8fmauWUmBOwuTqlIrOhKzQyfX48IXCnLnLX1o3igRfyDEa+//wK79GaiqSm5BCSWl3kr3LUvTHkl0cjQBrx+v24du1rnx3mvpd0MvAJa8s5IPn/mcoD+IxWZGN+vkZxRQnC8BlhBCCCGEOHMSXIlzRlF0FC0OTbPhNNkoNv8PaA1INlsYHrkVd/AAg9qoxNrsZJf6mPb9MlalbgPDR3JUJJGRNgr8HvLdHsaP6kmnNnUJhw3mf7qBdVsPYdJ1cvNdFLu8lYIiRVGIiosgvk4sRiiMx+VFVVUGj+/PiD9fhaqobFm2nTcffBd3iQezzYzZZqYws5Ci3GLC4XA1vjUhhBBCCFFTnVFwdeTIEY4ePVrxef369fzlL3/h1VdfrbKKiYuDolhQ1DgsqoUIUwSFlvFoen2STDrDI7dwxLeJmzrGkeCwU+AN8uqW7Ty2chE/5x0i1mEhPjYCV9hHZlEJo66+jD7dmgKwcPGPLF65C7NJI6/ARVZOMa5SH8HQr4GRI8pBfJ1YFFWhtMiNYRh0HdKJW2fciMVm5uCOVF76yxxyjuRispiwOCwUZBdRmCMBlhBCCCGEOH1nFFyNHj2aZcuWAZCZmcmgQYNYv349DzzwADNmzKjSCoqaT1GdoMZgVxWsejSFlvFYTfWJN8E1EVvY61nF1F5NGd6iPmZNZ39BMTNW/cB/Nq0gbJRSKzYSrxbmSEEBV6a0ZuiAtgAsW7OXT77ZhsWs4w+EyM4rITOniMJiNz5/8Jd5VTYS6sRitpgoLSwLsJp1acwdz9xKTGIUeRn5vPTXuezdeADdrGOPsJWthZVVKIsNCyGEEEKI03JGwdWOHTvo1q0bAB988AFt27Zl9erVzJ8/nzlz5lRl/cRFQlGjQYshQgOTGkux9U84zfWJ1f1c4VzLusLlDGpSi0f6daRr7ThCBvxwNIP7lnzHkkPbSI4ygUUjtaCQbl0acuPVXVAV2PDjYd76aC2KAhEOCwoKBYVuMrOLyMkvyyqom03E1YmttBZWcqNE7pw1ngat6uF1e3lr2gf88Ol6VE3FEWmnJN9FQWYhoaAEWEIIIYQQ4tScUXAVCASwWCwAfPfdd1xzzTUAtGzZkoyMjKqrnbholCW4iEVVI4nQADUat+1OIsx1iNU89LAsYXX+ShxmlfEdm3NXzxbUi3bgDYb5aPd+HlnxHTmeLCx2haOFhTRrlsi4G3ph0lV27c9i5n+WcDgtH4tZJ8JpxWLW8Xj8ZOUWk5lThDcQIiIhEkeUA3dR2VpYzhgHtz05mi6DOhA2wnz56mI++fdXhMMG9vLFhjMKCPgDv/t8QgghhBBCnFFw1aZNG15++WVWrlzJ4sWLufLKKwFIT08nLi6uSisoLh5lCS7iMWl2nJqKHycBx//hMCURr7vobPqINXnzyPHn0SQmhqndW3Jz+7pEWizkePy8uGUz837agFtxkVFSRFySg/8d04+4X9bCemHucr5btZtw2EDXNRx2Cw6bBcOA3PwS8grdhK06JocFV5GbYCCEbta57i9DGXLbH1AUhY3fbuXNB97BU+LFHmWntNhNXnoBfp8EWEIIIYQQ4uTOKLj65z//ySuvvEJKSgo333wzHTp0AGDhwoUVwwVPxYoVKxg2bBi1a9dGURQ+/fTTk57/8ccfM2jQIBISEoiMjKRnz558++23lc6ZPn16WS/Jb7aWLVue9jOKc0NRLChaPDbVgkMz4QlHojmn4jDVJkbzMcD+Ay7XbLYVrkdTNXrUTuSey5twZZPa6KrOnvxC/r1pPd9n7uJISS7YVP5vfAqd29YjbMDXy3fyyvyVFBZ7AFBVBavFRKTThq5peDxBvCgEdY38nGI8pT4URaHPyB6MnT6qItHFi395k5zUPJzRDnxuP/np+fg8vpM/nBBCCCGEuKSdUXCVkpJCbm4uubm5vPHGGxX7b7/9dl5++eVTLqe0tJQOHTrwwgsvnNL5K1asYNCgQXz11Vds2rSJ/v37M2zYMLZs2VLpvDZt2pCRkVGxrVq16pTrJM49RXWCFodT17GqJtzEoEXNwGG7Cotqobklk3baHLbn/4c8fxF23cYfmsTwUJ+2dK2VRBiFNRmZvLpzA0uP7OBoSSHXDe3ITcMuw2zS2H84h5n/+Y6deysPUTWZNJwOC06nFVukjaBZJy+vGFdRKYZh0KJrUyb+axyxyTEUZBXy8t/msGf9fuxRNrweP7npBXjdEmAJIYQQQojj08/kIo/Hg2EYxMTEAHD48GE++eQTWrVqxeDBg0+5nCFDhjBkyJBTPn/27NmVPj/xxBN89tlnfP7553Tq1Kliv67rJCcnn3K54vxT1CggQISeQ2FAxR9WMDtvwmrtCa43UAMH6WrdSKbvAIf8w0k0d0LV/YxuX4cBDWvz3k8/c7iomCVph9mck8mYFi1o0aIWf6nTi/mfbCMtq4g3PlhNn65NGPqHdph0reLemqpit1mw1jaTm1OExxfEKHDhiHKQ1CCBO2eN453HP+LgjlTenrGAweP702dkDzwlXnLT8oirFYPNaau+lyeEEEIIIS5IZxRcXXvttVx33XVMnDiRwsJCunfvjslkIjc3l3/961/ceeedVV3P4wqHw5SUlBAbG1tp/759+6hduzZWq5WePXvy5JNPUr9+/ROW4/P58Pl+7ZEoLi4GyhJ3BAJVM9emvJyqKu9iYBgRGGEvVgopDgYJhwx0tS6a4wHwLyHo+ZhkvZBE5pDq24TDNJqiYBinWWNKl5ZsySrk072HKPB5mbdnF7eGC6kXHcntN9Zj8Q8mVm3MZOX6Pew/lMEfr+1CQlw0KJU7ayOjbJS4PIQNg5LCEswWExanmTEzRvHVK4vZ+M02vnl9KZmHsrlm0mCC/hBZqTnEJEdjc1pRFOW8vCtpP+JMSdsRZ0Pajzgb0n7E2biQ2s/p1EExDMM43RvEx8fz/fff06ZNG1577TWee+45tmzZwkcffcTDDz/Mrl27TrdIFEXhk08+Yfjw4ad8zVNPPcU//vEPdu/eTWJiIgBff/01LpeLFi1akJGRwSOPPEJaWho7duwgIiLiuOVMnz6dRx555Jj977zzDna7/bSfRVQNs15MYr0viY88AEBJ0MbP6X/AKOhYcU5RMMj7edmUhEMkmUzcEJuIRS0LoI5mefhhWx6+QBhdVejWNoYmdR2nHBAZhsG+VYfZunAXhmEQ3yCGy8d1xhphqfJnFUIIIYQQFya3283o0aMpKioiMjLypOeeUXBlt9vZvXs39evXZ9SoUbRp04Zp06Zx5MgRWrRogdvtPu1Kn25w9c477zBhwgQ+++wzBg4ceMLzCgsLadCgAf/617+47bbbjnvO8Xqu6tWrR25u7u++wFMVCARYvHgxgwYNwmQyVUmZFwsjXIoRzMRrGHjCQQLhIJqiYVJ1FEWh1LuWoGceZlwAFNECp30cqhKLWbFyuMjDU+u2UeoP0sDpYEzTxlisKlbdBD745Ov9HEgtAAw6tIpn5BX1sFl1QMHAiaHE4fEFUFWFCKsZd1Ep3lIvtggbmq6xf/NB3v/nZ/hKfUQlRDL6oetIqBePr9SHpmtExDhwRNnRfjP0sKpJ+xFnStqOOBvSfsTZkPYjzsaF1H6Ki4uJj48/peDqjIYFNm3alE8//ZQRI0bw7bff8te//hWA7OzsKgtGTua9997jT3/6EwsWLDhpYAUQHR1N8+bN2b9//wnPsVgsFet2/ZbJZKryH+a5KLPmiyashzGF83BiwRtW8YTc+MN+dEUn0tGLkK0TGYWvExVeTxR7CLunEbReg8V2Bc0TI5nSrS2z1u0gtbSUDw4dZmzzxoQxCFvgxutbsXlTJotWHmLb7nyOZHgYM6INDWo7UXATVpxEmJyUlHoJGAoJdRMoLXBRkufCsEDLrs3439njeXv6B+Sm5/P6ve8w8q9X065PK/zeAMW5LnxuP1FxEdgibKjqGeWJOSXSfsSZkrYjzoa0H3E2pP2Is3EhtJ/Tuf8ZfQt8+OGHueuuu2jYsCHdunWjZ8+eACxatKhSYolz4d1332X8+PG8++67DB069HfPd7lcHDhwgFq1ap3Teomzo6gxKFoyCio21U+MKYIoUzSqouIJeQijUzd2Mh773ygMx6Lix+T9EE/Ro3gDu2kcY+P2Ti0xaxr7iop5/8BhwkEDq6ZTGvTRrlM8t41uR0yklfwiD8/N3cTSNUcJhVUUowCMAE67hVKPj5JSH1EJUcTXjQXDoLTITVztGCbOGkeTjo3we/28++THzHv0Q9wlHpwxDsKhMDlH88hNy8dT6j3hcxphF4ZR/WOHhRBCCCFE1Tuj4Or6668nNTWVjRs3Vlpn6g9/+AOzZs065XJcLhdbt25l69atABw8eJCtW7eSmpoKwP3338/YsWMrzn/nnXcYO3YsM2fOpHv37mRmZpKZmUlRUVHFOXfddRfff/89hw4dYvXq1YwYMQJN07j55pvP5FHFeaIoCooagaLXBjUaBR821SDGFF0pyIqxtiYh7h8cNvoSMDS08BGMkidQg1/TKFZnfIdm6KrKjvwCPv75MH5/EKfJQtgwsMfqjP9jW9q1TCBsGHyxbD/zP/uZcMiDYpSgKAoOm4VilxdXqRdHlIOEunFYHRZKC92YLSbGzbiRlFG9UFWVn9bs4dmJr7D60w2YrWbsUXY8Li85qbnkZx678LARdmOEcjDCJdX0loUQQgghxLl0xuOXkpOT6dSpE+np6Rw9ehSAbt26ndaCvRs3bqRTp04VvV1Tp06lU6dOPPzwwwBkZGRUBFoAr776KsFgkEmTJlGrVq2KbcqUKRXnHD16lJtvvpkWLVowatQo4uLiWLt2LQkJCWf6qOI8UhQzipqAoiWDoqHgwqbpxJpjKoKsMAbN426lwPo3UoO1CBlB8H5EnDGHhrEexrZvhKoobMzJ44tDR/D6/Fh1E3bdRFgLM2hQA64e1ARVUdiyK5u3Pj1MKFQAhgdNU7FZTOQXuSl1+7DYLMTXiSU6MRKf20fAF+CKcf2Z/Nxt1G9VF5/Hz5f/WcyLU94kfX8mjig7ZruForwSslNzKMorIRQMYRhBjHA+4AOjBMMIVverFkIIIYQQVeyM5lyFw2Eee+wxZs6cictVlmQgIiKCv/3tbzzwwAOnPOckJSWFk+XTmDNnTqXPy5cv/90y33vvvVO6t7hwKYoCihMUC0a4CMJFgIJNs2PTLHhCPjwhN7XsTSg13cumvDfpZPkR/JtoqGehxF7PjW3r8e6OVFZlZmPXdQbWr4PFrOMwW/CHgjRpGcUIa1M++XI/P+7J480Pg9w60opuSsZk0giGwuQXlqJrKhaLieiEKMxWM4U5xbgKSklsEM/tT49h4zdb+fbNZaT/nMnLf51L96s7M+jWFCJinPg9fvIzCnAXlRIZ68dmc4MWBYYLDA8ox89eKYQQQgghaqYz6rl64IEHeP755/nHP/7Bli1b2LJlC0888QTPPfccDz30UFXXUVyiFMWEqsX/0otlAaMYCGLXbRU9WTGWKFrG3cpid18KQhbcwaM0Vt6gZ8J2rm2RDIrBt0fSWZmWgT9Q1ltk1nQcupl6DSMYPqwpqqrw0/4i3vhgCwF/IQA2q4lQyCC/yE0wGCobMhhpJ6FuHI4oO6WFboL+EN2u6sxfX51Ih5Q2hI0waz7fyOzbX2HHql2YrKZf5mMVU5BxiPzsID5PANAwwiUn/cOCEEIIIYSoec4ouJo7dy6vvfYad955J+3bt6d9+/b87//+L//5z3+O6W0S4mwpqgNFSwI1HoxAxZyl8iCrji2ZvkkjWOL5A/v9cRSHikhSvmZE3SVc1TQSlDALDx1hXUY2gV8CLEVRcJgsNGoQxYhrm6LpCnsOlvCf99bi85YC4LCb8XgD5Be5CYfLAiGzxURc7Rhia8UQCgRxFZZij7Jx4z3D+Z/HRhNXK5bi/BLeeeJj3pr2AQVZOVgdHqzOCDylQfLS8inK8xP0FQEnTnwhhBBCCCFqnjMKrvLz8487t6ply5bk5+efdaWE+G+KoqNqsSh6LVDsYLgwDB+KouLQ7SRa47mq1hVsCvRhlbsZhcFSrOHt3NrgU65pEsJQwiz4+RBbsnMrAiwAq26iSf2YsgDLpHIgtZBX3/0ej8dXFoDZzbhKvRSVuCt6mlRVJSougoR68dicVkqL3Pi9AZp2bsT/vfgn+t/UG03T2LNxP89OfJUVH24hbFiwR9ow28yU5LvJyyigJDeLUChUXa9UCCGEEEJUsTMKrjp06MDzzz9/zP7nn3+e9u3bn3WlhDgRRbH9phcr9EsvloFTcxBtiuTa5P5k0oFPijuTHghihHO4tdGXjG2RQZgQ8/f9zI6cfILBXwMsk6bRtH4sI0c0w2TWOXS0gFfeWY7b40dTVWxWM0XFHlxuX6W6WO1lyS5ik6IJBYKUFrnRTBqDxvbjzy/8iUZtkwn6/Xzzxhae/78POfxTJppJwxHtQFGtFOWkk5OaQWmxW4YICiGEEEJcBM4ouHrqqad44403aN26Nbfddhu33XYbrVu3Zs6cOTzzzDNVXUchKlEUDVWLKUvbrjjBcKEqYSJNETh1OyOS+6OZmrCguCvbPVZChpcR9dZyT/tN6KqXt/btZ3dePsHgr71GmqrStG4s149sgcmikZqex0vzluMq9WHSNUy6RmGRG4/XX6kumqYRFR9JQr14rHYzpUVuAr4AifUd3P7PAYz8a39sEVayDuXxyt2f8Mlzy/GW+jBZHdijdIKBUnKP5JGXno/3v4I3IYQQQghRs5xRcNWvXz/27t3LiBEjKCwspLCwkOuuu46dO3fy9ttvV3UdhTguRbGgaImgxoHhRlc0IkwRmFQTI2r/gXhrHb50teWb4mQCRoheiUd4tPMS4i25vLlnH3vzcwkEAr8pT6FxrWhGjWyJxaqQlpnHi29/T7HLi8ViIhw2yC8sJRA4diif1W4hvm4csUnRBH0ePMXpAHQZ1Japr95M54EtwTDY8PVPzJ74Pns3pYJixuYMYIu04Cpyk5OaQ0F2IQG/LDIshBBCCFETnfE6V7Vr1+bxxx/no48+4qOPPuKxxx6joKCA119/vSrrJ8RJKYqKosaUpTU3XFg1KxEmJwowovYA6ttrs9lTl7n5zXEbVppFe3mw43d0it3Hm3sOsDs/G7fP89sCaVgrhpuvb4XNDhnZBbzw1nIKiz3YbWZ8gRD5RaWEQuFj6qJpGpFxESTU0bHaQpQWQdAXxBFl4/qpA/jTP64lJjmS4jwXcx76go9mr8FXWoyq+XFGOzBZzRTlFJOVmoursFTmYwkhhBBC1DBnHFwJcaFQFBVFiwPFjBF2Y9dsOHQ74XCYkbX/QENHbY4GIngxpyn5Rl1qO3XGNVvPyAarmLNnD58fPkhWcR7B0K/zsOomxTB2VHMcToWs3GKen7ucgiI3TpsFV6mPomJPRQbBSgwXZquH2OQ6RCdEE/QH8JR4MQyDxu3rMOWFG+l5Tdm8xE2LdjNr4mfs27QbAN2s44xxoigKOWl55BzJw+PyyHwsIYQQQogaQoIrcVFQFDOKGgeEgSBO3YlVsxIyQlxfZxCN7XVwhXVezk7kcLgHiXYbKcmH+Vubb9mTd4gXd+5ja3Yabp8bfglmEuLiGHdjQ5yRGjkFJTz/1vfkFZbitFsoLPGQX1hK8Dc9WIbhwwjlgWJC1c1ExDqJrR2L2WLCXeQm6AtitpkYNrE3E/45nNhaURTneZjz0Gd8PHsh3tKy1OwWmxlHlB2/L0DW4RzyMvLx+/zHeWohhBBCCHEhkeBKXDQU1QlqDBgeFAwiTBHoikbICDKy7kAa2eoSMMLMywmwIzScWEcCbeO83Nf+Gxo6DvDmrlQ+2L+f9OJ8AqEAoBAbE8ufbmpEVLSF3AIXL7y1nNwCFw6bmWKXh7wCF4FACMMIlQVWBFEUW0WdrHYLcbVjiIqPrNSL1ahdbf7v+VH0urYDKLBp0Vaenfgf9m48AJSle7dH2LBF2Cgt9JBzpGyJg3D42OGIQgghhBDiwqCfzsnXXXfdSY8XFhaeTV2EOGuKGo1h+MEowaRG4jQ5KfQXoykGw2un8Gn6cg55jvJZXiqeuOvoZF5JI/0Ak1qt4euj2XxyuDN7ClwMbZhEp6RkHLqN6Egrt93Ugjc/2E9enpvn31rO/97Sj6T4SFxuL+GwQazTh8nkKpv79V9UTSUyLqJsjas8F54iN1anFbPNxNV39KbN5XX5aNYK8jKKmfPwe3QZ1IGhtw/E6rCi6RqOaDue0rJ5YUU5xcTVikXTtPP9aoUQQgghxO84rZ6rqKiok24NGjRg7Nix56quQvyusvlXsaBYMMLuXxJcOPCH/Zh1jasS+9LQWh8Dg0V52/m2tAfFagpxdgsjGx3mnvZLMKklvLM3nbd/2ktqSS7ekEq0M8BtN7UnMcFJYYmHZ+csY+e+dCIcVvy+YvKL0/D6dBTlxL9SVruFuFoxOGOdeF0+/N6yrICN2jZgygtX0+vajiiKwqbF2yr1YgGYrCYAivNdFGQWEvzNQshCCCGEEOLCcFo9V2+++ea5qocQVUZRzKDGYYQyMQw/ds1OyAhRGizFYbHyh7gefJ+vccB7iC3Fu9hTamVg1BV0MK+ka2IpDZyL+ffO7uzIr8X+on1c2SCBnrVjcdhKGHdTF975aBtH0gv4z3s/0P/yplzdJ45wCPKL/MQYJux28wnrpuoqUfGRmC0mCnNLcBd7sEVYMVktDLvjMtpc3paPZ31FXkZ+RS/WVRMGYrKV/arao2xlmQSDIWKSYzBbTOfrtQohhBBCiN8hc67ERUlRHb/Mv/IC4YoEF4Gwn2iblZ7RnUmJ6UGE7sAd8rIwP4c5+e0oNuJIdhhM6/wDo5v+hD9s8PH+LP7z42EOl2SimwsZc2M3unduhIHB0lW7eP6dzeS6FYLhMLkFJZSUnDzDn6Io2CPtxNWKwWI14y4qJRw0oeChUdsk/vzCn7h8eLeKXqx/3/kf9m78GSibi+WIduAt9ZOblicLDwshhBBCXEAkuBIXLUWNrlj/SlVUIvSyBBeGEiLKYqWWuTbXJw2he3RHzKqJI/4Az2fXZbM7CU2DUY328I9ua4i1+tlf7GH25gyWHtqNN5TJHwY0Z+TQlljNCgdT3bw4dwu7juZREgqQll9EfqGb8HHWwvoti81cNkwwxonXFSTg9aLgwWw1MfT2QUx4agxxtWIpyitm3vQFrH//R7xuH4qi4Ii2E/AFyU3Lw13iOel9hBBCCCHE+SHBlbho/Tr/yooRLkVXdSJMERgGmE0adl3HGwzRyt6UG5Ovpo2zOWE0Pi+uzYKCehSF/DSLzODF3svpXbuEoKHw+cEiXtm2jYOFB2jc3MMtNzckNsaKy+Xn3QW72bwjC1c4yIHcXA7lFODxnzyFuqqrRCdEEZMcTTBkwlOcg2GUzcVq2KZeRS8WChzccJQXJr/BgW2HAHBE2QHIS8vDVVgq62EJIYQQQlQzCa7ERe3X9a/K1qGyaBYiTE7CRoBIq4Uoq42QYRAMKVwW2Z4bkobQwFaHvf4k3ipow8/eAIqSz9/aLOGeLllYTSaOuHy8tGUXK4/kYIvRueGmujRs5CAQCvLNdz+zdOkhVFUho6iY3enZZBQX4wkGTxj8KIqCI8pOXHISZiu4i/IIBUMAFb1Y//PEzThibRRlF/P6/fP54pXFBHyBsoyCJp289HyK80okwBJCCCGEqEYSXImL3q/zr3wYRgibZsOuOwjiJ9JiIsnhJNZqRVFAw0rf6B4MTeiPptXjg6IubHVHUhgooEv0cl7ut4N2iTYMxeDznwt5+8d0worOtcMa0qtXAjarm6yMXfyw8iuilFVEKV8QLHoVT/4TePPvIVBwP+FA+nHrabFbiE2OJyLGwFPixu/5tderYbv6DP5bHy67sgMAqz9bz/OTX+fI7jTMNjNmm5mCrCIKsosIhULn5b0KIYQQQojKTitboBA1laJGYRg+MIpBiSRCdxAygniCXsyamQirBVvYjC8YxOX3EanGMDgmhSP+NFYXO0kPHqCPfS8WdQP3dTjKtvxO7M1NI0IvxVviplFsmHF9vAS6+yks8hEOG6g+hUirCZOuQhjChkpAVQmWvIYl6u+o2rG/fqpuJzrOwGSxU5jjr8gmCGCy6Fwz+UraXt6Kj2d/SU5aHq/87S36jurJgNF9sEVYKcotJhwMEZ0YhW6SX28hhBBCiPNJvn2JS4KiqKDFYoT8YLhRVAeReiRm1Yw76MYd9GBWTTjMJmwmE95gkFKfj3pKXZLjk9nvPsDCkmj+4NxKhHGEdtFZdI62UOgJ4g+H8QYg39CIsZqIionnQKpKZq6ZEredxFq1SK6fTMDQaWL/HI39uFzfERE5GEVR/queGoai4IhS0C1xFGUXUVpYisn+a8r15pc14f9emsAXLy9i67IdLH//B/as38/1fxtGUoMESdUuhBBCCFFNJLgSl4xf17/KwjB8aKoFh+rAqlrxhr2Vgiy7Scem6/hDQUr9AVorLWhgacAmdzPiWY5D9eIKWzCryeS4k1l1SCXPayVMBNc1a0jTWhH8tCeNbT/mAdCkUeT/s/ffcXJe953v+TnnyRW7Oje6kXMgAgEQTKJIihRFypLloJEsBzmMLe+Od8ej116vNfe1nusZ3+vdmWuPfG3Z8r0jWbY1tiRbtjwSJYpJzCCRc85odO6uHJ549o9qNAABTCIpktJ5k/Wq2FVPFR4Unm//zvkd3n//PC6JGvO9R5D+1yk2bqKQmnddwEI4kFRxUzmsoS7KUxVKU+VrHpLKevyr/+knWXPbCr7xp99h9Ow4f/5v/4r7fvEu7vzpbTSrLZJLMxT6O3BTzg/rI9Y0TdM0TfuxpudcaT9Wvn/+FYAhDdJmmk67k5yVJVEJjahJrCIc06Qz5dGXzjCQ7mBLx+2kvf8bR6MP8lxjFU/U8hwQTYaWeJipFOUQvnDoLP9y6hJ33z3Ag++fj2kKTp+t8PdfPcO56fVUwyFU0iCu/TXFRuO6JhRC2EAIqolpmXT2F+jsLwBQK9aumYu17s7V/PbnP8XqbcuJ45jvful7/F+/87c06y0CP2RqZEa3atc0TdM0Tfsh0eFK+7EjZB5EDtS17csvh6yCXSBrpEmSiEZYI4pbWEZCzpH0ehZLMwXeW9jEB7vuY1lqEQJBRU0xuOACq5ZO4rhNXhyb5rO7TpKfb/OJjy0nn7cplQP+/qtneXT3HcSJxEkOMzPzNDP16wMWwkIlVZRKEEKQ6UgDUOjtIFGKWrGGPxuyMoU0v/B7H+VnP/0hHM/hwrFL/NlvfYEDzxwhjmKmL01TnqrMdSDUNE3TNE3T3ho6XGk/dtrrXxVAeKAqqOTak0GDtAkFyyZrmCSqRTOqESU+piHIODbdnk2PY3NL9mY+2v8QK9KLMYQkla6zbOkIvYPDlCnxub2n2Vcr8fMfX8aSxVmiSPHdp2P+5ZkVtFoxnfIbnJ8YZqJc+76A5YJqtk9XyXVl6VvQTaG/gFKK6syVkHXzfev5t3/x6yzduJgwCPnmX3yXr/x/v0Gt0qQ4XmLy4hSNalO3a9c0TdM0TXuL6DlX2o8lIWwwekD5gJg90T4XEhCYCDII3CSkEbVoxE1aicKRDqYd0MkYM60QEof3Fm5hU24NeytHOFk/x2BXhJe6QLnm8sxknWMzVT523xA3jXXxzLMjPL5rNSvmnWVedxE7+Tqngp8jDGMGOnMYhkQIiVIClVTbQxmvYtkW+S6LdM6jXmlSK9WplepYjkW+J8ev/MHHeenhPTzyhSc5ve8sf/5vv8jdH7uDm+9bj9+cIt2RIteV080uNE3TNE3T3mS6cqX92BLCQcgcQmYRMjN7SiOEhxBu+35hYxlp8k4X3W4fWStHREw9Bml20umaOFJRj0JyRob3dm7jYwMfZG12Gb1pj4FCROfAeerZo/z5kf1MOAG//IsruPu9C/jOrtsJI4WX7GbfjmfYfWKEc+NFmq1wdgM9UM12C/kbMC2TfFeW/oU9dM/rREpBrVQnaIXc+hOb+X987l+zYPUQfjPgu1/6Hv/n//S3nD86TK3YYOLCJJWZql4TS9M0TdM07U2kK1ea9hpZ0iJv5/GSFI2oTi2sYsssnU6ZGV9SD0PSlkXWzPCezq1szK1hf+UIh83TTBpNLO8szzQmOHF0FR9dtZjVK+9iZHKCwdxuti17kj/7RooDS6b4wHsWsWJBN5mUixQhKmkAmZfdLsM0yBYypLIezVqLykyVWqlOpiPFr//nX2D/U0f47hefZHp0hi//p39kxealvP+TdxP6Ec1qk1x3Di/t/vA+SE3TNE3TtB9ROlxp2utkSwvLymNJm0ogQfgUnAYl36YWBmQsG4Qga6a5s3MrG3Nr2Vc5wo6ZYxRpMGEc5s8PhvyrpQsZ7PsQOecirjPN+zYd4JGdmzl5usjtN8/jg+9dRk+nieNUUMp71e0yTINMRxov69KstqgVazQqTVZvW86a21bw1N8/z/Pf2MGJ3ac5ve8ct314C7d9aAt+IyDTmSFbSGPZeqigpmmapmnaD0oPC9S0H4AQgrSZosvtwTL7CYVD3gnwDJN6GF7TNCJjprizcwu/tvAnWZDN4zgBSf4MXzh6ln0zPmPRT5DNWnzwzjNsWdskjhXP7LzE//rn2/n2UxcolUskUeM1b5thtENWz4Jueoe6sCyDsBVy98/dwb/9/G+wcusy4jjmuX9+ib/49Jc49MJxShMlJi5OUy/XSZLkrfjINE3TNE3TfuTpcKVpb4AtbTrdHjL2fBJhkbFbeOb1AQsgb2X5yMC9zM9lyWZaeJ3D/PPpYb52PkeVDZgWfOKBHfz0Q4sp5F2qjYBvPHaa//Vz+3h+11GUUkSvo526YRik8+2Q1T3YiRACN2Xzc5/5aT75+x+je14ntVKdb/zpt/nyf/o6wydGmByeZnqkiN+88TwvTdM0TdM07eXpcKVpb5AUkrzTQ4e3BMuw8awmniFvGLB67E4e6LmTvrRHX1eTdGGcnaMz/G97VxKTJm1PsnnVHn7159Zw1x1DWI5kshjwpa8f5Hu7pjh5boLxqQqVWotmKyCKX73KZBjtOVl9C7op9HUQRzHzlvXzf/8/fo0P/Oq9OJ7N8IkRvvCZ/853vvgkExenmLgwpdfG0jRN0zRNe53e1nD1zDPP8KEPfYh58+YhhOAb3/jGq/7MU089xc0334zjOCxbtowvfelL1z3mc5/7HIsWLcJ1XbZt28aOHTve/I3XtO/jmZ0U3GVkLBfPjvBMQT0MrxtmN+QO8N7ObeQci4X9NToLJU6X4U8PrcWPFD3OM3QVKtx96wJ+6edWs/6mbgxDMDzR5M++9DQv7jnD1HSVsYkKoxMlJmbDVssPiV8hbJmWSb47R9+CbvJdWZIw4ub71vPbn/8UN79vPQB7nzjI5z/9JV56eA8TF6cZPTdBeapC4Idv6WenaZqmaZr2o+BtDVf1ep0NGzbwuc997jU9/uzZs3zwgx/knnvuYd++ffz2b/82//pf/2u++93vzj3mq1/9Kp/+9Kf5D//hP7Bnzx42bNjAAw88wMTExFv1NjRtjml2kLMXUrAd0o7EMhJqYXBdwFqeXsQt+Q04psHQvBmW9Ie8MLmQ58a6KLWa5PgauazFsvnd3H/PYj7+syvpytu0/Cb/9MgevvatXURxgikNWn7E1EyN0YkyIxMlJqarVOst/JcJW7Zr09lfoHdhD+lcCtMyeOg37uNTf/RLDC2fh98MePRvnuKv/ue/Y89jBxg/P8n4uQmmR2doNXy9CLGmaZqmadrLeFu7BT744IM8+OCDr/nxn//851m8eDF/9Ed/BMDq1at57rnn+K//9b/ywAMPAPDHf/zH/Pqv/zq/8iu/MvczDz/8MF/84hf53d/93Tf/TWja9xFGJymhsOQYtrCYaNaohj4Zy8GQV36fsT67inrc5HDtBIWeS9yVWc3Xzt7Kksz/IBMdJYgeo5B9iHldeWzL4MHb+5gJJC/sHOfoqWH+9/9zip+8fyNbNyxCCEGSKKI4ptkMqDfac6Ysy8C1LfJZD8syrtlON+XgeDbNfIrKdJXO/gK//Acf48gLJ3j0r59iamSGb/3lozz+t8+w4e41bLh7HT1DnXhZj0xHGidlYxjXPqemaZqmadqPs3dVK/bt27dz3333XXPbAw88wG//9m8DEAQBu3fv5jOf+czc/VJK7rvvPrZv3/6yz+v7Pr5/ZQJ/pVIBIAxDwvDNGQ51+XnerOfT3tmUykDSQVaUwc4zFpUoNmvkrNQ1AeuWzAbqYYOzrYs0vRM8uGYrj41u5oNDLxHU/pmd5YVsHlhD1raRUrDupj4WL+zkiafPMzxW46vffJG9h8/xsw9uIZ/zkAIcux14kqTdAKPYbNFs+nR2eNg3aLVuuSYd/TncmkO1WGXZ5sUs27SQ/U8fZecj+5gZKfLSw3t46eE9LFw7xMZ71rF882JSOY90Po2bdjCtd9VXyY8V/d2jvRF6/9HeCL3/aG/EO2n/eT3b8K46IhobG6Ovr++a2/r6+qhUKjSbTYrFInEc3/Axx44de9nn/cM//EN+//d//7rbH330UVKp1Juz8bMee+yxN/X5tHeXMWauu62PeYzmpymZJQ6oF1hZ2chY5RR96Smqxf/Of97zAd6TLWAIwdjJMQC2rEiTshL2HS+z5+A5Dh29wJY1HSwdSiOEeFO21VwEt/7GOsZPTnFq+wVGDk9weOcxDu88hpt1WLJtPktvnU+q49XX4NLefvq7R3sj9P6jvRF6/9HeiHfC/tNovPYlcd5V4eqt8pnPfIZPf/rTc9crlQrz58/n/e9/P7lc7k15jTAMeeyxx7j//vuxLL1Q648LpQJUPAEqIlI2o40Zin4VAwvbMLGN9l/BZck8vj31PWaiEhOF89zW+UlywZ+xqXeEPaVTPK5u4vbQ4u471xHGEaOlGgtXD/GB+yT//MhxLo5W2H+qTjNJ85MfuIls1iVUMZGKUCoha2aJQ1BK0ZFPk/bsVwxhQRBSLzVoVpuEyyLe/xMGzXqTvY8fYvd391Mr1rnwwhgXXxxn2abFbLx3LUs3LiLdkSKVTeG8yvNrPzz6u0d7I/T+o70Rev/R3oh30v5zeVTba/GuClf9/f2Mj49fc9v4+Di5XA7P8zAMA8MwbviY/v7+l31ex3FwHOe62y3LetP/MN+K59TeySxU0oeKx7GEYonTz3TLZcav4McJzSTCNiQpy+HB3vfyLxOPU4lrfLd8mk903U+P+TgfX7KTP9jXz5drBguri1jZXaDfgJHpIoYj+IWfW8mOPaM8+exFjp25xJkvjHPfvUtYf9NCDMMiUYqmaJFP5YmChFK5hRCSXMZDyhsHIMuySKdTRGGE3/CplRsYQnL7h7fynp++lVN7z7Lj23s5c+AcJ/ec4eSeMxT6Oth4zzo23rOWQl8HqZyH7dpYjqnnZr0D6O8e7Y3Q+4/2Ruj9R3sj3gn7z+t5/XfVOle33XYbTzzxxDW3PfbYY9x2220A2LbN5s2br3lMkiQ88cQTc4/RtB82IVMIoxtUBER0uR0MZnoouDYZp/1XsBaGGNg82H0XjrSZCKb5RtHFsgZY0QG/sGw/jSThD1/cw3fPncSXDVI5qMYVLtQmWbcxz2/88lqG5uUJg4TvPHKKr//jXlrVJq508ZMW9biObZu4jkWx3KBUbrxi63Zot29P59P0zu+mb1EPnX0dGKbBkpsW8HOf+Sl+609/jds/vBU35VIcL/G9rzzHn/6bL/AP//v/YO8TBxk7O87YuUlmxks0qk3C4O0fN61pmqZpmvZWeVsrV7VajVOnTs1dP3v2LPv27aOzs5MFCxbwmc98hkuXLvE3f/M3APzmb/4mf/Znf8bv/M7v8Ku/+qs8+eSTfO1rX+Phhx+ee45Pf/rTfPKTn2TLli3ccsstfPazn6Ver891D9S0t4OQWZQKIZkCkSZlpBCOpBZViUwFSlILAqwkxd0dt/H4zLOca43zZH0lD6QneWD+BY6PzOepah9fPniO0ZrPR1bOxyk4TJcalBshhVyKX/uFNby4c5wnn73IqbMz/Nn/9RwP3LOU9RuW0IjrWMLCs9oVq2KlQZQkdOZTmOYrV5aEEDieg+M5ZApp/EZAo9JAGpK7P3Y7d330Vo69dIod397LyJkxDj53hIPPHaVrXieb7lnHmjtWksmnMSwDx7PxMi62a2E5FlK+q37Ho2mapmma9rLe1nC1a9cu7rnnnrnrl+c9ffKTn+RLX/oSo6OjXLhwYe7+xYsX8/DDD/Pv/t2/40/+5E8YGhriv/23/zbXhh3gYx/7GJOTk/ze7/0eY2NjbNy4kUceeeS6Jhea9sMmZAFFAkkJhEHKTGEISTWsEqmILi+FH8e4gcFt+a08W3qRFys1FluLWW6d5VPrnuY9wXr+eP9Snjg7xni9xa9sXEZnPtWuRDWbRHbMtlt6WbW8wD89fIbhkQrffPQEz+24xB23L2XdWokpDCzDJpt2qdV9kjih0JHGsV/b14FhGKSyHqmsR+CH+A2fernB2ttXsua2FUxePMu+7x3kwDMXmR6Z4fH//gxP/t1zrLp1OZvuXcfCtfNpVpoIKbAcCy/r4ng2tmtjvErI0zRN0zRNeycTSq8Iep1KpUI+n6dcLr+pDS2+/e1v89BDD73t40a1t49SClQdlUyD8kFkiJSiElYIkgDXcABJK4rYNX2Y50q7MYn5xa4yff4p3IxNKXD4y6NreHFyEQOZFL+5eQVpKalUm0RC4doGOdvFkSYv7h7jme2XqDcCFCYdhRR337GMO9avxDJMlFLUGj62ZdDZkcZz7R/ofSVJgt8MaNUmadWGifyYOI45tqvKrkdPcuHo8Nxj8105br5/PTffdxPpjjRxEINSGLaBm3Lwsh5uytFB602iv3u0N0LvP9obofcf7Y14J+0/rycbvKsaWmjau50QAkQGhI2Ki6DKmMKhw8pTjas0oiau4ZCyTN7Tt4FERrxYPMBfTXVyc+1OPrDsNAVnmk+v281LE6f58ukt/OcXQv71pmUMZVyqtRZhlDCTNMnZDrdu6Wfzhl527B7muZfGKRWb/NM3D/DcC+d48K51bFq7gGzapd7wmZqp0dmRJp26vrnLq5FS4noJjh2Ry/fTakK9VGTtrYJ1d3yA0iTsfuwAe584SHm6wve+8hzf+8pzLN24mK0PbGT1rcsBaFRbVEt1LMcilfXwMu2qlh46qGmapmnau4EOV5r2NhDCBqMXlIeKp5E0yBsZJJJG1MCSFqY0eW/PzbSSFgfKJ9guahwcX8AH8l1s8E5ye3+ZJdnv8J3hVfzlbp+fWrWUTd15avUAQwgqvk+UJOQch/fctpBtNxfYvrvEsztmmJiu8OVv7OCJ549z/3tWs2H1EEEYMTlTJYpiclnvdbVSVypAJTNAgrQypCxw0y6teoVGqUiukObeT9zB/b/0Xo7tOMWu7+7j1N6znN7XPqWyHpvedxNbHthI74JuQj+iOl2jMlXBTTmk8inclIPlWLrFu6ZpmqZp71g6XGna26RdxcqBcFDxNKgaWcPBEAbVsIZCYUmL9/fdRo9d4InSLiIivllO8XR1BQ/mhlmSKvKRhUfZ3HWOfzy/lZHqZj6wsI9mM8CyJbUwIE4SMraDY2e45zbBLTcP8tzuIi/uHGZsssKX/3kHjz17lAfuWsPKpX3MlBvEiaIjl3rZVu1XUyqe3f5G+/3MkoYklevATXm0miVqpRrNusuyTYtZd+cqShMVdj+2nz2PHqA8XeH5b+zg+W/sYMGqQbZ8YBPr37Ma0zEJWiEzo0UMQ+KkXdI5DyflYFr660vTNE3TtHcWfXSiaW8zIRww+lFJGZISKZkgzSzVuIYf+ziGw4bcSqolmL86z+7yUU7VL/C10jKW2NPcnT7FglyVT614kgPFU3z9xD08uGgVYZCQckz8JGa62SBlmqRth5TT4L7b+7n55nns3T3Ozl2XGJ+q8jf/9BIDvTned/tKFi/oJo4TchkXIQUCMXveDoWXQ5dSCpWUQNVAZG9YVZKmg5fpwkvVaDYEtbKkWW7iph3e94n38L6ffw8nd59h5yP7OPbSSS4cu8SFY5d4+C8fZcPda9nywEYGlw8QRwmthk+90sCyTVK5FF7awUk5r3nYoFKKJElAteeJXZ5xajt6LoCmaZqmaW+cDlea9g4ghEQYBZRwUEkR16gjRYpq1KIZtTCViUCwMNXPkuwgY/Uiu0vHONU4y9+WOtnqnWW9e4FNPadZ3THMs2NbWdz1IPgC1zEBRSOKaMYxGQvS1gwdbg/bbh/kjq1L2LN7lGd2nGJ0osKXv7GTgd4c77llGauW9mMaEiEuV9ra54J2wDKoYchphPCQRgtDSJgNX45tYhhy9v2ZKJnFS9fw0hatZgfVUotmtYmQkuU3L2Hl1mVUizX2PH6QXY/sY3p0hh3f2cuO7+xlYHEfWz6wkY33rCNbyBC0QirTVarTVSzXIp1PY9nmbNhT7RAVJ8RxgooToihGJe1glSQKZh+HUiAE6Y402UIay9YhS9M0TdO0H5wOV5r2DiJkqt3sIiljUyRv2VTCkGbkX3mMgIFMgXvsLWxoruZC8xKnWgMcr/Rzp3eMPqfI/UPPMN48yIz/03SyBsuSOIYkQVEJFK3IJ21NYpk9RFbAe9+zlPfcspynXzrJs7Mh62vf2kM+67Jp3Xw2r1tAT1cWFCjawSSJGyg1QagkiYpBxe37ABSYliTl2niujW0ZCGGgyIKq4qbATXfSamSoleo0q812yMmleO9Hb+M9P7ONswcvsOu7+zj8/HFGz47zzb/4Lt/5b09w052r2fLgRhatnY9SiqAVUhwrAperZqp9+XIQlLOVNiGQQiClRMrZSpwQJHFCZapCo9qkoztHKufpBhqapmmapv1AdLjStHcYIUyE0YUSDmYyQ4cZoZJ2RSVRCdBuUZ62LQyZwTIWMN8bIlA3c6R+ipPN59nsHKUvNUMf/43TjeWY4b8iY/YQxQqhoJEY1FtFbCMEq5s4UvR63Tx491ruumUZT790ku27z1Cutnhq+0me2n6S+QMFtqxfwKa180l7EqEqgAUifd17UEoRRjGVWotavYVr26Q8G8cxkbIdsEDhZXrwMl206i2qxXbIUoCbcli6YRFLNyyiUW2y78lD7HxkL+PnJ9n7vYPs/d5Bega72PLARja97yYyhcwP/HlLQ5KxM7QaPpOXpklVPHJdWby0+wM/p6ZpmqZpP550uNK0dyghMyAcpCiSi6YAaMU+pmkgRbuy4pomXW6KYquFilLclr+ZmA2cqh0lbvwza1MXWJo6iZ/8/3hpZiue+RBrerqQwiUOXfywSN2fpuKnaLQiuq1ODCm5a9ty3rN1GafPT7Ln0EWOnh7j4miRi6NF/sdjB1izLMfWm3KsWrYA8wbfIkIIbMvEtkySOMEPIxqtANOSpD0bz05hmXUUCmF042W8dnfBhk+j0qBRbdGqtzAdCy/jcvtPbuW2D2/h4rERdn13HweeOcLkpWm+88UnePSvn2LVtuUsv3kx81cN0rug+wdaI8tNOdiuRbPawq/7ZDozeqigpmmapmmviw5XmvYOJoQFsgdhWsBJ0jKmGdXxzPRcwLJMg07Po+y3qAUhnmmwOreeJLOOHaPPsMD+Jv1Oibu6tzPeOsTfHryTjHszW+d1saKzj25VpxxYVKKAlt2i1+vEUpIgjFi2qIeVS/uJ4pi9hy6y68B5Lo1Nc+j4OAePT5HyLrJxTS+b1vXS35dGAVGSAJC1HQwpkYbEM+y5ala52qIqWri2Rcqr4Nghht2PEB5e2sVLu2Q7Q/yGT7VYo1FqIAyJ7dksWD3IgtWDfPBT93PgqcPsfGQfwydHOPzCMQ6/cAwA0zKZt6yfoeUDDC4fYGjlPLrmFV7TUD8pJel8iiiM9VBBTdM0TdNeNx2uNO0drj1vqD3sLWMNYKpxGlEJx+yYC1imIelw2wGg0mrhGArLMNk2eDdJcifny19nnnyKfrfKb676DvuKB/mb/XeC6GbLQAfb5oX0pPuoRHVkZNLrddCRSREFMVPlOn4UsWnDAjZvKjA2fp5dByvsPzJFud7i6Z0XeGrnBbo6Xdau6mLd6h5SGRM/julwXJzZ0taNq1kC2y6Rcn1ctx/bybcf51jYjkUq5+E3AhqVBs1qk2YtwXFtbNdi64Ob2PrgJkbPjHPwmSNcPDHKpROjtBotLhwd5sLR4bnP0PEchlZcCVtDywfI9+Reds0s0zLIFDL4zUAPFdQ0TdM07TXT4UrT3kWE0UvWyKGCCzSjSRyjgJQ2AIYUdDguJoJiq0WiIhzTREqTRYWP0QzvY6L5twzIw2zuvMTazf/Ad0du4rFzN/PEWUl/ZphN8+axcUBhYFANbEwpCExFpRXQmJkkbc3g5BzuvGOIO2+fz4ULFQ4eneL4qRmKRZ/nto/w/PZRli7Ks35DD0sXFuhKuXimfU2QubaaZVKuVKjWzmA7/eRzPTizrdENwyCV9fAyLkEroFX3qZbq1Ep1DNPATTkMLOljYEkf0G6vPjNaYvj4CMMnRxk+McLIqTH8ps/p/ec4vf/c3DZkOtIMLh9gyfoh1r/3JvLdV9bouszxbCzH1EMFNU3TNE17TXS40rR3ESEE0siRc5ahuEgrGschhRRpEAIhIOu2130qtZq0whDXagcBz+rCs36b8eZ+7PCrdFmTfGTBft7bf4a/P3UrB6YW8O0TZ/jWifMs7+zitsFFrOvtoyeVJuOkiVJFqk2LILJxHRMpJSuXdbFyWRctP+LoiWn2H57g/HCFU+dKnDpXotDhsnFDD3dsGKQ7m75uaN2ValYnSdSg1RplMgrozA+QSjnXPM7xHBzPIdORplVvXekyiMD2LCzHQkpJ92An3YOdbLx3HQBJnDB+fpLhE6NcOjnC8PFRxs5NUCvVObHzOCd2HuWRL36PJRuWsPGetay9YxXuVa+thwpqmqZpmvZa6XClae9CUrrknSUo6dEKRvFkGSHTINpBqt1JUFBqNamHAbaUGEK2w4e3gdhZy7n6N+njCbqcGr+19glG/UV88+yd7Bx3ODkzzemZEoaQ9KRSrOpKsabLZWmhj5QJjWaAbRlYVvsrxHVMNt3Ux6ab+pieabJr/xh7D45TLLX43lMXefb5YW6+qZ97ty1kXk/2xu/JTJEyTIJgkukZnzgZJJNOXzd0zzAN0vk0qVwKvxnQqDZpVBr4DR9rdsjg91fJLle3tn5gIwCh32DszCkuHjvPoecvcP7IJKf3neH0vrP8y589wprbVrDxnnUs37xkrjnGjYYKZvIpLNfGss2XHWKoaZqmadqPDx2uNO1dSkqDvD0AODSDUTyaCBkCHggx10mw7PsESUwYRyTR7M8CXd6HqMa3M9b8Couswww4Z/nUqov8qxW38vz43eyaaDFSbjDRKDPZKPLsRQM4R1/aZW3BZHnGZ3E2pDvVwBQlTFHEoMi87jIr3rece+/4CPsO19mxd5SpmSYv7hnhpT0jrFnazd23LmTl4s72+lNXEza2XUBEJYrFc0TRAPncDR5Hu5rlphzclEO2kKZRbVezaqU6lmPhePbLBJ4mtlNk4eo0C1bfwh0/dSvFsRH2PTXC3ifPMzk8zYFnjnDgmSOksh7r37uWDXevZcHqwdkKWnuoYKvm06g2MU0D27Xwsh62277PMF5/t8JXEscxURCRxAlOytEVM03TNE17h9LhStPexQxhkLc7QZg0wik85WNQReGBsLBMg24zRZwokiQhUopEJYSJIogihOjCTv0mJ/1DZOKvM8+apEM+x4MDB7hn6GeQwqRSP0/dLxFHM9iiTMGuY8oEUCgFjabANUykaWAbBlIKPA6wwD1PZtMn2LJxE2fOl9mxZ5Tjp6c5cnqK42em6evKcMeWIW7ZMIDrXPVVJAwssxNJmWrlPHEc0tHRi2m8fKCwbIt8l0Umn6JRbVIt1tohyzaxPXs2jCgENWCmve1kubzwcKG/n3s/nuHuj93CpVM++753mP1PHaZWqvPit3bx4rd20TVQYMM969h4zzq6BztJ5TwAojAmDEKao0WEEHPt4x2vXUV7vfOzkiQhCqL28/ohfjMg8EOSMCZJEnLdOQq9eV0p0zRN07R3IB2uNO1dzhAGeavdjKERVkgJF6mqQACkQAgMKTCkwbWH+c5c6OpJ30or2sjJyrcY4HEyskI6/itc6TCYcREZiaK9iLEfm/hRwlTLZbTuMe2nmWqmKQZpikGGgpviF5btYp5Xpsf4PJXkfpYuuo+lizqYKTV5cfcI+w5NMDpd4xuPnuA7T59h60393Ll1iN6u2QWJhcCwOvBkg0btAkns09ExgG3br/xZmAbZQoZUzqNVa1GdqdEoN5AmeOkmUlZR2IDzfT8pUXhIUWZoeS+Dy+/nwV97H6f3nWPfk4c4vP0406NFnvy7Z3ny755laMU8Nt6zjjW3r6SjJ4dpGZCeDUZ+RGW6ilIK0zKxXYtU1msHrdl5YZcppQiDiDhshym/4eO3QpIwIooTBALDMjBMAztrkcSKylQVKQX57pfvdqhpmqZp2ttDhytN+xFwdcCqh3VShouhyghVReHSHgiogOSacwOFKdtxI2VAZ88HmWxt40z5axTEWZqhST3JkLcW0++tQMkeQitPqNKYaUFXLqRcbFIJqpyqVxmtNZFCsnvybn5j1X7eO3COnHwUR51mOvkEnR15HnrfUu65YwG7Do2x/8AkpWKL53YP89zuYVYt7eJD9y5joLfdel4aKVKeScMfY3qmRaFjCNdNv/rnYbTnZbkZl1atSr14kaA6g5IZ3JSFmB21lyQJcawwTYkQFooYmAZMpOGyfPMSlm9ewk82A468eIJ9Tx7i1J6zDJ8YYfjECN/6y0cZXDbA6ltXsOa2FfQt6sH2bGyvHQKjICJohTSrTYQQWG67qmVaJkErwG8GxGFMHMUo2uHQNA2ctIt3g0qdlOCmHcoTFYQQ5LqyOmBpmqZp2juIDlea9iPi+oDVg0EdqAAR7YAlAIN25UaCMEmEnL0sAElXeh6dqQ2crJ3h+alnmA6KxIkgVRlnS76HlekuTCkwAddwyfen2NDXQZCETFUa7LlU5okLU/zJ4U3smurkU6v2kbdP0S//iJnkE7RYhedavGfzEJvW9zA8XOfQwUlOnC5y7PQ0J87McOeWId5/12JSroUwbNJeF63WDNMzAR35IdLpwmv6TKRo4aUreK5Nq7mARrVFs9YijBMwJH6UEMcJmbRNxnMQ0p0bOqjo5fJXpO3ZbJwdElgr1jnwzBEOPXuU80eHuXRqlEunRnn8y09T6OtgzW0rWHPbShauGcK0TUy7/RxJnBAGEeXJKqCQhoFhGdiePdc047UwbROlFKWJMghBvuvGDUI0TdM0Tfvh0+FK036EXAlYinrYIGXmkKRph6p2eOIVKh1KKRSKhIRFmaUsTi/iaHk3z0/vpho1eXpmB/srx7i1YyMLvHkIITCEgSEMLGmTKqToTmdY25Plq0cv8czYfI6XO/h3615iZb5MQf4l5fhOiuoBJBaeZbBwfob58zPcX1/E088Nc/DEJM/svMiew+N88J6lbF0/gJQGrttNEFaYKZ4higbJZntftrGDUgpUBRW351cJswMnlaBME18pmjM1asUalmHgeDblSosoSshlXAwjPRuwiii62p/ZVTKFNLf/5FZu/8mt1Ip1ju08xdHtxzm55yzF8RLPf2MHz39jB6msx6pty1lz6wqW3bwE22032XC8Vx7a+FpYjoVSUBovYRiSTMerV/M0TdM0TXvr6XClaT9i2gErD0AjauIZLlJcCQiJStonFImKSZSi3eChTdJu2y6EJFQRq/KbWZNdwt7SXrYXT1KOKnxn6hkG7G5uLWyiz+mee24hBGnHY3mvy/+rUODx06N858wo/59d9/KxJfv4iQVnyFvPYienuRj+FBEFBO35XKEL9z84yM0bevjOk+eYmG7w1YePsn3vJX76gZUsmJfDtvOIsE65fJ4k8snlBzDMa8OKUjEqKUJSBOEQxSatepN6KyAMY4SUdPcX6OjO0qq1aNV9CGOmJpvUSnWyGRfHkZjGNEqAMDqRhrhhkMsU0mx5/wa2vH8DQTPg5J6zHNl+nGM7TtGoNtnz+AH2PH4Ay7ZYdvNiVt+6glW3LHtTwpDtWqAUM6NFhIB0XgcsTdM0TXu76XClaT+Crg5Y9bAxOy9HIWarV1K0H2NLG0OYWNJEColEts9nLzfiJpWwTCJy3NK5hfW5JWwvnmV36ShjwRTfGH+MRd4gt3RsoDD7egBCCjzH5kOrF7BhoJO/3n+Gvzu9hUPFXn5z1W4G0iMsd77IaPRhqskqEhSGkTATFJE9gn/18QUcP9zgmecvcWGkwmf/aie3bJjHB+9ZSjadRmJRrY8RJT6FjiFMKwWAUgEqnoKkSitwaLQiWn6DOFZYlkHKc+bmKJmWieu5hB0hSZyQRAmNuk8MCNNCSoMkKRL5kiT2UMmVACqkQBoS0zKwnHabENuzWXvHStbesZIkTjh3+CJHtp/g6IsnKI6XOPpi+7IUknxvjs7+Dgp9HRT68u3z/vblTOH6xZZfju3ZKGBmtAhCkM6l3oS9R9M0TdO0H5QOV5r2I+pywLJEe1HdK8HJQNIezvdqzRDSZgpTGJTDMvVY4RkJ93avZEthFc9O7edg5STnmsNcaI6wIrOYzbmbyJhXHeALwYJCln9/1008fHKYR04J/uddXXxq1fNs7C4zaH6NcrKVyfgDKOHg2g5hElGPWgytEfzckkXs2j7DoSMz7Ng/wsFjEzxw1xLu2DxIyu2m2ZxhOg7oKAxhWxZhMI7vN6g3bYKw3UTCtkw89+XnNFmWxeU2il7Go+WHxHFMJpsh7bYbayQqTxybJHF7jlYYRERBROiHtBo+hmFgOeZc0JKGZMn6hSxZv5AP/sZ9jJ2daAet7ScYOTNGcbxEcbx0w+0xLZOOnlw7bH1fAOtd0H3dsELHs2klCTNjpXYb/Iz3KnuGpmmapmlvFR2uNO1HmCEMstYba3jgGA4FUaASVqhHipRQ5IwmD/XfztbOdTw1uYtTtQscq53mVP0867IrWJ9dhWe4V7ZDSj68cgEb+jr5q/2n+aMj9/PgvP381MJj5J0duMZ5xuKPEdCDJU3ydoYoiak7PlvuybFsTZrnn5lkYqLFNx49wYt7L/FTD6xk2cIuWq0yM8XT2IaFH8aEkYNlQcp1EDdYfBhAqYRYxcTEKKVwpIOYHTrpOhZRKClVmkSxSz4dYloVLLcPIa79ygyDcK4bYKvh3zBoCSEYWNLHwJI+3vfz76FarDF1aYbSeJmZsXbIKo2XKY6XKE9WicKIqZEZpkZmrttu27G45aGbufOnt5G7qpGFm3Zp1lpMjxbpmifw0u51P6tpmqZp2ltPhytN016VJS0KdgFDmNRDgUWCQ40eO8dHh+7jQn2M703u4lJrgn2VoxysHqPP6WahN8Qib5Cc2Q4CCzsy/Ps7b+Jfjl/k2+cMTlX7+bXlLzCUHmXQ+Asm44eosRkQmNIgb6eIkwSjN+DBn+7jzPEaL22fYXSyyl98eQ8b1vTy4fctx5MJjSjBtrI4zrVVKqUUCXE7TKmYMAmJiGbnnSUopfCkR8bMIGd7tJuWQUoKqrUmcWiRz1WwHBtk9zXVPstuLxKczqWIwgi/GVwTtKSUc+tbXZYtZMgWMrDu+s85jmLKU1WKYyWKE+X2+XiJ4liZmbEi1WKN5/75JV785i5ufv8G7vrZ2+js7wDAy7g0qk2mR4t0z+vETX3/Wl6apmmapr3VdLjSNO01kUKSt3JY0qTsC1QygytrKDIsSPfzi94HOVa9wAvT+5gIphltTTLmT/FSaR8FK88ib5CF3hDdVoGPrlnI+t4O/vagwx8cLPCLS55nS/ckfda/kE12MRG9n1AsQkqJISVZ2yVObFavdpm/JMPeHdMcOlhm75FRDp+Y5H23L2KoP0sU14mThDCJiOKYMIoIVXtB3iiOiZMElQjiRBFHkCgoFGzWro4BRdrMYMxWp6QhyaRdGk2fsKwoZCZm18jquOHnY1ompmW+atAybfNlh2MapkFnf8dcYLqaUoqTu8/w1Fee59yRi+z49h52fWcfG+5Zy90fu52e+d2ksh71coOZsXbAst033plQ0zRN07TXTocrTdNeMyEEaTONIQzKvqQRT+LKCkLmkFKyJr+QFdkFjDWKHKuc5UJrhMlwimJYphiW2Vs5QtpIsdAbZFFmkM/csYZ/OjbMXxx3ubdyhA/PP0jBucgi56+oRCsZbb2PxOjBNNohK23buKbJXXd5rFjdwQvPjjM60uTbz5xCIFCz/6GuNJ9oX7h8H6jZO1zbZ+3CYaZHLJ5+dgm3bO3htpvn0e3lMaV95f2mXFp+wHS5RT6+RCZrIo3MK35ONwxatSat+pWhg07q9a1vJYRgxZalrNiylLMHL/DUV57n5N4z7H3yIPu+d4h1d6zi7o/fwcCSPmqlOjOjRTrndSJeW28MTdM0TdPeBDpcaZr2urmGi+H2UglMmuEILmWkzIOQmFIwlOmk181RDVYz3aozFowxEowx3BqlHjc4UjvJkdpJbGmxYHCQn+rK8djxm9g5tZiHBg9we+9pMvZhlmdOUkk2M9p8D1GSwTQFpiFJWRaL+vP0/VSa4ydKHDk8g4rbXfyEEO1ufhIQot0Z0RBIKTCkoCdbZM3QIZb0nsQ0QlqtmPPjR/nWS5t54cUJbt3cxz1bF9OZvWpOk2MThpJSpUwUD5PLL8I0X9u8phsFrUalSbPaRClwUvbcQsOv1eKbFrD4pgUMnxjhe195nqMvnuDgc0c5+NxRVm5dxt0fu53ueV3MjBbJ9bxyENQ0TdM07c2jw5WmaT8QS1p0ON1UhUEjuIitSphGB5dLJbZp0mmYuKZFLnBZGC3A6FBMhFOcb17ifPMSrcTnVOMcAMuWCVSQ44mptTw9sYwPDR1kbccwrvECi9y9tIx7mQy2EoQGSRJjmgLPNNm4pptVKwso2sskG1IghcDg6rWpEjLiKB3GS3ji3OxtgkANks6UyWQqLOp/jJ3HF/HY7o28sGOcWzYMcN9tS+nqaHffsywTQxao16eI4ot0dCzAtl/fvKbLQSuV9fCbAfVyvR206i3clHPN3KzXYmjFPH7x9z7K2NkJnv7aCxx85ijHd57i+M5TLFm/kG0P3czyrYtf13NqmqZpmvaD0+FK07QfmCEM8nYXhjCo+RdIomlss2suYAkBadvCMU0aQUDFb9Ft9jDUMcB7ClsZD6Y41xzmXHOYalRHOGWGBqEVxTzaXMb2i/18oPsUQ14JQ3yTTuNZQu8BAmMLLT8iCGOShLmK1vevD2VQIy93k5c7MUVl9lZJLVlNKdlGUy3EoEa38xg93j7uyQyzcfklHtu5hhf3hLy0Z4xNa/q59/aFDPZlkYbEc7to+NNMzQjy2Xl4nj23jtgV6vvOv/92iZtycFMO2UJAvdKgVmrgN3xs18ZyrVdtk3+1/sW9fOz//RHu+8W7ePpr29n7xEHOHDjPmf3n6Vvcw8AtnbTu9THNl5/vpWmapmnaG6fDlaZpb4gQgqxdwJQm1dY5WtEUjtGFkFfmE5lSkHMdXNOgGgTUwxBDCPrtbvqdHrblN1IMy5xvtStak8zgZiOitM3XW8tZGExxb/YieaYh+gpR8iS++RBdufVECbT8AD+MUUmMZQrS5igFcwdZeQhBDEBMhnK8mXKylVBlCeKEehiQtdOM89OUk1vocR5moGeYj913hLs2nuOfnlnP7sMJew+Ps2pZF/fetpClCzpIe50EQYmZmRqua5NN29jWK82f+r7gJSyQnQiZwXZtbNcm05GmUW1RLdaolepYjoUzF9xem655nfz0b3+Qe3/+PTz7j9vZ+cg+xs5OcPrwWY596xxrb1/J+rvXsubW5TgpF8s2X9e8r1cTxzFxlBBHMUkUt4dlGhLDkMjLp9e4QLKmaZqmvRu9I8LV5z73Of7Lf/kvjI2NsWHDBv70T/+UW2655YaPvfvuu3n66aevu/2hhx7i4YcfBuCXf/mX+eu//utr7n/ggQd45JFH3vyN1zQNAM/MYnjLqLbO4kdTGEYOy7h2QdvLQwU9K6Lqt6iGAa5hYBkmnXYHnXYHm3JrqccNLjRHON+8xIgxTjGZx9/XelhtXeC2zEUy5jDp5AucH5vPNB9kTd8a0m5I4u8hq7aTEqPEkaKGYrzVx97iWvbPLKTsQy28QD32SYgQMsbG4/4F87l13iAt9etk5QG67UdZOK/K//NnX+LEpfN89bH1HDsFx05Ps2BejntvW8i65Z2YZkSrFdHym2Q8m3TaxTQN2gMUv5+4cq5aqHgcpUKEzCOExLIt8l0W6ZxHs3YlZJmWiZOyX1co6ejJ8cFP3c9dH72NZ/9xO09943ka1Sa7Ht3Pzu/uw/EcVmxZwprbVrJq23JynRlM28SyTUzbfNXXmgtRYUQcJe31vpoBYRijopgkbofJBIWkPQdOzIYrw5Bzr2OYxpXQdVUI09U1TdM07d3qbQ9XX/3qV/n0pz/N5z//ebZt28ZnP/tZHnjgAY4fP05vb+91j/+nf/ongiCYuz49Pc2GDRv46Ec/es3jPvCBD/BXf/VXc9cdR6/5omlvNdvw6PCW0ghS+NEkjbCKKbJYhjt3wCwEpCwT20jTDAIqgU8QBLimiTF7UJ82UqzOLGN1ZhlhEjLcGuN88xIXWhn+vjKfDc4ZNnqXWNxxmoXJ5zg2NsD8VJG0GeCjaMSC/dU+tpcGuRSmEbKM9PYi0zGmUORnt1cASgkem5pk++ggH16ygBWdG6klq+mUT5M3X2DNwnF+71cf58CZtXzluyu4MFLhS18/SE9niru3LWDLTf0IAdV6SMNvkU05pFIOhvEKAUWkUMqHZAqlAjA6EaI938q0TLKFDKmcR6vWolqs0yw3EYbESbUrWUmiSOL2Gl1JnKBmr1+ukInZQGM7Fvf8/HvIb3TpCHs4+uJJTuw6Q6PS4PDzxzn03DEs22TJhkWs2raMVVuW4uXTOK6Fk3IwLQPDNIij+GVDlEKBEJiWgTQMjNT11bAkSUiihCROiMKIoBXObe/cXDlDzgUwL+OS68y+qVU1TdM0TftheNvD1R//8R/z67/+6/zKr/wKAJ///Od5+OGH+eIXv8jv/u7vXvf4zs7Oa65/5StfIZVKXReuHMehv7//rdtwTdNuyJAOWXcRXtyNH03SDCdoRXWkSGEZLnJ2PpYpBVnXwTVNKoHfHioYC5zvmxdkSYvFqfksTs0nUQnjQbshxneaJ1hl7me5Pc6awjAAldhmV22AvfUBfGykC51uhBAgEUhhYQiBKSWOtDGlwbRfQ8opoqjIl8+Os2hkER9eOojy3k8l2UxBfpusPMHNyw+xYdk5dp+4nX94tIfJmQb/8J1jfOfpM9yxeZA7Ng8hpaRYadBoRWQzDp5jIeS1VRilEoSQCOGgMEGVUXE0G7CuVPoMwyCdT+NlPVp1n1qpjl9vte+Us9UeIbA8G8MyMM12EBJSzA6/a5/HcYw8Jrnl/Zu4+X3rqRRrnD14gRM7T3Ni12lKk2VO7D7Nid2nedgwWHzTAlZtW86yjYtI51IIKUiSBAAhJYYpkYaBmbaQrxQgryKlRNov/9jLITGJE5JEUZ6sEDQDOnrzOJ7+xZimaZr27vG2hqsgCNi9ezef+cxn5m6TUnLfffexffv21/QcX/jCF/j4xz9OOp2+5vannnqK3t5eCoUC9957L3/wB39AV1fXDZ/D931835+7Xqm0J76HYUgYhq/3bd3Q5ed5s55P+/Hy7tx/HCwxiGkV8KMJmmGRRtgA4WEb9lzIEkDOcrCFQdX3qTV9FApTGljy+jk6vUY3vZluVHo95egBTvp7MZMDlOMuSizBSzncnbGxhYUtbRxpYWFhKImIDVQESdAOIaahmHAn2d04zGirhOwaZSyc4U8OjLCtazH3zu/FN3+eGXWcPutRPGOKW1c9zpYVQ+w+uY3dh4oEQZ2xS0f4zlTEisUeKxd7OEZAUGygzBDLDDBEC2giVBOIScwtJO7Pg3BRKgWq1r7f6AKRvm5YnOWadPTlCANvtr28eM3zlxJ1ORgJUnkPJ22T60yzZMMC7vm52xk9O8GZ/ec4+uIppi/NcGrvWU7tPYuQgoVrhxhaMY9cd5ZMIU22kCbbmSFTyCAtMVt9uhEFNAGb1/zPjABhznZ5tBxq5QbNhk9HT45UztNDBd8m787vHu2dQu8/2hvxTtp/Xs82CKXU97ez+qEZGRlhcHCQF154gdtuu23u9t/5nd/h6aef5qWXXnrFn9+xYwfbtm3jpZdeumaO1uVq1uLFizl9+jT//t//ezKZDNu3b8cwrh9m8r/8L/8Lv//7v3/d7X/3d39HKpV6A+9Q07R3g4SES+4oJ7yzlJRPkCS06mnCqX5ud/pY46UwRMJA1wEW9r6IKS8PTVb4YULLT4jiK1+ltiVwHQPLENx4/hXU/S4On/8Qfpi/4f1vh/JYlUuHxrl4YIzSSOUVH2s5Jm7Owcs5uFkXN2vj5dz2bVmHVMEj05W6rnKnaZqmae82jUaDT3ziE5TLZXK53Cs+9l0drj71qU+xfft2Dhw48IqPO3PmDEuXLuXxxx/nfe9733X336hyNX/+fKampl71A3ytwjDkscce4/7778eyXt9aNpr2o7L/KBWjkhpRPEUrrtKMAWFiyvbpaomCKI4IooRGFBLGMYkCU0pMKefmZ70RcZIQRTEoRZwoGmGLg81jHGqcohQEhLGiWe0g48/nwfkLWJjxSFSZ+enn6LTPIoWDwkPhUqoanL0YcHE0wQ8tWoFFJpNn5ZIBBgd6MYwUnpcn7dQwgy8gVAUl0iTur4O5eu7zQdVBZBBGJ0LYb/g9vpZ9J/ADWjWfWrlB6IeYlkmtUufkzjPMjBapFutUZ2rUijWqM3WiIHqZV1NAPHuSmLagb2EffYsG6VvcQ//iXvoX9eBlvZf5+evFUUyr1sLxbPI9edyUHib4w/Sj8t2jvT30/qO9Ee+k/adSqdDd3f2awtXbOiywu7sbwzAYHx+/5vbx8fFXnS9Vr9f5yle+wn/8j//xVV9nyZIldHd3c+rUqRuGK8dxbtjwwrKsN/0P8614Tu3Hx7t//7EAF1vl8ZIq2WiKVlyjmYQEJFjSwhTG3BAw2zJIAXnlEcbRXPv0II4JkgRTCixp/MBBy0Bi21e+BrN49HEbW4N1vFjay6HyOcqyTJxU+drYBMvtpTywsI/zrQcZDhSmACkkpikxPEnnSqDPZ+euKQ6dLBLH8PiuhI78DFs2Gaxb7ZNNZ+lw/w0dfBkzuYhs/AmR9TMk1vva64OpNIIqQgUI2R4miBAI0W5UYb1iy/dX+ORfYd+xLIt0Jk1Hd55Wvd1EQyjYdO9NuCkH86rPSCmF3/CpztSozNRmQ1ed6kyZanGS2kyFykxAcbxGFISMnBrl0qlJ4MqfUb4r1w5aV526Bztv2MDCNExs26ZZbVEaLZHryZMtpHVL9x+yd/93j/Z20vuP9ka8E/af1/P6b2u4sm2bzZs388QTT/CRj3wEaHeVeuKJJ/it3/qtV/zZf/iHf8D3fX7hF37hVV9neHiY6elpBgYG3ozN1jTtDRLCQhidWDKDmZTxohn8pEUzjmgSAgqBwBAGUkgMYWCbJrYJadsmimP8OKYRhvhxRBK1D91fbsnea177mvtnX0dKrKuqYXk7ywO9d7Ehv5pnp3dzpjaOlBNcimf4i5MDvLdvFbcP9iBUgh/ENFshUZwgpcC0De64ax6bt/Zx8MA0+/dPM11q8d3vDfPU8yOsXdvB4JBLZ+4nuKn3e3TahxDRV/Ebp6kmH0WpdkMPoUogiiQUUCKLwEBKQTrlkMtcbvn+5jLMK000/IZPvdKgWW3RqrcwLBPHs5GGxE27uGmXnvndsz/ZRDCNIESRBtpNMGZGK4yeHWHsbJHRs03Gzs5QHC9Rnq5Qnq5wfNepudc2TYOe+d0MLOlj1a3LWXXLcszZICmEIJXzCFohM2NFwlZAvieHZeuDNU3TNO2d5W3vFvjpT3+aT37yk2zZsoVbbrmFz372s9Tr9bnugb/0S7/E4OAgf/iHf3jNz33hC1/gIx/5yHVNKmq1Gr//+7/Pz/zMz9Df38/p06f5nd/5HZYtW8YDDzzwQ3tfmqa9OiFshNGDkBmMuIxnlImSmBiDSAkCFZOomDCJUCRXApeUpA1rLmgFSUyYXG5Dfm2AElx7g+D7u/cpGtGVkGYKgWVIDGnQ7/TwswMPcLpxgWem9zBSLyELF3mhNsX2PYOsyPezqtDJks4MecvE9yNaYUgUJNimZNstvWzd0sORoyV275mkVAzYvWeG3XugHe0Wcc+miA9s2YtlPEu9dZpjEx8lleqjM+/SmbPIpSogIBEFksSgWG7QbAV05FKkXuciw6+VlBIv4+FlPAI/bAetcoNWrUWSJJiOhe1aSCkR1IDp2feTueY5ugc76B7sYP2dNRQuih5ajZjxc5OMnZ1g7OwEo2fHGT83gd8MGD07zujZcfY8cYB0LsWGe9ax+f71DCzpA8B2LUzLoFqqE/ghHT15Uq9jiKGmaZqmvdXe9nD1sY99jMnJSX7v936PsbExNm7cyCOPPEJfX/sf0wsXLlw3/OP48eM899xzPProo9c9n2EYHDhwgL/+67+mVCoxb9483v/+9/Of/tN/0mtdado7lBAeGC5C5rBUE0vVQAWgDBIcYmUQo4hURJCE1wUuyzBwzPZwwrn/XkfoyChnLqQ1gpAgiWlGMRKwDMmy1AIWpQY5VDnBc9MHmJINYvsUpznFqZIgmTaxceiw0/S6WealMnRKBzuysJXJmlU2m9YOce58kwOHSxSLIeVqQhgKnt63kotjOT5+z3OknAsszv85f//kXVycaK/zZ5qSQs6gkM8w2N/DPbetJUlgYrpKOuWQz3o49lv3VW47FrZjkelIE7QCWvXZilalgRRVbLeGYTsI8fLNfxTp2RBWwk11snDNEAvXDM3dnyQJpYkKY2cnuHBkmL1PHqRarPHCv+zghX/Zwbwl/Wx+/3o23L2OVM4jW8jQrLWYGp4m150l25m5YbMiTdM0Tfthe1sbWrxTVSoV8vn8a5q09lqFYci3v/1tHnroobd93Kj27vPjtv+0Gzv4KNVqtylXAZCAsAGbREGsYmIVzwUupRISEpRq11AUCVeXrATtFubt4CXbFa7Z65fbwl8WxQl+HONHEa04IoxjBO05VrFqsatyhCPVi1TDJq04IbhBS3JDCBzTwDYMbClxhY0rUrjSY6HdxwqnkyC0KVehUgkJWuPcPO8bZJ1pokjynZ238NyBJVc9Y4ICHMvmvbcs4+5bV6OExDIkuYxLJu3ecOHit2LfSeIQvzGOXx+nURfEvgApcFwbw365kBMjqKPoQtHxKs+fcHLPGXY/up+jL54kjmOgPXRw9a0r2Pz+DSy7eTFJlNCst0jnUnT05LDdN94ARLvWj9t3j/bm0vuP9ka8k/af15MN3vbKlaZp2vcTwgCRQpBCqQ7ARyWtdic91UQSI6WFhT238K5SCYlSXP4vUcnsuQJUO4yRECcxoFCqfVuSKBQRggRDCAwpMKXAFAkZs91RMFQQRIpmFJPEBptzG9ia34oUJj4RxaDOmfI056szjDbKlIIqQgYERog0IoQMkCLAMZpYhmQ0HOdoK81N9gIGs/0UCmmkXEpL/Fs6rH+mQx7mFx/cw0/cZzBcfpBqVVKq+OzeN8LwWIUnXjjC9t2nuOfWZdxy81KmSxGNVkg+6+G51lu6JpRSIahpHLeB4/WTLUj8VkCz3qRVD2g1WkjTwHEt5DXzwgwUHoIiCgtIv9xLIA3Jyq3LWLl1GfVyg/1PHWb3o/sZPTvOweeOcvC5o+Q6s2x6301suu8mpJAEfkihJ4eTcjAt/U+bpmma9vbQ/wJpmvaOJoQEPIThzQUtlI9KarPVrTpggTCQoh2krjnN3QYgwDDa5yiSJCZBEWPOztuKiRIIlEQJB0NaSMPCEja2bZBSkjBOCBJFLQjx4xAF9Hm9DKWXzW1zEMecK5U5NT3DiZkZzpdnUPhIM8Swmni5SUrWDONBmX77PNvSy+hiPkEsORv8FL1WL33OU3SYO7A7x5jIfYKh+V1sXNvD8TNFHn/2LNNTTb7zzDGe3Xmae25byKZ1i2n5KfKZLNmM+wN3FXwlSvmoeApUA0S2XQE0wcu4eBmXKIwIWgHNagu/4RPHCUIITMfCNCXStGaD7AwKE3j1odrpfIrbf3Irt//kVkbPjLP70f3s+94hKjNVnv6HF3j6H15g4er53PTe1azYvIR0Po1lm3gZF8uxMG0Tyzb1IsSapmnaD4UOV5qmvWu0D5BdEC6IHBCAaqGSOhDS7hkoAUG7d7k5d12Iy6HqyskwBIaQWEhcJEpJYtpDDaMkohX7RCokSBIUIYYwMA2TtCnJ2A5+HFEPQ+qBTzOK2sP/DBPbMFjR1cmKrk4eAsI45kK5wqmZIiemZzg52o2bnSDITlINpjjTmKbLOMG27EbWFebRSN7HpWiAAfPrpLjIgPFnnGn+DMjFLF2cZcXim9lzdIwXXhylUvb51pOneWbHBd536xDrVg/R9PPkMwVSKfdN++xVUkclU6DC2WB1fVgxLRPTMvEyHlEQEQYRQSskaAYEzeBK2LJbmNYUwuzj9fwzNLCkj5/4zffzgV+9l2M7T7H7u/s5ufsM549e5PzRiwgh6FvYw/xVgwwtH2Bo1Tw6unOYlomTdnDcdtgybVPP0dI0TdPeEjpcaZr2rtQ+uHdAOLNBS8HrbGRx/XOCRGJhgQFZK0uURMQqJlQhfuwTJbNzvFAIIG1LXNMliGNqQUA19NvzrQxrrrW7ZRgs7SywtLPAA8uWUA9CDk1Msm/8IueDI1jpSSaSGb458wT/Y6ST5e4qtvQuxiz8G3rVl0mpKdZk/p7zrfcz2VyHJ9NsXNXD6uXdHD42xY6dY5TLPv/8+Dme2nmJe28bYP3KPlLpLtJe9g19zkopUFVU3O4IKOSrP58QAsuxsByLVNZDKUUUxoR+SOiHBA2LyC8R1iIS1YlpW+3Ta6y2mbbJujtWse6OVVSmq+x98hB7Hz/AxMUpxs5NMHZugp2P7AWgs7/AglWDDC4fYGjlPLrnFTAdC8e12kMIbRPbsW64xpamaZqmvV46XGma9q7XDlRvzbAvU5qYmDg4ZMxMu4lGEhGpmCgJiVRMLKL2gsaGoBmF1H2fYtBE0A5ZrmEhhEQiEUKQti22Dc1j29A8/Ggze8cvsLP0IjPJMLgznFLbOXS6E1XvZ1PPB/nYomcY8s6xNPUIfd4xLvj3Ug76MZXH2pWdrFzRwdmTFZ57cYRixefrj1zk6R0T3LOtl3UrCgAErSmS2OLqACpmF/YVYrbSN/dZXh6OCagWqCIICyF+sEqYEAJrdngel8NWkCMKy4S+oNkwCf2QVq3Z3ipTXrUdXLNtgquvQyrrccdPbuWOj9xCdabKxWMjnDt8kXOHLzJ+boKZsSIzY0X2PXUIgHQ+zYLVs5WtFQP0LerBdm1Mq719ltsOWtKQGKaBYUpd5dI0TdNeMx2uNE3TXgdDGBiGcc1sIaUUCQmxSlAqIUpFtOKQkt+k3KpT9puYRruz3+V+fpZ0MISBY5rcOriEWweXMNGc4HsTT3KxNYyUM8SZInsq3Wx/YT0fmGfx4YVH6XLPsNo7T9HdwkR4F0mYI/El8xdn+dSqDRw7OsOT2y8wVfT5h0cu8vSOSYa6DI6fOUEmbeHZJlKK2XwiEKJd8ZsjoD1Xrb2lEoVtZbFtB9OKsQyJkC8fZJWKIDoOqgnWeoS4voNfu7LlYNp53FRItqtAHLtEs8MIoyAiSRKS2bXLVKJQly/PNiOZa3SrFJcvup7NkvULWbJ+IaZtEAYxI6dGOXfoIucPX2T4xAj1cp2jL57g6IsnAHBcm6GV8xhY2sfA4j76F/WS7cwACsOQCNPANA0sx8Rybhy8Lm/LtdvFdbfN9eadvWBYhg5umqZpP2J0uNI0TXuDhBAYGBiifaDsGA5pC7rcDoJMTD3wKflNGmGAEAlCJATKRyQCR9pzVaJer5ePLfw4Y83T7Cy9wHBzioZTpJkv8WRxgBd3LeSnFuzltr4ROtwd5O1DTFr3Uve2Ercsis0Gi5dn+fS6zezaN8H3XrzAxHSLMxea7D5ea1d+hCDlmaRci5RnkvYsUl77csqzSLkWac/C80xSrklXwSMIFUm9ipQC0zSwLQvHNrBMA9OQCNGA8CCE+yE61K52AYgMyrkXnLsR8vrWtUI47bb7yTSmNYBltxcufjkvF2IuX1eK9lwvP6RZa6EUDC7rZ3DZAHd99DYEMHp2nHOHhzl/+CLnjwzjN31O7z/H6f3n5l4nW8gwuHyAweX97cC1pBc35ZLE9bnhoIZpIAyJISWKK+Hp6tA3d3aj2wBjtvGGm3La4c3Wrao1TdPe7XS40jRNewvZhoHtpci7Ho0wpOr71AKfRBkkBPhJCylMLHGlhXq/t5QPOfO42DzMzvJ+SmGD7kyZWlPw38fX8tzkMn524W6W5sr0ON+iQ+6mmPkQXmoZ9UZEJWxx8/oeNt/Ux3M7h3nqhdNYjkkQxigU9WZIvRlC8dW337VN1q/uZfO6fhYP5UiUotH0adbHcOURXOMItjiHkFxZO0zmQRiQzEDrf0DrOyj7NnDfjzD6r3l+IVOopNruQmj0IcTL/7Mkvm/o4g0/b8eCrEeuK0sURnPVsFa9RRhE9A510z3UxbYP3oxhSqZHilw8eonhk6NcOjnKxPkpqsUax3ac5NiOk3PPW+jrYGj5AIMrBhhcNkD/kl4sKUmUmqv7XQ6v7VGW4vKNXLl4+QIwGwQr01XKU5V20w3Pxsu42G57vpqU169bpmmapr2z6XClaZr2QyCFIGPbpC2LVuRS9X3KfpNW3CISAbFoYAkHU85+LUuP+akNLHCHONk4xu7KKWyjRcYrUW4qPnv+Fm5JTfHQ4CF6vAv02n9JXW7AyP4EYVQgUeAkgru3zaPgNtm8eSVCSPwgptGKaMwGrHozpDF7qjcj6o2ARjOi3gyp1gKafsiO/SPsPDDM2kUl3rOpxKoFF0k5M+2qEYpEQRTOw1driORNSGMRrmNgyP2Y0ePI5Dy0nobWMyTGTcTW/Si5DCFn538pF6HKIAXIHozZitgb0Z7nZWHZFl7GI9+dIwrbHQxDP6RV9wn8kFxnhrV3rOSm967Bsk2SOGH07ASXTowyfHKESydGmRqZoTheojhe4uBzR+deo2ewi2xXFtu15gKR7Vo4no3l2tiz123Pvup+G8ezcVI2qVyKTEd7va8oiGg1fBqVBlJKTMcklfXaz+VYb3jtLqWC9lBNkZntnPnO1a7+xa8YtDVN096p9DeXpmnaD5EQAs+y8CyLnOtQDQJKrQb1sE4TH8sIsKWDFAYICyV7WJk2WZ4a4nh9nAO1k9hGjaxb41jT5PC5W3lfbpjbu8+Rc/bQ4xylJu+hpt5LbKZJOe3ZYQqFEArLgqxhkMuYgNce0mhIpBDtk3GlmUQSN5ia2EWtvIuMfQzbag/3m56BkmERqmVkO7ZipzahRFf7oDhKCOKYRjVCqFUgVmJyhrR8Ckcchmgvwt9LxAIayd34yU2AASJBMIwSTaTZSTblkPKcN3W9rrlW8WmXXGeWOIrnwpbf8Gk1fOIwpnteJ/2Letj2E5sxLYNmrcWlk6NcOjXG8PERRk6NUpwoM3lpmslL029omyzbIpXzSOdSpPLtcy/j4qQdXM/By7hkOtMU+vJ09hfo6M3jZdzXVdWaa6OftEDmwehCiHfWEESl4tl16/z2OmpEILsQMvN2b5qmadrrosOVpmna28Q1LVzTImc71II0080q1bCGL5rt9bKkjRAGMZ1ITNZmTFanF3C6Ncn+ylFcs0jDDfleq5O9lwo8VDjLknSJnPNteuydlJMPUUrWA9BdyGKZBnGSEM82iIjimDDyUdE4gjFEPIGMJjHEFAaTGFTJdoHsFihsas0Uxy8M8vyBbk4M9xOENgLBskXnuXmtz/pVvXiuicn3B6K1RKwlUePYyZNY6iUchnGML5PQRSDvIWAbqBxK1WnFJtMlm0rNIZtxSXk29hus3NxIuymFgZtyyBYyxFFM4IftYYS1JoEf0qq1A+XQynksvmnBXMv2WrHO6JlxmrUWQSvAbwYErdlW862QoNVe2+vK9RD/quutWos4jgmDkPJUSHmqcuONnJtTBiiFNCT9S/pYun4By25ewrJNi8l0pNtNNmYbbVwOyO02+pXZNvoCZHa2rX48G7DevHXQfhDtalqAUs12oFJB+w5htt93PAkIhEy/rdupaZr2euhwpWma9jZzTBPHNMk6DtVWhim/Qimo0qBKxnSxDIdEdKCUhUGR5V4nS9z7uRRMsL9ylBFrnHoQ8d9ri1jZqPFAx3k67BHyzhdJGStJOVuYrJ8gZZZx5DSmmsRUUzjJBEKV2seySqEMrnS5Q6GUIFLdNKI1BKzDSC9n5TqL+UtD9h+dYPehMc5cLHHyXJGT54p8/ZHjrF3RzZZ1/axc0oVpXltdSUQfLePn8NUHsdRz2MnTSKZx43/E4lvUktuYDjegqOCaKWKVYqboUq2lSKfSpFMOjv3W/bNlmAaeacxWtjLtYYR+hN/0adV8goZPFCdIKbFdi6WbFv3A86KUUgStkHq5QaPSoFFpzl5uUq+0b6uXm+3zyux5udEetnh6jNFTYzz7TzsQAvoX97Fo7RCL181n4dr5eFkPyxZYZg3TqiKNNNJ0MUyFYWRA1VHRGBg9P9TgolRCe+Fvv73wt/KBiPai3/bskMXZz1OAShqoeALoQ8jUD207NU3T3ggdrjRN094hbMOgK50m57pU/CyTzTJFv4qULXJWCkOmiTGRagZD1Blyehjq7WfCn2Z/9SjnmsOMBDZ/PpPiDm+M27Nj2PIQa5cewQtMCNuHsrOHs3PrRiXCJRI9JPSQGN0o2QOiB2G2D76DMKbZDGjVpzGEgeuYrF+fY8P6DoplnwNHpth7eJLJqSb7joyz78g4linJpG2yaYds2iaTtsmkrCvnqW247ha6MnvJO89hiSk89RhD1vdoJsuoRUtpJIsxjE5U7NIqOtTrOVJenkw6heO8tcParp6zlcp6qB5F2GoR+i1azQZ+o4ZfbaJUgsJCCANpmAjDQkoDaVhIw3jZ5htCiPbcK8+ms79j7vYkSUiihCROiOP2eZIk7T+nJKFZbXH24AXOH7nExWPDzIyVGD8/wfj5CV56eA9CCAaW9rBoTScL12SZv2IBjpcAVaRpYFhGex6Y7WNYDUy7D8MuvGXNM5SKZof7ta6qTsWANbt22suHpnazk3o7YIk+hHj5TpKapmnvFDpcaZqmvcNYhkFXKkPeTVHx84zWp5nxa5hSkDJdbLowRQVJlUS59Dpd3O/cSSmscKB6jOO1s+wKXA5MF3h/9iJL7TJTNcVkK8dkK8dEK8dkK8tEK8dEM0stsgHBlRWvfIS4hOAS87Ip7lzQw+1D3WQdGz+IqbYaKMC2JHZWsmVbnptvyTIx6XPg0AyHjhSp10Na5ZCZcoN2hBOXm+ShFCRKtatjCCR3smH5KHeuO8r8nglM8yBZ8zAFUxKLfhpqGS2xlHo4n0bgUq1lSacKZNJZXDf1it0DX692dSUCElAREM8NXzPNENNM8NIxKkkIA0kcQRIHxFFMFCYkkSJOIAoFSSxRykRhAgZCGO3wZZgIYaIUxFFMEiez/dlFew0yQ2KYcq5RhmmZV62vJVmyYRGteot6pcn0yAzDx0a4eHyE80cuMj06w+jpS4yeHmb7NyVCvMjgil4W3zSPvgUFUjkXL+OQyTm4aYFpF5FGF1aqB8d1MSwD0zIwLfMNfa5KBaikCkkVCGmXoiQqcUgQqEQRxwkqabUDagJJnBDFMUkYo5QilfNwPQ8hZytYRt/bPpRR0zTt1ehwpWma9g5lSkmnlyHvpCj6dcbqMzTjFtU4IkHgChNPVLENBylTdFg57uq8hc25dRysHedw9RTfrOVo+U0iaQIClRgkiUViWMROjDDqpCKfKLJIIoskNlFKtAMQiouVBl85dJ5/OHyBm/o6uGOom1WdeaIgIQwiIsCzLVzLZFF/mkX9nfzEvYpiuUm1EVCtB5SrLSr1kGotpNaIaTZi/GZCqxnTasUkSrH/1CD7Tw3Sky+xcv4wK4YusaB3EinOYpjnSZnfI8GjGi2hbq2gEaykWC2QS3eRy3TiummkvHbB4nZQUsDl89mT+r7rJHMVFohAxUAMQs0Ok5TtRZWFAVgIYSIkODf4F1QlESqJSZKQJI7apyQiiUPiKCaOIA4VUSIwRBYnk8Fy8nOLEsur5k29XDXJsq25phyF3jyDS/tZ/97VJHGZZmWMi8enOXtomjMHRyiOVRg+Ps7w8fHrnscwDdIdDpm8TSqXIZXvJJPPkOvKku/O0jmvQK47SxzGBK2AOExm53FdWbj56utKKVQSgKpBUkYlATEOSSiJI9UO1EltblFolSTM9aVHtP+X7UWqlVI0ay0s1yadS+GmWpj25YDlXPdetDdfu2tj+xcMqAiEoauHmvYa6HClaZr2DmdISbeXpcvN0IpDWlFAM2pRDZu0whmCeBIVVklEGktYmNJmS249G7NrOFw5yf7hY8gchETtJ1QKNTtAUNGcex2lFAKBa7ikDQ9PpijWY84WmxQbMcfrUxw7epa06bCmu8Dazg5yhoOqgIlF3kvhWiaGkGTzNl7GIhfZdCceUZwgRIwhBUK0X0dgIJRguhpzcqrK6WKNkZrJRNjNi8c2Yu1osjQ/zMqhS6wYuoRrV7HZD9F+EILpah+nGosIxTI6u5ezbEEfhXRmtvnFbICaC1JwbdD6foJ210KjPf+H2SF9r7N4I6SJkCaS6wNA+2C1XRFTSYQQIQgfhGx3xRPOlTlHr4FhGu0ug1mHsBkQNCMaqU7S+U5WbV2O5VpUS03OHR7l7IFLFMerVIsNasUGzZpPHMVUphpUphrANDCMUgYgrlmYuRU0ePb/2Etnfwed/QUK/R109OXal/s6cD0bRYAQDSRVhIhQuCBshIgRMkHIdjdKw2wHUyEF4qrw2B7y6FOdaVArNfEyNn2LOonDhPJEiaplksrW8dIRdnoeUuoK1pvlSrU2BnWlUgvhXPW2/XfGBKMbIbNv5+Zq2jueDleapmnvEkIIPNPGM20KtFtUh3E/rahKKxilFRdpRIpW5OMnCSjJUmcJTinDwqX9hDKmETdoJs2rTi2aSZNa3KAeN0hUQkxANQmpJhWwFfP6oBBF1PyIZhSRKDgWwbEJcE2DtGVgS4loCAwMLGmRMlzyMk+n2UGPVaDT6kAIm0QpJhotzpSqnC3XOFOuUfGDq96kRNgCbAUFuMh8DjQXkzlqsNCcZLF7gYWdF+kvFOnKjNKVGUWxnXLd48VnexBmJx35QRYMLKW/bzGO3Y0wLWZnmQHyTR1G+Hq0X7cd4IScbZGvQlA1VFwF4YLMgfAQwn7lJ5ulVICKpzGtKqbdTSonCfyIVr1Fs+7jugarti5gw3uWYdhXujhGQUSt1KQ605gNXHWqM0WqpYBqEarTTaqlOtWZGk0fauU69UqDiydGrtuGTIdD10CKroE0nQOddM3rpGseFHotAj+kXmpSKzWol1vUSu3wVCs1qZebs5cbNMqt2fB5hePZLFw7wJL185i/opfueTkazghupkEqNx8nnX1HLLSslE97v3pntbe/kXZwmv3Figpnq7Xh7C8hoqseefkXDRbgIoREqeZsB0d0wNK0V6DDlaZp2ruYZVhYRicZO4eKp4mTaSJl0YwVzbhF3fcBcCxBwUlhyCyONBFSYgjB1TlDKUUjblENy1SjCtWwSisJ8ZOQVhxRiwIqQcBEs8F0s0ktatFMElpRgBSCtGWQshSximiGTWZEkTMBhElCFAsC36VUs2g1HULfJYnaQxWlMJiXSbMgn6cvnWa6FTBcqTBcreJHEZNxzKSIOBdnebq+jkK8icW1iLXpEZZ455mXGaYz06IjfQHFBWAfSQ3GGxLbtpBGN+n0AF56AMPsRRndIHtAds92qHt7whbQPiAX1mz1wEfF44CFkmmESM8GrRsHCJU0UMn07OLA2fbjBHONMrIdCX4roFlr0moEtBqtqwpxAts26OrP0tV/+UBZIUSjPSxUdKJUhiiK2HVwJ/3eENWpOpXpKqWJMjOjM0yPTNGo1KmXfOqlGheOCuDcG/o8vIxDKudRKzbwmwEndp3nxK7zAFiuxYKVvcxfmWf+ykEW3XQThb5u3LQ71yL/h6k9V6wESbl9XTjt7ovCBux3zGLNSiWgmihVh6TOlRD1/dVa7xX/LgjhoWjNdnBUCJl76zde096FdLjSNE37ESCECUYvhrAxkiKOFHTYWZpGwFFG6E3nUDIhSmJCAkjAFAZSSAwkQkQIFZCRCWk3Rz+9KJGaXXMoQqgW4BNETWqtFvU4odSEXWNVXrw0xYzfYlomCBkzlHPoTsOYP01VVTBsHyESoIWVATsDtiHxpEOX1c381CC9dh8Fs5vUVa3BE6WYbDS4WKlwvlzkQqXCZKPBTDNipqnYPdOPoh/P2MqtfWWWp2v0qxoymsGWRfLpOnESIBihUR/BNCSOY+A5Jq5rYRoSIV2U7AbZC0Zf+/zyZZF/04OXUookVsRKYRnt+UXtPz8JeCC8dkUhqaCozFazstdUs9rznKqz61clIHI33E5pSryMi5dx2/Pjguia++d+5KqfFUKAaiJEBMImigu4Zx023rmWsBXRrNdJoiqGrGHZCUFoUBzzmR6tMD1SZma0zPRImenRCtWZOoZlkOlIkenwSOc9soX2+eXrmY4U6YJHNm+QzkWYVntoYZT0cOm04OzBEc4eGuXcoRGaNZ/T+y9xev8llDqIYT7OwNJBlqxfzOpbV7DqlqWkc299a/k4iomCFlEwRRLM4AcS23FxvADLrs3Oz7NQwmvPUZoLWz/cEN+uUjXajUVUi/akNvcVOzS+GiFcFP5cwHq5fU/TfpzpcKVpmvYjQgiBMAooYc1WNOqYs93VsmYW05TEKiFSEXES0oprqKSFr6LZduIe0shiCA8pr5ozJECRBRVj2SF5K8IJKliyxP2Lszy0JMexmSbPXyxxcLLC+Wk4Pw3QDXSRsSSLuqArG2O7EYFoUIoqxCqhSYnjzRLHm4cBcKSLiTHbVbDdTxAbjB7Foh5YoBR+HOLHEX4c04ojEqU4hOJA0yaoD7DE3sLmrl7C6RbF6Smi6jQiLNGVrtGRrdGRqVHIVOnMB7h2A8+t4DgXkLNzrASXA4+DMnpB9s0Gr57Z877ZStHLH1QmcUKs1Gw79csLNieEUdxut54oTFPiuRaObWHPNrFov+7V1azWVdWsDEKk24vuJsVXbWV+Ncs2sV7zGmH27IF5GeL2UD0v65ArCCI/IvBj/KZLqyEgjin02vTML2DZxrXzqKIyhlFFijqCKoIakhqCcYSoIajN3na5o+BVDMHi5Q8wuPx+7vzpjSRJwvi5Gc4dHuXMgRHOHR6hXqozfHyYi8dGeOorz2NYBgtWDzG0fIB5y/qZt6SPgWV95LtzSPn6h4MqpWa7QMbEYUTgh/iNgCiso6IpoAkqjTAlzUodISW2a+NlbGw3wrLLKFGmXR2yZ8PW5blob80QQqViUC1UUmu3vie8fg2xN0gIp/33M54EmYDs0AFL066iw5WmadqPmHZzBBsVT7UPkOcoDBFiEIIBKbNAohxi4RApia8SwjiglSSopNEeNCQur9UkkIi5A3/XSWEaPdTCOjW/xpKCz7qeTqp+kxdHZphqhCzKuyztzNOTyqNkCoXTroQBsYqZ8aeZ8MeZDCaYCiYphUVC5bcbd89OkRJcvnL5MjimwMEC2geoYRzjxxH1IMS0xxlT4/xzMUM27mbrksWszi4kCWJmxhUXRiNeOFZnYqqBIWM6MlU60lW6OqosGvBZ0N+kt1Aj69UwjAYiOg/i/Nx2zB1CCne2utWDEt3EdBGrLoKkkyDMEilIYjXXGEIIgZQSQwpMw0BagiiKqVRbCFoYloFnWziOhW0ZGIacPRhOtcOtCtod+CjTrhh41x2gKxW1Q1cy0z6pmauul9rDIO0tYK1/1ZbmQtgoZPvnARWPo0SAYVl4Vi+prCCJEgI/wG8GtBo+zWoLqJL2juBa+zGt8zd+8qu6DaJmF6yGdtv6JIPCwpCTWHwboqNUmh8lUQXynSk23LWUDe9dBgKmR0pcOHKOc0dKnDs8TWW6xpn95zhz4PzcvgKCdD5F38Ju+hb2tAPX0n4GlvSRKaSRUiANORcK4zAimu2OGDRDoqgdrGY/FCyrhW2WMTzaFc6rBlqqOCHwQ0oTrStBK+tiOxLLCduhhwSwrgwhxJrtQtk+/aAhpd36vg6qOjuP6o1XqV7J5YBFMt3+05MFHbA0bZYOV5qmaT+ChLDB6IXZg0aVVEDZ7d9iy9zsb9AdTGFgAg6QBhKVEKmYWEUEcUCYRCQkKJWQoEhUAkCiAKFwbRslc5RbLUq+j2c43LGoA6kCkC4Cl0g67Wh21cGXIQx6nB467W5WsIZEJQRxSDEskqgESbu7nBASKdrRzpASMXu7FPJKhQmBQjHqj7CveJCL9REQNXxqPNO8yNMzHaxKD7F5qIfFK9LcTR/Ctzl/ocbp8yXODpc5Peqz4xhzjQQNGbN8QcLKxSGL5/kMdNVJ2UUMMYlBCahDdBY4y+WkYACeEHjI9pwls70gcyK6SOghTDqpNHIUKwLDEAz0ZEinLJRShFFMre5TrTcxzATXMvBcsA2FNBLmOrcpvx14VBF1dZBKZtoH1jfshDgrvgDhHsBCWevB3grWTS/b2lwIE8Tl+VjhddU6aUpc08VJCXKZw8St7RAeIUkiVKJIYkhUhiRJk6g0SmVIVJqELEpkUWQRIouQWYSZxzBTGJZJImKieDup+J8xrQu43l8QGp8gFhvaFcEkgUThLXcYXNrNbR+sEsdppsYMLhwbY+L8JOPnJ5m8OE15qkKj0uDswQucPXCh/enM/nllCxl65nfRM7+LvgXdGLZJ2IoIg/DKAs5RQhzHREFEHNYJ/RphmBD5Ym6opWmb9C/qon9RJ32Luuhf3Iltm+2gNV6aC1qpnIdl21iOaC+sHNcvf5LtkzDaa6IJp/3ZX54PNRe8rp3D1a5SNWerVE3eiirVK2kHLDEbsNABS9Nm6XClaZr2I6q9HlNn+7LRizDTtOd+vPyBlxQSW0jAwjOurGlzJVQlV1UaZm9zEvq9hGrgU2o1aUUBnmmhREyQxMSRT6wiYqVIEtUORkpiSAMp2pUcQwgylkuHOx8JxCiSJCFKFIqERM2uk6Quv/aVGHF5paT59lIWz1tBM65wuHKUvcVDlIIKUXqak0xzbNShIHrZWpjHpt4B1m/o4tYt80hCmJiuceZimeHRGuculZmYbnDsvMGx8xaQAgr0dq5gyYICyxZkWL4gIp8uI9UkUk6TRJMk8SQimUYpnzgeIU4uEccJcayIk/Z7Vygc36HRcrlUTsh4grQn8FxBh6Mw5OziygEQKCLRXli4HTSvbUByYxbIAsjOq847QeQhPgvBTkgmINzdPmGjrA2zFa2brutSePlgWYhrmx0oFUN0BIKXINgL+BjQ7tZtLCWWWwniDcRxtr12l6Q9NE/KK9UiIdot2a963iAOqEYtWsk6ErGInP9VZHIegy+CdRfkP3ZdGFSqE5IavQuyrN62nDiCMIgIWgGVqSqjZyfmwtb0yAyTw9NUpqvUqw3qRxqcO3zxqjd8409VXG5Vjrzhg84duraLYqEvR//iLvoWddK/sJPOgRwdXe3wOBe0HBfTMmfXVUtmT81290hx9RpgZvt1hTl72Z6t7DZ+KFWqV9KucApIpmYrWJ3vyoClZjsotjspfv8vKF7r9XY30rl18eb2FYO3s0up9sOnw5WmadqPsLmDY5l7Q/M85Gwgky8XzAzIWGk6nYhyq0Ul8FFKYQqFMEGIpF2BEgJEPHvw2F5z6nIlypQSKYzZ4YfiqsqUbAcrmBtm166itU/MXg6ihFYcYoo0G3Jb2JTfwkQwys7pA5ypn6GJT4WLPF69yJMzOZalB3hocBlLO+axcCjHQF+Wph+SRIpGM2RkvM6Z4TJnLpQYnagzMdNgYqbBi/vab7mQcxHCo1LrJYq7gdUIFJlUg85slY5Mjc5slUKmRke2Sme2Str1yXghaTcgSRQIaPntU5H2wtG2bWBbBrZtYJkmcWIQqXZbbClthOwA2YkSBZBd7YrBbIiSMtsOLHBdcIGNKPcjEF+EcAcEuyCZgnBn+yTcdtCytoC17gbDDhXEZyB4sf2zqnrVDtID9jawb0UY/VyOAa+VUopW3KIaVVEoUoZHMxbEzqfoiJ9CBo9B8AzEp1CpX0eY8+d+VggDJTOgqghhYDld2LOLD3f05BlY2k80G7ZadZ/QD2lUGkxdKjI9OsP0pSJTIzMkicKyTUzLwLItTMfEssG0fGxHYdgelm21H+NcfpxBsx4wdnaa8XMzjJ2dplqsUxyvUByvcPTFs1f+ilgGvfML9Ax10DOUp3dBJ6m8h2kbWI45Ny/OckwM6/LBuAJaQIyYC2DxbNC2MSwPw7QwrQApw7lFmA1DvmyAfbMJYaFIQTJzVQXr7W+RfyNza3rNtaMP2gFVBVy7cPjrfuarLl/+rK8KWEKiZkNWe66ked39V8LZtT+rQ9m7jw5XmqZp2pvGMU160mmyTru6IIWYDU3XHiQkKmmvqaXiuVOQhLMVrnhuAdvZOs7s7/AvDwCc/Z2+YPZ5wUTgmAYZZRIpRRgmNOOQbmuAB/sHiAg5Vj7OrpmDzIQTJHaFk2GF/3rmFAXRw6rUPNZ2DDCY6cSwHZRS9A96DAymuPfO+cSx4uJIlXMXy5y9WOHSWJVipXXNe8+mbHJZh3y2h1zGIZexyWZNMhkTNy2J0wLf8THlDFI1qVQF0xOK8fGYC2MtRieahJEkSiRxbJAkEsMwGOzPsmgwx1Bflr7eFNm0hSll+0Dw6ubqIkGIytzix3I2oBpSIgyJIZjtkNiHEB9G2B9GcuH/396dR9l11Qe+/+59pjvXrBo0j7YsY9l4EGoDabDBCYQASR7EsDpOmg6LxE5DgLDgdZhCVmAF0i8B06Qha8Wsl9eB0AG6mxAHY8AOnrCFZMuWLEvWPFSVarrzGfd+f5xzr6pkeZBdlmSzP17X99a9p+49595dpfOr396/H1ayDRlvAzWDCB9Is1Eih3YuB/lKCt40Ivjf0HoQ1MlTByzK6dRC91VgrX4B64UUzbhFI2mgY0EUQTVqUs57RLZizr6Oin0xdvvrkByH+p+j878J3uvnZdYsNMV0XZpuo0WxO/XV9Rxcz6FQzqMHNXEUEwUxI6uHaTd8ojBGRQkKjZRpUCItiW21saxZpKXT5z4tYyVoYPMQkgYxF5OwBbBoVtuMH5xh4uA0Jw5MM3FwmolDM0RBzIn9U5zYP5Udd+d/ZyBEFnBlgZdnY2eBV+c+y5ZYtoVlS2xbYjkWtiOxnWz77HvyBUXfwAS9AxMUCg0cr4DtlXDzRSyngCbXfa/AA5nL1vR5CJkDciAdLCmR9pmDpjTAElmA1clgnd8AS3caIOsIiNHKB8IsU9jJUGUZQWHT6ed1to3Dn/71O8Fwp3F51k9MN7PfYt0tORVYdfrx0Q240qBMZFNEJekU0U6GzAU8E4BdYExwZRiGYSwqIQR555mzZFLINFt12j9D3WmHWncDq05zWUVnamInW6W62yfZbUWSZstcja0tcgnEStGKLDZWLubiyiXUwlnuP/kw+1p7CWWbOca5vz3OPTWXqFWhXwyxstTPWKnMWL5AMbQQWpLvt7liZIgtW0eJIs3Jky0cS1IuexQLNlIKFAmxilE6xtcRSsckBMQa5lSTE7UpTsQTAGzMrWPVqmFWrs/x760ixBZHTjQ5dLTKoeNVDh6t0mxHHDpW5dCxavbH8fS9KORdSgWHYtGlXHAp5G1KRZdS3qGQdygVHAoFh2LeJp9zEOJU1o9svVxaKaMfId+IlG/EFofIiR24PIxFFeJ7gXu5cl0b/DxaynRKnntFmqWyL3nBvZziJGbWrzIXNAh8RcOPaIUhsVYUHZcllRL5fIx2xqgU/2+c4P+D6BFo/wPEj6ELv9ttaCuEjaYERKDmshNYOyseUcqCBzfLPjlQzlMZKBNH6dqpOEqywCskiWZQ8QxJIEkSF00rzaRKcJ1j5N378ayHgRiExuWHaIrE+lLsnkspbr6ItZuXnhrXSjEzXmP8wDTjB9LA6+TROUI/IgrTYC/y4wWNlONsTVe7EZzVe1oohSxbU2f5uhqja6ssWdpCSg0R+HNpHqwjnaIpsqBSICzZvZ1mvzoFPxwSsQllvRG7sB7HtbBdG8u25gW49rwMVhpgnWtax/PKz0ekUzlV9qiVBVLP3tNrMXQDoAV3Pv32aTCm5106+x2lX6tT9+nuNjaIAsgSiMJ5D2iNlAmuDMMwjAtGd9rh8zzvmZ8RU6TXsUoIVUgQR7STEMeqcP3Sa3ltfA2Pzx1gd30vVTVOICJsZ4o2U+wM8mw7WsZv9uBJj6XlPCOFHKO5HGOFIoOFPANLXBCChJBa3CYiRGWviwKNpKZqjMcnORGNU1eNBft6d/gAOxplNubWsbawlIpbYvWKEutX9QFpMDQ12+bQ0TkOHZvm0LEaxyfbKKVpt31a7RBmWs/6nggEhbxNT9mjt5Kjt5Kjr8ejr5Kjp+LRU/aoFBwQa2mxhqZ+K1IdxGM7LtvR+DSjDUTyGpS8DNcq4MZpWOxYutur67nSKivgEbSZaMww5zcIQ1AILEfSW0hPfBthyLHZKr2tPIViTJBzGcj/Hp59H7S/lQZZtU+hi+9BOJekxyok4GWBFGnPMO2jkwZkxSJOlUT30mmEnWAr214nPmiJTpaRaIskTlCxj0weRMR3IdWR7hRVP15CK+mj7OzHFnUs7sfiflxtE8YbCONLCOKNaF2gUHBYs2mENZtG53843al8CNCJJokS4jAhjjvXiiiICYOYKIiI/E4gmBbasO05SuVj9PaeoKfvBMVSDa2y02+l0dqhXs0zfrifiWMFVBygVYDjxri5BNebf1HZfTGOl2Dbat4ndxeW81OmJ0Y4fvRK7MImxtYsYXTdEPlSrjulElHsBrdan5tGw+nn3EKratbTy8myUd5zDji09tMpgp2G3OfYs77mGX7M0mCygU7qIPIgK2esJHo+dRqkpwfwi5FlM8GVYRiG8bLxdBkx4FTQpdO1We0oZLTYz1XRJuphg2PBEQ60DjDuHydwY6LCNEH/JGG7xNFmD/urFZQSaK0pOhajBY+SY+MIgWMJXMvCkZLEbtC2p6nLKRIRpRN9RLpvQ9YAy9wxEkKeCJ6krhs80NrOY+29rHVWscIdpc+p0OOUcG2L3nJE/8Y8r9x0EdKqkKg8rbZPszlNszVLoxVTbwqa7Zh6M6LZCmm0QhrNiEYrpNWO0Wia7YhmO+L4ZOOpbxrpFMJKyc0Cr04A9mrKxV/iyMHDXP+6K3Asi0QpGo0ApdtIKbBsC9e28Vwb25Y4VjpN7SnvfaII44QwjGn5EXPtGtNhlVAlSOngeTaulRY46Sh7Hr4VU09CwrqF3w5p+RFLyluolNdD86ugTkDj/0F7N0D+bdlallM6rQOgM00sAtVK83/CRYvCvEa/CTqZAd0EUUI6FiIZx45/AuoeoJ2dNeWJ7StoiqtoJkPYwmUm9slxmLLYh6N2IfQ0jruHInuA76LkOpS4jERchhYD3QyVVmkPNBUnqFijLIW0JbZnpyXrlUarNMARBAjRQMoIxzqBYx3EsQ4gZZ2FisTJMFGyiiheRaRWUyz1sm4ZrBMiq+SoCNoRrZpPu5muRas3QtqzAe1GQKvu06z5tGstQr+JJWe4bMshNl11kt7+Y/T2H2Pi6J3ccdtS9uwYpH+0hyXL+xldM8DYuiFWbBhiyYo20jm7rNvZSkvQN0DVgDCrlvjMjY21qqfTS9U4JCfSizrebTsANlp21jOefunPpjxeGKfPnYqeT+2JV0bI0tNWAn2xaa3SP2poH3QjK7wiswCwnBVguXACwMV2YYwOwzAMw3iRdQIvAM/y6HFhpNhPlCTUwiZLgyE2lNdTCxucCI5xxD/ATDRFXIoJ+08SxScJWyWm5sr47RL7a37Ww0rhFhp4hRpOvoHQCSLriauVReiXCFsVonaZXdoC2thSsKS0gXLPNElugkg22BE9xiF9mIvEKlbST68q49l92KKfJM6htQQitLAplkaoVHqxZB1LNAEXKfNIa+E0vUSlxTkazZC5esBc1Weu5jNb9ZnNrqv1gDhRzNZ8Zmo+HCU9sSedghm0Q+7cfhejQyWWjaRrv5aPVhgZKiIQtP2QZitI66LZEkdKPE+QzgzVBEFCO1LEocJPFHXVoqrqSCkpOnkcaRFFiqkZn9m5gNm5ANeVrFlVobfXw5KCMEmwsWm0Atr+cZYU++krfgw3+ScI7oLgdogfRxd/D2ENn/HzF51qeyKfnYxGWe+wOU6dDiVAEaId6OAnEO+eN4CGwH0tLftKGnEaMJdcN50GS44gyVPTF1Gy/y8KYgaiHRBth+QIkieBJ0F/B+TydGqlc2k2SBqgGtl1PQsWGtnUtuwxmtlaIbpTZgUi3QkctFiBkuvQci1arsOSRSwhyGdrEtN1eOltrfSpUvOJIopikjBOK1vGCp0otDo1PVFYEiGhXQ+ZHT+EZ93NwOAjLF8f8Osr9jIzcYQHfjTGI/cvYff9B9K1klpjuxZDy0oEluTYXRMUSkUK5SJeIUe+5JEr5ckXPbyiR6GUI1fMkSvlyJdz5Is5pFwYpHcDUq2zUvb1rK9XBOTQopQ9noDWCGYRahzUiew6vY0+8x8YsqMF4nRt4fz1hadto0UPWAMg+tNrOZCuQRTF9CLL2TS9cxPcLOyJF6RTM1U1XXsoS1k268XNxnUyVFq1s4AqBHR6rWbAWj4vi5wFgKKQBVovr+mMJrgyDMMwfqE5lsVAvkJ/rkw78ZkNaoyG/WyMNlKNahxsHeCw/yStpI7IBywZaCPUNLlkkKbyqTOTrv3SZIv688ioD+33EAclZKKRiULIhCBJ0GgiJThW86FWQso8ufI0+fI0E3KSJ+tTeJQYYi3rShezvrfA+v5eCraLRJAonU4ZS1ySuECsG+hkBhVNp/dpt7teDdITayenGci59A+7IHqytWpp8JQkMbVWSLXWploPqNZ9avWAej2kXouYPiHxw4gDx2c5eHwu7T2WFcoYGsgzNlxg2ZIiy0dyLF2SQ3o2tQZobaeL8nVCIhIaUYvD0zNMzjRp16BW08zORszORdQbMdgJYkkDOVyDhuDOR3vpSwZYt7qH1avK9C3xKLkOwhIcb0xTbxcZLP4GJW8jdvj/QnIwnSYo+7PpUWWQPdn1Gb4mh5Dzpg+qOQjvg/Df5mUxJDivAO/fo+TF1FWLdtzGlQ62XHgK5VkesYqpxXUSu5dS7leR+V9Lm3lH29NLvC+t1pgcAf9/n3E8itOuTy3bkSgcEnLE9JNYaxHOBhxnA44s4sjnvv7tTDkDrRQqy2qdKfiybItCZQM6WYdSdfLuffS59zG0rMHay07y1nqVXdvWcP+dA5x4skkUJowfrNJsh8zsm06L0OjTCzd0ijecOuJOMNgpWZ+fH3SVbAolQb4E+bJNrlQkXyySL1v09J2kUj5Ezt2PLcdBhN16o6dfKfrRYhglRtCMgBxFy5Gs5cAcQs+kwZmeQTCTXmf3pcHcDCQzp39SpwLZznuKk62HKs67lLprpBCl7P2IsucN0+v5t4myQCW7v3MbJw3Q3StBjs1b+5ZOi02nDNbPMGVw8U79FwZUzSxDpYEYot1ZJdLH0EqhZR8y/8vgvSb9ZjWHZnZeNiv/lFYQL1VC66crVXPufPnLX+bzn/884+PjbN68mS996Utcc801Z9z2tttu43d/93cX3Od5Hr5/aomm1ppPfvKTfO1rX2Nubo5rr72Wr3zlK6xfv/457U+tVqOnp4dqtUqlsjjzhaMo4vvf/z5vetObcJ5lobdhnM6MH+P5MmPn7Gmt8ZVPM2pSj1rESpPEcNQ/wZPNfRxpP0mk/bQ0tkj/Ce2xi6zKjbEqv5Qhdygt9CAslE6IdYLSMVLY2MLGwqURKg7XahypVTlWrzHeaDLZ9rEK0+TKMwiZBj9xmKc5N0zs9zBW7mF5pRfXstKgSKVFPWKlUCpBqRAISFSC0hZKpQvf0567Gk9KSo5N0bYouhZl16LoSso5QcGxce2sqiNphk8iQcGRRyYpr+xhYqrFxESd8ckWJycDWq0EuqWiJSI7ae6t5Bkb6WN0qJdGK2B8usbJmRq1ero+LH0NAA1WDMM11NIa1lATbI3Kzk1lDLLloI/2oY724qkcy5YXWbumwiXr+ygUJHbkUnIKlPMBRfH3WOqJs/ikneykrpJOJ4v30S0iIMrgvRbc1yCsQYIkoBE3CFVIzsotaEmgVLdXNwCJTgiSgJyVo2yXFwRhWtXTtWLRdoj3pmvDRCkL+LITb1HK9uvU1zE5mkriqxiJxJFO1vA7RmmNJdL7PJnDkQ62sFjstS1a63RNmDoVeGnVQsT3YiU/RKi5bC2aRyt4FcePvIJjTwY8uucooz0VIj8kbAcEfkDYjgnaMaGfELQ1QSsm8GPCdoRKB272OXR6u6l5pehTfUMBazbOsXpjlZUbqrhekn1Lp/y8RaNWol7rodXqI/AHCIJBEgZxvSJewSVf8vAKDrmCS67gYlkSvxUQtEL8ZoTfDLOvI/xWelsndSw5h+PU8LwaXr5Bsdii0g/lfk2pR1EoJbiexHbtbF9Y9M9jATmcBlnOK8FamX5eKu0VqOIYlbRQKkQrhzgqIKwCXiGHm3MRMqtGmF2eLYt0KqCaP+Uv68sWPZG2eYh2pusXs4A9SRyECNJed1YZmXsDonAdkAP8LIh00gCrm2mzLqh/v84mNjjvmatvfvObfPCDH+Rv/uZv2LJlC3/1V3/FDTfcwJ49e1iyZMkZv6dSqbBnz57u16cP2L/4i7/gi1/8Il//+tdZvXo1H//4x7nhhhvYtWsXuVzuRT0ewzAM46VNCEHeyuNJj6JTpBm1CByfi7zlbOhZhR//Ek/WD3C4dZCi5bE8t5R+u4gQCltEJCpMTywIEVLiCJecXcASHrZwAYuyLRgtWGwZKRHr5QTKpZHAST/gWL3KnuZuJuInCa0Q1z1EGOY4OTfC8WNV5lcgO/10TaA51RMpLapBdqvbJwydVQDPrjWAouAKSh4UXE3eU+QdhWPHxE7CqqSPsbFerl0zhGN7xNqi2kgYn/Q5eTJgajJgarJFteYzUw2YqY3z6J7xbJqSBqEQQuJ5Nt6gRTJaI+qfQRVnSUh7oGkESZTDb/QgREKuNIvujbF7prDWnySeKrL/SB/77ijxrz84yrKxIhvWVdi4epBl/b003d+jmJsj57SwRAOLBlADVQddS6eQqexa+6TTAmeAmVNvoL0OvNeB88q0vLhWNOMm9aiJFJCTBZI4IUxiZv0aexoHiVXCpvJ6BoolHMfGEhY5K0eQhCRqjrJTxrPSDJmQZfCuTS/PQaKSNNCPmyid4FouVlahcf7awljFRCrCT3wkFra0yVkejnBwpL0o066EEAhbpEF3Vw74VbT+5bSEv387OjlBj/tTKhvvZ90l1+As6+eqyy9FWr1o5KmTfh2BCtEqC6iUTZJIgtDGb2j8lqJda9CuzdBqVon9OsXiOD19RxkYOka+UEclOsu2aZo1hwOP93JwTw/HDpSZPZlDqfn72gaOZJfFUskudH+WNBq0xvFiiqWEJSs9Rld6DC1zGVrqMjBq0zMAxVKClO2sIbRDp0G0xs2KcTiASzrt00mvs20QTpppjbYjkl2QnEBH30Pr/4NSfYTJJvz4FQTBUoJmRLsR4DcDglaLMGgzMNpD/2iJXD5HvlzAzbvYjkPaj2t+dlFyqgR89ttGt7KfHwVYEO+H6KH0DwY6nSqtlCaOB2j5ryCIL8fOLcGVD+GoO7GSGVT0T4jW99Hu63CKNyDsnqwoSQOd1ECk0zy1fmlmss575mrLli1cffXV3HrrrUBarnT58uX84R/+IR/96Eefsv1tt93GBz7wAebm5s74fFprxsbG+NCHPsSHP/xhAKrVKsPDw9x222381m/91rPuk8lcGRcaM36M58uMnRdOaUWgAppRC1+1sYSFhUOsNYlKiBNNmCSESUigfBKtsBA4WLhSYiOxRIQgwhI660el0XgoyijyaGGRZFUN/aRNqALaqs2+1hPsb+0hUCFRorB1CTfrubTwP5lN15MIBDYKKWIsFFLYCOEQJjGNOKCtAnwVEKqQSIckRAiZdI93XsX39Gt1qiKg1hId57GTMnlRoSz76LP7qLh5yp5NTlskDUVrOqE6F4CnaeVDGlbEnHOSmjWOlZtDiFMZiCTyIBhgibWclcURVlQqnGg0uffYYZpiglx5BjffwJMCR0mStiQ4VCI53Ido5BAC+so5Nq4dYu3ynqwgh0ep5OHZNo4rsS0Ly5LpRQiEjLKgqw6qmk5pslcirGXd/QrjiGpYoxG1sJVFHAuqQYP9/hEOBEeYjKezhtZgY7Ext5arejYxUKrgOjaWJQiSAK3TBtsFq/CcsxdaK/wkoJW0CFWIK10EFvvrR3ikuhe0ZH1hBatLS/EsGynTYxMizZwlKiHWMQKJI21c6eJKF0c6T98IfBForSB6GPx/gWQ/Wmuq1RY9Pdmxi1I2NbOSXfem09VkD1BMp4hRQOOBtkiiJ9Hx44h4H0IdQuuku6YLLQnjVUTxeiJ9EYkYIwlJM03tiHY9oN0M8Bsh7UaQBRgh7YZPu5EW8WjV06Cj3QhIouxnQAhy3ayWS66Y3s4VXXJFj3wpzXK5OQc37+DlHaQlmJtsMDNeY+ZEjenjVaZPVImC+LSg69TaMS/vMDDWQ6mvgGWnvdXS3mVZnzXbSkviO1b6+XYfSwvJoDV+MyTyG/QNHGJk6SFGlp9AyrhbCKU257Dn4X6eeHiAw/t60OrU+Cv3FVixcQkrLhpi9WWjLF0/RKGUw8s5CDtrYN3960vnF4IGrKwR+UNZM/F0DZtSmkRVaPmvYGZqPYf3eZzYP8PRJyaZGZ9j6boh1m5exqVbphnsvxfJRPZ2u2j3Ncj8L+Pkl2TvT9rYOYrg9jt2XRD/fr1kMldhGLJt2zY+9rGPde+TUnL99ddz3333Pe33NRoNVq5ciVKKV77ylfz5n/85mzZtAuDAgQOMj49z/fXXd7fv6elhy5Yt3HfffWcMroIgIAhOVbSp1WpAelISRdELPs7Oc82/NoyzYcaP8XyZsbM4bGzKsoSHSyNu0lZNLGFhCwtkjBSaouPiyQqOcLGzdQ2J0iSkJ1SRUsRJQJzEaJWAsBHCxhJpLkDaDmXhISmjSIiUz/LCEFfFl7Gr8RiPN3aT6BAIs1OdU/9XaYpo4f1ZRkqke4C2BcIGjzTfILJ1LhoPpUEpgcQB5aC1QxLbRJGk2qhBMSaxmmkGymmhnBZNJmgAJwDVdoiqOZIgRxR56DiPVywSyhly+RpeoZZm9rLXdXSBfpayOr+GDUNL6fEWlmheWS5zzcgwe2ZmuPf4cQ5OTZAvz5IvzZIrxVQuqyM2NVBzOZp7S8wcrXD/jpCfPXwibTWt04CwXHQoF1zKJYdy0T1VEbGSp683x0BvP8XCCLZjISJB3G7TasecrFYZr87RaAbU/ZBxJjjpTFB35tJMiVIkCvRsAWlrrL42D7Z38WhzLxvza7m8dDGD+R68nAVSM9ueI7BDilZpQUXEM4mSiKZqEiQBEpsk0Txc382O2uPMxjVUokHAnsZ+nJM2y50x1njLGHOHyTsejmOllRwtO52pJSJ81QYSLGGRt4vknsN+PG/iFZC7FJK9EPyAMH60mylNg9nTqxueiY1GIglPLceyQMulaLmRWF5MotaiEweZKGQYkQQJwkrw8g6uZ1HuzaeviciKcojuRQqFkOlFyjR4iMOYOFa4ORshbbSSqNhOp9UmChUn2Q9V1qTbtpC2xLIEtpP2kovCiCTRqDjdvjbVYGa8zvSJKjPj9W7wNTdZJwpjThyYhv3T2ft2lm/zwv8BY9jOMGsumeOizdOsv3SWUiXkyteMc+VrJwh9l8P7hjm0b4wjTwRo3WD6yAQzx2DHjzTFisPI6gGWbRhk1SXDLFnZh+Na2dRnjdAJqP2I6CGETtckag2JKjB5YjW7tw+x+0GL409OMzvxb3R+M3X2duZElZ3/to/v3irpGRzj2jePcuVrD9A/NItUPyQJfow/ezXkfgW3sAxpFYiTNHC7EP79Opt9OK+Zq+PHj7N06VLuvfdetm7d2r3/Ix/5CHfddRcPPPDAU77nvvvuY+/evVx22WVUq1W+8IUvcPfdd/PYY4+xbNky7r33Xq699lqOHz/O6OipXhLveMc7EELwzW9+8ynP+alPfYpPf/rTT7n/f/yP/0GhUFikozUMwzCM5y8UIVPuNInIGi13rtEokd0SGo1CdW+fura0haMcHG1n1w6ucrq3bW0jnuEMT6OpihYnRY2q3aBmNWjbLSIrQKFJsiyOOu20whICRwhyymMkXMKqaITecdtDCwAAMj9JREFUpPyMr3W6k1HIz5sNHvebWPkmhd45iuUmBUuQFxIVC8R4D+0DJYJpG78usulNz86SgryXrmMLQkWiNLgx1mgDe2kdOdjurq0DUDN5kmNl4uNl8G1AI5e0cC6eQvb5SCFwpEV/bQlro5WMFIvPa71NG5991hFO5MYJidJCJoEgPJD+1dxaWscqJEiRNQNOLIrVPiqNfvqjXvKuQ96zyHsS15Hnub+QwrF8HLuFa7dw7Sau00yv7daC+23r1B+748RjtrmC2cYK5horCKJz0zfrxZbEiuZMi/pkM11nlqh0PVt2rboFRbLbcVZsJM4eV2kBHTfv4ORsnLyDm+9cp/d5RcnwkknGhg8w2HsAx1pYmyAOE+IgTi9hwunRgJQC27OxPQsnZ2ctFgQqUfgtyb7dQ/z8pz3suNsiDhSnqywpMbCyl4EVvRQHCkwfmmVy3zRTB+dQSWd7zdpLalz/G1OsuaSJ49nYrs104yIOn7yaVjD44n0IZ6nVavGud73rOWWuXnLB1emiKGLjxo3ceOONfOYzn3lewdWZMlfLly9nampqUacF3nHHHbzhDW8476lN46XHjB/j+TJj58XTaVh8etW4F4PWmkjHhElAqEMSrdBadSsCKt2pDphO5+lWnBPZtEEhEFojhJX9tVkRqXQ6YxBHadEODbYAS1o4Mi2IkMQJT/7sEdZec1k6FekMQhUyG80wE00zG88wFUwxFU4RqoiSVWR1bjUrvNUM2EMIIU7NONSnjq1zDJ2vtU6XCXROIjv31cOIbRMTPDQ5QTNpUijPUajMUs5HFB0Lx8qmRgpwlYuduMjIRgYuum0RN22iukW7KmnOgt84tb5E2xF6uAYjc8jBJtISyKwQgeOXKLWG6AtH6PMq2J5NSybUiAijBLeWUB2vMxFNwpqTiL7sRFYJ5PF+RtsrWTcywMpVJdat7mOo0Itrud3jnW7UeeLwSQ4dq3KwepITzmFaPTNZyXWg7aIO9aOOVNBxJ3DUaTA3WkeM1BBeTDeLEVmoyQr6RA96upQ1knYplz1GhvIsGc6xdNRl7dIiQ+UCrmWn35ut7Vms0thRnPDDO7dz/XVX4DzN+HkKHWXr4gK0WJKlWDtT09Spaz3/69NJTq0XSnudpX2VZDq1VQkSJUGn09k6vb+UUlhWOi1PWmBZCiFj0qILQVZ4IVtr1H2vnnvQqrVGq/TnT8VJVhTk1GsrpUjiBJ0VoUGnVRzTQ00DrLQkf1rcQ6d1O7PC/Kfvx7z3RWhc+wA5Zxeu8wSCCLLKjRqB1gK/GdKshTSrPs1qmK1nSys8ag2WbdNqFNjx0zJPPNJLEsuseAjkig7LLx5k+UVDLLtojKUbVpEv9xAoRVM1iXWa9RGAHcac3HOEwzuPsf+RKY7tnUYrzdI1Na5941HWbprFdm3cnEOrvYYnprbwmut+57z/+1Wr1RgcHLzwpwUODg5iWRYTExML7p+YmGBkZOQ5PYfjOFxxxRXs27cPoPt9ExMTC4KriYkJLr/88jM+h+d5eN5TexE4jrPoH+aL8ZzGLw4zfozny4ydlz4XlyKnZlOoLLhK1/2oBdcaTaxiFCprnqyyKl+gUOk6MA0eFkUs4kQRJoogjvGTgEacPrfM+h0F2kdqGylEVu2vUyVQ4EibYWeYYUZOlb7WipbyKVqFF3yirpVOM3FaM6g1q0b7eUu8gR3jE9x9+ChHj1epeU1yxWl6yi0cJ8yCxAhbtrFyEpkX0HuqyLkFVISgV1s4KodMHBpiDoTO+iuVKMlelohlFJJR/MjiZL7FiVaL7a1pZmdPZQE6z7pkbZGL+jYxpAT1xnEOWPtpuXXUsmmO6xmOHulDbRtERg7Lx0qsXd7P7FzAoeNzzNbasKSGXDWLGGx1n9uqFumrjjBqD9K3Jk/vVWUGBnpxXY96M6beiGi1Y5rNgDl/ilnrOI38BLEboFZWUcvnUKGFnuihPd5Le6rExGQT/ZjuThvt78uxfLSHdct6WLO0yOplRXp6bNLgJC2o8Hw+Q60VYZL+4VqLENvysjdrXqCU7cPCdT2AyIHOAVnz2U4RDdG5bWWBk51W5uxuI+ZtI7PHFydjp3VCpyy61u20qIP2s/2WWRGKTtW9zvuV7tPZBWBZVUQdg46zintR+kcGlYbbWgm0ttAk6bRfQRZc2elnlhU9OVWeHWAEIf5d9hqnAjqdaKIwApGQKyicQUWxHXF07ySHdo1zaNc4h/dMEAVxtm4ORlb1s/yiQZZv6GXZhkH6RgdBlNA6B9pBac1MrUFDNZBIXNx03aal8aVF76WrWH75St5oR8StgP0759j/yAR3/+tR7v7nQ2x9wzEuumIaeIwVud1Y8l04zvmdSXY2/36e1+DKdV2uvPJK7rzzTt72trcB6V+q7rzzTm655Zbn9BxJkrBz507e9KY3AbB69WpGRka48847u8FUrVbjgQce4Pd///dfjMMwDMMwjHOuW5jgOZ63qU5wpTuVBDsVBOd/nZbaDpOYMEmo+y1OArbw0FoQ6/REWGXBWvc7tUYLEPOrEQpoR+kaZkl6oiezwhuWkPMCtM5hnHYgIs0cgciCumwbCTnL4pqVI1y5fIgDc3P828Hj7Jwo05xVaBTCihEywrJipB3hOIqCp8i5CtdOkFaMsGJsoUnsFpYjsJTAVhVE0E+zUWF/A+5vtYjVk50FbAtyJEXHYbhQJFKKo/U6J+oNTjTSNSJ522Z97xWsKvrU5JPMqTmi9XNEK+eID/dw8MkBDh6pI+wEsWwWefksdjHGcSSO5TDGEi4urGZgxSC12Ga8GXPSj9lVazMxPgVoer0cvV6OHtejMuiyylvJJmcdjhDM6pNMqCNMqqOEOoDhNurSNjKx8ep9JBMlagddGrOC2bmQ6dkJHt41iSQt+FEpe6wYLbFyWYGVYwVWjBVZMlDMSs53MqQdp9YhoTWxjgiTCD8J8KMQgLlwHJ88OSuHK7OqmcJNAyFhAVYWJJ3WB0vMC5qygOl8NZwV2X6mgV+ZRIUk+MSqTZzUgSautNK1mN19zzJMZ5wjNr+j2alpcpAFi0KCyCEsF1s48+7PAl8s0qbXMegIrTvZtSzT1lmo1mmejSTWMaFKM995N0dOLqygPb/X2fDqIa64biM6SfBbPoceP0HkRyxdW8bLO+nnJ0tockjpIbLy/zExddVEk9Av0gxtolSarQsT7EQRBSEnmz7TxBSkZsWmMmsv6+eNN11Nux5w4NFxfvbTXfT33ku1lWPdvy8u2ud4Lpz3Uuwf/OAHuemmm7jqqqu45ppr+Ku/+iuazWa3l9Vv//Zvs3TpUj772c8C8Kd/+qe86lWvYt26dczNzfH5z3+eQ4cO8Z/+038C0l/eH/jAB/izP/sz1q9f3y3FPjY21g3gDMMwDOMXTScYe9ZqcdlMKoABt8wuYH3fGLbjdKfwaTRKp6XC02bEqhuoqSyDplQadCUoYpWQEJOohIQkPdGa17dIa5F2yhKdfllpkBfrhEQn6XPq9PlEVhFRCoklbFZXhll3+Rhz7ZA90yeZabeZafvMtlvMtNvUWgEJ0NYLy9EjFMKKsKwIy0oIQxcV57Kgr0knYMi5FiOlEmOlIiPlEmPlMqOlIiXXRaQzt2iGIbtPTvPYySn2TM/QjmN2zdR4bAakWM7y/jHKfeNQrmINtIgvbeLVewiKdSxHYzsWOZFnWCwFf4zxOvzrbMjJ9jFqQcip4iW6G5BOtNMM1+lBqWtZWeDVT9kbppRrEDtTtK1JlB0R5qaRQ9NUXiFYQT+F1hBMlZk7ppk82aQ6GzBT9Zmrhex8YgYhdFqG3rNZNlph5dLe7NLP2HAF204rXQYqop34hMpC42JbfTgIYAbXWUUiBXNJjEhsPOmRdwp40n3KtNpEKdpxTCsMsaSm5Fp49rOfriY6IVZx1lcuSXu1ZRU0598+20yW1jqtwJj1rItUlAUoMYlWhCqkFYdUnDJSpVldR0pc6eBKG1vKtO9Y9xk7AZfq3k4zbOm0xU6wmQaczxZISrr9oahkP5+dJsQRWrVRqk2oavhxm1BHpFMBJX4syVkeeelhdxu1aaRM+7bZzqnXKFTy9I+szQK+IkLk08BPLJzq2U7a1MIaee3SZ1VAS9p+SLPRRgiLUn8Bz7VBK5JYEUQhftSmFbdxojaumsXLuay/fJT1ly8lTv4d2/dMntXndSE478HVO9/5Tk6ePMknPvEJxsfHufzyy7n99tsZHh4G4PDhw1mKPjU7O8vv/d7vMT4+Tl9fH1deeSX33nsvl1xySXebj3zkIzSbTd773vcyNzfHq1/9am6//XbT48owDMMwngfHsnCsM62ZOfuppp21agrVDZwSFRPpmEQnWUZMZSfFaSbAEp1iG2mxebIAq7PWTGnNWFGzYWCIKFFZOfK0yEYQx8z6bWbb7TTwym7PtttMt1vMBT5JlHYQGioXusHTWLnESKVIfz6H6Ob1sgANTUJMWkUN8nnBK5cv4eoVI2gtODRb47GTUzw6eZKJRpOjszbMLkO6PQwMTlEsNBGD9bR8dZynOjXE+FSBIAEhprNjTV9NWjCQL7Cs0sPKnj5W9vRhCclkq85kq85Uq5EFlAHNMCbRmmm/zbTfnveuF4FV2F6TnkqbcqmF5bRJ7GlqpVlEWVBYm+fq3DL6xTKs2R4ak3DyZJvxyRonJms02wmPPznLnv2z3We1bMGSoRLDI0WGR4qMjZUoD1gEtGmoBrWoytHCESrVPobzS+h1etNgNPaZC+rY2Hgyhytc0BZBnFCPQvwoyvo/Qd716CvkKbtuN8jqBDyxjol1TJAE3cCqs04tDUeztWnILPspu+PJltZTAy9Ed1wm8wKpSEVUoyqz4RxzUZVaXGMunGMumqMZp0GuIx2W5ZexorCMpfll9DhlGkmAIA2uHMtNg0nhYUsbSz7HNWhnIQ0cXRQ2obLwFfixItYCW+ZwAEukTX9jldCMQ3yhyFtlCnYZy3J5+sxh9nMnnrrfSisaUYN63EAKQcEuEAQRe2afZHv150wlk4w6Y6xpbmB1cSXlUo6c5+LlXSqUiFVMkLQRuo2b+HhopCoQBjZWp5riS8h573N1ITJ9rowLjRk/xvNlxo7xQpzr8TN/3VjnpPf5Pk+i0+xX51ppTagSwjhZEHxFSUI1CCg6Np7tdEsDiM40xO6Zus6qBqZl7zslqiFdExariJg4DQ5FZz2RYKbls3tylt1T0+ybnkVpjXQbOLk6cVAg9MtZtgwsKRktFRitlFjR08OqSj8re/upuPlu8+DTdYIAX/k0I5/pdoOaH1P3Y+b8gFnfzzJ5baZa7W72UVohdq5OudyiWGzh2pB3bNysmMmQN8iawmouKl3MUmcpJyZrPH7oGHvGD3OsNsm0P0PiBIh8hMiHkI/ASRACbNvCsdNeTSoOsd1sXbsS5OISXlTGC0qIdh7dcojakjDWadVHJREqfe97ewr0DxTpGSgwNlphtL9EKW8jrXT6YWeKqxQSW9hnHDO6G4BnVTSz4F1rTazSDEqSaKJI0QhaVJM5GqpGQ9doqDq1pEY9rqPShgedBUwL8oWWsEh0suB1i3aRlYUVLC8sY1l+KZ7ldYu2OMLGljae5WGLtPH0882szReqiFAFtGKfSAegwREuQss0ExwroigBrcgXPFzHJtExgQqxhUXRLpK38mdVpCdSEbWoRjvx8aSLUoJdM4/z0Ow2TkaTafn7ee9WQRRZ627gktJGhnsGyHlO5y3NgqwWNm0KMsJWijt+ePCC+PfrJdPnyjAMwzAMo0MIgYV11j1/zvQ8thDzpjst9HTBl0ana07IpjrqTqatU83w1JqyU1MM07VmtnCwtM6mPWYBnI4pOi5XjA1y2Vg/QRyxb7rKE1NzHK016C3kGBnOM1IqsLRSZqRQIWfnsElPvjsn2s0wQhM+ZQrgqbZLAos8RcshV8gzkG8TqRiNxhEWjkyLUoRxzMG5Kvtn5tg7M8vhqsd0UzONxvKaOLkapVKTfC6iFgQca41z38z9FOy0mIA/4KP6NZ7WjGqII0UYJUSRIo4FUSzQoUVUc4jaLrrtECchTn+CKPtgJVRpAZNoCyilFx1Y6EYOXcuh6zmo56GZh+MaJRI0CdpSODlJT5/L4GCB0SUVRpaUGBgokC9ap2WcYiIVEemISEXEOv3aj0NagU8rTC9+FBImUXdbpMa2LGxbYltpw97OOy6xqFgVeqxeeu300uf1M+D1kbM8ZpMZjvpHOeof4bh/gkbcZFdtN7tquwEY9AZZWVjBisJyRnMjKKVoJ36WE+pk1ixsaePMD7iElfXCO3Pglei0+Xg78QkSnzCJEUpCIkkSTSNsEycJcaxROqGmqxRECa8dkvMcSgWPnJNHiZhqVKUVtyjYBQp24WkD+s7PkK98amGNWCfY2OyYepSfzTxENZ5Lq2xaDuvzG1iTX8sh/yB720/QUk12Btt5xN/O0rnlbKps4qLetRRyaUbPlhViVaCWNCBbs/lSY4IrwzAMwzB+oTxb8PVMOgHVma6BMwZfnSlriY65uC8mXBsRJWl5ald6uNLFs92nnMyeHkzNP7dOXyMNBrv197KAMFElQhXhJyGBaqdFLTRIabGqr8yK3jKvXbOMMI45Uq1zYLbK/tkqR2sl2lUNMsDJ1/HydQqFFjmnlQYdQlCwCvQ4PfR7vQx6vZStEnlKuEkeR+VozMVMTTc4ebLJ8dkaE8cmGWgPYtkC5YVEhQZRvkmSb6JyLRLXRxdB94doEaKpZhlMEApINCrRJEn6/jaAutYcSECMCxhPezI5toXjpBkzyBp4ZxXxkkSTZIUa5k/XOmMMHwh004GmhxXk6bV7WVIYYLgywGB/gb6+PJWihxZp36lq2GJWNwDBiFzNMmcdwtOcjCcZj45zIj7OTDzFRHuSSf8kD85swxIWY/lRVhVWsqywlAF3AEukAWKQ+LSyjFy6f2mRkc4aQ0fYWDKt3NmOApphCz8KiROFjgRaQZxoyLLAVWaYjMeZiMeZCMeJdYxAMOgMMdwaZbg+xtL8GOVijpyXBllzUZV20qZol8hbuadkA+dPA4xVxGNzu3lwZjvNuAFCkLNyXFK8lE2FTQQNwczxFleMbuGayqs40N7P461djIfjHEsOc3T6EP82W2Rj6RJeOXAZA4W+LMjqpZnYwMFn+Ym88JjgyjAMwzAM4zlKe4exMNJ5Vu5T7pk/pe3FpLQiUhHtuE078Ym1SqfDKU2iEiq5HBcP9RMrjR/FHJqtcrBa48BslWNTDeo6wXJ9tBLEiZv1RwKIgSkKTpWy59LjeZQcl4rrUvAcCisd1q4eYvBwgBzpYzYMqUUWfuwSRD20/JhmPaIZ+winje342J6P4/rYro+UCZ3q4tKRaVCBhaUkMhYkIUS+Igo1KhEkWuLrdFohSqJjCYlAJxLi9D6RpNeedMm7Hnk3R8FzKXp5SrkcOraYmfKZmm4yM9siTDSTaCbFDI/qaZg3fbO/r8DAQJElAyWG+stIW5DQRIuEtDCEpCxW0ivWEBEwK08yIyaYFhM0RYu9/gH2zR5Mp58KQcWuMOAM0G8PMOAM0m/3U5QlgGxtYtZSgVP97dK+WRY2aZZLiYRZNcVEPM54eJyJaCJtyaCzjKzS6ftgKcbVOJNigp3swG46LJkZYWluKWsrqxkpDqGEYjacoSlzlOwCuSzIClVEPapxMpji0bndPDK3Ez9OWxOU7BKrxcXYJ5Zw5EiNuw89SLXmp+vgpGDl8j42rl/Cqze8EXp8Hm/tZm/7CdpJi221B/l57SFW5ldxRf8r2NC7DvE8/vhxITDBlWEYhmEYxjn2YgdV81/Hszw8y6OSFRPpNJfuUFoTJ+kUycuXpD3P2lFEMwrZMz3FkzPTVP2AauBTDwPqgU8tDFFaEcQKP2pzstE+VfIjy6pppUn8CNmsImRaUn++zuolHRdQqoyHQ1E5lJSDZSkmmm2mGsGp/mnz9rnkOIyWy4z0FnAjgecLdF3RaoW4tkWh4JDPOxTyLoW8i5u3sF2JsCWBSmgnaZW/VhwTJDHNRBEphRh0GcClJ+ml2QqpNyPqLZ+GH9HyQ1pBRKIVSjTQs6DnNHo/uLEkF0lyocSLBG7YqXt56jjT930YCiFioIEcbCB62ohcxDh1BMe6jXmFEEhl4/gl3LBEPqqQTyoUqZB3XKQl8MOQmpylbk/T8mYI8zUUSZr5U1nwFVqomSJ6poieLqIbOciFWEtaFJb5WINNrFxA0z7AkfAQDzTuoyiLLM8vZ015FcuLSwlVkFZ4tHMcbR7jwdmH2FPbS5TEhFGC4xdwjo9yfJfLnvoMWs8QC0XTiWmXFSovsBua5uFJ9h2Y4nt37GbJUImL1g/x79a9maRvmif8JzgRHOdg+wAHjx2gPFnm4vJFKBm+qD8fLwYTXBmGYRiGYfwCeLoiIVKAfYbqdYlSrO7tI169lkSn2a5IJcSJIlIJjTBk1vep+j7VIL2uhT4136cRhVT9NvWpWcaGh+nN5yl7HmXPo+K6lD2XspvLrl1y9lMLFmgU9cDnYHWWw7U5jtbqHKvWGW+0aMQR+2Zn2Dc726054lqSsWVlPNuiHTfxo4R2LcafifDjNKPUCf7mhTzZq4k0KMwCQ7r3CoQFoiQQJYciDipRxElaICLJrmM0dRT1rC2BAHKJJJ9Y5GKZXQR2DFoX4GQBPbEkfQ4ZIso+suIjKn56u+yDCAndGZruDLOn3hRUw4PQQvS2EVKfymQloCKLsFokrBZo1/P4vkNiCZQDqh/iQR8r1LjNPPlHi3jhIPlcgNPfxBpsYQ80sd0WJ6wZfn5yJ65jMVJYwqrKCqrxHHvn9uP7MX4QEU/lifYNoU9WCIWi6bZoVRKSHkHoaRzXwnVcLCGIE8VEEGP5CntW0ajGjN9T4+579lMqeaxdu4JLNlxEPDDBgfBJalGdn808SNIb8Rb1FpznUZn0fDHBlWEYhmEYhvEUlpRYUuI9zeOdQiCdS9K9rdJphoHPtrv/jUuu3oqWaRF9S6TP6WTP/WzyhQJLCv1cOZL2r0q0wk9CDlVnOTw3y+FanaO1GifqTYIk5sDc7Bmb9qbVH9PXzNkOOdsmbzvkbIe8bePZNjnbxhZpUXZbinTNlxYIrZFapGu7pIVnW3iOg2tZWFk2bqrdYqLVZKLV4HijTjM6lXGJgFBrqmhytsVQIc9QwWMolyNvOyQK4hjCOCGMY8JI4QcRvq4TyDqx1SBxGmi3BSKGfJCusQPixMJv5QlaRUK/SBLl0qxXXiAKAlfMm8qa0UCSpFnHZqyIYweiHuwDFby9UCr4FHraeP1NRLnJ1MwBdlsHQadr2dRkmfDwKPW2R8tTJCtCYjcNphzHwRECB+jNuayolOnP5TlQrXK4VkcXQPdq5sKE6SDGqUKhFVPdGfDww+A4FstXbWToooh23zG8UOHIl05gBSa4MgzDMAzDMJ4HS0qeqVtTlJXPHitXEJZFlCS044ggjmnFUboGSIAjLRwpseXTlyK3sp5nDpCzcvQMltnUv4yYOJ2elkQcrVc5VJ0l1oqi45G3PEquR9FJL2XXTQMiKbFEWhHQzoK9tFrfU19bQzplMk6I4oQgjAnCmCRJM3gIgSUlcY+iFYb4SYxSiiBOOOm3mGw3OdFsMN6qc7LdIogVR2tNjtaaIE71aXtKZyTdaQPggugH+hFopBXheD7SSoiCAknkZaX8BX05h2LZpeQ6lJxT10XXoeS6lByHvO0w47c50WhwotHkeKPByVbrVPGVWDEXO0zFRZITA3hHY4pOk2KljU4k1Zke2nYOPZA2GnZsG0sIckKwpFhgZaWcthEo99Lr5dE67Y2lUfhxyJNzUzw+M80Ts7O0chFUBM0oYTaIsJuaXF3QOBTh7ZNoVWaoJEiuVbyUOomY4MowDMMwDMN40bhWmtHAceghh8r6i4VJQpQkNKOISKWBl9ZgyU4lR5GuH8ouet6EvbQXWRoQOdLFszwuHaxw+fBK7CxwEogscGLe7TMHUWdDa00UJ+l0wDihHWSBYrGQZpNQNJOIpaoHIQWeZeHaNgjBeKPB0XqVo7UaR2pVwiTO9lek+yl0GkSJ9H2whMCRNjZpQGgJgUX6/hSsNPtWdBxyVlrUAtHJ0qXPpUXak01rlT2gGHU8Rkoul+teIA0eJ5s+480m480mE802k602YZIADr7O0QjSAix2v8QV4AiL0WKR5eUyy0s9rKz0UnBcHOni2jaubeFKC8+2sYQkVgmNKKQ0VOSyweVIqThSn+Wx6Uken5lhvNGEsiAeUIwHMdqPcWs67XsmX2BvhnPMBFeGYRiGYRjGOSOFwMum4gH0A1GSEKt0LZcfxfhJhNY6nZYoBY5lY0uRrRsTWXny9NIJms4VIQSuY+Nm2ZRKOf+UbZTWBHHcLQzixwlaJ4yU0gbRzzQlMq0MmPbsilVEqNNeXUqnxSq0FiAkSZKgYkWk47RJsI7RKq0GqRKNVqCSdAWY1BKJhSRdd6eF7mYJbWBZqcCKyhCWlQZ1CsFc4DPeaDDeTLNcQsDqnj5W9/QwVumlKN20/xYCS6fNtSUCoQVCpwGtjkALhSMkvbZHJBTtJKKdxIzm+1mxYpC3rBVUgxY7p8Z5fHqaJ+fmCEsJUU/CTKhehE/wxWWCK8MwDMMwDOO8ciwLx7LI41DxOk2bOadB02KSQpB3HPKOQ6/OE8QxfhzRCCNacRo42tLCtdIgq9OoOu3xlV1rAThIbWPrhIS0JHukA5ROkBJczyYvcjjCxrOdtEcUEtuysqmUEqEFCxp8obPMlgSZZswQojs1sNswe14vtU5jbdeysKWFLUU6tXLeVE6ts35iSnVLvycqLfqRZEVAnFiRxyGMY9phRC0MaLRjEJIrelZwdd9KIiL216Z5eHIce67+khsDJrgyDMMwDMMwLigvtRPqZzI/0OrJacIkzWg1wgg/jrvTG6XISufLNBvXWQ8m5mXpOlMcNQorq/74XAqDnAtCCGxLgPXM+6M6AZhOGzs3o4hWGFINAoIowtE2m70VbCgvYfKJR8/R3i8eE1wZhmEYhmEYxjkghciqFTr05nTaWyu7XwrxtAU9Xk6kFMhO6X8HcjmXAYqMqlP91ZphiJ0XTJ7fXX1eTHBlGIZhGIZhGOeYEALXeqZ6i79YbCm7vdCCOKbh+zx5vnfqeTDBlWEYhmEYhmEYFwzPtpHe03VYu7BdGJM0DcMwDMMwDMMwXuJMcGUYhmEYhmEYhrEITHBlGIZhGIZhGIaxCExwZRiGYRiGYRiGsQhMcGUYhmEYhmEYhrEITHBlGIZhGIZhGIaxCExwZRiGYRiGYRiGsQhMcGUYhmEYhmEYhrEITHBlGIZhGIZhGIaxCExwZRiGYRiGYRiGsQhMcGUYhmEYhmEYhrEI7PO9AxcirTUAtVpt0Z4ziiJarRa1Wg3HcRbteY1fDGb8GM+XGTvGC2HGj/FCmPFjvBAX0vjpxASdGOGZmODqDOr1OgDLly8/z3tiGIZhGIZhGMaFoF6v09PT84zbCP1cQrBfMEopjh8/TrlcRgixKM9Zq9VYvnw5R44coVKpLMpzGr84zPgxni8zdowXwowf44Uw48d4IS6k8aO1pl6vMzY2hpTPvKrKZK7OQErJsmXLXpTnrlQq532AGC9dZvwYz5cZO8YLYcaP8UKY8WO8EBfK+Hm2jFWHKWhhGIZhGIZhGIaxCExwZRiGYRiGYRiGsQhMcHWOeJ7HJz/5STzPO9+7YrwEmfFjPF9m7BgvhBk/xgthxo/xQrxUx48paGEYhmEYhmEYhrEITObKMAzDMAzDMAxjEZjgyjAMwzAMwzAMYxGY4MowDMMwDMMwDGMRmODKMAzDMAzDMAxjEZjg6hz48pe/zKpVq8jlcmzZsoWf/exn53uXjAvA3XffzVve8hbGxsYQQvDd7353weNaaz7xiU8wOjpKPp/n+uuvZ+/evQu2mZmZ4d3vfjeVSoXe3l7e85730Gg0zuFRGOfDZz/7Wa6++mrK5TJLlizhbW97G3v27Fmwje/73HzzzQwMDFAqlfiN3/gNJiYmFmxz+PBh3vzmN1MoFFiyZAl//Md/TBzH5/JQjPPgK1/5Cpdddlm3MefWrVv5l3/5l+7jZuwYZ+Nzn/scQgg+8IEPdO8zY8h4Op/61KcQQiy4XHzxxd3HXw5jxwRXL7JvfvObfPCDH+STn/wkP//5z9m8eTM33HADk5OT53vXjPOs2WyyefNmvvzlL5/x8b/4i7/gi1/8In/zN3/DAw88QLFY5IYbbsD3/e427373u3nssce44447+N73vsfdd9/Ne9/73nN1CMZ5ctddd3HzzTdz//33c8cddxBFEW984xtpNpvdbf7oj/6I//N//g/f+ta3uOuuuzh+/Di//uu/3n08SRLe/OY3E4Yh9957L1//+te57bbb+MQnPnE+Dsk4h5YtW8bnPvc5tm3bxkMPPcTrX/963vrWt/LYY48BZuwYz92DDz7If//v/53LLrtswf1mDBnPZNOmTZw4caJ7+elPf9p97GUxdrTxorrmmmv0zTff3P06SRI9NjamP/vZz57HvTIuNID+zne+0/1aKaVHRkb05z//+e59c3Nz2vM8/Q//8A9aa6137dqlAf3ggw92t/mXf/kXLYTQx44dO2f7bpx/k5OTGtB33XWX1jodK47j6G9961vdbXbv3q0Bfd9992mttf7+97+vpZR6fHy8u81XvvIVXalUdBAE5/YAjPOur69P/+3f/q0ZO8ZzVq/X9fr16/Udd9yhf+mXfkm///3v11qb3z/GM/vkJz+pN2/efMbHXi5jx2SuXkRhGLJt2zauv/767n1SSq6//nruu+++87hnxoXuwIEDjI+PLxg7PT09bNmypTt27rvvPnp7e7nqqqu621x//fVIKXnggQfO+T4b50+1WgWgv78fgG3bthFF0YLxc/HFF7NixYoF4+cVr3gFw8PD3W1uuOEGarVaN4NhvPwlScI3vvENms0mW7duNWPHeM5uvvlm3vzmNy8YK2B+/xjPbu/evYyNjbFmzRre/e53c/jwYeDlM3bs870DL2dTU1MkSbJgAAAMDw/z+OOPn6e9Ml4KxsfHAc44djqPjY+Ps2TJkgWP27ZNf39/dxvj5U8pxQc+8AGuvfZaLr30UiAdG67r0tvbu2Db08fPmcZX5zHj5W3nzp1s3boV3/cplUp85zvf4ZJLLmHHjh1m7BjP6hvf+AY///nPefDBB5/ymPn9YzyTLVu2cNttt3HRRRdx4sQJPv3pT/Oa17yGRx999GUzdkxwZRiG8RJ288038+ijjy6Ys24Yz+aiiy5ix44dVKtV/uf//J/cdNNN3HXXXed7t4yXgCNHjvD+97+fO+64g1wud753x3iJ+ZVf+ZXu7csuu4wtW7awcuVK/vEf/5F8Pn8e92zxmGmBL6LBwUEsy3pKlZOJiQlGRkbO014ZLwWd8fFMY2dkZOQphVHiOGZmZsaMr18Qt9xyC9/73vf48Y9/zLJly7r3j4yMEIYhc3NzC7Y/ffycaXx1HjNe3lzXZd26dVx55ZV89rOfZfPmzfz1X/+1GTvGs9q2bRuTk5O88pWvxLZtbNvmrrvu4otf/CK2bTM8PGzGkPGc9fb2smHDBvbt2/ey+f1jgqsXkeu6XHnlldx5553d+5RS3HnnnWzduvU87plxoVu9ejUjIyMLxk6tVuOBBx7ojp2tW7cyNzfHtm3butv86Ec/QinFli1bzvk+G+eO1ppbbrmF73znO/zoRz9i9erVCx6/8sorcRxnwfjZs2cPhw8fXjB+du7cuSBAv+OOO6hUKlxyySXn5kCMC4ZSiiAIzNgxntV1113Hzp072bFjR/dy1VVX8e53v7t724wh47lqNBo8+eSTjI6Ovnx+/5zvihovd9/4xje053n6tttu07t27dLvfe97dW9v74IqJ8Yvpnq9rrdv3663b9+uAf1f/+t/1du3b9eHDh3SWmv9uc99Tvf29ur/9b/+l37kkUf0W9/6Vr169Wrdbre7z/HLv/zL+oorrtAPPPCA/ulPf6rXr1+vb7zxxvN1SMY58vu///u6p6dH/+QnP9EnTpzoXlqtVneb973vfXrFihX6Rz/6kX7ooYf01q1b9datW7uPx3GsL730Uv3GN75R79ixQ99+++16aGhIf+xjHzsfh2ScQx/96Ef1XXfdpQ8cOKAfeeQR/dGPflQLIfQPfvADrbUZO8bZm18tUGszhoyn96EPfUj/5Cc/0QcOHND33HOPvv766/Xg4KCenJzUWr88xo4Jrs6BL33pS3rFihXadV19zTXX6Pvvv/9875JxAfjxj3+sgadcbrrpJq11Wo794x//uB4eHtae5+nrrrtO79mzZ8FzTE9P6xtvvFGXSiVdqVT07/7u7+p6vX4ejsY4l840bgD9d3/3d91t2u22/oM/+APd19enC4WCfvvb365PnDix4HkOHjyof+VXfkXn83k9ODioP/ShD+kois7x0Rjn2n/8j/9Rr1y5Uruuq4eGhvR1113XDay0NmPHOHunB1dmDBlP553vfKceHR3VruvqpUuX6ne+851637593cdfDmNHaK31+cmZGYZhGIZhGIZhvHyYNVeGYRiGYRiGYRiLwARXhmEYhmEYhmEYi8AEV4ZhGIZhGIZhGIvABFeGYRiGYRiGYRiLwARXhmEYhmEYhmEYi8AEV4ZhGIZhGIZhGIvABFeGYRiGYRiGYRiLwARXhmEYhmEYhmEYi8AEV4ZhGMYFSwjBd7/73ee8/e/8zu/wtre97QW95sGDBxFCsGPHjhf0POfSbbfdRm9v7/neDcMwjF94JrgyDMMwzrnx8XHe//73s27dOnK5HMPDw1x77bV85StfodVqne/dMwzDMIznxT7fO2AYhmH8Ytm/fz/XXnstvb29/Pmf/zmveMUr8DyPnTt38tWvfpWlS5fya7/2a+d7N89KGIa4rnu+d8MwDMM4z0zmyjAMwzin/uAP/gDbtnnooYd4xzvewcaNG1mzZg1vfetb+ed//mfe8pa3PO337ty5k9e//vXk83kGBgZ473vfS6PReMp2n/70pxkaGqJSqfC+972PMAy7j91+++28+tWvpre3l4GBAX71V3+VJ5988qyOYdWqVXzmM5/ht3/7t6lUKrz3ve8F4J/+6Z/YtGkTnuexatUq/vIv/3LB951pmmNvby+33XYbcGpK4re//W1e97rXUSgU2Lx5M/fdd9+C77nttttYsWIFhUKBt7/97UxPT5/V/huGYRgvDhNcGYZhGOfM9PQ0P/jBD7j55pspFotn3EYIccb7m80mN9xwA319fTz44IN861vf4oc//CG33HLLgu3uvPNOdu/ezU9+8hP+4R/+gW9/+9t8+tOfXvA8H/zgB3nooYe48847kVLy9re/HaXUWR3LF77wBTZv3sz27dv5+Mc/zrZt23jHO97Bb/3Wb7Fz504+9alP8fGPf7wbOJ2N//Jf/gsf/vCH2bFjBxs2bODGG28kjmMAHnjgAd7znvdwyy23sGPHDl73utfxZ3/2Z2f9GoZhGMaLQBuGYRjGOXL//fdrQH/7299ecP/AwIAuFou6WCzqj3zkI937Af2d73xHa631V7/6Vd3X16cbjUb38X/+53/WUko9Pj6utdb6pptu0v39/brZbHa3+cpXvqJLpZJOkuSM+3Ty5EkN6J07d2qttT5w4IAG9Pbt25/2OFauXKnf9ra3LbjvXe96l37DG96w4L4//uM/1pdccskZj6ejp6dH/93f/d2C1/7bv/3b7uOPPfaYBvTu3bu11lrfeOON+k1vetOC53jnO9+pe3p6nnZ/DcMwjHPDZK4MwzCM8+5nP/sZO3bsYNOmTQRBcMZtdu/ezebNmxdkvK699lqUUuzZs6d73+bNmykUCt2vt27dSqPR4MiRIwDs3buXG2+8kTVr1lCpVFi1ahUAhw8fPqt9vuqqq56yf9dee+2C+6699lr27t1LkiRn9dyXXXZZ9/bo6CgAk5OT3dfZsmXLgu23bt16Vs9vGIZhvDhMQQvDMAzjnFm3bh1CiAXBEMCaNWsAyOfzL/o+vOUtb2HlypV87WtfY2xsDKUUl1566YJ1Wc/F001rfCZCCLTWC+6Lougp2zmOs+B7gLOetmgYhmGceyZzZRiGYZwzAwMDvOENb+DWW2+l2Wye1fdu3LiRhx9+eMH33XPPPUgpueiii7r3Pfzww7Tb7e7X999/P6VSieXLlzM9Pc2ePXv4kz/5E6677jo2btzI7OzsCz+wbP/uueeeBffdc889bNiwAcuyABgaGuLEiRPdx/fu3XvWpec3btzIAw88sOC++++//3nutWEYhrGYTHBlGIZhnFP/7b/9N+I45qqrruKb3/wmu3fvZs+ePfz93/89jz/+eDcQOd273/1ucrkcN910E48++ig//vGP+cM//EP+w3/4DwwPD3e3C8OQ97znPezatYvvf//7fPKTn+SWW25BSklfXx8DAwN89atfZd++ffzoRz/igx/84KIc14c+9CHuvPNOPvOZz/DEE0/w9a9/nVtvvZUPf/jD3W1e//rXc+utt7J9+3Yeeugh3ve+9y3IUj0X//k//2duv/12vvCFL7B3715uvfVWbr/99kU5BsMwDOOFMcGVYRiGcU6tXbuW7du3c/311/Oxj32MzZs3c9VVV/GlL32JD3/4w3zmM5854/cVCgX+9V//lZmZGa6++mp+8zd/k+uuu45bb711wXbXXXcd69ev57WvfS3vfOc7+bVf+zU+9alPASCl5Bvf+Abbtm3j0ksv5Y/+6I/4/Oc/vyjH9cpXvpJ//Md/5Bvf+AaXXnopn/jEJ/jTP/1Tfud3fqe7zV/+5V+yfPlyXvOa1/Cud72LD3/4wwvWhz0Xr3rVq/ja177GX//1X7N582Z+8IMf8Cd/8ieLcgyGYRjGCyP06ZO/DcMwDMMwDMMwjLNmMleGYRiGYRiGYRiLwARXhmEYhmEYhmEYi8AEV4ZhGIZhGIZhGIvABFeGYRiGYRiGYRiLwARXhmEYhmEYhmEYi8AEV4ZhGIZhGIZhGIvABFeGYRiGYRiGYRiLwARXhmEYhmEYhmEYi8AEV4ZhGIZhGIZhGIvABFeGYRiGYRiGYRiLwARXhmEYhmEYhmEYi+D/B0RxISt1qvBhAAAAAElFTkSuQmCC",
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ }
+ ],
+ "source": [
+ "grouped_df = df.groupby([\"kd\", \"global_round\", \"p\"])\n",
+ "df_mean = grouped_df[[\"loss\", \"accuracy\"]].mean()\n",
+ "df_std = grouped_df[[\"loss\", \"accuracy\"]].std()\n",
+ "\n",
+ "df_plot = df_mean.merge(\n",
+ " df_std, left_index=True, right_index=True, suffixes=(\"_mean\", \"_std\")\n",
+ ")\n",
+ "grouped_df = df_plot.reset_index().groupby([\"kd\"])\n",
+ "\n",
+ "for i, (group_name, group_data) in enumerate(grouped_df):\n",
+ " gd = group_data.groupby(\"p\")\n",
+ " plt.figure(figsize=(10, 4))\n",
+ " title = \"FjORD w/ KD\" if bool(group_name) else \"FjORD\"\n",
+ " filename_suffix = \"fjord_kd\" if bool(group_name) else \"fjord\"\n",
+ " plt.title(f\"ResNet18 - CIFAR10 - {title}\")\n",
+ " colors = plt.cm.viridis(np.linspace(0, 1, len(gd)))\n",
+ " for j, (p, p_data) in enumerate(gd):\n",
+ " plt.plot(\n",
+ " p_data[\"global_round\"],\n",
+ " p_data[\"loss_mean\"],\n",
+ " color=colors[j],\n",
+ " alpha=0.8,\n",
+ " label=f\"p={p}\",\n",
+ " )\n",
+ " plt.fill_between(\n",
+ " p_data[\"global_round\"],\n",
+ " p_data[\"loss_mean\"] - p_data[\"loss_std\"],\n",
+ " p_data[\"loss_mean\"] + p_data[\"loss_std\"],\n",
+ " alpha=0.1,\n",
+ " color=colors[j],\n",
+ " )\n",
+ " plt.xlabel(\"Global round\")\n",
+ " plt.ylabel(\"Loss\")\n",
+ " plt.legend()\n",
+ " plt.grid()\n",
+ "\n",
+ " plt.savefig(\n",
+ " f\"../_static/resnet18_cifar10_{filename_suffix}_convergence.png\",\n",
+ " dpi=300,\n",
+ " bbox_inches=\"tight\",\n",
+ " )"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": []
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "fjord",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.9.12"
+ },
+ "orig_nbformat": 4
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/baselines/fjord/pyproject.toml b/baselines/fjord/pyproject.toml
new file mode 100644
index 000000000000..d8a9ae307d7c
--- /dev/null
+++ b/baselines/fjord/pyproject.toml
@@ -0,0 +1,146 @@
+[build-system]
+requires = ["poetry-core>=1.4.0"]
+build-backend = "poetry.masonry.api"
+
+[tool.poetry]
+name = "fjord"
+version = "1.0.0"
+description = "FjORD implementation of Federated Ordered Dropout in Flower"
+license = "Apache-2.0"
+authors = ["Steve Laskaridis ", "Samuel Horvath "]
+readme = "README.md"
+homepage = "https://flower.dev"
+repository = "https://github.com/adap/flower"
+documentation = "https://flower.dev"
+classifiers = [
+ "Development Status :: 3 - Alpha",
+ "Intended Audience :: Developers",
+ "Intended Audience :: Science/Research",
+ "License :: OSI Approved :: Apache Software License",
+ "Operating System :: MacOS :: MacOS X",
+ "Operating System :: POSIX :: Linux",
+ "Programming Language :: Python",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3 :: Only",
+ "Programming Language :: Python :: 3.8",
+ "Programming Language :: Python :: 3.9",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: Implementation :: CPython",
+ "Topic :: Scientific/Engineering",
+ "Topic :: Scientific/Engineering :: Artificial Intelligence",
+ "Topic :: Scientific/Engineering :: Mathematics",
+ "Topic :: Software Development",
+ "Topic :: Software Development :: Libraries",
+ "Topic :: Software Development :: Libraries :: Python Modules",
+ "Typing :: Typed",
+]
+
+[tool.poetry.dependencies]
+python = ">=3.10.0, <3.11.0"
+flwr = { extras = ["simulation"], version = "1.5.0" }
+hydra-core = "1.3.2"
+matplotlib = "3.7.1"
+coloredlogs = "15.0.1"
+omegaconf = "2.3.0"
+tqdm = "4.65.0"
+torch = { url = "https://download.pytorch.org/whl/cu117/torch-2.0.1%2Bcu117-cp310-cp310-linux_x86_64.whl"}
+torchvision = { url = "https://download.pytorch.org/whl/cu117/torchvision-0.15.2%2Bcu117-cp310-cp310-linux_x86_64.whl"}
+
+
+
+[tool.poetry.dev-dependencies]
+isort = "==5.11.5"
+black = "==23.1.0"
+docformatter = "==1.5.1"
+mypy = "==1.4.1"
+pylint = "==2.8.2"
+flake8 = "==3.9.2"
+pytest = "==6.2.4"
+pytest-watch = "==4.2.0"
+ruff = "==0.0.272"
+types-requests = "==2.27.7"
+
+[tool.isort]
+line_length = 88
+indent = " "
+multi_line_output = 3
+include_trailing_comma = true
+force_grid_wrap = 0
+use_parentheses = true
+
+[tool.black]
+line-length = 88
+target-version = ["py38", "py39", "py310", "py311"]
+
+[tool.pytest.ini_options]
+minversion = "6.2"
+addopts = "-qq"
+testpaths = [
+ "flwr_baselines",
+]
+
+[tool.mypy]
+ignore_missing_imports = true
+strict = false
+plugins = "numpy.typing.mypy_plugin"
+
+[tool.pylint."MESSAGES CONTROL"]
+disable = "duplicate-code,too-few-public-methods,useless-import-alias"
+good-names = "i,j,k,_,x,y,X,Y,fl,lr,p,p_,bn,NUM_CLIENTS,od,m,g,ResNet18,FJORD_CONFIG_TYPE"
+signature-mutators="hydra.main.main"
+
+[tool.pylint.typecheck]
+generated-members="numpy.*, torch.*, tensorflow.*"
+
+
+[[tool.mypy.overrides]]
+module = [
+ "importlib.metadata.*",
+ "importlib_metadata.*",
+]
+follow_imports = "skip"
+follow_imports_for_stubs = true
+disallow_untyped_calls = false
+
+[[tool.mypy.overrides]]
+module = "torch.*"
+follow_imports = "skip"
+follow_imports_for_stubs = true
+
+[tool.docformatter]
+wrap-summaries = 88
+wrap-descriptions = 88
+
+[tool.ruff]
+target-version = "py38"
+line-length = 88
+select = ["D", "E", "F", "W", "B", "ISC", "C4"]
+fixable = ["D", "E", "F", "W", "B", "ISC", "C4"]
+ignore = ["B024", "B027"]
+exclude = [
+ ".bzr",
+ ".direnv",
+ ".eggs",
+ ".git",
+ ".hg",
+ ".mypy_cache",
+ ".nox",
+ ".pants.d",
+ ".pytype",
+ ".ruff_cache",
+ ".svn",
+ ".tox",
+ ".venv",
+ "__pypackages__",
+ "_build",
+ "buck-out",
+ "build",
+ "dist",
+ "node_modules",
+ "venv",
+ "proto",
+]
+
+[tool.ruff.pydocstyle]
+convention = "numpy"
diff --git a/baselines/fjord/requirements.txt b/baselines/fjord/requirements.txt
new file mode 100644
index 000000000000..35583b1a45c4
--- /dev/null
+++ b/baselines/fjord/requirements.txt
@@ -0,0 +1,8 @@
+coloredlogs==15.0.1
+hydra-core==1.3.2
+flwr==1.5.0
+omegaconf==2.3.0
+ray==2.6.3
+torch==2.0.1
+torchvision==0.15.2
+tqdm==4.65.0
diff --git a/baselines/fjord/scripts/run.sh b/baselines/fjord/scripts/run.sh
new file mode 100755
index 000000000000..ab4571724e2f
--- /dev/null
+++ b/baselines/fjord/scripts/run.sh
@@ -0,0 +1,16 @@
+#!/bin/bash
+
+RUN_LOG_DIR=${RUN_LOG_DIR:-"exp_logs"}
+
+pushd ../
+mkdir -p $RUN_LOG_DIR
+for seed in 123 124 125; do
+ echo "Running seed $seed"
+
+ echo "Running without KD ..."
+ poetry run python -m fjord.main ++manual_seed=$seed |& tee $RUN_LOG_DIR/wout_kd_$seed.log
+
+ echo "Running with KD ..."
+ poetry run python -m fjord.main +train_mode=fjord_kd ++manual_seed=$seed |& tee $RUN_LOG_DIR/w_kd_$seed.log
+done
+popd
diff --git a/baselines/fjord/setup.py b/baselines/fjord/setup.py
new file mode 100644
index 000000000000..aa09948c34fc
--- /dev/null
+++ b/baselines/fjord/setup.py
@@ -0,0 +1,14 @@
+"""Setup fjord package."""
+from setuptools import find_packages, setup
+
+VERSION = "0.1.0"
+DESCRIPTION = "FjORD Flwr package"
+LONG_DESCRIPTION = "Implementation of FjORD as a flwr baseline"
+
+setup(
+ name="fjord",
+ version=VERSION,
+ description=DESCRIPTION,
+ long_description=LONG_DESCRIPTION,
+ packages=find_packages(),
+)
diff --git a/baselines/flwr_baselines/dev/bootstrap.sh b/baselines/flwr_baselines/dev/bootstrap.sh
index eaa3a0bb046b..0bc322edc0de 100755
--- a/baselines/flwr_baselines/dev/bootstrap.sh
+++ b/baselines/flwr_baselines/dev/bootstrap.sh
@@ -6,8 +6,8 @@ cd "$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"/../
./dev/rm-caches.sh
# Upgrade/install spcific versions of `pip`, `setuptools`, and `poetry`
-python -m pip install -U pip==23.1.2
-python -m pip install -U setuptools==68.0.0
+python -m pip install -U pip==23.3.1
+python -m pip install -U setuptools==68.2.2
python -m pip install -U poetry==1.5.1
# Use `poetry` to install project dependencies
diff --git a/baselines/moon/LICENSE b/baselines/moon/LICENSE
new file mode 100644
index 000000000000..d64569567334
--- /dev/null
+++ b/baselines/moon/LICENSE
@@ -0,0 +1,202 @@
+
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
+
+ APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "[]"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+ Copyright [yyyy] [name of copyright owner]
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
diff --git a/baselines/moon/README.md b/baselines/moon/README.md
new file mode 100644
index 000000000000..05ab4ef68469
--- /dev/null
+++ b/baselines/moon/README.md
@@ -0,0 +1,146 @@
+---
+title: Model-Contrastive Federated Learning
+url: https://arxiv.org/abs/2103.16257
+labels: [data heterogeneity, image classification, cross-silo, constrastive-learning]
+dataset: [CIFAR-10, CIFAR-100]
+---
+
+# Model-Contrastive Federated Learning
+
+> Note: If you use this baseline in your work, please remember to cite the original authors of the paper as well as the Flower paper.
+
+
+**Paper:** [arxiv.org/abs/2103.16257](https://arxiv.org/abs/2103.16257)
+
+**Authors:** Qinbin Li, Bingsheng He, Dawn Song
+
+**Abstract:** Federated learning enables multiple parties to collaboratively train a machine learning model without communicating their local data. A key challenge in federated learning is to handle the heterogeneity of local data distribution across parties. Although many studies have been proposed to address this challenge, we find that they fail to achieve high performance in image datasets with deep learning models. In this paper, we propose MOON: modelcontrastive federated learning. MOON is a simple and effective federated learning framework. The key idea of MOON is to utilize the similarity between model representations to correct the local training of individual parties, i.e., conducting contrastive learning in model-level. Our extensive experiments show that MOON significantly outperforms the other state-of-the-art federated learning algorithms on various image classification tasks.
+
+
+
+## About this baseline
+
+**What’s implemented:** The code in this directory replicates the experiments in *Model-Contrastive Federated Learning* (Li et al., 2021), which proposed the MOON algorithm. Concretely ,it replicates the results of MOON for CIFAR-10 and CIFAR-100 in Table 1.
+
+**Datasets:** CIFAR-10 and CIFAR-100
+
+**Hardware Setup:** The experiments are run on a server with 4x Intel Xeon Gold 6226R and 8x Nvidia GeForce RTX 3090. A machine with at least 1x 16GB GPU should be able to run the experiments in a reasonable time.
+
+****Contributors:**** [Qinbin Li](https://qinbinli.com)
+
+**Description:** MOON requires to compute the model-contrastive loss in local training, which requires access to the local model of the previous round (Lines 14-17 of Algorithm 1 of the paper). Since currently `FlowerClient` does not preserve the states when starting a new round, we store the local models into the specified `model.dir` in local training indexed by the client ID, which will be loaded to the corresponding client in the next round.
+
+## Experimental Setup
+
+**Task:** Image classification.
+
+**Model:** This directory implements two models as same as the paper:
+* A simple-CNN with a projection head for CIFAR-10
+* A ResNet-50 with a projection head for CIFAR-100.
+
+**Dataset:** This directory includes CIFAR-10 and CIFAR-100. They are partitioned in the same way as the paper. The settings are as follow:
+
+| Dataset | partitioning method |
+| :------ | :---: |
+| CIFAR-10 | Dirichlet with beta 0.5 |
+| CIFAR-100 | Dirichlet with beta 0.5 |
+
+
+**Training Hyperparameters:**
+
+| Description | Default Value |
+| ----------- | ----- |
+| number of clients | 10 |
+| number of local epochs | 10 |
+| fraction fit | 1.0 |
+| batch size | 64 |
+| learning rate | 0.01 |
+| mu | 1 |
+| temperature | 0.5 |
+| alg | moon |
+| seed | 0 |
+| service_device | cpu |
+| number of rounds | 100 |
+| client resources | {'num_cpus': 2.0, 'num_gpus': 0.0 }|
+
+## Environment Setup
+
+To construct the Python environment follow these steps:
+
+```bash
+# Set local python version via pyenv
+pyenv local 3.10.6
+# Then fix that for poetry
+poetry env use 3.10.6
+# Then install poetry env
+poetry install
+
+# Activate the environment
+poetry shell
+```
+
+
+## Running the Experiments
+
+First ensure you have activated your Poetry environment (execute `poetry shell` from this directory). To run MOON on CIFAR-10 (Table 1 of the paper), you should run:
+```bash
+python -m moon.main --config-name cifar10
+```
+
+To run MOON on CIFAR-100 (Table 1 of the paper), you should run:
+```bash
+python -m moon.main --config-name cifar100
+```
+
+
+You can also run FedProx on CIFAR-10:
+```bash
+python -m moon.main --config-name cifar10_fedprox
+```
+
+To run FedProx on CIFAR-100:
+```bash
+python -m moon.main --config-name cifar100_fedprox
+```
+
+## Expected Results
+
+You can find the output logs of a single run in this [link](https://drive.google.com/drive/folders/1YZEU2NcHWEHVyuJMlc1QvBSAvNMjH-aR?usp=share_link). After running the above commands, you can see the accuracy list at the end of the ouput, which is the test accuracy of the global model. For example, in one running, for CIFAR-10 with MOON, the accuracy after running 100 rounds is 0.7071.
+
+For CIFAR-10 with FedProx, the accuracy after running 100 rounds is 0.6852. For CIFAR100 with MOON, the accuracy after running 100 rounds is 0.6636. For CIFAR100 with FedProx, the accuracy after running 100 rounds is 0.6494. The results are summarized below:
+
+
+| | CIFAR-10 | CIFAR-100 |
+| ----------- | ----- | ----- |
+| MOON | 0.7071 | 0.6636 |
+| FedProx| 0.6852 | 0.6494 |
+
+### Figure 6
+You can find the curve comparing MOON and FedProx on CIFAR-10 and CIFAR-100 below.
+
+
+
+
+
+You can tune the hyperparameter `mu` for both MOON and FedProx by changing the configuration file in `conf`.
+
+### Figure 8(a)
+You can run the experiments in Figure 8 of the paper. To run MOON (`mu=10`) on CIFAR-100 with 50 clients (Figure 8(a) of the paper):
+```bash
+python -m moon.main --config-name cifar100_50clients
+```
+
+To run FedProx on CIFAR-100 with 50 clients (Figure 8(a) of the paper):
+```bash
+python -m moon.main --config-name cifar100_50clients_fedprox
+```
+
+
+You can find the curve presenting MOON and FedProx below.
+
+
+
+You may also run MOON on CIFAR-100 with 100 clients (Figure 8(b) of the paper):
+```bash
+python -m moon.main --config-name cifar100_100clients
+```
\ No newline at end of file
diff --git a/baselines/moon/_static/cifar100_50clients_moon_fedprox.png b/baselines/moon/_static/cifar100_50clients_moon_fedprox.png
new file mode 100644
index 000000000000..ecc1c99de230
Binary files /dev/null and b/baselines/moon/_static/cifar100_50clients_moon_fedprox.png differ
diff --git a/baselines/moon/_static/cifar100_moon_fedprox.png b/baselines/moon/_static/cifar100_moon_fedprox.png
new file mode 100644
index 000000000000..798d778cd1cc
Binary files /dev/null and b/baselines/moon/_static/cifar100_moon_fedprox.png differ
diff --git a/baselines/moon/_static/cifar10_moon_fedprox.png b/baselines/moon/_static/cifar10_moon_fedprox.png
new file mode 100644
index 000000000000..f5f18f1c08e9
Binary files /dev/null and b/baselines/moon/_static/cifar10_moon_fedprox.png differ
diff --git a/baselines/moon/moon/__init__.py b/baselines/moon/moon/__init__.py
new file mode 100644
index 000000000000..a5e567b59135
--- /dev/null
+++ b/baselines/moon/moon/__init__.py
@@ -0,0 +1 @@
+"""Template baseline package."""
diff --git a/baselines/moon/moon/client.py b/baselines/moon/moon/client.py
new file mode 100644
index 000000000000..4903140009b5
--- /dev/null
+++ b/baselines/moon/moon/client.py
@@ -0,0 +1,155 @@
+"""Define your client class and a function to construct such clients.
+
+Please overwrite `flwr.client.NumPyClient` or `flwr.client.Client` and create a function
+to instantiate your client.
+"""
+
+import copy
+import os
+from collections import OrderedDict
+from typing import Callable, Dict, List, Tuple
+
+import flwr as fl
+import torch
+from flwr.common.typing import NDArrays, Scalar
+from omegaconf import DictConfig
+from torch.utils.data import DataLoader
+
+from moon.models import init_net, train_fedprox, train_moon
+
+
+# pylint: disable=too-many-instance-attributes
+class FlowerClient(fl.client.NumPyClient):
+ """Standard Flower client for CNN training."""
+
+ def __init__(
+ self,
+ # net: torch.nn.Module,
+ net_id: int,
+ dataset: str,
+ model: str,
+ output_dim: int,
+ trainloader: DataLoader,
+ valloader: DataLoader,
+ device: torch.device,
+ num_epochs: int,
+ learning_rate: float,
+ mu: float,
+ temperature: float,
+ model_dir: str,
+ alg: str,
+ ): # pylint: disable=too-many-arguments
+ self.net = init_net(dataset, model, output_dim)
+ self.net_id = net_id
+ self.dataset = dataset
+ self.model = model
+ self.output_dim = output_dim
+ self.trainloader = trainloader
+ self.valloader = valloader
+ self.device = device
+ self.num_epochs = num_epochs
+ self.learning_rate = learning_rate
+ self.mu = mu # pylint: disable=invalid-name
+ self.temperature = temperature
+ self.model_dir = model_dir
+ self.alg = alg
+
+ def get_parameters(self, config: Dict[str, Scalar]) -> NDArrays:
+ """Return the parameters of the current net."""
+ return [val.cpu().numpy() for _, val in self.net.state_dict().items()]
+
+ def set_parameters(self, parameters: NDArrays) -> None:
+ """Change the parameters of the model using the given ones."""
+ params_dict = zip(self.net.state_dict().keys(), parameters)
+ state_dict = OrderedDict({k: torch.from_numpy(v) for k, v in params_dict})
+ self.net.load_state_dict(state_dict, strict=True)
+
+ def fit(
+ self, parameters: NDArrays, config: Dict[str, Scalar]
+ ) -> Tuple[NDArrays, int, Dict]:
+ """Implement distributed fit function for a given client."""
+ self.set_parameters(parameters)
+ prev_net = init_net(self.dataset, self.model, self.output_dim)
+ if not os.path.exists(os.path.join(self.model_dir, str(self.net_id))):
+ prev_net = copy.deepcopy(self.net)
+ else:
+ # load previous model from model_dir
+ prev_net.load_state_dict(
+ torch.load(
+ os.path.join(self.model_dir, str(self.net_id), "prev_net.pt")
+ )
+ )
+ global_net = init_net(self.dataset, self.model, self.output_dim)
+ global_net.load_state_dict(self.net.state_dict())
+ if self.alg == "moon":
+ train_moon(
+ self.net,
+ global_net,
+ prev_net,
+ self.trainloader,
+ self.num_epochs,
+ self.learning_rate,
+ self.mu,
+ self.temperature,
+ self.device,
+ )
+ elif self.alg == "fedprox":
+ train_fedprox(
+ self.net,
+ global_net,
+ self.trainloader,
+ self.num_epochs,
+ self.learning_rate,
+ self.mu,
+ self.device,
+ )
+ if not os.path.exists(os.path.join(self.model_dir, str(self.net_id))):
+ os.makedirs(os.path.join(self.model_dir, str(self.net_id)))
+ torch.save(
+ self.net.state_dict(),
+ os.path.join(self.model_dir, str(self.net_id), "prev_net.pt"),
+ )
+ return self.get_parameters({}), len(self.trainloader), {"is_straggler": False}
+
+ def evaluate(
+ self, parameters: NDArrays, config: Dict[str, Scalar]
+ ) -> Tuple[float, int, Dict]:
+ """Implement distributed evaluation for a given client."""
+ self.set_parameters(parameters)
+ # skip evaluation in the client-side
+ loss = 0.0
+ accuracy = 0.0
+ return float(loss), len(self.valloader), {"accuracy": float(accuracy)}
+
+
+def gen_client_fn(
+ trainloaders: List[DataLoader],
+ testloaders: List[DataLoader],
+ cfg: DictConfig,
+) -> Callable[[str], FlowerClient]:
+ """Generate the client function that creates the Flower Clients."""
+
+ def client_fn(cid: str) -> FlowerClient:
+ """Create a Flower client representing a single organization."""
+ device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
+
+ trainloader = trainloaders[int(cid)]
+ testloader = testloaders[int(cid)]
+
+ return FlowerClient(
+ int(cid),
+ cfg.dataset.name,
+ cfg.model.name,
+ cfg.model.output_dim,
+ trainloader,
+ testloader,
+ device,
+ cfg.num_epochs,
+ cfg.learning_rate,
+ cfg.mu,
+ cfg.temperature,
+ cfg.model.dir,
+ cfg.alg,
+ )
+
+ return client_fn
diff --git a/baselines/moon/moon/conf/base.yaml b/baselines/moon/moon/conf/base.yaml
new file mode 100644
index 000000000000..a2d3ddfb7bde
--- /dev/null
+++ b/baselines/moon/moon/conf/base.yaml
@@ -0,0 +1,33 @@
+---
+# this is the config that will be loaded as default by main.py
+# Please follow the provided structure (this will ensuring all baseline follow
+# a similar configuration structure and hence be easy to customise)
+
+num_clients: 10
+num_epochs: 10
+fraction_fit: 1.0
+batch_size: 64
+learning_rate: 0.01
+mu: 1
+temperature: 0.5
+alg: moon
+seed: 0
+server_device: cpu
+num_rounds: 100
+
+client_resources:
+ num_cpus: 2
+ num_gpus: 1
+
+dataset:
+ # dataset config
+ name: cifar10
+ dir: ./data/moon/
+ partition: noniid
+ beta: 0.5
+
+model:
+ # model config
+ name: simple-cnn
+ output_dim: 256
+ dir: ./models/moon/
\ No newline at end of file
diff --git a/baselines/moon/moon/conf/cifar10.yaml b/baselines/moon/moon/conf/cifar10.yaml
new file mode 100644
index 000000000000..672427495dfe
--- /dev/null
+++ b/baselines/moon/moon/conf/cifar10.yaml
@@ -0,0 +1,33 @@
+---
+# this is the config that will be loaded as default by main.py
+# Please follow the provided structure (this will ensuring all baseline follow
+# a similar configuration structure and hence be easy to customise)
+
+num_clients: 10
+num_epochs: 10
+fraction_fit: 1.0
+batch_size: 64
+learning_rate: 0.01
+mu: 5
+temperature: 0.5
+alg: moon
+seed: 0
+server_device: cpu
+num_rounds: 100
+
+client_resources:
+ num_cpus: 4
+ num_gpus: 0.2
+
+dataset:
+ # dataset config
+ name: cifar10
+ dir: ./data/moon/
+ partition: noniid
+ beta: 0.5
+
+model:
+ # model config
+ name: simple-cnn
+ output_dim: 256
+ dir: ./client_states/moon/cifar10/
\ No newline at end of file
diff --git a/baselines/moon/moon/conf/cifar100.yaml b/baselines/moon/moon/conf/cifar100.yaml
new file mode 100644
index 000000000000..33dc6d289456
--- /dev/null
+++ b/baselines/moon/moon/conf/cifar100.yaml
@@ -0,0 +1,33 @@
+---
+# this is the config that will be loaded as default by main.py
+# Please follow the provided structure (this will ensuring all baseline follow
+# a similar configuration structure and hence be easy to customise)
+
+num_clients: 10
+num_epochs: 10
+fraction_fit: 1.0
+batch_size: 64
+learning_rate: 0.01
+mu: 1
+temperature: 0.5
+alg: moon
+seed: 0
+server_device: cpu
+num_rounds: 100
+
+client_resources:
+ num_cpus: 4
+ num_gpus: 0.5
+
+dataset:
+ # dataset config
+ name: cifar100
+ dir: ./data/moon/
+ partition: noniid
+ beta: 0.5
+
+model:
+ # model config
+ name: resnet50
+ output_dim: 256
+ dir: ./client_states/moon/cifar100/
\ No newline at end of file
diff --git a/baselines/moon/moon/conf/cifar100_100clients.yaml b/baselines/moon/moon/conf/cifar100_100clients.yaml
new file mode 100644
index 000000000000..b314497b9411
--- /dev/null
+++ b/baselines/moon/moon/conf/cifar100_100clients.yaml
@@ -0,0 +1,33 @@
+---
+# this is the config that will be loaded as default by main.py
+# Please follow the provided structure (this will ensuring all baseline follow
+# a similar configuration structure and hence be easy to customise)
+
+num_clients: 100
+num_epochs: 10
+fraction_fit: 0.2
+batch_size: 64
+learning_rate: 0.01
+mu: 10
+temperature: 0.5
+alg: moon
+seed: 0
+server_device: cpu
+num_rounds: 500
+
+client_resources:
+ num_cpus: 8
+ num_gpus: 0.5
+
+dataset:
+ # dataset config
+ name: cifar100
+ dir: ./data/moon/
+ partition: noniid
+ beta: 0.5
+
+model:
+ # model config
+ name: resnet50
+ output_dim: 256
+ dir: ./client_states/moon/cifar100_100c/
\ No newline at end of file
diff --git a/baselines/moon/moon/conf/cifar100_50clients.yaml b/baselines/moon/moon/conf/cifar100_50clients.yaml
new file mode 100644
index 000000000000..d8c5877a1dcc
--- /dev/null
+++ b/baselines/moon/moon/conf/cifar100_50clients.yaml
@@ -0,0 +1,33 @@
+---
+# this is the config that will be loaded as default by main.py
+# Please follow the provided structure (this will ensuring all baseline follow
+# a similar configuration structure and hence be easy to customise)
+
+num_clients: 50
+num_epochs: 10
+fraction_fit: 1.0
+batch_size: 64
+learning_rate: 0.01
+mu: 10
+temperature: 0.5
+alg: moon
+seed: 0
+server_device: cpu
+num_rounds: 200
+
+client_resources:
+ num_cpus: 4
+ num_gpus: 0.5
+
+dataset:
+ # dataset config
+ name: cifar100
+ dir: ./data/moon/
+ partition: noniid
+ beta: 0.5
+
+model:
+ # model config
+ name: resnet50
+ output_dim: 256
+ dir: ./client_states/moon/cifar100_50clients/
\ No newline at end of file
diff --git a/baselines/moon/moon/conf/cifar100_50clients_fedprox.yaml b/baselines/moon/moon/conf/cifar100_50clients_fedprox.yaml
new file mode 100644
index 000000000000..69691021438a
--- /dev/null
+++ b/baselines/moon/moon/conf/cifar100_50clients_fedprox.yaml
@@ -0,0 +1,33 @@
+---
+# this is the config that will be loaded as default by main.py
+# Please follow the provided structure (this will ensuring all baseline follow
+# a similar configuration structure and hence be easy to customise)
+
+num_clients: 50
+num_epochs: 10
+fraction_fit: 1.0
+batch_size: 64
+learning_rate: 0.01
+mu: 0.001
+temperature: 0.5
+alg: fedprox
+seed: 0
+server_device: cpu
+num_rounds: 200
+
+client_resources:
+ num_cpus: 4
+ num_gpus: 0.5
+
+dataset:
+ # dataset config
+ name: cifar100
+ dir: ./data/moon/
+ partition: noniid
+ beta: 0.5
+
+model:
+ # model config
+ name: resnet50
+ output_dim: 256
+ dir: ./client_states/fedprox/cifar100_50clients/
\ No newline at end of file
diff --git a/baselines/moon/moon/conf/cifar100_fedprox.yaml b/baselines/moon/moon/conf/cifar100_fedprox.yaml
new file mode 100644
index 000000000000..1544f8e3a348
--- /dev/null
+++ b/baselines/moon/moon/conf/cifar100_fedprox.yaml
@@ -0,0 +1,33 @@
+---
+# this is the config that will be loaded as default by main.py
+# Please follow the provided structure (this will ensuring all baseline follow
+# a similar configuration structure and hence be easy to customise)
+
+num_clients: 10
+num_epochs: 10
+fraction_fit: 1.0
+batch_size: 64
+learning_rate: 0.01
+mu: 0.001
+temperature: 0.5
+alg: moon
+seed: 0
+server_device: cpu
+num_rounds: 100
+
+client_resources:
+ num_cpus: 4
+ num_gpus: 0.5
+
+dataset:
+ # dataset config
+ name: cifar100
+ dir: ./data/moon/
+ partition: noniid
+ beta: 0.5
+
+model:
+ # model config
+ name: resnet50
+ output_dim: 256
+ dir: ./client_states/moon/cifar100_fedprox/
\ No newline at end of file
diff --git a/baselines/moon/moon/conf/cifar10_fedprox.yaml b/baselines/moon/moon/conf/cifar10_fedprox.yaml
new file mode 100644
index 000000000000..d0f9c5e8e163
--- /dev/null
+++ b/baselines/moon/moon/conf/cifar10_fedprox.yaml
@@ -0,0 +1,33 @@
+---
+# this is the config that will be loaded as default by main.py
+# Please follow the provided structure (this will ensuring all baseline follow
+# a similar configuration structure and hence be easy to customise)
+
+num_clients: 10
+num_epochs: 10
+fraction_fit: 1.0
+batch_size: 64
+learning_rate: 0.01
+mu: 0.001
+temperature: 0.5
+alg: fedprox
+seed: 0
+server_device: cpu
+num_rounds: 100
+
+client_resources:
+ num_cpus: 4
+ num_gpus: 0.2
+
+dataset:
+ # dataset config
+ name: cifar10
+ dir: ./data/moon/
+ partition: noniid
+ beta: 0.5
+
+model:
+ # model config
+ name: simple-cnn
+ output_dim: 256
+ dir: ./client_states/moon/cifar10_fedprox/
\ No newline at end of file
diff --git a/baselines/moon/moon/dataset.py b/baselines/moon/moon/dataset.py
new file mode 100644
index 000000000000..0ec5c6ae9e27
--- /dev/null
+++ b/baselines/moon/moon/dataset.py
@@ -0,0 +1,271 @@
+"""Handle basic dataset creation.
+
+In case of PyTorch it should return dataloaders for your dataset (for both the clients
+and the server). If you are using a custom dataset class, this module is the place to
+define it. If your dataset requires to be downloaded (and this is not done
+automatically -- e.g. as it is the case for many dataset in TorchVision) and
+partitioned, please include all those functions and logic in the
+`dataset_preparation.py` module. You can use all those functions from functions/methods
+defined here of course.
+"""
+
+# https://github.com/QinbinLi/MOON/blob/main/datasets.py
+
+import logging
+import os
+
+import numpy as np
+import torch.nn.functional as F
+import torch.utils.data as data
+import torchvision
+import torchvision.transforms as transforms
+from PIL import Image
+from torch.autograd import Variable
+from torchvision.datasets import CIFAR10, CIFAR100
+
+logging.basicConfig()
+logger = logging.getLogger()
+logger.setLevel(logging.INFO)
+
+IMG_EXTENSIONS = (
+ ".jpg",
+ ".jpeg",
+ ".png",
+ ".ppm",
+ ".bmp",
+ ".pgm",
+ ".tif",
+ ".tiff",
+ ".webp",
+)
+
+
+class CIFAR10Sub(data.Dataset):
+ """CIFAR-10 dataset with idxs."""
+
+ def __init__(
+ self,
+ root,
+ dataidxs=None,
+ train=True,
+ transform=None,
+ target_transform=None,
+ download=False,
+ ):
+ self.root = root
+ self.dataidxs = dataidxs
+ self.train = train
+ self.transform = transform
+ self.target_transform = target_transform
+ self.download = download
+
+ self.data, self.target = self.__build_sub_dataset__()
+
+ def __build_sub_dataset__(self):
+ """Build sub dataset given idxs."""
+ cifar_dataobj = CIFAR10(
+ self.root, self.train, self.transform, self.target_transform, self.download
+ )
+
+ if torchvision.__version__ == "0.2.1":
+ if self.train:
+ # pylint: disable=redefined-outer-name
+ data, target = cifar_dataobj.train_data, np.array(
+ cifar_dataobj.train_labels
+ )
+ else:
+ # pylint: disable=redefined-outer-name
+ data, target = cifar_dataobj.test_data, np.array(
+ cifar_dataobj.test_labels
+ )
+ else:
+ data = cifar_dataobj.data
+ target = np.array(cifar_dataobj.targets)
+
+ if self.dataidxs is not None:
+ data = data[self.dataidxs]
+ target = target[self.dataidxs]
+
+ return data, target
+
+ def __getitem__(self, index):
+ """Get item by index.
+
+ Args:
+ index (int): Index.
+
+ Returns
+ -------
+ tuple: (image, target) where target is index of the target class.
+ """
+ img, target = self.data[index], self.target[index]
+
+ if self.transform is not None:
+ img = self.transform(img)
+
+ if self.target_transform is not None:
+ target = self.target_transform(target)
+
+ return img, target
+
+ def __len__(self):
+ """Length.
+
+ Returns
+ -------
+ int: length of data
+ """
+ return len(self.data)
+
+
+class CIFAR100Sub(data.Dataset):
+ """CIFAR-100 dataset with idxs."""
+
+ def __init__(
+ self,
+ root,
+ dataidxs=None,
+ train=True,
+ transform=None,
+ target_transform=None,
+ download=False,
+ ):
+ self.root = root
+ self.dataidxs = dataidxs
+ self.train = train
+ self.transform = transform
+ self.target_transform = target_transform
+ self.download = download
+
+ self.data, self.target = self.__build_sub_dataset__()
+
+ def __build_sub_dataset__(self):
+ """Build sub dataset given idxs."""
+ cifar_dataobj = CIFAR100(
+ self.root, self.train, self.transform, self.target_transform, self.download
+ )
+
+ if torchvision.__version__ == "0.2.1":
+ if self.train:
+ # pylint: disable=redefined-outer-name
+ data, target = cifar_dataobj.train_data, np.array(
+ cifar_dataobj.train_labels
+ )
+ else:
+ data, target = cifar_dataobj.test_data, np.array(
+ cifar_dataobj.test_labels
+ ) # pylint: disable=redefined-outer-name
+ else:
+ data = cifar_dataobj.data
+ target = np.array(cifar_dataobj.targets)
+
+ if self.dataidxs is not None:
+ data = data[self.dataidxs]
+ target = target[self.dataidxs]
+
+ return data, target
+
+ def __getitem__(self, index):
+ """Get item by index.
+
+ Args:
+ index (int): Index.
+
+ Returns
+ -------
+ tuple: (image, target) where target is index of the target class.
+ """
+ img, target = self.data[index], self.target[index]
+ img = Image.fromarray(img)
+
+ if self.transform is not None:
+ img = self.transform(img)
+
+ if self.target_transform is not None:
+ target = self.target_transform(target)
+
+ return img, target
+
+ def __len__(self):
+ """Length.
+
+ Returns
+ -------
+ int: length of data
+ """
+ return len(self.data)
+
+
+def get_dataloader(dataset, datadir, train_bs, test_bs, dataidxs=None, noise_level=0):
+ """Get dataloader for a given dataset."""
+ if dataset == "cifar10":
+ dl_obj = CIFAR10Sub
+ normalize = transforms.Normalize(
+ mean=[x / 255.0 for x in [125.3, 123.0, 113.9]],
+ std=[x / 255.0 for x in [63.0, 62.1, 66.7]],
+ )
+ transform_train = transforms.Compose(
+ [
+ transforms.ToTensor(),
+ transforms.Lambda(
+ lambda x: F.pad(
+ Variable(x.unsqueeze(0), requires_grad=False),
+ (4, 4, 4, 4),
+ mode="reflect",
+ ).data.squeeze()
+ ),
+ transforms.ToPILImage(),
+ transforms.ColorJitter(brightness=noise_level),
+ transforms.RandomCrop(32),
+ transforms.RandomHorizontalFlip(),
+ transforms.ToTensor(),
+ normalize,
+ ]
+ )
+ # data prep for test set
+ transform_test = transforms.Compose([transforms.ToTensor(), normalize])
+
+ elif dataset == "cifar100":
+ dl_obj = CIFAR100Sub
+
+ normalize = transforms.Normalize(
+ mean=[0.5070751592371323, 0.48654887331495095, 0.4409178433670343],
+ std=[0.2673342858792401, 0.2564384629170883, 0.27615047132568404],
+ )
+
+ transform_train = transforms.Compose(
+ [
+ transforms.RandomCrop(32, padding=4),
+ transforms.RandomHorizontalFlip(),
+ transforms.RandomRotation(15),
+ transforms.ToTensor(),
+ normalize,
+ ]
+ )
+ # data prep for test set
+ transform_test = transforms.Compose([transforms.ToTensor(), normalize])
+ if dataset == "cifar10" and os.path.isdir(
+ os.path.join(datadir, "cifar-10-batches-py")
+ ):
+ download = False
+ elif dataset == "cifar100" and os.path.isdir(
+ os.path.join(datadir, "cifar-100-python")
+ ):
+ download = False
+ else:
+ download = True
+ train_ds = dl_obj(
+ datadir,
+ dataidxs=dataidxs,
+ train=True,
+ transform=transform_train,
+ download=download,
+ )
+ test_ds = dl_obj(datadir, train=False, transform=transform_test, download=download)
+
+ train_dl = data.DataLoader(
+ dataset=train_ds, batch_size=train_bs, drop_last=True, shuffle=True
+ )
+ test_dl = data.DataLoader(dataset=test_ds, batch_size=test_bs, shuffle=False)
+
+ return train_dl, test_dl, train_ds, test_ds
diff --git a/baselines/moon/moon/dataset_preparation.py b/baselines/moon/moon/dataset_preparation.py
new file mode 100644
index 000000000000..11103d37763b
--- /dev/null
+++ b/baselines/moon/moon/dataset_preparation.py
@@ -0,0 +1,100 @@
+"""Handle the dataset partitioning and (optionally) complex downloads.
+
+Please add here all the necessary logic to either download, uncompress, pre/post-process
+your dataset (or all of the above). If the desired way of running your baseline is to
+first download the dataset and partition it and then run the experiments, please
+uncomment the lines below and tell us in the README.md (see the "Running the Experiment"
+block) that this file should be executed first.
+"""
+
+import numpy as np
+import torchvision.transforms as transforms
+
+from moon.dataset import CIFAR10Sub, CIFAR100Sub
+
+
+def load_cifar10_data(datadir):
+ """Load CIFAR10 dataset."""
+ transform = transforms.Compose([transforms.ToTensor()])
+
+ cifar10_train_ds = CIFAR10Sub(
+ datadir, train=True, download=True, transform=transform
+ )
+ cifar10_test_ds = CIFAR10Sub(
+ datadir, train=False, download=True, transform=transform
+ )
+
+ X_train, y_train = cifar10_train_ds.data, cifar10_train_ds.target
+ X_test, y_test = cifar10_test_ds.data, cifar10_test_ds.target
+
+ return (X_train, y_train, X_test, y_test)
+
+
+def load_cifar100_data(datadir):
+ """Load CIFAR100 dataset."""
+ transform = transforms.Compose([transforms.ToTensor()])
+
+ cifar100_train_ds = CIFAR100Sub(
+ datadir, train=True, download=True, transform=transform
+ )
+ cifar100_test_ds = CIFAR100Sub(
+ datadir, train=False, download=True, transform=transform
+ )
+
+ X_train, y_train = cifar100_train_ds.data, cifar100_train_ds.target
+ X_test, y_test = cifar100_test_ds.data, cifar100_test_ds.target
+
+ return (X_train, y_train, X_test, y_test)
+
+
+# pylint: disable=too-many-locals
+def partition_data(dataset, datadir, partition, num_clients, beta):
+ """Partition data into train and test sets for IID and non-IID experiments."""
+ if dataset == "cifar10":
+ X_train, y_train, X_test, y_test = load_cifar10_data(datadir)
+ elif dataset == "cifar100":
+ X_train, y_train, X_test, y_test = load_cifar100_data(datadir)
+
+ n_train = y_train.shape[0]
+
+ if partition in ("homo", "iid"):
+ idxs = np.random.permutation(n_train)
+ batch_idxs = np.array_split(idxs, num_clients)
+ net_dataidx_map = {i: batch_idxs[i] for i in range(num_clients)}
+
+ elif partition in ("noniid-labeldir", "noniid"):
+ min_size = 0
+ min_require_size = 10
+ K = 10
+ if dataset == "cifar100":
+ K = 100
+ elif dataset == "tinyimagenet":
+ K = 200
+
+ N = y_train.shape[0]
+ net_dataidx_map = {}
+
+ while min_size < min_require_size:
+ idx_batch = [[] for _ in range(num_clients)]
+ for k in range(K):
+ idx_k = np.where(y_train == k)[0]
+ np.random.shuffle(idx_k)
+ proportions = np.random.dirichlet(np.repeat(beta, num_clients))
+ proportions = np.array(
+ [
+ p * (len(idx_j) < N / num_clients)
+ for p, idx_j in zip(proportions, idx_batch)
+ ]
+ )
+ proportions = proportions / proportions.sum()
+ proportions = (np.cumsum(proportions) * len(idx_k)).astype(int)[:-1]
+ idx_batch = [
+ idx_j + idx.tolist()
+ for idx_j, idx in zip(idx_batch, np.split(idx_k, proportions))
+ ]
+ min_size = min([len(idx_j) for idx_j in idx_batch])
+ for j in range(num_clients):
+ np.random.shuffle(idx_batch[j])
+ net_dataidx_map[j] = idx_batch[j]
+
+ return (X_train, y_train, X_test, y_test, net_dataidx_map)
diff --git a/baselines/moon/moon/main.py b/baselines/moon/moon/main.py
new file mode 100644
index 000000000000..902ccfa8395c
--- /dev/null
+++ b/baselines/moon/moon/main.py
@@ -0,0 +1,150 @@
+"""Create and connect the building blocks for your experiments; start the simulation.
+
+It includes processioning the dataset, instantiate strategy, specify how the global
+model is going to be evaluated, etc. At the end, this script saves the results.
+"""
+import os
+import random
+import shutil
+from pathlib import Path
+
+# these are the basic packages you'll need here
+# feel free to remove some if aren't needed
+import flwr as fl
+import hydra
+import numpy as np
+import torch
+from hydra.core.hydra_config import HydraConfig
+from omegaconf import DictConfig, OmegaConf
+
+from moon import client, server
+from moon.dataset import get_dataloader
+from moon.dataset_preparation import partition_data
+from moon.utils import plot_metric_from_history
+
+
+@hydra.main(config_path="conf", config_name="base", version_base=None)
+def main(cfg: DictConfig) -> None:
+ """Run the baseline.
+
+ Parameters
+ ----------
+ cfg : DictConfig
+ An omegaconf object that stores the hydra config.
+ """
+ # Clean the model directory to save models for MOON
+ if cfg.alg == "moon":
+ if os.path.exists(cfg.model.dir):
+ shutil.rmtree(cfg.model.dir)
+ # 1. Print parsed config
+ print(OmegaConf.to_yaml(cfg))
+
+ # 2. Prepare your dataset
+ np.random.seed(cfg.seed)
+ torch.manual_seed(cfg.seed)
+ if torch.cuda.is_available():
+ torch.cuda.manual_seed(cfg.seed)
+ random.seed(cfg.seed)
+ (
+ _,
+ _,
+ _,
+ _,
+ net_dataidx_map,
+ ) = partition_data(
+ dataset=cfg.dataset.name,
+ datadir=cfg.dataset.dir,
+ partition=cfg.dataset.partition,
+ num_clients=cfg.num_clients,
+ beta=cfg.dataset.beta,
+ )
+
+ _, test_global_dl, _, _ = get_dataloader(
+ dataset=cfg.dataset.name,
+ datadir=cfg.dataset.dir,
+ train_bs=cfg.batch_size,
+ test_bs=32,
+ )
+
+ trainloaders = []
+ testloaders = []
+ for idx in range(cfg.num_clients):
+ train_dl, test_dl, _, _ = get_dataloader(
+ cfg.dataset.name, cfg.dataset.dir, cfg.batch_size, 32, net_dataidx_map[idx]
+ )
+
+ trainloaders.append(train_dl)
+ testloaders.append(test_dl)
+ # 3. Define your clients
+ # Define a function that returns another function that will be used during
+ # simulation to instantiate each individual client
+ client_fn = client.gen_client_fn(
+ trainloaders=trainloaders,
+ testloaders=testloaders,
+ cfg=cfg,
+ )
+
+ # get function that will executed by the strategy's evaluate() method
+ # Set server's device
+ device = (
+ torch.device("cuda:0")
+ if torch.cuda.is_available() and cfg.server_device == "cuda"
+ else "cpu"
+ )
+ evaluate_fn = server.gen_evaluate_fn(test_global_dl, device=device, cfg=cfg)
+
+ # 4. Define your strategy
+ strategy = fl.server.strategy.FedAvg(
+ # Clients in MOON do not perform federated evaluation
+ # (see the client's evaluate())
+ fraction_fit=cfg.fraction_fit,
+ fraction_evaluate=0.0,
+ evaluate_fn=evaluate_fn,
+ )
+ # 5. Start Simulation
+ # history = fl.simulation.start_simulation()
+ history = fl.simulation.start_simulation(
+ client_fn=client_fn,
+ num_clients=cfg.num_clients,
+ config=fl.server.ServerConfig(num_rounds=cfg.num_rounds),
+ client_resources={
+ "num_cpus": cfg.client_resources.num_cpus,
+ "num_gpus": cfg.client_resources.num_gpus,
+ },
+ strategy=strategy,
+ )
+ # remove saved models
+ if cfg.alg == "moon":
+ shutil.rmtree(cfg.model.dir)
+
+ # 6. Save your results
+ # Experiment completed. Now we save the results and
+ # generate plots using the `history`
+ print("................")
+ print(history)
+
+ # Hydra automatically creates an output directory
+ # Let's retrieve it and save some results there
+ save_path = HydraConfig.get().runtime.output_dir
+
+ # plot results and include them in the readme
+ strategy_name = strategy.__class__.__name__
+ file_suffix: str = (
+ f"_{strategy_name}"
+ f"{'_dataset' if cfg.dataset.name else ''}"
+ f"_C={cfg.num_clients}"
+ f"_B={cfg.batch_size}"
+ f"_E={cfg.num_epochs}"
+ f"_R={cfg.num_rounds}"
+ f"_mu={cfg.mu}"
+ )
+
+ plot_metric_from_history(
+ history,
+ Path(save_path),
+ (file_suffix),
+ )
+
+
+if __name__ == "__main__":
+ main()
diff --git a/baselines/moon/moon/models.py b/baselines/moon/moon/models.py
new file mode 100644
index 000000000000..a323a8e74727
--- /dev/null
+++ b/baselines/moon/moon/models.py
@@ -0,0 +1,528 @@
+"""Define our models, and training and eval functions.
+
+If your model is 100% off-the-shelf (e.g. directly from torchvision without requiring
+modifications) you might be better off instantiating your model directly from the Hydra
+config. In this way, swapping your model for another one can be done without changing
+the python code at all
+"""
+
+
+import torch
+import torch.nn as nn
+import torch.nn.functional as F
+import torch.optim as optim
+
+from moon.utils import compute_accuracy
+
+
+def conv3x3(in_planes, out_planes, stride=1, groups=1, dilation=1):
+ """3x3 convolution with padding."""
+ return nn.Conv2d(
+ in_planes,
+ out_planes,
+ kernel_size=3,
+ stride=stride,
+ padding=dilation,
+ groups=groups,
+ bias=False,
+ dilation=dilation,
+ )
+
+
+def conv1x1(in_planes, out_planes, stride=1):
+ """1x1 convolution."""
+ return nn.Conv2d(in_planes, out_planes, kernel_size=1, stride=stride, bias=False)
+
+
+class BasicBlock(nn.Module):
+ """Basic Block for resnet."""
+
+ expansion = 1
+
+ def __init__(
+ self,
+ inplanes,
+ planes,
+ stride=1,
+ downsample=None,
+ groups=1,
+ base_width=64,
+ dilation=1,
+ norm_layer=None,
+ ):
+ super().__init__()
+ if norm_layer is None:
+ norm_layer = nn.BatchNorm2d
+ if groups != 1 or base_width != 64:
+ raise ValueError("BasicBlock only supports groups=1 and base_width=64")
+ if dilation > 1:
+ raise NotImplementedError("Dilation > 1 not supported in BasicBlock")
+ self.conv1 = conv3x3(inplanes, planes, stride)
+ self.bn1 = norm_layer(planes)
+ self.relu = nn.ReLU(inplace=True)
+ self.conv2 = conv3x3(planes, planes)
+ self.bn2 = norm_layer(planes)
+ self.downsample = downsample
+ self.stride = stride
+
+ def forward(self, x):
+ """Forward."""
+ identity = x
+
+ out = self.conv1(x)
+ out = self.bn1(out)
+ out = self.relu(out)
+
+ out = self.conv2(out)
+ out = self.bn2(out)
+
+ if self.downsample is not None:
+ identity = self.downsample(x)
+
+ out += identity
+ out = self.relu(out)
+
+ return out
+
+
+class Bottleneck(nn.Module):
+ """Bottleneck in torchvision places the stride."""
+
+ expansion = 4
+
+ def __init__(
+ self,
+ inplanes,
+ planes,
+ stride=1,
+ downsample=None,
+ groups=1,
+ base_width=64,
+ dilation=1,
+ norm_layer=None,
+ ):
+ super().__init__()
+ if norm_layer is None:
+ norm_layer = nn.BatchNorm2d
+ width = int(planes * (base_width / 64.0)) * groups
+ self.conv1 = conv1x1(inplanes, width)
+ self.bn1 = norm_layer(width)
+ self.conv2 = conv3x3(width, width, stride, groups, dilation)
+ self.bn2 = norm_layer(width)
+ self.conv3 = conv1x1(width, planes * self.expansion)
+ self.bn3 = norm_layer(planes * self.expansion)
+ self.relu = nn.ReLU(inplace=True)
+ self.downsample = downsample
+ self.stride = stride
+
+ def forward(self, x):
+ """Forward."""
+ identity = x
+
+ out = self.conv1(x)
+ out = self.bn1(out)
+ out = self.relu(out)
+
+ out = self.conv2(out)
+ out = self.bn2(out)
+ out = self.relu(out)
+
+ out = self.conv3(out)
+ out = self.bn3(out)
+
+ if self.downsample is not None:
+ identity = self.downsample(x)
+
+ out += identity
+ out = self.relu(out)
+
+ return out
+
+
+class ResNetCifar10(nn.Module):
+ """ResNet model."""
+
+ def __init__(
+ self,
+ block,
+ layers,
+ num_classes=1000,
+ zero_init_residual=False,
+ groups=1,
+ width_per_group=64,
+ replace_stride_with_dilation=None,
+ norm_layer=None,
+ ):
+ super().__init__()
+ if norm_layer is None:
+ norm_layer = nn.BatchNorm2d
+ self._norm_layer = norm_layer
+
+ self.inplanes = 64
+ self.dilation = 1
+ if replace_stride_with_dilation is None:
+ # each element in the tuple indicates if we should replace
+ # the 2x2 stride with a dilated convolution instead
+ replace_stride_with_dilation = [False, False, False]
+ if len(replace_stride_with_dilation) != 3:
+ raise ValueError(
+ "replace_stride_with_dilation should be None "
+ "or a 3-element tuple, got {}".format(replace_stride_with_dilation)
+ )
+ self.groups = groups
+ self.base_width = width_per_group
+ self.conv1 = nn.Conv2d(
+ 3, self.inplanes, kernel_size=3, stride=1, padding=1, bias=False
+ )
+ self.bn1 = norm_layer(self.inplanes)
+ self.relu = nn.ReLU(inplace=True)
+ self.layer1 = self._make_layer(block, 64, layers[0])
+ self.layer2 = self._make_layer(
+ block, 128, layers[1], stride=2, dilate=replace_stride_with_dilation[0]
+ )
+ self.layer3 = self._make_layer(
+ block, 256, layers[2], stride=2, dilate=replace_stride_with_dilation[1]
+ )
+ self.layer4 = self._make_layer(
+ block, 512, layers[3], stride=2, dilate=replace_stride_with_dilation[2]
+ )
+ self.avgpool = nn.AdaptiveAvgPool2d((1, 1))
+ self.fc = nn.Linear(512 * block.expansion, num_classes)
+
+ for module in self.modules():
+ if isinstance(module, nn.Conv2d):
+ nn.init.kaiming_normal_(
+ module.weight, mode="fan_out", nonlinearity="relu"
+ )
+ elif isinstance(module, (nn.BatchNorm2d, nn.GroupNorm)):
+ nn.init.constant_(module.weight, 1)
+ nn.init.constant_(module.bias, 0)
+
+ if zero_init_residual:
+ for module in self.modules():
+ if isinstance(module, Bottleneck):
+ nn.init.constant_(module.bn3.weight, 0)
+ elif isinstance(module, BasicBlock):
+ nn.init.constant_(module.bn2.weight, 0)
+
+ def _make_layer(self, block, planes, blocks, stride=1, dilate=False):
+ norm_layer = self._norm_layer
+ downsample = None
+ previous_dilation = self.dilation
+ if dilate:
+ self.dilation *= stride
+ stride = 1
+ if stride != 1 or self.inplanes != planes * block.expansion:
+ downsample = nn.Sequential(
+ conv1x1(self.inplanes, planes * block.expansion, stride),
+ norm_layer(planes * block.expansion),
+ )
+
+ layers = []
+ layers.append(
+ block(
+ self.inplanes,
+ planes,
+ stride,
+ downsample,
+ self.groups,
+ self.base_width,
+ previous_dilation,
+ norm_layer,
+ )
+ )
+ self.inplanes = planes * block.expansion
+ for _ in range(1, blocks):
+ layers.append(
+ block(
+ self.inplanes,
+ planes,
+ groups=self.groups,
+ base_width=self.base_width,
+ dilation=self.dilation,
+ norm_layer=norm_layer,
+ )
+ )
+
+ return nn.Sequential(*layers)
+
+ def _forward_impl(self, x):
+ # See note [TorchScript super()]
+ x = self.conv1(x)
+ x = self.bn1(x)
+ x = self.relu(x)
+
+ x = self.layer1(x)
+ x = self.layer2(x)
+ x = self.layer3(x)
+ x = self.layer4(x)
+
+ x = self.avgpool(x)
+ x = torch.flatten(x, 1)
+ x = self.fc(x)
+
+ return x
+
+ def forward(self, x):
+ """Forward."""
+ return self._forward_impl(x)
+
+
+def resnet50_cifar10(**kwargs):
+ r"""ResNet-50 model from `"Deep Residual Learning for Image Recognition".
+
+ `_
+
+ Args:
+ pretrained (bool): If True, returns a model pre-trained on ImageNet
+ progress (bool): If True, displays a progress bar of the download to stderr
+ """
+ return ResNetCifar10(Bottleneck, [3, 4, 6, 3], **kwargs)
+
+
+class SimpleCNNHeader(nn.Module):
+ """Simple CNN model."""
+
+ def __init__(self, input_dim, hidden_dims):
+ super().__init__()
+ self.conv1 = nn.Conv2d(3, 6, 5)
+ self.relu = nn.ReLU()
+ self.pool = nn.MaxPool2d(2, 2)
+ self.conv2 = nn.Conv2d(6, 16, 5)
+
+ self.fc1 = nn.Linear(input_dim, hidden_dims[0])
+ self.fc2 = nn.Linear(hidden_dims[0], hidden_dims[1])
+
+ def forward(self, x):
+ """Forward."""
+ x = self.pool(self.relu(self.conv1(x)))
+ x = self.pool(self.relu(self.conv2(x)))
+ x = x.view(-1, 16 * 5 * 5)
+
+ x = self.relu(self.fc1(x))
+ x = self.relu(self.fc2(x))
+ # x = self.fc3(x)
+ return x
+
+
+class ModelMOON(nn.Module):
+ """Model for MOON."""
+
+ def __init__(self, base_model, out_dim, n_classes):
+ super().__init__()
+
+ if base_model in (
+ "resnet50-cifar10",
+ "resnet50-cifar100",
+ "resnet50-smallkernel",
+ "resnet50",
+ ):
+ basemodel = resnet50_cifar10()
+ self.features = nn.Sequential(*list(basemodel.children())[:-1])
+ num_ftrs = basemodel.fc.in_features
+ elif base_model == "simple-cnn":
+ self.features = SimpleCNNHeader(
+ input_dim=(16 * 5 * 5), hidden_dims=[120, 84]
+ )
+ num_ftrs = 84
+
+ # projection MLP
+ self.l1 = nn.Linear(num_ftrs, num_ftrs)
+ self.l2 = nn.Linear(num_ftrs, out_dim)
+
+ # last layer
+ self.l3 = nn.Linear(out_dim, n_classes)
+
+ def _get_basemodel(self, model_name):
+ try:
+ model = self.model_dict[model_name]
+ return model
+ except KeyError as err:
+ raise ValueError("Invalid model name.") from err
+
+ def forward(self, x):
+ """Forward."""
+ h = self.features(x)
+ h = h.squeeze()
+ x = self.l1(h)
+ x = F.relu(x)
+ x = self.l2(x)
+
+ y = self.l3(x)
+ return h, x, y
+
+
+def init_net(dataset, model, output_dim, device="cpu"):
+ """Initialize model."""
+ if dataset == "cifar10":
+ n_classes = 10
+ elif dataset == "cifar100":
+ n_classes = 100
+
+ net = ModelMOON(model, output_dim, n_classes)
+ if device == "cpu":
+ net.to(device)
+ else:
+ net = net.cuda()
+
+ return net
+
+
+def train_moon(
+ net,
+ global_net,
+ previous_net,
+ train_dataloader,
+ epochs,
+ lr,
+ mu,
+ temperature,
+ device="cpu",
+):
+ """Training function for MOON."""
+ net.to(device)
+ global_net.to(device)
+ previous_net.to(device)
+ train_acc, _ = compute_accuracy(net, train_dataloader, device=device)
+ optimizer = optim.SGD(
+ filter(lambda p: p.requires_grad, net.parameters()),
+ lr=lr,
+ momentum=0.9,
+ weight_decay=1e-5,
+ )
+
+ criterion = nn.CrossEntropyLoss().cuda()
+
+ previous_net.eval()
+ for param in previous_net.parameters():
+ param.requires_grad = False
+ previous_net.cuda()
+
+ cnt = 0
+ cos = torch.nn.CosineSimilarity(dim=-1)
+
+ for epoch in range(epochs):
+ epoch_loss_collector = []
+ epoch_loss1_collector = []
+ epoch_loss2_collector = []
+ for _, (x, target) in enumerate(train_dataloader):
+ x, target = x.to(device), target.to(device)
+
+ optimizer.zero_grad()
+ x.requires_grad = False
+ target.requires_grad = False
+ target = target.long()
+
+ # pro1 is the representation by the current model (Line 14 of Algorithm 1)
+ _, pro1, out = net(x)
+ # pro2 is the representation by the global model (Line 15 of Algorithm 1)
+ _, pro2, _ = global_net(x)
+ # posi is the positive pair
+ posi = cos(pro1, pro2)
+ logits = posi.reshape(-1, 1)
+
+ previous_net.to(device)
+ # pro 3 is the representation by the previous model (Line 16 of Algorithm 1)
+ _, pro3, _ = previous_net(x)
+ # nega is the negative pair
+ nega = cos(pro1, pro3)
+ logits = torch.cat((logits, nega.reshape(-1, 1)), dim=1)
+
+ previous_net.to("cpu")
+ logits /= temperature
+ labels = torch.zeros(x.size(0)).cuda().long()
+ # compute the model-contrastive loss (Line 17 of Algorithm 1)
+ loss2 = mu * criterion(logits, labels)
+ # compute the cross-entropy loss (Line 13 of Algorithm 1)
+ loss1 = criterion(out, target)
+ # compute the loss (Line 18 of Algorithm 1)
+ loss = loss1 + loss2
+
+ loss.backward()
+ optimizer.step()
+
+ cnt += 1
+ epoch_loss_collector.append(loss.item())
+ epoch_loss1_collector.append(loss1.item())
+ epoch_loss2_collector.append(loss2.item())
+
+ epoch_loss = sum(epoch_loss_collector) / len(epoch_loss_collector)
+ epoch_loss1 = sum(epoch_loss1_collector) / len(epoch_loss1_collector)
+ epoch_loss2 = sum(epoch_loss2_collector) / len(epoch_loss2_collector)
+ print(
+ "Epoch: %d Loss: %f Loss1: %f Loss2: %f"
+ % (epoch, epoch_loss, epoch_loss1, epoch_loss2)
+ )
+
+ previous_net.to("cpu")
+ train_acc, _ = compute_accuracy(net, train_dataloader, device=device)
+
+ print(">> Training accuracy: %f" % train_acc)
+ net.to("cpu")
+ global_net.to("cpu")
+ print(" ** Training complete **")
+ return net
+
+
+def train_fedprox(net, global_net, train_dataloader, epochs, lr, mu, device="cpu"):
+ """Training function for FedProx."""
+ net = nn.DataParallel(net)
+ net.cuda()
+
+ train_acc, _ = compute_accuracy(net, train_dataloader, device=device)
+
+ print(">> Pre-Training Training accuracy: {}".format(train_acc))
+
+ optimizer = optim.SGD(
+ filter(lambda p: p.requires_grad, net.parameters()),
+ lr=lr,
+ momentum=0.9,
+ weight_decay=1e-5,
+ )
+
+ criterion = nn.CrossEntropyLoss().cuda()
+
+ cnt = 0
+ global_weight_collector = list(global_net.cuda().parameters())
+
+ for _epoch in range(epochs):
+ epoch_loss_collector = []
+ for _, (x, target) in enumerate(train_dataloader):
+ x, target = x.cuda(), target.cuda()
+
+ optimizer.zero_grad()
+ x.requires_grad = False
+ target.requires_grad = False
+ target = target.long()
+
+ _, _, out = net(x)
+ loss = criterion(out, target)
+
+ fed_prox_reg = 0.0
+ for param_index, param in enumerate(net.parameters()):
+ fed_prox_reg += (mu / 2) * torch.norm(
+ (param - global_weight_collector[param_index])
+ ) ** 2
+ loss += fed_prox_reg
+
+ loss.backward()
+ optimizer.step()
+
+ cnt += 1
+ epoch_loss_collector.append(loss.item())
+
+ train_acc, _ = compute_accuracy(net, train_dataloader, device=device)
+
+ print(">> Training accuracy: %f" % train_acc)
+ net.to("cpu")
+ print(" ** Training complete **")
+ return net
+
+
+def test(net, test_dataloader, device="cpu"):
+ """Test function."""
+ net.to(device)
+ test_acc, loss = compute_accuracy(net, test_dataloader, device=device)
+ print(">> Test accuracy: %f" % test_acc)
+ net.to("cpu")
+ return test_acc, loss
diff --git a/baselines/moon/moon/server.py b/baselines/moon/moon/server.py
new file mode 100644
index 000000000000..0cf812b88666
--- /dev/null
+++ b/baselines/moon/moon/server.py
@@ -0,0 +1,40 @@
+"""Create global evaluation function.
+
+Optionally, also define a new Server class (please note this is not needed in most
+settings).
+"""
+
+from collections import OrderedDict
+from typing import Callable, Dict, Optional, Tuple
+
+import torch
+from flwr.common.typing import NDArrays, Scalar
+from omegaconf import DictConfig
+from torch.utils.data import DataLoader
+
+from moon.models import init_net, test
+
+
+def gen_evaluate_fn(
+ testloader: DataLoader,
+ device: torch.device,
+ cfg: DictConfig,
+) -> Callable[
+ [int, NDArrays, Dict[str, Scalar]], Optional[Tuple[float, Dict[str, Scalar]]]
+]:
+ """Generate the function for centralized evaluation."""
+
+ def evaluate(
+ server_round: int, parameters_ndarrays: NDArrays, config: Dict[str, Scalar]
+ ) -> Optional[Tuple[float, Dict[str, Scalar]]]:
+ # pylint: disable=unused-argument
+ net = init_net(cfg.dataset.name, cfg.model.name, cfg.model.output_dim)
+ params_dict = zip(net.state_dict().keys(), parameters_ndarrays)
+ state_dict = OrderedDict({k: torch.from_numpy(v) for k, v in params_dict})
+ net.load_state_dict(state_dict, strict=True)
+ net.to(device)
+
+ accuracy, loss = test(net, testloader, device=device)
+ return loss, {"accuracy": accuracy}
+
+ return evaluate
diff --git a/baselines/moon/moon/strategy.py b/baselines/moon/moon/strategy.py
new file mode 100644
index 000000000000..17436c401c30
--- /dev/null
+++ b/baselines/moon/moon/strategy.py
@@ -0,0 +1,5 @@
+"""Optionally define a custom strategy.
+
+Needed only when the strategy is not yet implemented in Flower or because you want to
+extend or modify the functionality of an existing strategy.
+"""
diff --git a/baselines/moon/moon/utils.py b/baselines/moon/moon/utils.py
new file mode 100644
index 000000000000..4b99a480f77b
--- /dev/null
+++ b/baselines/moon/moon/utils.py
@@ -0,0 +1,127 @@
+"""Define any utility function.
+
+They are not directly relevant to the other (more FL specific) python modules. For
+example, you may define here things like: loading a model from a checkpoint, saving
+results, plotting.
+"""
+from pathlib import Path
+from typing import Optional
+
+import matplotlib.pyplot as plt
+import numpy as np
+import torch
+import torch.nn as nn
+from flwr.server.history import History
+
+
+def compute_accuracy(model, dataloader, device="cpu", multiloader=False):
+ """Compute accuracy."""
+ was_training = False
+ if model.training:
+ model.eval()
+ was_training = True
+
+ true_labels_list, pred_labels_list = np.array([]), np.array([])
+
+ correct, total = 0, 0
+ if device == "cpu":
+ criterion = nn.CrossEntropyLoss()
+ elif "cuda" in device.type:
+ criterion = nn.CrossEntropyLoss().cuda()
+ loss_collector = []
+ if multiloader:
+ for loader in dataloader:
+ with torch.no_grad():
+ for _, (x, target) in enumerate(loader):
+ if device != "cpu":
+ x, target = x.cuda(), target.to(dtype=torch.int64).cuda()
+ _, _, out = model(x)
+ if len(target) == 1:
+ loss = criterion(out, target)
+ else:
+ loss = criterion(out, target)
+ _, pred_label = torch.max(out.data, 1)
+ loss_collector.append(loss.item())
+ total += x.data.size()[0]
+ correct += (pred_label == target.data).sum().item()
+
+ if device == "cpu":
+ pred_labels_list = np.append(
+ pred_labels_list, pred_label.numpy()
+ )
+ true_labels_list = np.append(
+ true_labels_list, target.data.numpy()
+ )
+ else:
+ pred_labels_list = np.append(
+ pred_labels_list, pred_label.cpu().numpy()
+ )
+ true_labels_list = np.append(
+ true_labels_list, target.data.cpu().numpy()
+ )
+ avg_loss = sum(loss_collector) / len(loss_collector)
+ else:
+ with torch.no_grad():
+ for _, (x, target) in enumerate(dataloader):
+ # print("x:",x)
+ if device != "cpu":
+ x, target = x.cuda(), target.to(dtype=torch.int64).cuda()
+ _, _, out = model(x)
+ loss = criterion(out, target)
+ _, pred_label = torch.max(out.data, 1)
+ loss_collector.append(loss.item())
+ total += x.data.size()[0]
+ correct += (pred_label == target.data).sum().item()
+
+ if device == "cpu":
+ pred_labels_list = np.append(pred_labels_list, pred_label.numpy())
+ true_labels_list = np.append(true_labels_list, target.data.numpy())
+ else:
+ pred_labels_list = np.append(
+ pred_labels_list, pred_label.cpu().numpy()
+ )
+ true_labels_list = np.append(
+ true_labels_list, target.data.cpu().numpy()
+ )
+ avg_loss = sum(loss_collector) / len(loss_collector)
+
+ if was_training:
+ model.train()
+
+ return correct / float(total), avg_loss
+
+
+def plot_metric_from_history(
+ hist: History,
+ save_plot_path: Path,
+ suffix: Optional[str] = "",
+) -> None:
+ """Plot data from Flower server History.
+
+ Parameters
+ ----------
+ hist : History
+ Object containing evaluation for all rounds.
+ save_plot_path : Path
+ Folder to save the plot to.
+ suffix: Optional[str]
+ Optional string to add at the end of the filename for the plot.
+ """
+ metric_type = "centralized"
+ metric_dict = (
+ hist.metrics_centralized
+ if metric_type == "centralized"
+ else hist.metrics_distributed
+ )
+ rounds, values = zip(*metric_dict["accuracy"])
+
+ # Plot the curve
+ plt.figure(figsize=(10, 6))
+ plt.plot(rounds, values)
+ plt.xlabel("#round")
+ plt.ylabel("Test accuracy")
+ plt.legend()
+ plt.show()
+
+ plt.savefig(Path(save_plot_path) / Path(f"{metric_type}_metrics{suffix}.png"))
+ plt.close()
diff --git a/baselines/moon/pyproject.toml b/baselines/moon/pyproject.toml
new file mode 100644
index 000000000000..e9f826abb2ea
--- /dev/null
+++ b/baselines/moon/pyproject.toml
@@ -0,0 +1,146 @@
+[build-system]
+requires = ["poetry-core>=1.4.0"]
+build-backend = "poetry.masonry.api"
+
+[tool.poetry]
+name = "moon" # <----- Ensure it matches the name of your baseline directory containing all the source code
+version = "1.0.0"
+description = "Model-Contrastive Federated Learning"
+license = "Apache-2.0"
+authors = ["The Flower Authors ", "Qinbin Li "]
+readme = "README.md"
+homepage = "https://flower.dev"
+repository = "https://github.com/adap/flower"
+documentation = "https://flower.dev"
+classifiers = [
+ "Development Status :: 3 - Alpha",
+ "Intended Audience :: Developers",
+ "Intended Audience :: Science/Research",
+ "License :: OSI Approved :: Apache Software License",
+ "Operating System :: MacOS :: MacOS X",
+ "Operating System :: POSIX :: Linux",
+ "Programming Language :: Python",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3 :: Only",
+ "Programming Language :: Python :: 3.8",
+ "Programming Language :: Python :: 3.9",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: Implementation :: CPython",
+ "Topic :: Scientific/Engineering",
+ "Topic :: Scientific/Engineering :: Artificial Intelligence",
+ "Topic :: Scientific/Engineering :: Mathematics",
+ "Topic :: Software Development",
+ "Topic :: Software Development :: Libraries",
+ "Topic :: Software Development :: Libraries :: Python Modules",
+ "Typing :: Typed",
+]
+
+[tool.poetry.dependencies]
+python = ">=3.10.0, <3.12.0" # don't change this```
+flwr = { extras = ["simulation"], version = "1.5.0" }
+hydra-core = "1.3.2" # don't change this
+scikit-learn = "1.3.0"
+matplotlib = "3.8.0"
+torch = { url = "https://download.pytorch.org/whl/cu116/torch-1.12.0%2Bcu116-cp310-cp310-linux_x86_64.whl"}
+torchvision = { url = "https://download.pytorch.org/whl/cu116/torchvision-0.13.0%2Bcu116-cp310-cp310-linux_x86_64.whl"}
+
+[tool.poetry.dev-dependencies]
+isort = "==5.11.5"
+black = "==23.1.0"
+docformatter = "==1.5.1"
+mypy = "==1.4.1"
+pylint = "==2.8.2"
+flake8 = "==3.9.2"
+pytest = "==6.2.4"
+pytest-watch = "==4.2.0"
+ruff = "==0.0.272"
+types-requests = "==2.27.7"
+
+[tool.isort]
+line_length = 88
+indent = " "
+multi_line_output = 3
+include_trailing_comma = true
+force_grid_wrap = 0
+use_parentheses = true
+
+[tool.black]
+line-length = 88
+target-version = ["py38", "py39", "py310", "py311"]
+
+[tool.pytest.ini_options]
+minversion = "6.2"
+addopts = "-qq"
+testpaths = [
+ "flwr_baselines",
+]
+
+[tool.mypy]
+ignore_missing_imports = true
+strict = false
+plugins = "numpy.typing.mypy_plugin"
+
+[tool.pylint."MESSAGES CONTROL"]
+disable = "bad-continuation,duplicate-code,too-few-public-methods,useless-import-alias"
+good-names = "i,j,k,_,x,y,X,Y,K,N,X_train,X_test,fc,l1,l2,l3,h,lr,mu"
+max-args = 10
+max-attributes = 15
+max-locals = 36
+max-branches = 20
+max-statements = 55
+signature-mutators="hydra.main.main"
+
+[tool.pylint.typecheck]
+generated-members="numpy.*, torch.*, tensorflow.*"
+
+[[tool.mypy.overrides]]
+module = [
+ "importlib.metadata.*",
+ "importlib_metadata.*",
+]
+follow_imports = "skip"
+follow_imports_for_stubs = true
+disallow_untyped_calls = false
+
+[[tool.mypy.overrides]]
+module = "torch.*"
+follow_imports = "skip"
+follow_imports_for_stubs = true
+
+[tool.docformatter]
+wrap-summaries = 88
+wrap-descriptions = 88
+
+[tool.ruff]
+target-version = "py38"
+line-length = 88
+select = ["D", "E", "F", "W", "B", "ISC", "C4"]
+fixable = ["D", "E", "F", "W", "B", "ISC", "C4"]
+ignore = ["B024", "B027"]
+exclude = [
+ ".bzr",
+ ".direnv",
+ ".eggs",
+ ".git",
+ ".hg",
+ ".mypy_cache",
+ ".nox",
+ ".pants.d",
+ ".pytype",
+ ".ruff_cache",
+ ".svn",
+ ".tox",
+ ".venv",
+ "__pypackages__",
+ "_build",
+ "buck-out",
+ "build",
+ "dist",
+ "node_modules",
+ "venv",
+ "proto",
+]
+
+[tool.ruff.pydocstyle]
+convention = "numpy"
diff --git a/dev/aws-ami-bootstrap-tf.sh b/dev/aws-ami-bootstrap-tf.sh
index bece7d21f1a0..8799a254cbcc 100755
--- a/dev/aws-ami-bootstrap-tf.sh
+++ b/dev/aws-ami-bootstrap-tf.sh
@@ -27,7 +27,7 @@ sudo apt-get install -y make build-essential libssl-dev zlib1g-dev libbz2-dev li
sudo apt install -y python3.7 python3-pip
# Install project dependencies
-python3.7 -m pip install -U pip==23.1.2 setuptools==68.0.0
+python3.7 -m pip install -U pip==23.3.1 setuptools==68.2.2
python3.7 -m pip install -U numpy==1.18.1 grpcio==1.27.2 google==2.0.3 protobuf==3.12.1 \
boto3==1.12.36 boto3_type_annotations==0.3.1 paramiko==2.7.1 docker==4.2.0 matplotlib==3.2.1 \
tensorflow-cpu==2.6.2
diff --git a/dev/aws-ami-bootstrap-torch.sh b/dev/aws-ami-bootstrap-torch.sh
index 1c44cb09673d..835a3994c28a 100755
--- a/dev/aws-ami-bootstrap-torch.sh
+++ b/dev/aws-ami-bootstrap-torch.sh
@@ -27,7 +27,7 @@ sudo apt-get install -y make build-essential libssl-dev zlib1g-dev libbz2-dev li
sudo apt install -y python3.7 python3-pip
# Install project dependencies
-python3.7 -m pip install -U pip==23.1.2 setuptools==68.0.0
+python3.7 -m pip install -U pip==23.3.1 setuptools==68.2.2
python3.7 -m pip install -U numpy==1.18.1 grpcio==1.27.2 google==2.0.3 protobuf==3.12.1 \
boto3==1.12.36 boto3_type_annotations==0.3.1 paramiko==2.7.1 docker==4.2.0 matplotlib==3.2.1 \
tqdm==4.48.2 torch==1.6.0 torchvision==0.7.0
diff --git a/dev/bootstrap.sh b/dev/bootstrap.sh
index 4451115cc151..1700c3774767 100755
--- a/dev/bootstrap.sh
+++ b/dev/bootstrap.sh
@@ -9,8 +9,8 @@ cd "$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"/../
./dev/rm-caches.sh
# Upgrade/install spcific versions of `pip`, `setuptools`, and `poetry`
-python -m pip install -U pip==23.1.2
-python -m pip install -U setuptools==68.0.0
+python -m pip install -U pip==23.3.1
+python -m pip install -U setuptools==68.2.2
python -m pip install -U poetry==1.5.1
# Use `poetry` to install project dependencies
diff --git a/doc/source/ref-changelog.md b/doc/source/ref-changelog.md
index d2978bac0213..06e77fefedf0 100644
--- a/doc/source/ref-changelog.md
+++ b/doc/source/ref-changelog.md
@@ -2,11 +2,17 @@
## Unreleased
+### What's new?
+
+- **Add experimental support for Python 3.12** ([#2565](https://github.com/adap/flower/pull/2565))
+
- **Support custom** `ClientManager` **in** `start_driver()` ([#2292](https://github.com/adap/flower/pull/2292))
- **Update REST API to support create and delete nodes** ([#2283](https://github.com/adap/flower/pull/2283))
-### What's new?
+- **Update the C++ SDK** ([#2537](https://github/com/adap/flower/pull/2537), [#2528](https://github/com/adap/flower/pull/2528), [#2523](https://github.com/adap/flower/pull/2523), [#2522](https://github.com/adap/flower/pull/2522))
+
+ Add gRPC request-response capability to the C++ SDK.
- **Fix the incorrect return types of Strategy** ([#2432](https://github.com/adap/flower/pull/2432/files))
@@ -28,13 +34,23 @@
- FedMeta [#2438](https://github.com/adap/flower/pull/2438)
+ - FjORD [#2431](https://github.com/adap/flower/pull/2431)
+
+ - MOON [#2421](https://github.com/adap/flower/pull/2421)
+
+ - DepthFL [#2295](https://github.com/adap/flower/pull/2295)
+
+ - FedPer [#2266](https://github.com/adap/flower/pull/2266)
+
+ - FedWav2vec [#2551](https://github.com/adap/flower/pull/2551)
+
- **Update Flower Examples** ([#2384](https://github.com/adap/flower/pull/2384),[#2425](https://github.com/adap/flower/pull/2425), [#2526](https://github.com/adap/flower/pull/2526))
- **General updates to baselines** ([#2301](https://github.com/adap/flower/pull/2301), [#2305](https://github.com/adap/flower/pull/2305), [#2307](https://github.com/adap/flower/pull/2307), [#2327](https://github.com/adap/flower/pull/2327), [#2435](https://github.com/adap/flower/pull/2435))
- **General updates to the simulation engine** ([#2331](https://github.com/adap/flower/pull/2331), [#2447](https://github.com/adap/flower/pull/2447), [#2448](https://github.com/adap/flower/pull/2448))
-- **General improvements** ([#2309](https://github.com/adap/flower/pull/2309), [#2310](https://github.com/adap/flower/pull/2310), [2313](https://github.com/adap/flower/pull/2313), [#2316](https://github.com/adap/flower/pull/2316), [2317](https://github.com/adap/flower/pull/2317),[#2349](https://github.com/adap/flower/pull/2349), [#2360](https://github.com/adap/flower/pull/2360), [#2402](https://github.com/adap/flower/pull/2402), [#2446](https://github.com/adap/flower/pull/2446))
+- **General improvements** ([#2309](https://github.com/adap/flower/pull/2309), [#2310](https://github.com/adap/flower/pull/2310), [2313](https://github.com/adap/flower/pull/2313), [#2316](https://github.com/adap/flower/pull/2316), [2317](https://github.com/adap/flower/pull/2317),[#2349](https://github.com/adap/flower/pull/2349), [#2360](https://github.com/adap/flower/pull/2360), [#2402](https://github.com/adap/flower/pull/2402), [#2446](https://github.com/adap/flower/pull/2446) [#2561](https://github.com/adap/flower/pull/2561))
Flower received many improvements under the hood, too many to list here.
diff --git a/examples/quickstart-cpp/CMakeLists.txt b/examples/quickstart-cpp/CMakeLists.txt
index 79af6a0ef17e..552132b079c9 100644
--- a/examples/quickstart-cpp/CMakeLists.txt
+++ b/examples/quickstart-cpp/CMakeLists.txt
@@ -3,7 +3,6 @@ project(SimpleCppFlowerClient VERSION 0.10
DESCRIPTION "Creates a Simple C++ Flower client that trains a linear model on synthetic data."
LANGUAGES CXX)
set(CMAKE_CXX_STANDARD 17)
-set(ABSL_PROPAGATE_CXX_STD ON)
######################
### Download gRPC
@@ -27,62 +26,27 @@ else()
set(_GRPC_CPP_PLUGIN_EXECUTABLE $)
endif()
-
######################
-### FLWR_GRPC_PROTO
-
-get_filename_component(FLWR_PROTO "../../src/proto/flwr/proto/transport.proto" ABSOLUTE)
-get_filename_component(FLWR_PROTO_PATH "${FLWR_PROTO}" PATH)
-
-set(FLWR_PROTO_SRCS "${CMAKE_CURRENT_BINARY_DIR}/transport.pb.cc")
-set(FLWR_PROTO_HDRS "${CMAKE_CURRENT_BINARY_DIR}/transport.pb.h")
-set(FLWR_GRPC_SRCS "${CMAKE_CURRENT_BINARY_DIR}/transport.grpc.pb.cc")
-set(FLAR_GRPC_HDRS "${CMAKE_CURRENT_BINARY_DIR}/transport.grpc.pb.h")
+### FLWR_LIB
-# External building command to generate gRPC source files.
-add_custom_command(
- OUTPUT "${FLWR_PROTO_SRCS}" "${FLWR_PROTO_HDRS}" "${FLWR_GRPC_SRCS}" "${FLWR_GRPC_HDRS}"
- COMMAND ${_PROTOBUF_PROTOC}
- ARGS --grpc_out "${CMAKE_CURRENT_BINARY_DIR}"
- --cpp_out "${CMAKE_CURRENT_BINARY_DIR}"
- -I "${FLWR_PROTO_PATH}"
- --plugin=protoc-gen-grpc="${_GRPC_CPP_PLUGIN_EXECUTABLE}"
- "${FLWR_PROTO}"
- DEPENDS "${FLWR_PROTO}"
-)
+set(FLWR_SDK_PATH "../../src/cc/flwr")
-add_library(flwr_grpc_proto
- ${FLWR_GRPC_SRCS}
- ${FLWR_GRPC_HDRS}
- ${FLWR_PROTO_SRCS}
- ${FLWR_PROTO_HDRS}
-)
+file(GLOB FLWR_SRCS "${FLWR_SDK_PATH}/src/*.cc")
+file(GLOB FLWR_PROTO_SRCS "${FLWR_SDK_PATH}/include/flwr/proto/*.cc")
+set(FLWR_INCLUDE_DIR "${FLWR_SDK_PATH}/include")
-target_include_directories(flwr_grpc_proto PUBLIC ${CMAKE_CURRENT_BINARY_DIR})
+add_library(flwr ${FLWR_SRCS} ${FLWR_PROTO_SRCS})
-target_link_libraries(flwr_grpc_proto
+target_link_libraries(flwr
${_REFLECTION}
${_GRPC_GRPCPP}
${_PROTOBUF_LIBPROTOBUF}
)
-######################
-### FLWR_LIB
-
-file(GLOB FLWR_SRCS "../../src/cc/flwr/src/*.cc")
-set(FLWR_INCLUDE_DIR "../../src/cc/flwr/include")
-
-add_library(flwr ${FLWR_SRCS})
-
target_include_directories(flwr PUBLIC
- ${CMAKE_CURRENT_BINARY_DIR}
${FLWR_INCLUDE_DIR}
)
-target_link_libraries(flwr
- flwr_grpc_proto
-)
-
######################
### FLWR_CLIENT
file(GLOB FLWR_CLIENT_SRCS src/*.cc)
diff --git a/examples/quickstart-cpp/driver.py b/examples/quickstart-cpp/driver.py
new file mode 100644
index 000000000000..037623ee77cf
--- /dev/null
+++ b/examples/quickstart-cpp/driver.py
@@ -0,0 +1,10 @@
+import flwr as fl
+from fedavg_cpp import FedAvgCpp
+
+# Start Flower server for three rounds of federated learning
+if __name__ == "__main__":
+ fl.driver.start_driver(
+ server_address="0.0.0.0:9091",
+ config=fl.server.ServerConfig(num_rounds=3),
+ strategy=FedAvgCpp(),
+ )
diff --git a/examples/quickstart-cpp/include/simple_client.h b/examples/quickstart-cpp/include/simple_client.h
index ce598365f29c..894ecb267387 100644
--- a/examples/quickstart-cpp/include/simple_client.h
+++ b/examples/quickstart-cpp/include/simple_client.h
@@ -1,6 +1,6 @@
/***********************************************************************************************************
*
- * @file libtorch_client.h
+ * @file simple_client.h
*
* @brief Define an example flower client, train and test method
*
diff --git a/examples/quickstart-cpp/src/main.cc b/examples/quickstart-cpp/src/main.cc
index fb3c533a3841..f294f9d69473 100644
--- a/examples/quickstart-cpp/src/main.cc
+++ b/examples/quickstart-cpp/src/main.cc
@@ -2,44 +2,58 @@
#include "start.h"
int main(int argc, char **argv) {
- if (argc != 3) {
- std::cout << "Client takes three arguments as follows: " << std::endl;
- std::cout << "./client CLIENT_ID SERVER_URL" << std::endl;
- std::cout << "Example: ./flwr_client 0 '127.0.0.1:8080'" << std::endl;
- return 0;
- }
-
- // Parsing arguments
- const std::string CLIENT_ID = argv[1];
- const std::string SERVER_URL = argv[2];
-
- // Populate local datasets
- std::vector ms{3.5, 9.3}; // b + m_0*x0 + m_1*x1
- double b = 1.7;
- std::cout <<"Training set:" << std::endl;
- SyntheticDataset local_training_data = SyntheticDataset(ms, b, 1000);
- std::cout << std::endl;
-
- std::cout <<"Validation set:" << std::endl;
- SyntheticDataset local_validation_data = SyntheticDataset(ms, b, 100);
- std::cout << std::endl;
-
- std::cout <<"Test set:" << std::endl;
- SyntheticDataset local_test_data = SyntheticDataset(ms, b, 500);
- std::cout << std::endl;
-
- // Define a model
- LineFitModel model = LineFitModel(500, 0.01, ms.size());
-
- // Initialize TorchClient
- SimpleFlwrClient client(CLIENT_ID, model, local_training_data, local_validation_data, local_test_data);
-
- // Define a server address
- std::string server_add = SERVER_URL;
-
- // Start client
+ if (argc != 3 && argc != 4) {
+ std::cout << "Client takes three mandatory arguments and one optional as "
+ "follows: "
+ << std::endl;
+ std::cout << "./client CLIENT_ID SERVER_URL [GRPC_MODE]" << std::endl;
+ std::cout
+ << "GRPC_MODE is optional and can be either 'bidi' (default) or 'rere'."
+ << std::endl;
+ std::cout << "Example: ./flwr_client 0 '127.0.0.1:8080' bidi" << std::endl;
+ std::cout << "This is the same as: ./flwr_client 0 '127.0.0.1:8080'"
+ << std::endl;
+ return 0;
+ }
+
+ // Parsing arguments
+ const std::string CLIENT_ID = argv[1];
+ const std::string SERVER_URL = argv[2];
+
+ // Populate local datasets
+ std::vector ms{3.5, 9.3}; // b + m_0*x0 + m_1*x1
+ double b = 1.7;
+ std::cout << "Training set:" << std::endl;
+ SyntheticDataset local_training_data = SyntheticDataset(ms, b, 1000);
+ std::cout << std::endl;
+
+ std::cout << "Validation set:" << std::endl;
+ SyntheticDataset local_validation_data = SyntheticDataset(ms, b, 100);
+ std::cout << std::endl;
+
+ std::cout << "Test set:" << std::endl;
+ SyntheticDataset local_test_data = SyntheticDataset(ms, b, 500);
+ std::cout << std::endl;
+
+ // Define a model
+ LineFitModel model = LineFitModel(500, 0.01, ms.size());
+
+ // Initialize TorchClient
+ SimpleFlwrClient client(CLIENT_ID, model, local_training_data,
+ local_validation_data, local_test_data);
+
+ // Define a server address
+ std::string server_add = SERVER_URL;
+
+ if (argc == 4 && std::string(argv[3]) == "rere") {
+ std::cout << "Starting rere client" << std::endl;
+ // Start rere client
+ start::start_rere_client(server_add, &client);
+ } else {
+ std::cout << "Starting bidi client" << std::endl;
+ // Start bidi client
start::start_client(server_add, &client);
+ }
- return 0;
+ return 0;
}
-
diff --git a/pyproject.toml b/pyproject.toml
index b948c8d8b64d..261eacbf0c94 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -34,6 +34,7 @@ classifiers = [
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: 3.12",
"Programming Language :: Python :: Implementation :: CPython",
"Topic :: Scientific/Engineering",
"Topic :: Scientific/Engineering :: Artificial Intelligence",
@@ -81,14 +82,14 @@ rest = ["requests", "starlette", "uvicorn"]
types-dataclasses = "==0.6.6"
types-protobuf = "==3.19.18"
types-requests = "==2.31.0.2"
-types-setuptools = "==68.0.0.3"
+types-setuptools = "==68.2.0.0"
clang-format = "==16.0.6"
isort = "==5.11.5"
black = { version = "==23.3.0", extras = ["jupyter"] }
docformatter = "==1.7.5"
mypy = "==1.5.1"
pylint = "==2.13.9"
-flake8 = "==3.9.2"
+flake8 = "==5.0.4"
pytest = "==7.4.0"
pytest-cov = "==3.0.0"
pytest-watch = "==4.2.0"
diff --git a/src/cc/flwr/CMakeLists.txt b/src/cc/flwr/CMakeLists.txt
index 8ab7dc4c2964..c242f52b237b 100644
--- a/src/cc/flwr/CMakeLists.txt
+++ b/src/cc/flwr/CMakeLists.txt
@@ -4,7 +4,6 @@ project(flwr VERSION 1.0
LANGUAGES CXX)
set(CMAKE_CXX_STANDARD 17)
-set(ABSL_PROPAGATE_CXX_STD ON)
# Assume gRPC and other dependencies are necessary
include(FetchContent)
@@ -26,34 +25,56 @@ else()
set(_GRPC_CPP_PLUGIN_EXECUTABLE $)
endif()
-# FLWR_GRPC_PROTO
-
-get_filename_component(FLWR_PROTO "../../proto/flwr/proto/transport.proto" ABSOLUTE)
-get_filename_component(FLWR_PROTO_PATH "${FLWR_PROTO}" PATH)
-
-set(FLWR_PROTO_SRCS "${CMAKE_CURRENT_BINARY_DIR}/transport.pb.cc")
-set(FLWR_PROTO_HDRS "${CMAKE_CURRENT_BINARY_DIR}/transport.pb.h")
-set(FLWR_GRPC_SRCS "${CMAKE_CURRENT_BINARY_DIR}/transport.grpc.pb.cc")
-set(FLAR_GRPC_HDRS "${CMAKE_CURRENT_BINARY_DIR}/transport.grpc.pb.h")
-
-# External building command to generate gRPC source files.
-add_custom_command(
- OUTPUT "${FLWR_PROTO_SRCS}" "${FLWR_PROTO_HDRS}" "${FLWR_GRPC_SRCS}" "${FLWR_GRPC_HDRS}"
- COMMAND ${_PROTOBUF_PROTOC}
- ARGS --grpc_out "${CMAKE_CURRENT_BINARY_DIR}"
- --cpp_out "${CMAKE_CURRENT_BINARY_DIR}"
- -I "${FLWR_PROTO_PATH}"
- --plugin=protoc-gen-grpc="${_GRPC_CPP_PLUGIN_EXECUTABLE}"
- "${FLWR_PROTO}"
- DEPENDS "${FLWR_PROTO}"
-)
-
-add_library(flwr_grpc_proto STATIC
- ${FLWR_GRPC_SRCS}
- ${FLWR_GRPC_HDRS}
- ${FLWR_PROTO_SRCS}
- ${FLWR_PROTO_HDRS}
-)
+# Paths and output directories
+get_filename_component(FLWR_PROTO_BASE_PATH "../../proto/" ABSOLUTE)
+set(INCLUDE_FLWR_PROTO_DIR "${CMAKE_CURRENT_SOURCE_DIR}/include/flwr/proto")
+
+# Generate source files and copy them
+macro(GENERATE_AND_COPY PROTO_NAME)
+ set(OUT_PROTO_SRCS "${CMAKE_CURRENT_BINARY_DIR}/flwr/proto/${PROTO_NAME}.pb.cc")
+ set(OUT_PROTO_HDRS "${CMAKE_CURRENT_BINARY_DIR}/flwr/proto/${PROTO_NAME}.pb.h")
+ set(OUT_GRPC_SRCS "${CMAKE_CURRENT_BINARY_DIR}/flwr/proto/${PROTO_NAME}.grpc.pb.cc")
+ set(OUT_GRPC_HDRS "${CMAKE_CURRENT_BINARY_DIR}/flwr/proto/${PROTO_NAME}.grpc.pb.h")
+ set(SOURCE_PROTO "${FLWR_PROTO_BASE_PATH}/flwr/proto/${PROTO_NAME}.proto")
+
+ add_custom_command(
+ OUTPUT "${OUT_PROTO_SRCS}" "${OUT_PROTO_HDRS}" "${OUT_GRPC_SRCS}" "${OUT_GRPC_HDRS}"
+ COMMAND ${_PROTOBUF_PROTOC}
+ ARGS --grpc_out "${CMAKE_CURRENT_BINARY_DIR}"
+ --cpp_out "${CMAKE_CURRENT_BINARY_DIR}"
+ -I "${FLWR_PROTO_BASE_PATH}"
+ --plugin=protoc-gen-grpc="${_GRPC_CPP_PLUGIN_EXECUTABLE}"
+ "${SOURCE_PROTO}"
+ )
+
+ add_custom_command(
+ OUTPUT "${INCLUDE_FLWR_PROTO_DIR}/${PROTO_NAME}.pb.cc"
+ "${INCLUDE_FLWR_PROTO_DIR}/${PROTO_NAME}.pb.h"
+ "${INCLUDE_FLWR_PROTO_DIR}/${PROTO_NAME}.grpc.pb.cc"
+ "${INCLUDE_FLWR_PROTO_DIR}/${PROTO_NAME}.grpc.pb.h"
+ COMMAND ${CMAKE_COMMAND} -E copy_if_different
+ "${OUT_PROTO_SRCS}" "${OUT_PROTO_HDRS}" "${OUT_GRPC_SRCS}" "${OUT_GRPC_HDRS}"
+ "${INCLUDE_FLWR_PROTO_DIR}"
+ DEPENDS "${OUT_PROTO_SRCS}" "${OUT_PROTO_HDRS}" "${OUT_GRPC_SRCS}" "${OUT_GRPC_HDRS}"
+ )
+
+ set(ALL_PROTO_FILES
+ ${ALL_PROTO_FILES}
+ "${INCLUDE_FLWR_PROTO_DIR}/${PROTO_NAME}.pb.cc"
+ "${INCLUDE_FLWR_PROTO_DIR}/${PROTO_NAME}.pb.h"
+ "${INCLUDE_FLWR_PROTO_DIR}/${PROTO_NAME}.grpc.pb.cc"
+ "${INCLUDE_FLWR_PROTO_DIR}/${PROTO_NAME}.grpc.pb.h"
+ CACHE INTERNAL "All generated proto files"
+ )
+endmacro()
+
+# Using the above macro for all proto files
+GENERATE_AND_COPY(transport)
+GENERATE_AND_COPY(node)
+GENERATE_AND_COPY(task)
+GENERATE_AND_COPY(fleet)
+
+add_library(flwr_grpc_proto STATIC ${ALL_PROTO_FILES})
target_include_directories(flwr_grpc_proto
PUBLIC
@@ -67,56 +88,14 @@ target_link_libraries(flwr_grpc_proto
${_GRPC_GRPCPP}
${_PROTOBUF_LIBPROTOBUF}
)
+
# For the internal use of flwr
file(GLOB FLWR_SRCS "src/*.cc")
-
add_library(flwr ${FLWR_SRCS})
target_include_directories(flwr PUBLIC
$
- $
)
# Link gRPC and other dependencies
target_link_libraries(flwr PRIVATE flwr_grpc_proto)
-
-# Merge the two libraries
-add_library(flwr_merged STATIC $ $)
-
-target_include_directories(flwr_merged PUBLIC
- $
- $
-)
-
-# This will create a 'flwrConfig.cmake' for users to find
-install(TARGETS flwr_merged EXPORT flwrTargets
- LIBRARY DESTINATION lib
- ARCHIVE DESTINATION lib
- RUNTIME DESTINATION bin
- PUBLIC_HEADER DESTINATION include
-)
-install(
- FILES
- ${CMAKE_CURRENT_BINARY_DIR}/transport.grpc.pb.h
- ${CMAKE_CURRENT_BINARY_DIR}/transport.pb.h
- DESTINATION include
-)
-install(DIRECTORY include/ DESTINATION include)
-
-install(EXPORT flwrTargets
- FILE flwrConfig.cmake
- NAMESPACE flwr::
- DESTINATION lib/cmake/flwr
-)
-
-# Optional: Generate and install package version file
-include(CMakePackageConfigHelpers)
-write_basic_package_version_file(
- "${CMAKE_CURRENT_BINARY_DIR}/flwrConfigVersion.cmake"
- VERSION ${PROJECT_VERSION}
- COMPATIBILITY AnyNewerVersion
-)
-install(FILES "${CMAKE_CURRENT_BINARY_DIR}/flwrConfigVersion.cmake"
- DESTINATION lib/cmake/flwr
-)
-
diff --git a/src/cc/flwr/include/flwr/proto/fleet.grpc.pb.cc b/src/cc/flwr/include/flwr/proto/fleet.grpc.pb.cc
new file mode 100644
index 000000000000..c71a6a3e1c45
--- /dev/null
+++ b/src/cc/flwr/include/flwr/proto/fleet.grpc.pb.cc
@@ -0,0 +1,214 @@
+// Generated by the gRPC C++ plugin.
+// If you make any local change, they will be lost.
+// source: flwr/proto/fleet.proto
+
+#include "flwr/proto/fleet.pb.h"
+#include "flwr/proto/fleet.grpc.pb.h"
+
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+namespace flwr {
+namespace proto {
+
+static const char* Fleet_method_names[] = {
+ "/flwr.proto.Fleet/CreateNode",
+ "/flwr.proto.Fleet/DeleteNode",
+ "/flwr.proto.Fleet/PullTaskIns",
+ "/flwr.proto.Fleet/PushTaskRes",
+};
+
+std::unique_ptr< Fleet::Stub> Fleet::NewStub(const std::shared_ptr< ::grpc::ChannelInterface>& channel, const ::grpc::StubOptions& options) {
+ (void)options;
+ std::unique_ptr< Fleet::Stub> stub(new Fleet::Stub(channel, options));
+ return stub;
+}
+
+Fleet::Stub::Stub(const std::shared_ptr< ::grpc::ChannelInterface>& channel, const ::grpc::StubOptions& options)
+ : channel_(channel), rpcmethod_CreateNode_(Fleet_method_names[0], options.suffix_for_stats(),::grpc::internal::RpcMethod::NORMAL_RPC, channel)
+ , rpcmethod_DeleteNode_(Fleet_method_names[1], options.suffix_for_stats(),::grpc::internal::RpcMethod::NORMAL_RPC, channel)
+ , rpcmethod_PullTaskIns_(Fleet_method_names[2], options.suffix_for_stats(),::grpc::internal::RpcMethod::NORMAL_RPC, channel)
+ , rpcmethod_PushTaskRes_(Fleet_method_names[3], options.suffix_for_stats(),::grpc::internal::RpcMethod::NORMAL_RPC, channel)
+ {}
+
+::grpc::Status Fleet::Stub::CreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::flwr::proto::CreateNodeResponse* response) {
+ return ::grpc::internal::BlockingUnaryCall< ::flwr::proto::CreateNodeRequest, ::flwr::proto::CreateNodeResponse, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(channel_.get(), rpcmethod_CreateNode_, context, request, response);
+}
+
+void Fleet::Stub::async::CreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest* request, ::flwr::proto::CreateNodeResponse* response, std::function f) {
+ ::grpc::internal::CallbackUnaryCall< ::flwr::proto::CreateNodeRequest, ::flwr::proto::CreateNodeResponse, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(stub_->channel_.get(), stub_->rpcmethod_CreateNode_, context, request, response, std::move(f));
+}
+
+void Fleet::Stub::async::CreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest* request, ::flwr::proto::CreateNodeResponse* response, ::grpc::ClientUnaryReactor* reactor) {
+ ::grpc::internal::ClientCallbackUnaryFactory::Create< ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(stub_->channel_.get(), stub_->rpcmethod_CreateNode_, context, request, response, reactor);
+}
+
+::grpc::ClientAsyncResponseReader< ::flwr::proto::CreateNodeResponse>* Fleet::Stub::PrepareAsyncCreateNodeRaw(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::grpc::CompletionQueue* cq) {
+ return ::grpc::internal::ClientAsyncResponseReaderHelper::Create< ::flwr::proto::CreateNodeResponse, ::flwr::proto::CreateNodeRequest, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(channel_.get(), cq, rpcmethod_CreateNode_, context, request);
+}
+
+::grpc::ClientAsyncResponseReader< ::flwr::proto::CreateNodeResponse>* Fleet::Stub::AsyncCreateNodeRaw(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::grpc::CompletionQueue* cq) {
+ auto* result =
+ this->PrepareAsyncCreateNodeRaw(context, request, cq);
+ result->StartCall();
+ return result;
+}
+
+::grpc::Status Fleet::Stub::DeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::flwr::proto::DeleteNodeResponse* response) {
+ return ::grpc::internal::BlockingUnaryCall< ::flwr::proto::DeleteNodeRequest, ::flwr::proto::DeleteNodeResponse, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(channel_.get(), rpcmethod_DeleteNode_, context, request, response);
+}
+
+void Fleet::Stub::async::DeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest* request, ::flwr::proto::DeleteNodeResponse* response, std::function f) {
+ ::grpc::internal::CallbackUnaryCall< ::flwr::proto::DeleteNodeRequest, ::flwr::proto::DeleteNodeResponse, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(stub_->channel_.get(), stub_->rpcmethod_DeleteNode_, context, request, response, std::move(f));
+}
+
+void Fleet::Stub::async::DeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest* request, ::flwr::proto::DeleteNodeResponse* response, ::grpc::ClientUnaryReactor* reactor) {
+ ::grpc::internal::ClientCallbackUnaryFactory::Create< ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(stub_->channel_.get(), stub_->rpcmethod_DeleteNode_, context, request, response, reactor);
+}
+
+::grpc::ClientAsyncResponseReader< ::flwr::proto::DeleteNodeResponse>* Fleet::Stub::PrepareAsyncDeleteNodeRaw(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::grpc::CompletionQueue* cq) {
+ return ::grpc::internal::ClientAsyncResponseReaderHelper::Create< ::flwr::proto::DeleteNodeResponse, ::flwr::proto::DeleteNodeRequest, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(channel_.get(), cq, rpcmethod_DeleteNode_, context, request);
+}
+
+::grpc::ClientAsyncResponseReader< ::flwr::proto::DeleteNodeResponse>* Fleet::Stub::AsyncDeleteNodeRaw(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::grpc::CompletionQueue* cq) {
+ auto* result =
+ this->PrepareAsyncDeleteNodeRaw(context, request, cq);
+ result->StartCall();
+ return result;
+}
+
+::grpc::Status Fleet::Stub::PullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::flwr::proto::PullTaskInsResponse* response) {
+ return ::grpc::internal::BlockingUnaryCall< ::flwr::proto::PullTaskInsRequest, ::flwr::proto::PullTaskInsResponse, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(channel_.get(), rpcmethod_PullTaskIns_, context, request, response);
+}
+
+void Fleet::Stub::async::PullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest* request, ::flwr::proto::PullTaskInsResponse* response, std::function f) {
+ ::grpc::internal::CallbackUnaryCall< ::flwr::proto::PullTaskInsRequest, ::flwr::proto::PullTaskInsResponse, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(stub_->channel_.get(), stub_->rpcmethod_PullTaskIns_, context, request, response, std::move(f));
+}
+
+void Fleet::Stub::async::PullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest* request, ::flwr::proto::PullTaskInsResponse* response, ::grpc::ClientUnaryReactor* reactor) {
+ ::grpc::internal::ClientCallbackUnaryFactory::Create< ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(stub_->channel_.get(), stub_->rpcmethod_PullTaskIns_, context, request, response, reactor);
+}
+
+::grpc::ClientAsyncResponseReader< ::flwr::proto::PullTaskInsResponse>* Fleet::Stub::PrepareAsyncPullTaskInsRaw(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::grpc::CompletionQueue* cq) {
+ return ::grpc::internal::ClientAsyncResponseReaderHelper::Create< ::flwr::proto::PullTaskInsResponse, ::flwr::proto::PullTaskInsRequest, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(channel_.get(), cq, rpcmethod_PullTaskIns_, context, request);
+}
+
+::grpc::ClientAsyncResponseReader< ::flwr::proto::PullTaskInsResponse>* Fleet::Stub::AsyncPullTaskInsRaw(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::grpc::CompletionQueue* cq) {
+ auto* result =
+ this->PrepareAsyncPullTaskInsRaw(context, request, cq);
+ result->StartCall();
+ return result;
+}
+
+::grpc::Status Fleet::Stub::PushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::flwr::proto::PushTaskResResponse* response) {
+ return ::grpc::internal::BlockingUnaryCall< ::flwr::proto::PushTaskResRequest, ::flwr::proto::PushTaskResResponse, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(channel_.get(), rpcmethod_PushTaskRes_, context, request, response);
+}
+
+void Fleet::Stub::async::PushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest* request, ::flwr::proto::PushTaskResResponse* response, std::function f) {
+ ::grpc::internal::CallbackUnaryCall< ::flwr::proto::PushTaskResRequest, ::flwr::proto::PushTaskResResponse, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(stub_->channel_.get(), stub_->rpcmethod_PushTaskRes_, context, request, response, std::move(f));
+}
+
+void Fleet::Stub::async::PushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest* request, ::flwr::proto::PushTaskResResponse* response, ::grpc::ClientUnaryReactor* reactor) {
+ ::grpc::internal::ClientCallbackUnaryFactory::Create< ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(stub_->channel_.get(), stub_->rpcmethod_PushTaskRes_, context, request, response, reactor);
+}
+
+::grpc::ClientAsyncResponseReader< ::flwr::proto::PushTaskResResponse>* Fleet::Stub::PrepareAsyncPushTaskResRaw(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::grpc::CompletionQueue* cq) {
+ return ::grpc::internal::ClientAsyncResponseReaderHelper::Create< ::flwr::proto::PushTaskResResponse, ::flwr::proto::PushTaskResRequest, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(channel_.get(), cq, rpcmethod_PushTaskRes_, context, request);
+}
+
+::grpc::ClientAsyncResponseReader< ::flwr::proto::PushTaskResResponse>* Fleet::Stub::AsyncPushTaskResRaw(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::grpc::CompletionQueue* cq) {
+ auto* result =
+ this->PrepareAsyncPushTaskResRaw(context, request, cq);
+ result->StartCall();
+ return result;
+}
+
+Fleet::Service::Service() {
+ AddMethod(new ::grpc::internal::RpcServiceMethod(
+ Fleet_method_names[0],
+ ::grpc::internal::RpcMethod::NORMAL_RPC,
+ new ::grpc::internal::RpcMethodHandler< Fleet::Service, ::flwr::proto::CreateNodeRequest, ::flwr::proto::CreateNodeResponse, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(
+ [](Fleet::Service* service,
+ ::grpc::ServerContext* ctx,
+ const ::flwr::proto::CreateNodeRequest* req,
+ ::flwr::proto::CreateNodeResponse* resp) {
+ return service->CreateNode(ctx, req, resp);
+ }, this)));
+ AddMethod(new ::grpc::internal::RpcServiceMethod(
+ Fleet_method_names[1],
+ ::grpc::internal::RpcMethod::NORMAL_RPC,
+ new ::grpc::internal::RpcMethodHandler< Fleet::Service, ::flwr::proto::DeleteNodeRequest, ::flwr::proto::DeleteNodeResponse, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(
+ [](Fleet::Service* service,
+ ::grpc::ServerContext* ctx,
+ const ::flwr::proto::DeleteNodeRequest* req,
+ ::flwr::proto::DeleteNodeResponse* resp) {
+ return service->DeleteNode(ctx, req, resp);
+ }, this)));
+ AddMethod(new ::grpc::internal::RpcServiceMethod(
+ Fleet_method_names[2],
+ ::grpc::internal::RpcMethod::NORMAL_RPC,
+ new ::grpc::internal::RpcMethodHandler< Fleet::Service, ::flwr::proto::PullTaskInsRequest, ::flwr::proto::PullTaskInsResponse, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(
+ [](Fleet::Service* service,
+ ::grpc::ServerContext* ctx,
+ const ::flwr::proto::PullTaskInsRequest* req,
+ ::flwr::proto::PullTaskInsResponse* resp) {
+ return service->PullTaskIns(ctx, req, resp);
+ }, this)));
+ AddMethod(new ::grpc::internal::RpcServiceMethod(
+ Fleet_method_names[3],
+ ::grpc::internal::RpcMethod::NORMAL_RPC,
+ new ::grpc::internal::RpcMethodHandler< Fleet::Service, ::flwr::proto::PushTaskResRequest, ::flwr::proto::PushTaskResResponse, ::grpc::protobuf::MessageLite, ::grpc::protobuf::MessageLite>(
+ [](Fleet::Service* service,
+ ::grpc::ServerContext* ctx,
+ const ::flwr::proto::PushTaskResRequest* req,
+ ::flwr::proto::PushTaskResResponse* resp) {
+ return service->PushTaskRes(ctx, req, resp);
+ }, this)));
+}
+
+Fleet::Service::~Service() {
+}
+
+::grpc::Status Fleet::Service::CreateNode(::grpc::ServerContext* context, const ::flwr::proto::CreateNodeRequest* request, ::flwr::proto::CreateNodeResponse* response) {
+ (void) context;
+ (void) request;
+ (void) response;
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+}
+
+::grpc::Status Fleet::Service::DeleteNode(::grpc::ServerContext* context, const ::flwr::proto::DeleteNodeRequest* request, ::flwr::proto::DeleteNodeResponse* response) {
+ (void) context;
+ (void) request;
+ (void) response;
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+}
+
+::grpc::Status Fleet::Service::PullTaskIns(::grpc::ServerContext* context, const ::flwr::proto::PullTaskInsRequest* request, ::flwr::proto::PullTaskInsResponse* response) {
+ (void) context;
+ (void) request;
+ (void) response;
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+}
+
+::grpc::Status Fleet::Service::PushTaskRes(::grpc::ServerContext* context, const ::flwr::proto::PushTaskResRequest* request, ::flwr::proto::PushTaskResResponse* response) {
+ (void) context;
+ (void) request;
+ (void) response;
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+}
+
+
+} // namespace flwr
+} // namespace proto
+
diff --git a/src/cc/flwr/include/flwr/proto/fleet.grpc.pb.h b/src/cc/flwr/include/flwr/proto/fleet.grpc.pb.h
new file mode 100644
index 000000000000..03d445142c37
--- /dev/null
+++ b/src/cc/flwr/include/flwr/proto/fleet.grpc.pb.h
@@ -0,0 +1,747 @@
+// Generated by the gRPC C++ plugin.
+// If you make any local change, they will be lost.
+// source: flwr/proto/fleet.proto
+// Original file comments:
+// Copyright 2022 Flower Labs GmbH. All Rights Reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+// ==============================================================================
+//
+#ifndef GRPC_flwr_2fproto_2ffleet_2eproto__INCLUDED
+#define GRPC_flwr_2fproto_2ffleet_2eproto__INCLUDED
+
+#include "flwr/proto/fleet.pb.h"
+
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+
+namespace flwr {
+namespace proto {
+
+class Fleet final {
+ public:
+ static constexpr char const* service_full_name() {
+ return "flwr.proto.Fleet";
+ }
+ class StubInterface {
+ public:
+ virtual ~StubInterface() {}
+ virtual ::grpc::Status CreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::flwr::proto::CreateNodeResponse* response) = 0;
+ std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::CreateNodeResponse>> AsyncCreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::CreateNodeResponse>>(AsyncCreateNodeRaw(context, request, cq));
+ }
+ std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::CreateNodeResponse>> PrepareAsyncCreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::CreateNodeResponse>>(PrepareAsyncCreateNodeRaw(context, request, cq));
+ }
+ virtual ::grpc::Status DeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::flwr::proto::DeleteNodeResponse* response) = 0;
+ std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::DeleteNodeResponse>> AsyncDeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::DeleteNodeResponse>>(AsyncDeleteNodeRaw(context, request, cq));
+ }
+ std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::DeleteNodeResponse>> PrepareAsyncDeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::DeleteNodeResponse>>(PrepareAsyncDeleteNodeRaw(context, request, cq));
+ }
+ // Retrieve one or more tasks, if possible
+ //
+ // HTTP API path: /api/v1/fleet/pull-task-ins
+ virtual ::grpc::Status PullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::flwr::proto::PullTaskInsResponse* response) = 0;
+ std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::PullTaskInsResponse>> AsyncPullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::PullTaskInsResponse>>(AsyncPullTaskInsRaw(context, request, cq));
+ }
+ std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::PullTaskInsResponse>> PrepareAsyncPullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::PullTaskInsResponse>>(PrepareAsyncPullTaskInsRaw(context, request, cq));
+ }
+ // Complete one or more tasks, if possible
+ //
+ // HTTP API path: /api/v1/fleet/push-task-res
+ virtual ::grpc::Status PushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::flwr::proto::PushTaskResResponse* response) = 0;
+ std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::PushTaskResResponse>> AsyncPushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::PushTaskResResponse>>(AsyncPushTaskResRaw(context, request, cq));
+ }
+ std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::PushTaskResResponse>> PrepareAsyncPushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::PushTaskResResponse>>(PrepareAsyncPushTaskResRaw(context, request, cq));
+ }
+ class async_interface {
+ public:
+ virtual ~async_interface() {}
+ virtual void CreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest* request, ::flwr::proto::CreateNodeResponse* response, std::function) = 0;
+ virtual void CreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest* request, ::flwr::proto::CreateNodeResponse* response, ::grpc::ClientUnaryReactor* reactor) = 0;
+ virtual void DeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest* request, ::flwr::proto::DeleteNodeResponse* response, std::function) = 0;
+ virtual void DeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest* request, ::flwr::proto::DeleteNodeResponse* response, ::grpc::ClientUnaryReactor* reactor) = 0;
+ // Retrieve one or more tasks, if possible
+ //
+ // HTTP API path: /api/v1/fleet/pull-task-ins
+ virtual void PullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest* request, ::flwr::proto::PullTaskInsResponse* response, std::function) = 0;
+ virtual void PullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest* request, ::flwr::proto::PullTaskInsResponse* response, ::grpc::ClientUnaryReactor* reactor) = 0;
+ // Complete one or more tasks, if possible
+ //
+ // HTTP API path: /api/v1/fleet/push-task-res
+ virtual void PushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest* request, ::flwr::proto::PushTaskResResponse* response, std::function) = 0;
+ virtual void PushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest* request, ::flwr::proto::PushTaskResResponse* response, ::grpc::ClientUnaryReactor* reactor) = 0;
+ };
+ typedef class async_interface experimental_async_interface;
+ virtual class async_interface* async() { return nullptr; }
+ class async_interface* experimental_async() { return async(); }
+ private:
+ virtual ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::CreateNodeResponse>* AsyncCreateNodeRaw(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::grpc::CompletionQueue* cq) = 0;
+ virtual ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::CreateNodeResponse>* PrepareAsyncCreateNodeRaw(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::grpc::CompletionQueue* cq) = 0;
+ virtual ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::DeleteNodeResponse>* AsyncDeleteNodeRaw(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::grpc::CompletionQueue* cq) = 0;
+ virtual ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::DeleteNodeResponse>* PrepareAsyncDeleteNodeRaw(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::grpc::CompletionQueue* cq) = 0;
+ virtual ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::PullTaskInsResponse>* AsyncPullTaskInsRaw(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::grpc::CompletionQueue* cq) = 0;
+ virtual ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::PullTaskInsResponse>* PrepareAsyncPullTaskInsRaw(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::grpc::CompletionQueue* cq) = 0;
+ virtual ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::PushTaskResResponse>* AsyncPushTaskResRaw(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::grpc::CompletionQueue* cq) = 0;
+ virtual ::grpc::ClientAsyncResponseReaderInterface< ::flwr::proto::PushTaskResResponse>* PrepareAsyncPushTaskResRaw(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::grpc::CompletionQueue* cq) = 0;
+ };
+ class Stub final : public StubInterface {
+ public:
+ Stub(const std::shared_ptr< ::grpc::ChannelInterface>& channel, const ::grpc::StubOptions& options = ::grpc::StubOptions());
+ ::grpc::Status CreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::flwr::proto::CreateNodeResponse* response) override;
+ std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::CreateNodeResponse>> AsyncCreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::CreateNodeResponse>>(AsyncCreateNodeRaw(context, request, cq));
+ }
+ std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::CreateNodeResponse>> PrepareAsyncCreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::CreateNodeResponse>>(PrepareAsyncCreateNodeRaw(context, request, cq));
+ }
+ ::grpc::Status DeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::flwr::proto::DeleteNodeResponse* response) override;
+ std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::DeleteNodeResponse>> AsyncDeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::DeleteNodeResponse>>(AsyncDeleteNodeRaw(context, request, cq));
+ }
+ std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::DeleteNodeResponse>> PrepareAsyncDeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::DeleteNodeResponse>>(PrepareAsyncDeleteNodeRaw(context, request, cq));
+ }
+ ::grpc::Status PullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::flwr::proto::PullTaskInsResponse* response) override;
+ std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::PullTaskInsResponse>> AsyncPullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::PullTaskInsResponse>>(AsyncPullTaskInsRaw(context, request, cq));
+ }
+ std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::PullTaskInsResponse>> PrepareAsyncPullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::PullTaskInsResponse>>(PrepareAsyncPullTaskInsRaw(context, request, cq));
+ }
+ ::grpc::Status PushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::flwr::proto::PushTaskResResponse* response) override;
+ std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::PushTaskResResponse>> AsyncPushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::PushTaskResResponse>>(AsyncPushTaskResRaw(context, request, cq));
+ }
+ std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::PushTaskResResponse>> PrepareAsyncPushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::grpc::CompletionQueue* cq) {
+ return std::unique_ptr< ::grpc::ClientAsyncResponseReader< ::flwr::proto::PushTaskResResponse>>(PrepareAsyncPushTaskResRaw(context, request, cq));
+ }
+ class async final :
+ public StubInterface::async_interface {
+ public:
+ void CreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest* request, ::flwr::proto::CreateNodeResponse* response, std::function) override;
+ void CreateNode(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest* request, ::flwr::proto::CreateNodeResponse* response, ::grpc::ClientUnaryReactor* reactor) override;
+ void DeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest* request, ::flwr::proto::DeleteNodeResponse* response, std::function) override;
+ void DeleteNode(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest* request, ::flwr::proto::DeleteNodeResponse* response, ::grpc::ClientUnaryReactor* reactor) override;
+ void PullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest* request, ::flwr::proto::PullTaskInsResponse* response, std::function) override;
+ void PullTaskIns(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest* request, ::flwr::proto::PullTaskInsResponse* response, ::grpc::ClientUnaryReactor* reactor) override;
+ void PushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest* request, ::flwr::proto::PushTaskResResponse* response, std::function) override;
+ void PushTaskRes(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest* request, ::flwr::proto::PushTaskResResponse* response, ::grpc::ClientUnaryReactor* reactor) override;
+ private:
+ friend class Stub;
+ explicit async(Stub* stub): stub_(stub) { }
+ Stub* stub() { return stub_; }
+ Stub* stub_;
+ };
+ class async* async() override { return &async_stub_; }
+
+ private:
+ std::shared_ptr< ::grpc::ChannelInterface> channel_;
+ class async async_stub_{this};
+ ::grpc::ClientAsyncResponseReader< ::flwr::proto::CreateNodeResponse>* AsyncCreateNodeRaw(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::grpc::CompletionQueue* cq) override;
+ ::grpc::ClientAsyncResponseReader< ::flwr::proto::CreateNodeResponse>* PrepareAsyncCreateNodeRaw(::grpc::ClientContext* context, const ::flwr::proto::CreateNodeRequest& request, ::grpc::CompletionQueue* cq) override;
+ ::grpc::ClientAsyncResponseReader< ::flwr::proto::DeleteNodeResponse>* AsyncDeleteNodeRaw(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::grpc::CompletionQueue* cq) override;
+ ::grpc::ClientAsyncResponseReader< ::flwr::proto::DeleteNodeResponse>* PrepareAsyncDeleteNodeRaw(::grpc::ClientContext* context, const ::flwr::proto::DeleteNodeRequest& request, ::grpc::CompletionQueue* cq) override;
+ ::grpc::ClientAsyncResponseReader< ::flwr::proto::PullTaskInsResponse>* AsyncPullTaskInsRaw(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::grpc::CompletionQueue* cq) override;
+ ::grpc::ClientAsyncResponseReader< ::flwr::proto::PullTaskInsResponse>* PrepareAsyncPullTaskInsRaw(::grpc::ClientContext* context, const ::flwr::proto::PullTaskInsRequest& request, ::grpc::CompletionQueue* cq) override;
+ ::grpc::ClientAsyncResponseReader< ::flwr::proto::PushTaskResResponse>* AsyncPushTaskResRaw(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::grpc::CompletionQueue* cq) override;
+ ::grpc::ClientAsyncResponseReader< ::flwr::proto::PushTaskResResponse>* PrepareAsyncPushTaskResRaw(::grpc::ClientContext* context, const ::flwr::proto::PushTaskResRequest& request, ::grpc::CompletionQueue* cq) override;
+ const ::grpc::internal::RpcMethod rpcmethod_CreateNode_;
+ const ::grpc::internal::RpcMethod rpcmethod_DeleteNode_;
+ const ::grpc::internal::RpcMethod rpcmethod_PullTaskIns_;
+ const ::grpc::internal::RpcMethod rpcmethod_PushTaskRes_;
+ };
+ static std::unique_ptr NewStub(const std::shared_ptr< ::grpc::ChannelInterface>& channel, const ::grpc::StubOptions& options = ::grpc::StubOptions());
+
+ class Service : public ::grpc::Service {
+ public:
+ Service();
+ virtual ~Service();
+ virtual ::grpc::Status CreateNode(::grpc::ServerContext* context, const ::flwr::proto::CreateNodeRequest* request, ::flwr::proto::CreateNodeResponse* response);
+ virtual ::grpc::Status DeleteNode(::grpc::ServerContext* context, const ::flwr::proto::DeleteNodeRequest* request, ::flwr::proto::DeleteNodeResponse* response);
+ // Retrieve one or more tasks, if possible
+ //
+ // HTTP API path: /api/v1/fleet/pull-task-ins
+ virtual ::grpc::Status PullTaskIns(::grpc::ServerContext* context, const ::flwr::proto::PullTaskInsRequest* request, ::flwr::proto::PullTaskInsResponse* response);
+ // Complete one or more tasks, if possible
+ //
+ // HTTP API path: /api/v1/fleet/push-task-res
+ virtual ::grpc::Status PushTaskRes(::grpc::ServerContext* context, const ::flwr::proto::PushTaskResRequest* request, ::flwr::proto::PushTaskResResponse* response);
+ };
+ template
+ class WithAsyncMethod_CreateNode : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithAsyncMethod_CreateNode() {
+ ::grpc::Service::MarkMethodAsync(0);
+ }
+ ~WithAsyncMethod_CreateNode() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status CreateNode(::grpc::ServerContext* /*context*/, const ::flwr::proto::CreateNodeRequest* /*request*/, ::flwr::proto::CreateNodeResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ void RequestCreateNode(::grpc::ServerContext* context, ::flwr::proto::CreateNodeRequest* request, ::grpc::ServerAsyncResponseWriter< ::flwr::proto::CreateNodeResponse>* response, ::grpc::CompletionQueue* new_call_cq, ::grpc::ServerCompletionQueue* notification_cq, void *tag) {
+ ::grpc::Service::RequestAsyncUnary(0, context, request, response, new_call_cq, notification_cq, tag);
+ }
+ };
+ template
+ class WithAsyncMethod_DeleteNode : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithAsyncMethod_DeleteNode() {
+ ::grpc::Service::MarkMethodAsync(1);
+ }
+ ~WithAsyncMethod_DeleteNode() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status DeleteNode(::grpc::ServerContext* /*context*/, const ::flwr::proto::DeleteNodeRequest* /*request*/, ::flwr::proto::DeleteNodeResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ void RequestDeleteNode(::grpc::ServerContext* context, ::flwr::proto::DeleteNodeRequest* request, ::grpc::ServerAsyncResponseWriter< ::flwr::proto::DeleteNodeResponse>* response, ::grpc::CompletionQueue* new_call_cq, ::grpc::ServerCompletionQueue* notification_cq, void *tag) {
+ ::grpc::Service::RequestAsyncUnary(1, context, request, response, new_call_cq, notification_cq, tag);
+ }
+ };
+ template
+ class WithAsyncMethod_PullTaskIns : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithAsyncMethod_PullTaskIns() {
+ ::grpc::Service::MarkMethodAsync(2);
+ }
+ ~WithAsyncMethod_PullTaskIns() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status PullTaskIns(::grpc::ServerContext* /*context*/, const ::flwr::proto::PullTaskInsRequest* /*request*/, ::flwr::proto::PullTaskInsResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ void RequestPullTaskIns(::grpc::ServerContext* context, ::flwr::proto::PullTaskInsRequest* request, ::grpc::ServerAsyncResponseWriter< ::flwr::proto::PullTaskInsResponse>* response, ::grpc::CompletionQueue* new_call_cq, ::grpc::ServerCompletionQueue* notification_cq, void *tag) {
+ ::grpc::Service::RequestAsyncUnary(2, context, request, response, new_call_cq, notification_cq, tag);
+ }
+ };
+ template
+ class WithAsyncMethod_PushTaskRes : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithAsyncMethod_PushTaskRes() {
+ ::grpc::Service::MarkMethodAsync(3);
+ }
+ ~WithAsyncMethod_PushTaskRes() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status PushTaskRes(::grpc::ServerContext* /*context*/, const ::flwr::proto::PushTaskResRequest* /*request*/, ::flwr::proto::PushTaskResResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ void RequestPushTaskRes(::grpc::ServerContext* context, ::flwr::proto::PushTaskResRequest* request, ::grpc::ServerAsyncResponseWriter< ::flwr::proto::PushTaskResResponse>* response, ::grpc::CompletionQueue* new_call_cq, ::grpc::ServerCompletionQueue* notification_cq, void *tag) {
+ ::grpc::Service::RequestAsyncUnary(3, context, request, response, new_call_cq, notification_cq, tag);
+ }
+ };
+ typedef WithAsyncMethod_CreateNode > > > AsyncService;
+ template
+ class WithCallbackMethod_CreateNode : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithCallbackMethod_CreateNode() {
+ ::grpc::Service::MarkMethodCallback(0,
+ new ::grpc::internal::CallbackUnaryHandler< ::flwr::proto::CreateNodeRequest, ::flwr::proto::CreateNodeResponse>(
+ [this](
+ ::grpc::CallbackServerContext* context, const ::flwr::proto::CreateNodeRequest* request, ::flwr::proto::CreateNodeResponse* response) { return this->CreateNode(context, request, response); }));}
+ void SetMessageAllocatorFor_CreateNode(
+ ::grpc::MessageAllocator< ::flwr::proto::CreateNodeRequest, ::flwr::proto::CreateNodeResponse>* allocator) {
+ ::grpc::internal::MethodHandler* const handler = ::grpc::Service::GetHandler(0);
+ static_cast<::grpc::internal::CallbackUnaryHandler< ::flwr::proto::CreateNodeRequest, ::flwr::proto::CreateNodeResponse>*>(handler)
+ ->SetMessageAllocator(allocator);
+ }
+ ~WithCallbackMethod_CreateNode() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status CreateNode(::grpc::ServerContext* /*context*/, const ::flwr::proto::CreateNodeRequest* /*request*/, ::flwr::proto::CreateNodeResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ virtual ::grpc::ServerUnaryReactor* CreateNode(
+ ::grpc::CallbackServerContext* /*context*/, const ::flwr::proto::CreateNodeRequest* /*request*/, ::flwr::proto::CreateNodeResponse* /*response*/) { return nullptr; }
+ };
+ template
+ class WithCallbackMethod_DeleteNode : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithCallbackMethod_DeleteNode() {
+ ::grpc::Service::MarkMethodCallback(1,
+ new ::grpc::internal::CallbackUnaryHandler< ::flwr::proto::DeleteNodeRequest, ::flwr::proto::DeleteNodeResponse>(
+ [this](
+ ::grpc::CallbackServerContext* context, const ::flwr::proto::DeleteNodeRequest* request, ::flwr::proto::DeleteNodeResponse* response) { return this->DeleteNode(context, request, response); }));}
+ void SetMessageAllocatorFor_DeleteNode(
+ ::grpc::MessageAllocator< ::flwr::proto::DeleteNodeRequest, ::flwr::proto::DeleteNodeResponse>* allocator) {
+ ::grpc::internal::MethodHandler* const handler = ::grpc::Service::GetHandler(1);
+ static_cast<::grpc::internal::CallbackUnaryHandler< ::flwr::proto::DeleteNodeRequest, ::flwr::proto::DeleteNodeResponse>*>(handler)
+ ->SetMessageAllocator(allocator);
+ }
+ ~WithCallbackMethod_DeleteNode() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status DeleteNode(::grpc::ServerContext* /*context*/, const ::flwr::proto::DeleteNodeRequest* /*request*/, ::flwr::proto::DeleteNodeResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ virtual ::grpc::ServerUnaryReactor* DeleteNode(
+ ::grpc::CallbackServerContext* /*context*/, const ::flwr::proto::DeleteNodeRequest* /*request*/, ::flwr::proto::DeleteNodeResponse* /*response*/) { return nullptr; }
+ };
+ template
+ class WithCallbackMethod_PullTaskIns : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithCallbackMethod_PullTaskIns() {
+ ::grpc::Service::MarkMethodCallback(2,
+ new ::grpc::internal::CallbackUnaryHandler< ::flwr::proto::PullTaskInsRequest, ::flwr::proto::PullTaskInsResponse>(
+ [this](
+ ::grpc::CallbackServerContext* context, const ::flwr::proto::PullTaskInsRequest* request, ::flwr::proto::PullTaskInsResponse* response) { return this->PullTaskIns(context, request, response); }));}
+ void SetMessageAllocatorFor_PullTaskIns(
+ ::grpc::MessageAllocator< ::flwr::proto::PullTaskInsRequest, ::flwr::proto::PullTaskInsResponse>* allocator) {
+ ::grpc::internal::MethodHandler* const handler = ::grpc::Service::GetHandler(2);
+ static_cast<::grpc::internal::CallbackUnaryHandler< ::flwr::proto::PullTaskInsRequest, ::flwr::proto::PullTaskInsResponse>*>(handler)
+ ->SetMessageAllocator(allocator);
+ }
+ ~WithCallbackMethod_PullTaskIns() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status PullTaskIns(::grpc::ServerContext* /*context*/, const ::flwr::proto::PullTaskInsRequest* /*request*/, ::flwr::proto::PullTaskInsResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ virtual ::grpc::ServerUnaryReactor* PullTaskIns(
+ ::grpc::CallbackServerContext* /*context*/, const ::flwr::proto::PullTaskInsRequest* /*request*/, ::flwr::proto::PullTaskInsResponse* /*response*/) { return nullptr; }
+ };
+ template
+ class WithCallbackMethod_PushTaskRes : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithCallbackMethod_PushTaskRes() {
+ ::grpc::Service::MarkMethodCallback(3,
+ new ::grpc::internal::CallbackUnaryHandler< ::flwr::proto::PushTaskResRequest, ::flwr::proto::PushTaskResResponse>(
+ [this](
+ ::grpc::CallbackServerContext* context, const ::flwr::proto::PushTaskResRequest* request, ::flwr::proto::PushTaskResResponse* response) { return this->PushTaskRes(context, request, response); }));}
+ void SetMessageAllocatorFor_PushTaskRes(
+ ::grpc::MessageAllocator< ::flwr::proto::PushTaskResRequest, ::flwr::proto::PushTaskResResponse>* allocator) {
+ ::grpc::internal::MethodHandler* const handler = ::grpc::Service::GetHandler(3);
+ static_cast<::grpc::internal::CallbackUnaryHandler< ::flwr::proto::PushTaskResRequest, ::flwr::proto::PushTaskResResponse>*>(handler)
+ ->SetMessageAllocator(allocator);
+ }
+ ~WithCallbackMethod_PushTaskRes() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status PushTaskRes(::grpc::ServerContext* /*context*/, const ::flwr::proto::PushTaskResRequest* /*request*/, ::flwr::proto::PushTaskResResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ virtual ::grpc::ServerUnaryReactor* PushTaskRes(
+ ::grpc::CallbackServerContext* /*context*/, const ::flwr::proto::PushTaskResRequest* /*request*/, ::flwr::proto::PushTaskResResponse* /*response*/) { return nullptr; }
+ };
+ typedef WithCallbackMethod_CreateNode > > > CallbackService;
+ typedef CallbackService ExperimentalCallbackService;
+ template
+ class WithGenericMethod_CreateNode : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithGenericMethod_CreateNode() {
+ ::grpc::Service::MarkMethodGeneric(0);
+ }
+ ~WithGenericMethod_CreateNode() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status CreateNode(::grpc::ServerContext* /*context*/, const ::flwr::proto::CreateNodeRequest* /*request*/, ::flwr::proto::CreateNodeResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ };
+ template
+ class WithGenericMethod_DeleteNode : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithGenericMethod_DeleteNode() {
+ ::grpc::Service::MarkMethodGeneric(1);
+ }
+ ~WithGenericMethod_DeleteNode() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status DeleteNode(::grpc::ServerContext* /*context*/, const ::flwr::proto::DeleteNodeRequest* /*request*/, ::flwr::proto::DeleteNodeResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ };
+ template
+ class WithGenericMethod_PullTaskIns : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithGenericMethod_PullTaskIns() {
+ ::grpc::Service::MarkMethodGeneric(2);
+ }
+ ~WithGenericMethod_PullTaskIns() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status PullTaskIns(::grpc::ServerContext* /*context*/, const ::flwr::proto::PullTaskInsRequest* /*request*/, ::flwr::proto::PullTaskInsResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ };
+ template
+ class WithGenericMethod_PushTaskRes : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithGenericMethod_PushTaskRes() {
+ ::grpc::Service::MarkMethodGeneric(3);
+ }
+ ~WithGenericMethod_PushTaskRes() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status PushTaskRes(::grpc::ServerContext* /*context*/, const ::flwr::proto::PushTaskResRequest* /*request*/, ::flwr::proto::PushTaskResResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ };
+ template
+ class WithRawMethod_CreateNode : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithRawMethod_CreateNode() {
+ ::grpc::Service::MarkMethodRaw(0);
+ }
+ ~WithRawMethod_CreateNode() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status CreateNode(::grpc::ServerContext* /*context*/, const ::flwr::proto::CreateNodeRequest* /*request*/, ::flwr::proto::CreateNodeResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ void RequestCreateNode(::grpc::ServerContext* context, ::grpc::ByteBuffer* request, ::grpc::ServerAsyncResponseWriter< ::grpc::ByteBuffer>* response, ::grpc::CompletionQueue* new_call_cq, ::grpc::ServerCompletionQueue* notification_cq, void *tag) {
+ ::grpc::Service::RequestAsyncUnary(0, context, request, response, new_call_cq, notification_cq, tag);
+ }
+ };
+ template
+ class WithRawMethod_DeleteNode : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithRawMethod_DeleteNode() {
+ ::grpc::Service::MarkMethodRaw(1);
+ }
+ ~WithRawMethod_DeleteNode() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status DeleteNode(::grpc::ServerContext* /*context*/, const ::flwr::proto::DeleteNodeRequest* /*request*/, ::flwr::proto::DeleteNodeResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ void RequestDeleteNode(::grpc::ServerContext* context, ::grpc::ByteBuffer* request, ::grpc::ServerAsyncResponseWriter< ::grpc::ByteBuffer>* response, ::grpc::CompletionQueue* new_call_cq, ::grpc::ServerCompletionQueue* notification_cq, void *tag) {
+ ::grpc::Service::RequestAsyncUnary(1, context, request, response, new_call_cq, notification_cq, tag);
+ }
+ };
+ template
+ class WithRawMethod_PullTaskIns : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithRawMethod_PullTaskIns() {
+ ::grpc::Service::MarkMethodRaw(2);
+ }
+ ~WithRawMethod_PullTaskIns() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status PullTaskIns(::grpc::ServerContext* /*context*/, const ::flwr::proto::PullTaskInsRequest* /*request*/, ::flwr::proto::PullTaskInsResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ void RequestPullTaskIns(::grpc::ServerContext* context, ::grpc::ByteBuffer* request, ::grpc::ServerAsyncResponseWriter< ::grpc::ByteBuffer>* response, ::grpc::CompletionQueue* new_call_cq, ::grpc::ServerCompletionQueue* notification_cq, void *tag) {
+ ::grpc::Service::RequestAsyncUnary(2, context, request, response, new_call_cq, notification_cq, tag);
+ }
+ };
+ template
+ class WithRawMethod_PushTaskRes : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithRawMethod_PushTaskRes() {
+ ::grpc::Service::MarkMethodRaw(3);
+ }
+ ~WithRawMethod_PushTaskRes() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status PushTaskRes(::grpc::ServerContext* /*context*/, const ::flwr::proto::PushTaskResRequest* /*request*/, ::flwr::proto::PushTaskResResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ void RequestPushTaskRes(::grpc::ServerContext* context, ::grpc::ByteBuffer* request, ::grpc::ServerAsyncResponseWriter< ::grpc::ByteBuffer>* response, ::grpc::CompletionQueue* new_call_cq, ::grpc::ServerCompletionQueue* notification_cq, void *tag) {
+ ::grpc::Service::RequestAsyncUnary(3, context, request, response, new_call_cq, notification_cq, tag);
+ }
+ };
+ template
+ class WithRawCallbackMethod_CreateNode : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithRawCallbackMethod_CreateNode() {
+ ::grpc::Service::MarkMethodRawCallback(0,
+ new ::grpc::internal::CallbackUnaryHandler< ::grpc::ByteBuffer, ::grpc::ByteBuffer>(
+ [this](
+ ::grpc::CallbackServerContext* context, const ::grpc::ByteBuffer* request, ::grpc::ByteBuffer* response) { return this->CreateNode(context, request, response); }));
+ }
+ ~WithRawCallbackMethod_CreateNode() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status CreateNode(::grpc::ServerContext* /*context*/, const ::flwr::proto::CreateNodeRequest* /*request*/, ::flwr::proto::CreateNodeResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ virtual ::grpc::ServerUnaryReactor* CreateNode(
+ ::grpc::CallbackServerContext* /*context*/, const ::grpc::ByteBuffer* /*request*/, ::grpc::ByteBuffer* /*response*/) { return nullptr; }
+ };
+ template
+ class WithRawCallbackMethod_DeleteNode : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithRawCallbackMethod_DeleteNode() {
+ ::grpc::Service::MarkMethodRawCallback(1,
+ new ::grpc::internal::CallbackUnaryHandler< ::grpc::ByteBuffer, ::grpc::ByteBuffer>(
+ [this](
+ ::grpc::CallbackServerContext* context, const ::grpc::ByteBuffer* request, ::grpc::ByteBuffer* response) { return this->DeleteNode(context, request, response); }));
+ }
+ ~WithRawCallbackMethod_DeleteNode() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status DeleteNode(::grpc::ServerContext* /*context*/, const ::flwr::proto::DeleteNodeRequest* /*request*/, ::flwr::proto::DeleteNodeResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ virtual ::grpc::ServerUnaryReactor* DeleteNode(
+ ::grpc::CallbackServerContext* /*context*/, const ::grpc::ByteBuffer* /*request*/, ::grpc::ByteBuffer* /*response*/) { return nullptr; }
+ };
+ template
+ class WithRawCallbackMethod_PullTaskIns : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithRawCallbackMethod_PullTaskIns() {
+ ::grpc::Service::MarkMethodRawCallback(2,
+ new ::grpc::internal::CallbackUnaryHandler< ::grpc::ByteBuffer, ::grpc::ByteBuffer>(
+ [this](
+ ::grpc::CallbackServerContext* context, const ::grpc::ByteBuffer* request, ::grpc::ByteBuffer* response) { return this->PullTaskIns(context, request, response); }));
+ }
+ ~WithRawCallbackMethod_PullTaskIns() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status PullTaskIns(::grpc::ServerContext* /*context*/, const ::flwr::proto::PullTaskInsRequest* /*request*/, ::flwr::proto::PullTaskInsResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ virtual ::grpc::ServerUnaryReactor* PullTaskIns(
+ ::grpc::CallbackServerContext* /*context*/, const ::grpc::ByteBuffer* /*request*/, ::grpc::ByteBuffer* /*response*/) { return nullptr; }
+ };
+ template
+ class WithRawCallbackMethod_PushTaskRes : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithRawCallbackMethod_PushTaskRes() {
+ ::grpc::Service::MarkMethodRawCallback(3,
+ new ::grpc::internal::CallbackUnaryHandler< ::grpc::ByteBuffer, ::grpc::ByteBuffer>(
+ [this](
+ ::grpc::CallbackServerContext* context, const ::grpc::ByteBuffer* request, ::grpc::ByteBuffer* response) { return this->PushTaskRes(context, request, response); }));
+ }
+ ~WithRawCallbackMethod_PushTaskRes() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable synchronous version of this method
+ ::grpc::Status PushTaskRes(::grpc::ServerContext* /*context*/, const ::flwr::proto::PushTaskResRequest* /*request*/, ::flwr::proto::PushTaskResResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ virtual ::grpc::ServerUnaryReactor* PushTaskRes(
+ ::grpc::CallbackServerContext* /*context*/, const ::grpc::ByteBuffer* /*request*/, ::grpc::ByteBuffer* /*response*/) { return nullptr; }
+ };
+ template
+ class WithStreamedUnaryMethod_CreateNode : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithStreamedUnaryMethod_CreateNode() {
+ ::grpc::Service::MarkMethodStreamed(0,
+ new ::grpc::internal::StreamedUnaryHandler<
+ ::flwr::proto::CreateNodeRequest, ::flwr::proto::CreateNodeResponse>(
+ [this](::grpc::ServerContext* context,
+ ::grpc::ServerUnaryStreamer<
+ ::flwr::proto::CreateNodeRequest, ::flwr::proto::CreateNodeResponse>* streamer) {
+ return this->StreamedCreateNode(context,
+ streamer);
+ }));
+ }
+ ~WithStreamedUnaryMethod_CreateNode() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable regular version of this method
+ ::grpc::Status CreateNode(::grpc::ServerContext* /*context*/, const ::flwr::proto::CreateNodeRequest* /*request*/, ::flwr::proto::CreateNodeResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ // replace default version of method with streamed unary
+ virtual ::grpc::Status StreamedCreateNode(::grpc::ServerContext* context, ::grpc::ServerUnaryStreamer< ::flwr::proto::CreateNodeRequest,::flwr::proto::CreateNodeResponse>* server_unary_streamer) = 0;
+ };
+ template
+ class WithStreamedUnaryMethod_DeleteNode : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithStreamedUnaryMethod_DeleteNode() {
+ ::grpc::Service::MarkMethodStreamed(1,
+ new ::grpc::internal::StreamedUnaryHandler<
+ ::flwr::proto::DeleteNodeRequest, ::flwr::proto::DeleteNodeResponse>(
+ [this](::grpc::ServerContext* context,
+ ::grpc::ServerUnaryStreamer<
+ ::flwr::proto::DeleteNodeRequest, ::flwr::proto::DeleteNodeResponse>* streamer) {
+ return this->StreamedDeleteNode(context,
+ streamer);
+ }));
+ }
+ ~WithStreamedUnaryMethod_DeleteNode() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable regular version of this method
+ ::grpc::Status DeleteNode(::grpc::ServerContext* /*context*/, const ::flwr::proto::DeleteNodeRequest* /*request*/, ::flwr::proto::DeleteNodeResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ // replace default version of method with streamed unary
+ virtual ::grpc::Status StreamedDeleteNode(::grpc::ServerContext* context, ::grpc::ServerUnaryStreamer< ::flwr::proto::DeleteNodeRequest,::flwr::proto::DeleteNodeResponse>* server_unary_streamer) = 0;
+ };
+ template
+ class WithStreamedUnaryMethod_PullTaskIns : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithStreamedUnaryMethod_PullTaskIns() {
+ ::grpc::Service::MarkMethodStreamed(2,
+ new ::grpc::internal::StreamedUnaryHandler<
+ ::flwr::proto::PullTaskInsRequest, ::flwr::proto::PullTaskInsResponse>(
+ [this](::grpc::ServerContext* context,
+ ::grpc::ServerUnaryStreamer<
+ ::flwr::proto::PullTaskInsRequest, ::flwr::proto::PullTaskInsResponse>* streamer) {
+ return this->StreamedPullTaskIns(context,
+ streamer);
+ }));
+ }
+ ~WithStreamedUnaryMethod_PullTaskIns() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable regular version of this method
+ ::grpc::Status PullTaskIns(::grpc::ServerContext* /*context*/, const ::flwr::proto::PullTaskInsRequest* /*request*/, ::flwr::proto::PullTaskInsResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ // replace default version of method with streamed unary
+ virtual ::grpc::Status StreamedPullTaskIns(::grpc::ServerContext* context, ::grpc::ServerUnaryStreamer< ::flwr::proto::PullTaskInsRequest,::flwr::proto::PullTaskInsResponse>* server_unary_streamer) = 0;
+ };
+ template
+ class WithStreamedUnaryMethod_PushTaskRes : public BaseClass {
+ private:
+ void BaseClassMustBeDerivedFromService(const Service* /*service*/) {}
+ public:
+ WithStreamedUnaryMethod_PushTaskRes() {
+ ::grpc::Service::MarkMethodStreamed(3,
+ new ::grpc::internal::StreamedUnaryHandler<
+ ::flwr::proto::PushTaskResRequest, ::flwr::proto::PushTaskResResponse>(
+ [this](::grpc::ServerContext* context,
+ ::grpc::ServerUnaryStreamer<
+ ::flwr::proto::PushTaskResRequest, ::flwr::proto::PushTaskResResponse>* streamer) {
+ return this->StreamedPushTaskRes(context,
+ streamer);
+ }));
+ }
+ ~WithStreamedUnaryMethod_PushTaskRes() override {
+ BaseClassMustBeDerivedFromService(this);
+ }
+ // disable regular version of this method
+ ::grpc::Status PushTaskRes(::grpc::ServerContext* /*context*/, const ::flwr::proto::PushTaskResRequest* /*request*/, ::flwr::proto::PushTaskResResponse* /*response*/) override {
+ abort();
+ return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, "");
+ }
+ // replace default version of method with streamed unary
+ virtual ::grpc::Status StreamedPushTaskRes(::grpc::ServerContext* context, ::grpc::ServerUnaryStreamer< ::flwr::proto::PushTaskResRequest,::flwr::proto::PushTaskResResponse>* server_unary_streamer) = 0;
+ };
+ typedef WithStreamedUnaryMethod_CreateNode > > > StreamedUnaryService;
+ typedef Service SplitStreamedService;
+ typedef WithStreamedUnaryMethod_CreateNode > > > StreamedService;
+};
+
+} // namespace proto
+} // namespace flwr
+
+
+#endif // GRPC_flwr_2fproto_2ffleet_2eproto__INCLUDED
diff --git a/src/cc/flwr/include/flwr/proto/fleet.pb.cc b/src/cc/flwr/include/flwr/proto/fleet.pb.cc
new file mode 100644
index 000000000000..302331374db1
--- /dev/null
+++ b/src/cc/flwr/include/flwr/proto/fleet.pb.cc
@@ -0,0 +1,1932 @@
+// Generated by the protocol buffer compiler. DO NOT EDIT!
+// source: flwr/proto/fleet.proto
+
+#include "flwr/proto/fleet.pb.h"
+
+#include
+
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+// @@protoc_insertion_point(includes)
+#include
+
+PROTOBUF_PRAGMA_INIT_SEG
+namespace flwr {
+namespace proto {
+constexpr CreateNodeRequest::CreateNodeRequest(
+ ::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized){}
+struct CreateNodeRequestDefaultTypeInternal {
+ constexpr CreateNodeRequestDefaultTypeInternal()
+ : _instance(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized{}) {}
+ ~CreateNodeRequestDefaultTypeInternal() {}
+ union {
+ CreateNodeRequest _instance;
+ };
+};
+PROTOBUF_ATTRIBUTE_NO_DESTROY PROTOBUF_CONSTINIT CreateNodeRequestDefaultTypeInternal _CreateNodeRequest_default_instance_;
+constexpr CreateNodeResponse::CreateNodeResponse(
+ ::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized)
+ : node_(nullptr){}
+struct CreateNodeResponseDefaultTypeInternal {
+ constexpr CreateNodeResponseDefaultTypeInternal()
+ : _instance(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized{}) {}
+ ~CreateNodeResponseDefaultTypeInternal() {}
+ union {
+ CreateNodeResponse _instance;
+ };
+};
+PROTOBUF_ATTRIBUTE_NO_DESTROY PROTOBUF_CONSTINIT CreateNodeResponseDefaultTypeInternal _CreateNodeResponse_default_instance_;
+constexpr DeleteNodeRequest::DeleteNodeRequest(
+ ::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized)
+ : node_(nullptr){}
+struct DeleteNodeRequestDefaultTypeInternal {
+ constexpr DeleteNodeRequestDefaultTypeInternal()
+ : _instance(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized{}) {}
+ ~DeleteNodeRequestDefaultTypeInternal() {}
+ union {
+ DeleteNodeRequest _instance;
+ };
+};
+PROTOBUF_ATTRIBUTE_NO_DESTROY PROTOBUF_CONSTINIT DeleteNodeRequestDefaultTypeInternal _DeleteNodeRequest_default_instance_;
+constexpr DeleteNodeResponse::DeleteNodeResponse(
+ ::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized){}
+struct DeleteNodeResponseDefaultTypeInternal {
+ constexpr DeleteNodeResponseDefaultTypeInternal()
+ : _instance(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized{}) {}
+ ~DeleteNodeResponseDefaultTypeInternal() {}
+ union {
+ DeleteNodeResponse _instance;
+ };
+};
+PROTOBUF_ATTRIBUTE_NO_DESTROY PROTOBUF_CONSTINIT DeleteNodeResponseDefaultTypeInternal _DeleteNodeResponse_default_instance_;
+constexpr PullTaskInsRequest::PullTaskInsRequest(
+ ::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized)
+ : task_ids_()
+ , node_(nullptr){}
+struct PullTaskInsRequestDefaultTypeInternal {
+ constexpr PullTaskInsRequestDefaultTypeInternal()
+ : _instance(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized{}) {}
+ ~PullTaskInsRequestDefaultTypeInternal() {}
+ union {
+ PullTaskInsRequest _instance;
+ };
+};
+PROTOBUF_ATTRIBUTE_NO_DESTROY PROTOBUF_CONSTINIT PullTaskInsRequestDefaultTypeInternal _PullTaskInsRequest_default_instance_;
+constexpr PullTaskInsResponse::PullTaskInsResponse(
+ ::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized)
+ : task_ins_list_()
+ , reconnect_(nullptr){}
+struct PullTaskInsResponseDefaultTypeInternal {
+ constexpr PullTaskInsResponseDefaultTypeInternal()
+ : _instance(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized{}) {}
+ ~PullTaskInsResponseDefaultTypeInternal() {}
+ union {
+ PullTaskInsResponse _instance;
+ };
+};
+PROTOBUF_ATTRIBUTE_NO_DESTROY PROTOBUF_CONSTINIT PullTaskInsResponseDefaultTypeInternal _PullTaskInsResponse_default_instance_;
+constexpr PushTaskResRequest::PushTaskResRequest(
+ ::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized)
+ : task_res_list_(){}
+struct PushTaskResRequestDefaultTypeInternal {
+ constexpr PushTaskResRequestDefaultTypeInternal()
+ : _instance(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized{}) {}
+ ~PushTaskResRequestDefaultTypeInternal() {}
+ union {
+ PushTaskResRequest _instance;
+ };
+};
+PROTOBUF_ATTRIBUTE_NO_DESTROY PROTOBUF_CONSTINIT PushTaskResRequestDefaultTypeInternal _PushTaskResRequest_default_instance_;
+constexpr PushTaskResResponse_ResultsEntry_DoNotUse::PushTaskResResponse_ResultsEntry_DoNotUse(
+ ::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized){}
+struct PushTaskResResponse_ResultsEntry_DoNotUseDefaultTypeInternal {
+ constexpr PushTaskResResponse_ResultsEntry_DoNotUseDefaultTypeInternal()
+ : _instance(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized{}) {}
+ ~PushTaskResResponse_ResultsEntry_DoNotUseDefaultTypeInternal() {}
+ union {
+ PushTaskResResponse_ResultsEntry_DoNotUse _instance;
+ };
+};
+PROTOBUF_ATTRIBUTE_NO_DESTROY PROTOBUF_CONSTINIT PushTaskResResponse_ResultsEntry_DoNotUseDefaultTypeInternal _PushTaskResResponse_ResultsEntry_DoNotUse_default_instance_;
+constexpr PushTaskResResponse::PushTaskResResponse(
+ ::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized)
+ : results_(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized{})
+ , reconnect_(nullptr){}
+struct PushTaskResResponseDefaultTypeInternal {
+ constexpr PushTaskResResponseDefaultTypeInternal()
+ : _instance(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized{}) {}
+ ~PushTaskResResponseDefaultTypeInternal() {}
+ union {
+ PushTaskResResponse _instance;
+ };
+};
+PROTOBUF_ATTRIBUTE_NO_DESTROY PROTOBUF_CONSTINIT PushTaskResResponseDefaultTypeInternal _PushTaskResResponse_default_instance_;
+constexpr Reconnect::Reconnect(
+ ::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized)
+ : reconnect_(uint64_t{0u}){}
+struct ReconnectDefaultTypeInternal {
+ constexpr ReconnectDefaultTypeInternal()
+ : _instance(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized{}) {}
+ ~ReconnectDefaultTypeInternal() {}
+ union {
+ Reconnect _instance;
+ };
+};
+PROTOBUF_ATTRIBUTE_NO_DESTROY PROTOBUF_CONSTINIT ReconnectDefaultTypeInternal _Reconnect_default_instance_;
+} // namespace proto
+} // namespace flwr
+static ::PROTOBUF_NAMESPACE_ID::Metadata file_level_metadata_flwr_2fproto_2ffleet_2eproto[10];
+static constexpr ::PROTOBUF_NAMESPACE_ID::EnumDescriptor const** file_level_enum_descriptors_flwr_2fproto_2ffleet_2eproto = nullptr;
+static constexpr ::PROTOBUF_NAMESPACE_ID::ServiceDescriptor const** file_level_service_descriptors_flwr_2fproto_2ffleet_2eproto = nullptr;
+
+const ::PROTOBUF_NAMESPACE_ID::uint32 TableStruct_flwr_2fproto_2ffleet_2eproto::offsets[] PROTOBUF_SECTION_VARIABLE(protodesc_cold) = {
+ ~0u, // no _has_bits_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::CreateNodeRequest, _internal_metadata_),
+ ~0u, // no _extensions_
+ ~0u, // no _oneof_case_
+ ~0u, // no _weak_field_map_
+ ~0u, // no _inlined_string_donated_
+ ~0u, // no _has_bits_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::CreateNodeResponse, _internal_metadata_),
+ ~0u, // no _extensions_
+ ~0u, // no _oneof_case_
+ ~0u, // no _weak_field_map_
+ ~0u, // no _inlined_string_donated_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::CreateNodeResponse, node_),
+ ~0u, // no _has_bits_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::DeleteNodeRequest, _internal_metadata_),
+ ~0u, // no _extensions_
+ ~0u, // no _oneof_case_
+ ~0u, // no _weak_field_map_
+ ~0u, // no _inlined_string_donated_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::DeleteNodeRequest, node_),
+ ~0u, // no _has_bits_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::DeleteNodeResponse, _internal_metadata_),
+ ~0u, // no _extensions_
+ ~0u, // no _oneof_case_
+ ~0u, // no _weak_field_map_
+ ~0u, // no _inlined_string_donated_
+ ~0u, // no _has_bits_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PullTaskInsRequest, _internal_metadata_),
+ ~0u, // no _extensions_
+ ~0u, // no _oneof_case_
+ ~0u, // no _weak_field_map_
+ ~0u, // no _inlined_string_donated_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PullTaskInsRequest, node_),
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PullTaskInsRequest, task_ids_),
+ ~0u, // no _has_bits_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PullTaskInsResponse, _internal_metadata_),
+ ~0u, // no _extensions_
+ ~0u, // no _oneof_case_
+ ~0u, // no _weak_field_map_
+ ~0u, // no _inlined_string_donated_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PullTaskInsResponse, reconnect_),
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PullTaskInsResponse, task_ins_list_),
+ ~0u, // no _has_bits_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PushTaskResRequest, _internal_metadata_),
+ ~0u, // no _extensions_
+ ~0u, // no _oneof_case_
+ ~0u, // no _weak_field_map_
+ ~0u, // no _inlined_string_donated_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PushTaskResRequest, task_res_list_),
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PushTaskResResponse_ResultsEntry_DoNotUse, _has_bits_),
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PushTaskResResponse_ResultsEntry_DoNotUse, _internal_metadata_),
+ ~0u, // no _extensions_
+ ~0u, // no _oneof_case_
+ ~0u, // no _weak_field_map_
+ ~0u, // no _inlined_string_donated_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PushTaskResResponse_ResultsEntry_DoNotUse, key_),
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PushTaskResResponse_ResultsEntry_DoNotUse, value_),
+ 0,
+ 1,
+ ~0u, // no _has_bits_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PushTaskResResponse, _internal_metadata_),
+ ~0u, // no _extensions_
+ ~0u, // no _oneof_case_
+ ~0u, // no _weak_field_map_
+ ~0u, // no _inlined_string_donated_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PushTaskResResponse, reconnect_),
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::PushTaskResResponse, results_),
+ ~0u, // no _has_bits_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::Reconnect, _internal_metadata_),
+ ~0u, // no _extensions_
+ ~0u, // no _oneof_case_
+ ~0u, // no _weak_field_map_
+ ~0u, // no _inlined_string_donated_
+ PROTOBUF_FIELD_OFFSET(::flwr::proto::Reconnect, reconnect_),
+};
+static const ::PROTOBUF_NAMESPACE_ID::internal::MigrationSchema schemas[] PROTOBUF_SECTION_VARIABLE(protodesc_cold) = {
+ { 0, -1, -1, sizeof(::flwr::proto::CreateNodeRequest)},
+ { 6, -1, -1, sizeof(::flwr::proto::CreateNodeResponse)},
+ { 13, -1, -1, sizeof(::flwr::proto::DeleteNodeRequest)},
+ { 20, -1, -1, sizeof(::flwr::proto::DeleteNodeResponse)},
+ { 26, -1, -1, sizeof(::flwr::proto::PullTaskInsRequest)},
+ { 34, -1, -1, sizeof(::flwr::proto::PullTaskInsResponse)},
+ { 42, -1, -1, sizeof(::flwr::proto::PushTaskResRequest)},
+ { 49, 57, -1, sizeof(::flwr::proto::PushTaskResResponse_ResultsEntry_DoNotUse)},
+ { 59, -1, -1, sizeof(::flwr::proto::PushTaskResResponse)},
+ { 67, -1, -1, sizeof(::flwr::proto::Reconnect)},
+};
+
+static ::PROTOBUF_NAMESPACE_ID::Message const * const file_default_instances[] = {
+ reinterpret_cast(&::flwr::proto::_CreateNodeRequest_default_instance_),
+ reinterpret_cast(&::flwr::proto::_CreateNodeResponse_default_instance_),
+ reinterpret_cast(&::flwr::proto::_DeleteNodeRequest_default_instance_),
+ reinterpret_cast(&::flwr::proto::_DeleteNodeResponse_default_instance_),
+ reinterpret_cast(&::flwr::proto::_PullTaskInsRequest_default_instance_),
+ reinterpret_cast(&::flwr::proto::_PullTaskInsResponse_default_instance_),
+ reinterpret_cast(&::flwr::proto::_PushTaskResRequest_default_instance_),
+ reinterpret_cast(&::flwr::proto::_PushTaskResResponse_ResultsEntry_DoNotUse_default_instance_),
+ reinterpret_cast(&::flwr::proto::_PushTaskResResponse_default_instance_),
+ reinterpret_cast(&::flwr::proto::_Reconnect_default_instance_),
+};
+
+const char descriptor_table_protodef_flwr_2fproto_2ffleet_2eproto[] PROTOBUF_SECTION_VARIABLE(protodesc_cold) =
+ "\n\026flwr/proto/fleet.proto\022\nflwr.proto\032\025fl"
+ "wr/proto/node.proto\032\025flwr/proto/task.pro"
+ "to\"\023\n\021CreateNodeRequest\"4\n\022CreateNodeRes"
+ "ponse\022\036\n\004node\030\001 \001(\0132\020.flwr.proto.Node\"3\n"
+ "\021DeleteNodeRequest\022\036\n\004node\030\001 \001(\0132\020.flwr."
+ "proto.Node\"\024\n\022DeleteNodeResponse\"F\n\022Pull"
+ "TaskInsRequest\022\036\n\004node\030\001 \001(\0132\020.flwr.prot"
+ "o.Node\022\020\n\010task_ids\030\002 \003(\t\"k\n\023PullTaskInsR"
+ "esponse\022(\n\treconnect\030\001 \001(\0132\025.flwr.proto."
+ "Reconnect\022*\n\rtask_ins_list\030\002 \003(\0132\023.flwr."
+ "proto.TaskIns\"@\n\022PushTaskResRequest\022*\n\rt"
+ "ask_res_list\030\001 \003(\0132\023.flwr.proto.TaskRes\""
+ "\256\001\n\023PushTaskResResponse\022(\n\treconnect\030\001 \001"
+ "(\0132\025.flwr.proto.Reconnect\022=\n\007results\030\002 \003"
+ "(\0132,.flwr.proto.PushTaskResResponse.Resu"
+ "ltsEntry\032.\n\014ResultsEntry\022\013\n\003key\030\001 \001(\t\022\r\n"
+ "\005value\030\002 \001(\r:\0028\001\"\036\n\tReconnect\022\021\n\treconne"
+ "ct\030\001 \001(\0042\311\002\n\005Fleet\022M\n\nCreateNode\022\035.flwr."
+ "proto.CreateNodeRequest\032\036.flwr.proto.Cre"
+ "ateNodeResponse\"\000\022M\n\nDeleteNode\022\035.flwr.p"
+ "roto.DeleteNodeRequest\032\036.flwr.proto.Dele"
+ "teNodeResponse\"\000\022P\n\013PullTaskIns\022\036.flwr.p"
+ "roto.PullTaskInsRequest\032\037.flwr.proto.Pul"
+ "lTaskInsResponse\"\000\022P\n\013PushTaskRes\022\036.flwr"
+ ".proto.PushTaskResRequest\032\037.flwr.proto.P"
+ "ushTaskResResponse\"\000b\006proto3"
+ ;
+static const ::PROTOBUF_NAMESPACE_ID::internal::DescriptorTable*const descriptor_table_flwr_2fproto_2ffleet_2eproto_deps[2] = {
+ &::descriptor_table_flwr_2fproto_2fnode_2eproto,
+ &::descriptor_table_flwr_2fproto_2ftask_2eproto,
+};
+static ::PROTOBUF_NAMESPACE_ID::internal::once_flag descriptor_table_flwr_2fproto_2ffleet_2eproto_once;
+const ::PROTOBUF_NAMESPACE_ID::internal::DescriptorTable descriptor_table_flwr_2fproto_2ffleet_2eproto = {
+ false, false, 1028, descriptor_table_protodef_flwr_2fproto_2ffleet_2eproto, "flwr/proto/fleet.proto",
+ &descriptor_table_flwr_2fproto_2ffleet_2eproto_once, descriptor_table_flwr_2fproto_2ffleet_2eproto_deps, 2, 10,
+ schemas, file_default_instances, TableStruct_flwr_2fproto_2ffleet_2eproto::offsets,
+ file_level_metadata_flwr_2fproto_2ffleet_2eproto, file_level_enum_descriptors_flwr_2fproto_2ffleet_2eproto, file_level_service_descriptors_flwr_2fproto_2ffleet_2eproto,
+};
+PROTOBUF_ATTRIBUTE_WEAK const ::PROTOBUF_NAMESPACE_ID::internal::DescriptorTable* descriptor_table_flwr_2fproto_2ffleet_2eproto_getter() {
+ return &descriptor_table_flwr_2fproto_2ffleet_2eproto;
+}
+
+// Force running AddDescriptors() at dynamic initialization time.
+PROTOBUF_ATTRIBUTE_INIT_PRIORITY static ::PROTOBUF_NAMESPACE_ID::internal::AddDescriptorsRunner dynamic_init_dummy_flwr_2fproto_2ffleet_2eproto(&descriptor_table_flwr_2fproto_2ffleet_2eproto);
+namespace flwr {
+namespace proto {
+
+// ===================================================================
+
+class CreateNodeRequest::_Internal {
+ public:
+};
+
+CreateNodeRequest::CreateNodeRequest(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned)
+ : ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase(arena, is_message_owned) {
+ // @@protoc_insertion_point(arena_constructor:flwr.proto.CreateNodeRequest)
+}
+CreateNodeRequest::CreateNodeRequest(const CreateNodeRequest& from)
+ : ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase() {
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+ // @@protoc_insertion_point(copy_constructor:flwr.proto.CreateNodeRequest)
+}
+
+
+
+
+
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData CreateNodeRequest::_class_data_ = {
+ ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase::CopyImpl,
+ ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase::MergeImpl,
+};
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*CreateNodeRequest::GetClassData() const { return &_class_data_; }
+
+
+
+
+
+
+
+::PROTOBUF_NAMESPACE_ID::Metadata CreateNodeRequest::GetMetadata() const {
+ return ::PROTOBUF_NAMESPACE_ID::internal::AssignDescriptors(
+ &descriptor_table_flwr_2fproto_2ffleet_2eproto_getter, &descriptor_table_flwr_2fproto_2ffleet_2eproto_once,
+ file_level_metadata_flwr_2fproto_2ffleet_2eproto[0]);
+}
+
+// ===================================================================
+
+class CreateNodeResponse::_Internal {
+ public:
+ static const ::flwr::proto::Node& node(const CreateNodeResponse* msg);
+};
+
+const ::flwr::proto::Node&
+CreateNodeResponse::_Internal::node(const CreateNodeResponse* msg) {
+ return *msg->node_;
+}
+void CreateNodeResponse::clear_node() {
+ if (GetArenaForAllocation() == nullptr && node_ != nullptr) {
+ delete node_;
+ }
+ node_ = nullptr;
+}
+CreateNodeResponse::CreateNodeResponse(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned)
+ : ::PROTOBUF_NAMESPACE_ID::Message(arena, is_message_owned) {
+ SharedCtor();
+ if (!is_message_owned) {
+ RegisterArenaDtor(arena);
+ }
+ // @@protoc_insertion_point(arena_constructor:flwr.proto.CreateNodeResponse)
+}
+CreateNodeResponse::CreateNodeResponse(const CreateNodeResponse& from)
+ : ::PROTOBUF_NAMESPACE_ID::Message() {
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+ if (from._internal_has_node()) {
+ node_ = new ::flwr::proto::Node(*from.node_);
+ } else {
+ node_ = nullptr;
+ }
+ // @@protoc_insertion_point(copy_constructor:flwr.proto.CreateNodeResponse)
+}
+
+void CreateNodeResponse::SharedCtor() {
+node_ = nullptr;
+}
+
+CreateNodeResponse::~CreateNodeResponse() {
+ // @@protoc_insertion_point(destructor:flwr.proto.CreateNodeResponse)
+ if (GetArenaForAllocation() != nullptr) return;
+ SharedDtor();
+ _internal_metadata_.Delete<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+inline void CreateNodeResponse::SharedDtor() {
+ GOOGLE_DCHECK(GetArenaForAllocation() == nullptr);
+ if (this != internal_default_instance()) delete node_;
+}
+
+void CreateNodeResponse::ArenaDtor(void* object) {
+ CreateNodeResponse* _this = reinterpret_cast< CreateNodeResponse* >(object);
+ (void)_this;
+}
+void CreateNodeResponse::RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena*) {
+}
+void CreateNodeResponse::SetCachedSize(int size) const {
+ _cached_size_.Set(size);
+}
+
+void CreateNodeResponse::Clear() {
+// @@protoc_insertion_point(message_clear_start:flwr.proto.CreateNodeResponse)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ if (GetArenaForAllocation() == nullptr && node_ != nullptr) {
+ delete node_;
+ }
+ node_ = nullptr;
+ _internal_metadata_.Clear<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+const char* CreateNodeResponse::_InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) {
+#define CHK_(x) if (PROTOBUF_PREDICT_FALSE(!(x))) goto failure
+ while (!ctx->Done(&ptr)) {
+ ::PROTOBUF_NAMESPACE_ID::uint32 tag;
+ ptr = ::PROTOBUF_NAMESPACE_ID::internal::ReadTag(ptr, &tag);
+ switch (tag >> 3) {
+ // .flwr.proto.Node node = 1;
+ case 1:
+ if (PROTOBUF_PREDICT_TRUE(static_cast<::PROTOBUF_NAMESPACE_ID::uint8>(tag) == 10)) {
+ ptr = ctx->ParseMessage(_internal_mutable_node(), ptr);
+ CHK_(ptr);
+ } else
+ goto handle_unusual;
+ continue;
+ default:
+ goto handle_unusual;
+ } // switch
+ handle_unusual:
+ if ((tag == 0) || ((tag & 7) == 4)) {
+ CHK_(ptr);
+ ctx->SetLastTag(tag);
+ goto message_done;
+ }
+ ptr = UnknownFieldParse(
+ tag,
+ _internal_metadata_.mutable_unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(),
+ ptr, ctx);
+ CHK_(ptr != nullptr);
+ } // while
+message_done:
+ return ptr;
+failure:
+ ptr = nullptr;
+ goto message_done;
+#undef CHK_
+}
+
+::PROTOBUF_NAMESPACE_ID::uint8* CreateNodeResponse::_InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const {
+ // @@protoc_insertion_point(serialize_to_array_start:flwr.proto.CreateNodeResponse)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ // .flwr.proto.Node node = 1;
+ if (this->_internal_has_node()) {
+ target = stream->EnsureSpace(target);
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::
+ InternalWriteMessage(
+ 1, _Internal::node(this), target, stream);
+ }
+
+ if (PROTOBUF_PREDICT_FALSE(_internal_metadata_.have_unknown_fields())) {
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormat::InternalSerializeUnknownFieldsToArray(
+ _internal_metadata_.unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(::PROTOBUF_NAMESPACE_ID::UnknownFieldSet::default_instance), target, stream);
+ }
+ // @@protoc_insertion_point(serialize_to_array_end:flwr.proto.CreateNodeResponse)
+ return target;
+}
+
+size_t CreateNodeResponse::ByteSizeLong() const {
+// @@protoc_insertion_point(message_byte_size_start:flwr.proto.CreateNodeResponse)
+ size_t total_size = 0;
+
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ // .flwr.proto.Node node = 1;
+ if (this->_internal_has_node()) {
+ total_size += 1 +
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::MessageSize(
+ *node_);
+ }
+
+ return MaybeComputeUnknownFieldsSize(total_size, &_cached_size_);
+}
+
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData CreateNodeResponse::_class_data_ = {
+ ::PROTOBUF_NAMESPACE_ID::Message::CopyWithSizeCheck,
+ CreateNodeResponse::MergeImpl
+};
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*CreateNodeResponse::GetClassData() const { return &_class_data_; }
+
+void CreateNodeResponse::MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to,
+ const ::PROTOBUF_NAMESPACE_ID::Message& from) {
+ static_cast(to)->MergeFrom(
+ static_cast(from));
+}
+
+
+void CreateNodeResponse::MergeFrom(const CreateNodeResponse& from) {
+// @@protoc_insertion_point(class_specific_merge_from_start:flwr.proto.CreateNodeResponse)
+ GOOGLE_DCHECK_NE(&from, this);
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ if (from._internal_has_node()) {
+ _internal_mutable_node()->::flwr::proto::Node::MergeFrom(from._internal_node());
+ }
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+}
+
+void CreateNodeResponse::CopyFrom(const CreateNodeResponse& from) {
+// @@protoc_insertion_point(class_specific_copy_from_start:flwr.proto.CreateNodeResponse)
+ if (&from == this) return;
+ Clear();
+ MergeFrom(from);
+}
+
+bool CreateNodeResponse::IsInitialized() const {
+ return true;
+}
+
+void CreateNodeResponse::InternalSwap(CreateNodeResponse* other) {
+ using std::swap;
+ _internal_metadata_.InternalSwap(&other->_internal_metadata_);
+ swap(node_, other->node_);
+}
+
+::PROTOBUF_NAMESPACE_ID::Metadata CreateNodeResponse::GetMetadata() const {
+ return ::PROTOBUF_NAMESPACE_ID::internal::AssignDescriptors(
+ &descriptor_table_flwr_2fproto_2ffleet_2eproto_getter, &descriptor_table_flwr_2fproto_2ffleet_2eproto_once,
+ file_level_metadata_flwr_2fproto_2ffleet_2eproto[1]);
+}
+
+// ===================================================================
+
+class DeleteNodeRequest::_Internal {
+ public:
+ static const ::flwr::proto::Node& node(const DeleteNodeRequest* msg);
+};
+
+const ::flwr::proto::Node&
+DeleteNodeRequest::_Internal::node(const DeleteNodeRequest* msg) {
+ return *msg->node_;
+}
+void DeleteNodeRequest::clear_node() {
+ if (GetArenaForAllocation() == nullptr && node_ != nullptr) {
+ delete node_;
+ }
+ node_ = nullptr;
+}
+DeleteNodeRequest::DeleteNodeRequest(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned)
+ : ::PROTOBUF_NAMESPACE_ID::Message(arena, is_message_owned) {
+ SharedCtor();
+ if (!is_message_owned) {
+ RegisterArenaDtor(arena);
+ }
+ // @@protoc_insertion_point(arena_constructor:flwr.proto.DeleteNodeRequest)
+}
+DeleteNodeRequest::DeleteNodeRequest(const DeleteNodeRequest& from)
+ : ::PROTOBUF_NAMESPACE_ID::Message() {
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+ if (from._internal_has_node()) {
+ node_ = new ::flwr::proto::Node(*from.node_);
+ } else {
+ node_ = nullptr;
+ }
+ // @@protoc_insertion_point(copy_constructor:flwr.proto.DeleteNodeRequest)
+}
+
+void DeleteNodeRequest::SharedCtor() {
+node_ = nullptr;
+}
+
+DeleteNodeRequest::~DeleteNodeRequest() {
+ // @@protoc_insertion_point(destructor:flwr.proto.DeleteNodeRequest)
+ if (GetArenaForAllocation() != nullptr) return;
+ SharedDtor();
+ _internal_metadata_.Delete<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+inline void DeleteNodeRequest::SharedDtor() {
+ GOOGLE_DCHECK(GetArenaForAllocation() == nullptr);
+ if (this != internal_default_instance()) delete node_;
+}
+
+void DeleteNodeRequest::ArenaDtor(void* object) {
+ DeleteNodeRequest* _this = reinterpret_cast< DeleteNodeRequest* >(object);
+ (void)_this;
+}
+void DeleteNodeRequest::RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena*) {
+}
+void DeleteNodeRequest::SetCachedSize(int size) const {
+ _cached_size_.Set(size);
+}
+
+void DeleteNodeRequest::Clear() {
+// @@protoc_insertion_point(message_clear_start:flwr.proto.DeleteNodeRequest)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ if (GetArenaForAllocation() == nullptr && node_ != nullptr) {
+ delete node_;
+ }
+ node_ = nullptr;
+ _internal_metadata_.Clear<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+const char* DeleteNodeRequest::_InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) {
+#define CHK_(x) if (PROTOBUF_PREDICT_FALSE(!(x))) goto failure
+ while (!ctx->Done(&ptr)) {
+ ::PROTOBUF_NAMESPACE_ID::uint32 tag;
+ ptr = ::PROTOBUF_NAMESPACE_ID::internal::ReadTag(ptr, &tag);
+ switch (tag >> 3) {
+ // .flwr.proto.Node node = 1;
+ case 1:
+ if (PROTOBUF_PREDICT_TRUE(static_cast<::PROTOBUF_NAMESPACE_ID::uint8>(tag) == 10)) {
+ ptr = ctx->ParseMessage(_internal_mutable_node(), ptr);
+ CHK_(ptr);
+ } else
+ goto handle_unusual;
+ continue;
+ default:
+ goto handle_unusual;
+ } // switch
+ handle_unusual:
+ if ((tag == 0) || ((tag & 7) == 4)) {
+ CHK_(ptr);
+ ctx->SetLastTag(tag);
+ goto message_done;
+ }
+ ptr = UnknownFieldParse(
+ tag,
+ _internal_metadata_.mutable_unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(),
+ ptr, ctx);
+ CHK_(ptr != nullptr);
+ } // while
+message_done:
+ return ptr;
+failure:
+ ptr = nullptr;
+ goto message_done;
+#undef CHK_
+}
+
+::PROTOBUF_NAMESPACE_ID::uint8* DeleteNodeRequest::_InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const {
+ // @@protoc_insertion_point(serialize_to_array_start:flwr.proto.DeleteNodeRequest)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ // .flwr.proto.Node node = 1;
+ if (this->_internal_has_node()) {
+ target = stream->EnsureSpace(target);
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::
+ InternalWriteMessage(
+ 1, _Internal::node(this), target, stream);
+ }
+
+ if (PROTOBUF_PREDICT_FALSE(_internal_metadata_.have_unknown_fields())) {
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormat::InternalSerializeUnknownFieldsToArray(
+ _internal_metadata_.unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(::PROTOBUF_NAMESPACE_ID::UnknownFieldSet::default_instance), target, stream);
+ }
+ // @@protoc_insertion_point(serialize_to_array_end:flwr.proto.DeleteNodeRequest)
+ return target;
+}
+
+size_t DeleteNodeRequest::ByteSizeLong() const {
+// @@protoc_insertion_point(message_byte_size_start:flwr.proto.DeleteNodeRequest)
+ size_t total_size = 0;
+
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ // .flwr.proto.Node node = 1;
+ if (this->_internal_has_node()) {
+ total_size += 1 +
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::MessageSize(
+ *node_);
+ }
+
+ return MaybeComputeUnknownFieldsSize(total_size, &_cached_size_);
+}
+
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData DeleteNodeRequest::_class_data_ = {
+ ::PROTOBUF_NAMESPACE_ID::Message::CopyWithSizeCheck,
+ DeleteNodeRequest::MergeImpl
+};
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*DeleteNodeRequest::GetClassData() const { return &_class_data_; }
+
+void DeleteNodeRequest::MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to,
+ const ::PROTOBUF_NAMESPACE_ID::Message& from) {
+ static_cast(to)->MergeFrom(
+ static_cast(from));
+}
+
+
+void DeleteNodeRequest::MergeFrom(const DeleteNodeRequest& from) {
+// @@protoc_insertion_point(class_specific_merge_from_start:flwr.proto.DeleteNodeRequest)
+ GOOGLE_DCHECK_NE(&from, this);
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ if (from._internal_has_node()) {
+ _internal_mutable_node()->::flwr::proto::Node::MergeFrom(from._internal_node());
+ }
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+}
+
+void DeleteNodeRequest::CopyFrom(const DeleteNodeRequest& from) {
+// @@protoc_insertion_point(class_specific_copy_from_start:flwr.proto.DeleteNodeRequest)
+ if (&from == this) return;
+ Clear();
+ MergeFrom(from);
+}
+
+bool DeleteNodeRequest::IsInitialized() const {
+ return true;
+}
+
+void DeleteNodeRequest::InternalSwap(DeleteNodeRequest* other) {
+ using std::swap;
+ _internal_metadata_.InternalSwap(&other->_internal_metadata_);
+ swap(node_, other->node_);
+}
+
+::PROTOBUF_NAMESPACE_ID::Metadata DeleteNodeRequest::GetMetadata() const {
+ return ::PROTOBUF_NAMESPACE_ID::internal::AssignDescriptors(
+ &descriptor_table_flwr_2fproto_2ffleet_2eproto_getter, &descriptor_table_flwr_2fproto_2ffleet_2eproto_once,
+ file_level_metadata_flwr_2fproto_2ffleet_2eproto[2]);
+}
+
+// ===================================================================
+
+class DeleteNodeResponse::_Internal {
+ public:
+};
+
+DeleteNodeResponse::DeleteNodeResponse(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned)
+ : ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase(arena, is_message_owned) {
+ // @@protoc_insertion_point(arena_constructor:flwr.proto.DeleteNodeResponse)
+}
+DeleteNodeResponse::DeleteNodeResponse(const DeleteNodeResponse& from)
+ : ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase() {
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+ // @@protoc_insertion_point(copy_constructor:flwr.proto.DeleteNodeResponse)
+}
+
+
+
+
+
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData DeleteNodeResponse::_class_data_ = {
+ ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase::CopyImpl,
+ ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase::MergeImpl,
+};
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*DeleteNodeResponse::GetClassData() const { return &_class_data_; }
+
+
+
+
+
+
+
+::PROTOBUF_NAMESPACE_ID::Metadata DeleteNodeResponse::GetMetadata() const {
+ return ::PROTOBUF_NAMESPACE_ID::internal::AssignDescriptors(
+ &descriptor_table_flwr_2fproto_2ffleet_2eproto_getter, &descriptor_table_flwr_2fproto_2ffleet_2eproto_once,
+ file_level_metadata_flwr_2fproto_2ffleet_2eproto[3]);
+}
+
+// ===================================================================
+
+class PullTaskInsRequest::_Internal {
+ public:
+ static const ::flwr::proto::Node& node(const PullTaskInsRequest* msg);
+};
+
+const ::flwr::proto::Node&
+PullTaskInsRequest::_Internal::node(const PullTaskInsRequest* msg) {
+ return *msg->node_;
+}
+void PullTaskInsRequest::clear_node() {
+ if (GetArenaForAllocation() == nullptr && node_ != nullptr) {
+ delete node_;
+ }
+ node_ = nullptr;
+}
+PullTaskInsRequest::PullTaskInsRequest(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned)
+ : ::PROTOBUF_NAMESPACE_ID::Message(arena, is_message_owned),
+ task_ids_(arena) {
+ SharedCtor();
+ if (!is_message_owned) {
+ RegisterArenaDtor(arena);
+ }
+ // @@protoc_insertion_point(arena_constructor:flwr.proto.PullTaskInsRequest)
+}
+PullTaskInsRequest::PullTaskInsRequest(const PullTaskInsRequest& from)
+ : ::PROTOBUF_NAMESPACE_ID::Message(),
+ task_ids_(from.task_ids_) {
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+ if (from._internal_has_node()) {
+ node_ = new ::flwr::proto::Node(*from.node_);
+ } else {
+ node_ = nullptr;
+ }
+ // @@protoc_insertion_point(copy_constructor:flwr.proto.PullTaskInsRequest)
+}
+
+void PullTaskInsRequest::SharedCtor() {
+node_ = nullptr;
+}
+
+PullTaskInsRequest::~PullTaskInsRequest() {
+ // @@protoc_insertion_point(destructor:flwr.proto.PullTaskInsRequest)
+ if (GetArenaForAllocation() != nullptr) return;
+ SharedDtor();
+ _internal_metadata_.Delete<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+inline void PullTaskInsRequest::SharedDtor() {
+ GOOGLE_DCHECK(GetArenaForAllocation() == nullptr);
+ if (this != internal_default_instance()) delete node_;
+}
+
+void PullTaskInsRequest::ArenaDtor(void* object) {
+ PullTaskInsRequest* _this = reinterpret_cast< PullTaskInsRequest* >(object);
+ (void)_this;
+}
+void PullTaskInsRequest::RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena*) {
+}
+void PullTaskInsRequest::SetCachedSize(int size) const {
+ _cached_size_.Set(size);
+}
+
+void PullTaskInsRequest::Clear() {
+// @@protoc_insertion_point(message_clear_start:flwr.proto.PullTaskInsRequest)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ task_ids_.Clear();
+ if (GetArenaForAllocation() == nullptr && node_ != nullptr) {
+ delete node_;
+ }
+ node_ = nullptr;
+ _internal_metadata_.Clear<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+const char* PullTaskInsRequest::_InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) {
+#define CHK_(x) if (PROTOBUF_PREDICT_FALSE(!(x))) goto failure
+ while (!ctx->Done(&ptr)) {
+ ::PROTOBUF_NAMESPACE_ID::uint32 tag;
+ ptr = ::PROTOBUF_NAMESPACE_ID::internal::ReadTag(ptr, &tag);
+ switch (tag >> 3) {
+ // .flwr.proto.Node node = 1;
+ case 1:
+ if (PROTOBUF_PREDICT_TRUE(static_cast<::PROTOBUF_NAMESPACE_ID::uint8>(tag) == 10)) {
+ ptr = ctx->ParseMessage(_internal_mutable_node(), ptr);
+ CHK_(ptr);
+ } else
+ goto handle_unusual;
+ continue;
+ // repeated string task_ids = 2;
+ case 2:
+ if (PROTOBUF_PREDICT_TRUE(static_cast<::PROTOBUF_NAMESPACE_ID::uint8>(tag) == 18)) {
+ ptr -= 1;
+ do {
+ ptr += 1;
+ auto str = _internal_add_task_ids();
+ ptr = ::PROTOBUF_NAMESPACE_ID::internal::InlineGreedyStringParser(str, ptr, ctx);
+ CHK_(::PROTOBUF_NAMESPACE_ID::internal::VerifyUTF8(str, "flwr.proto.PullTaskInsRequest.task_ids"));
+ CHK_(ptr);
+ if (!ctx->DataAvailable(ptr)) break;
+ } while (::PROTOBUF_NAMESPACE_ID::internal::ExpectTag<18>(ptr));
+ } else
+ goto handle_unusual;
+ continue;
+ default:
+ goto handle_unusual;
+ } // switch
+ handle_unusual:
+ if ((tag == 0) || ((tag & 7) == 4)) {
+ CHK_(ptr);
+ ctx->SetLastTag(tag);
+ goto message_done;
+ }
+ ptr = UnknownFieldParse(
+ tag,
+ _internal_metadata_.mutable_unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(),
+ ptr, ctx);
+ CHK_(ptr != nullptr);
+ } // while
+message_done:
+ return ptr;
+failure:
+ ptr = nullptr;
+ goto message_done;
+#undef CHK_
+}
+
+::PROTOBUF_NAMESPACE_ID::uint8* PullTaskInsRequest::_InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const {
+ // @@protoc_insertion_point(serialize_to_array_start:flwr.proto.PullTaskInsRequest)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ // .flwr.proto.Node node = 1;
+ if (this->_internal_has_node()) {
+ target = stream->EnsureSpace(target);
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::
+ InternalWriteMessage(
+ 1, _Internal::node(this), target, stream);
+ }
+
+ // repeated string task_ids = 2;
+ for (int i = 0, n = this->_internal_task_ids_size(); i < n; i++) {
+ const auto& s = this->_internal_task_ids(i);
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::VerifyUtf8String(
+ s.data(), static_cast(s.length()),
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::SERIALIZE,
+ "flwr.proto.PullTaskInsRequest.task_ids");
+ target = stream->WriteString(2, s, target);
+ }
+
+ if (PROTOBUF_PREDICT_FALSE(_internal_metadata_.have_unknown_fields())) {
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormat::InternalSerializeUnknownFieldsToArray(
+ _internal_metadata_.unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(::PROTOBUF_NAMESPACE_ID::UnknownFieldSet::default_instance), target, stream);
+ }
+ // @@protoc_insertion_point(serialize_to_array_end:flwr.proto.PullTaskInsRequest)
+ return target;
+}
+
+size_t PullTaskInsRequest::ByteSizeLong() const {
+// @@protoc_insertion_point(message_byte_size_start:flwr.proto.PullTaskInsRequest)
+ size_t total_size = 0;
+
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ // repeated string task_ids = 2;
+ total_size += 1 *
+ ::PROTOBUF_NAMESPACE_ID::internal::FromIntSize(task_ids_.size());
+ for (int i = 0, n = task_ids_.size(); i < n; i++) {
+ total_size += ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::StringSize(
+ task_ids_.Get(i));
+ }
+
+ // .flwr.proto.Node node = 1;
+ if (this->_internal_has_node()) {
+ total_size += 1 +
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::MessageSize(
+ *node_);
+ }
+
+ return MaybeComputeUnknownFieldsSize(total_size, &_cached_size_);
+}
+
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData PullTaskInsRequest::_class_data_ = {
+ ::PROTOBUF_NAMESPACE_ID::Message::CopyWithSizeCheck,
+ PullTaskInsRequest::MergeImpl
+};
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*PullTaskInsRequest::GetClassData() const { return &_class_data_; }
+
+void PullTaskInsRequest::MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to,
+ const ::PROTOBUF_NAMESPACE_ID::Message& from) {
+ static_cast(to)->MergeFrom(
+ static_cast(from));
+}
+
+
+void PullTaskInsRequest::MergeFrom(const PullTaskInsRequest& from) {
+// @@protoc_insertion_point(class_specific_merge_from_start:flwr.proto.PullTaskInsRequest)
+ GOOGLE_DCHECK_NE(&from, this);
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ task_ids_.MergeFrom(from.task_ids_);
+ if (from._internal_has_node()) {
+ _internal_mutable_node()->::flwr::proto::Node::MergeFrom(from._internal_node());
+ }
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+}
+
+void PullTaskInsRequest::CopyFrom(const PullTaskInsRequest& from) {
+// @@protoc_insertion_point(class_specific_copy_from_start:flwr.proto.PullTaskInsRequest)
+ if (&from == this) return;
+ Clear();
+ MergeFrom(from);
+}
+
+bool PullTaskInsRequest::IsInitialized() const {
+ return true;
+}
+
+void PullTaskInsRequest::InternalSwap(PullTaskInsRequest* other) {
+ using std::swap;
+ _internal_metadata_.InternalSwap(&other->_internal_metadata_);
+ task_ids_.InternalSwap(&other->task_ids_);
+ swap(node_, other->node_);
+}
+
+::PROTOBUF_NAMESPACE_ID::Metadata PullTaskInsRequest::GetMetadata() const {
+ return ::PROTOBUF_NAMESPACE_ID::internal::AssignDescriptors(
+ &descriptor_table_flwr_2fproto_2ffleet_2eproto_getter, &descriptor_table_flwr_2fproto_2ffleet_2eproto_once,
+ file_level_metadata_flwr_2fproto_2ffleet_2eproto[4]);
+}
+
+// ===================================================================
+
+class PullTaskInsResponse::_Internal {
+ public:
+ static const ::flwr::proto::Reconnect& reconnect(const PullTaskInsResponse* msg);
+};
+
+const ::flwr::proto::Reconnect&
+PullTaskInsResponse::_Internal::reconnect(const PullTaskInsResponse* msg) {
+ return *msg->reconnect_;
+}
+void PullTaskInsResponse::clear_task_ins_list() {
+ task_ins_list_.Clear();
+}
+PullTaskInsResponse::PullTaskInsResponse(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned)
+ : ::PROTOBUF_NAMESPACE_ID::Message(arena, is_message_owned),
+ task_ins_list_(arena) {
+ SharedCtor();
+ if (!is_message_owned) {
+ RegisterArenaDtor(arena);
+ }
+ // @@protoc_insertion_point(arena_constructor:flwr.proto.PullTaskInsResponse)
+}
+PullTaskInsResponse::PullTaskInsResponse(const PullTaskInsResponse& from)
+ : ::PROTOBUF_NAMESPACE_ID::Message(),
+ task_ins_list_(from.task_ins_list_) {
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+ if (from._internal_has_reconnect()) {
+ reconnect_ = new ::flwr::proto::Reconnect(*from.reconnect_);
+ } else {
+ reconnect_ = nullptr;
+ }
+ // @@protoc_insertion_point(copy_constructor:flwr.proto.PullTaskInsResponse)
+}
+
+void PullTaskInsResponse::SharedCtor() {
+reconnect_ = nullptr;
+}
+
+PullTaskInsResponse::~PullTaskInsResponse() {
+ // @@protoc_insertion_point(destructor:flwr.proto.PullTaskInsResponse)
+ if (GetArenaForAllocation() != nullptr) return;
+ SharedDtor();
+ _internal_metadata_.Delete<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+inline void PullTaskInsResponse::SharedDtor() {
+ GOOGLE_DCHECK(GetArenaForAllocation() == nullptr);
+ if (this != internal_default_instance()) delete reconnect_;
+}
+
+void PullTaskInsResponse::ArenaDtor(void* object) {
+ PullTaskInsResponse* _this = reinterpret_cast< PullTaskInsResponse* >(object);
+ (void)_this;
+}
+void PullTaskInsResponse::RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena*) {
+}
+void PullTaskInsResponse::SetCachedSize(int size) const {
+ _cached_size_.Set(size);
+}
+
+void PullTaskInsResponse::Clear() {
+// @@protoc_insertion_point(message_clear_start:flwr.proto.PullTaskInsResponse)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ task_ins_list_.Clear();
+ if (GetArenaForAllocation() == nullptr && reconnect_ != nullptr) {
+ delete reconnect_;
+ }
+ reconnect_ = nullptr;
+ _internal_metadata_.Clear<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+const char* PullTaskInsResponse::_InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) {
+#define CHK_(x) if (PROTOBUF_PREDICT_FALSE(!(x))) goto failure
+ while (!ctx->Done(&ptr)) {
+ ::PROTOBUF_NAMESPACE_ID::uint32 tag;
+ ptr = ::PROTOBUF_NAMESPACE_ID::internal::ReadTag(ptr, &tag);
+ switch (tag >> 3) {
+ // .flwr.proto.Reconnect reconnect = 1;
+ case 1:
+ if (PROTOBUF_PREDICT_TRUE(static_cast<::PROTOBUF_NAMESPACE_ID::uint8>(tag) == 10)) {
+ ptr = ctx->ParseMessage(_internal_mutable_reconnect(), ptr);
+ CHK_(ptr);
+ } else
+ goto handle_unusual;
+ continue;
+ // repeated .flwr.proto.TaskIns task_ins_list = 2;
+ case 2:
+ if (PROTOBUF_PREDICT_TRUE(static_cast<::PROTOBUF_NAMESPACE_ID::uint8>(tag) == 18)) {
+ ptr -= 1;
+ do {
+ ptr += 1;
+ ptr = ctx->ParseMessage(_internal_add_task_ins_list(), ptr);
+ CHK_(ptr);
+ if (!ctx->DataAvailable(ptr)) break;
+ } while (::PROTOBUF_NAMESPACE_ID::internal::ExpectTag<18>(ptr));
+ } else
+ goto handle_unusual;
+ continue;
+ default:
+ goto handle_unusual;
+ } // switch
+ handle_unusual:
+ if ((tag == 0) || ((tag & 7) == 4)) {
+ CHK_(ptr);
+ ctx->SetLastTag(tag);
+ goto message_done;
+ }
+ ptr = UnknownFieldParse(
+ tag,
+ _internal_metadata_.mutable_unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(),
+ ptr, ctx);
+ CHK_(ptr != nullptr);
+ } // while
+message_done:
+ return ptr;
+failure:
+ ptr = nullptr;
+ goto message_done;
+#undef CHK_
+}
+
+::PROTOBUF_NAMESPACE_ID::uint8* PullTaskInsResponse::_InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const {
+ // @@protoc_insertion_point(serialize_to_array_start:flwr.proto.PullTaskInsResponse)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ // .flwr.proto.Reconnect reconnect = 1;
+ if (this->_internal_has_reconnect()) {
+ target = stream->EnsureSpace(target);
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::
+ InternalWriteMessage(
+ 1, _Internal::reconnect(this), target, stream);
+ }
+
+ // repeated .flwr.proto.TaskIns task_ins_list = 2;
+ for (unsigned int i = 0,
+ n = static_cast(this->_internal_task_ins_list_size()); i < n; i++) {
+ target = stream->EnsureSpace(target);
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::
+ InternalWriteMessage(2, this->_internal_task_ins_list(i), target, stream);
+ }
+
+ if (PROTOBUF_PREDICT_FALSE(_internal_metadata_.have_unknown_fields())) {
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormat::InternalSerializeUnknownFieldsToArray(
+ _internal_metadata_.unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(::PROTOBUF_NAMESPACE_ID::UnknownFieldSet::default_instance), target, stream);
+ }
+ // @@protoc_insertion_point(serialize_to_array_end:flwr.proto.PullTaskInsResponse)
+ return target;
+}
+
+size_t PullTaskInsResponse::ByteSizeLong() const {
+// @@protoc_insertion_point(message_byte_size_start:flwr.proto.PullTaskInsResponse)
+ size_t total_size = 0;
+
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ // repeated .flwr.proto.TaskIns task_ins_list = 2;
+ total_size += 1UL * this->_internal_task_ins_list_size();
+ for (const auto& msg : this->task_ins_list_) {
+ total_size +=
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::MessageSize(msg);
+ }
+
+ // .flwr.proto.Reconnect reconnect = 1;
+ if (this->_internal_has_reconnect()) {
+ total_size += 1 +
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::MessageSize(
+ *reconnect_);
+ }
+
+ return MaybeComputeUnknownFieldsSize(total_size, &_cached_size_);
+}
+
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData PullTaskInsResponse::_class_data_ = {
+ ::PROTOBUF_NAMESPACE_ID::Message::CopyWithSizeCheck,
+ PullTaskInsResponse::MergeImpl
+};
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*PullTaskInsResponse::GetClassData() const { return &_class_data_; }
+
+void PullTaskInsResponse::MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to,
+ const ::PROTOBUF_NAMESPACE_ID::Message& from) {
+ static_cast(to)->MergeFrom(
+ static_cast(from));
+}
+
+
+void PullTaskInsResponse::MergeFrom(const PullTaskInsResponse& from) {
+// @@protoc_insertion_point(class_specific_merge_from_start:flwr.proto.PullTaskInsResponse)
+ GOOGLE_DCHECK_NE(&from, this);
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ task_ins_list_.MergeFrom(from.task_ins_list_);
+ if (from._internal_has_reconnect()) {
+ _internal_mutable_reconnect()->::flwr::proto::Reconnect::MergeFrom(from._internal_reconnect());
+ }
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+}
+
+void PullTaskInsResponse::CopyFrom(const PullTaskInsResponse& from) {
+// @@protoc_insertion_point(class_specific_copy_from_start:flwr.proto.PullTaskInsResponse)
+ if (&from == this) return;
+ Clear();
+ MergeFrom(from);
+}
+
+bool PullTaskInsResponse::IsInitialized() const {
+ return true;
+}
+
+void PullTaskInsResponse::InternalSwap(PullTaskInsResponse* other) {
+ using std::swap;
+ _internal_metadata_.InternalSwap(&other->_internal_metadata_);
+ task_ins_list_.InternalSwap(&other->task_ins_list_);
+ swap(reconnect_, other->reconnect_);
+}
+
+::PROTOBUF_NAMESPACE_ID::Metadata PullTaskInsResponse::GetMetadata() const {
+ return ::PROTOBUF_NAMESPACE_ID::internal::AssignDescriptors(
+ &descriptor_table_flwr_2fproto_2ffleet_2eproto_getter, &descriptor_table_flwr_2fproto_2ffleet_2eproto_once,
+ file_level_metadata_flwr_2fproto_2ffleet_2eproto[5]);
+}
+
+// ===================================================================
+
+class PushTaskResRequest::_Internal {
+ public:
+};
+
+void PushTaskResRequest::clear_task_res_list() {
+ task_res_list_.Clear();
+}
+PushTaskResRequest::PushTaskResRequest(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned)
+ : ::PROTOBUF_NAMESPACE_ID::Message(arena, is_message_owned),
+ task_res_list_(arena) {
+ SharedCtor();
+ if (!is_message_owned) {
+ RegisterArenaDtor(arena);
+ }
+ // @@protoc_insertion_point(arena_constructor:flwr.proto.PushTaskResRequest)
+}
+PushTaskResRequest::PushTaskResRequest(const PushTaskResRequest& from)
+ : ::PROTOBUF_NAMESPACE_ID::Message(),
+ task_res_list_(from.task_res_list_) {
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+ // @@protoc_insertion_point(copy_constructor:flwr.proto.PushTaskResRequest)
+}
+
+void PushTaskResRequest::SharedCtor() {
+}
+
+PushTaskResRequest::~PushTaskResRequest() {
+ // @@protoc_insertion_point(destructor:flwr.proto.PushTaskResRequest)
+ if (GetArenaForAllocation() != nullptr) return;
+ SharedDtor();
+ _internal_metadata_.Delete<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+inline void PushTaskResRequest::SharedDtor() {
+ GOOGLE_DCHECK(GetArenaForAllocation() == nullptr);
+}
+
+void PushTaskResRequest::ArenaDtor(void* object) {
+ PushTaskResRequest* _this = reinterpret_cast< PushTaskResRequest* >(object);
+ (void)_this;
+}
+void PushTaskResRequest::RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena*) {
+}
+void PushTaskResRequest::SetCachedSize(int size) const {
+ _cached_size_.Set(size);
+}
+
+void PushTaskResRequest::Clear() {
+// @@protoc_insertion_point(message_clear_start:flwr.proto.PushTaskResRequest)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ task_res_list_.Clear();
+ _internal_metadata_.Clear<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+const char* PushTaskResRequest::_InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) {
+#define CHK_(x) if (PROTOBUF_PREDICT_FALSE(!(x))) goto failure
+ while (!ctx->Done(&ptr)) {
+ ::PROTOBUF_NAMESPACE_ID::uint32 tag;
+ ptr = ::PROTOBUF_NAMESPACE_ID::internal::ReadTag(ptr, &tag);
+ switch (tag >> 3) {
+ // repeated .flwr.proto.TaskRes task_res_list = 1;
+ case 1:
+ if (PROTOBUF_PREDICT_TRUE(static_cast<::PROTOBUF_NAMESPACE_ID::uint8>(tag) == 10)) {
+ ptr -= 1;
+ do {
+ ptr += 1;
+ ptr = ctx->ParseMessage(_internal_add_task_res_list(), ptr);
+ CHK_(ptr);
+ if (!ctx->DataAvailable(ptr)) break;
+ } while (::PROTOBUF_NAMESPACE_ID::internal::ExpectTag<10>(ptr));
+ } else
+ goto handle_unusual;
+ continue;
+ default:
+ goto handle_unusual;
+ } // switch
+ handle_unusual:
+ if ((tag == 0) || ((tag & 7) == 4)) {
+ CHK_(ptr);
+ ctx->SetLastTag(tag);
+ goto message_done;
+ }
+ ptr = UnknownFieldParse(
+ tag,
+ _internal_metadata_.mutable_unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(),
+ ptr, ctx);
+ CHK_(ptr != nullptr);
+ } // while
+message_done:
+ return ptr;
+failure:
+ ptr = nullptr;
+ goto message_done;
+#undef CHK_
+}
+
+::PROTOBUF_NAMESPACE_ID::uint8* PushTaskResRequest::_InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const {
+ // @@protoc_insertion_point(serialize_to_array_start:flwr.proto.PushTaskResRequest)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ // repeated .flwr.proto.TaskRes task_res_list = 1;
+ for (unsigned int i = 0,
+ n = static_cast(this->_internal_task_res_list_size()); i < n; i++) {
+ target = stream->EnsureSpace(target);
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::
+ InternalWriteMessage(1, this->_internal_task_res_list(i), target, stream);
+ }
+
+ if (PROTOBUF_PREDICT_FALSE(_internal_metadata_.have_unknown_fields())) {
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormat::InternalSerializeUnknownFieldsToArray(
+ _internal_metadata_.unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(::PROTOBUF_NAMESPACE_ID::UnknownFieldSet::default_instance), target, stream);
+ }
+ // @@protoc_insertion_point(serialize_to_array_end:flwr.proto.PushTaskResRequest)
+ return target;
+}
+
+size_t PushTaskResRequest::ByteSizeLong() const {
+// @@protoc_insertion_point(message_byte_size_start:flwr.proto.PushTaskResRequest)
+ size_t total_size = 0;
+
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ // repeated .flwr.proto.TaskRes task_res_list = 1;
+ total_size += 1UL * this->_internal_task_res_list_size();
+ for (const auto& msg : this->task_res_list_) {
+ total_size +=
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::MessageSize(msg);
+ }
+
+ return MaybeComputeUnknownFieldsSize(total_size, &_cached_size_);
+}
+
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData PushTaskResRequest::_class_data_ = {
+ ::PROTOBUF_NAMESPACE_ID::Message::CopyWithSizeCheck,
+ PushTaskResRequest::MergeImpl
+};
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*PushTaskResRequest::GetClassData() const { return &_class_data_; }
+
+void PushTaskResRequest::MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to,
+ const ::PROTOBUF_NAMESPACE_ID::Message& from) {
+ static_cast(to)->MergeFrom(
+ static_cast(from));
+}
+
+
+void PushTaskResRequest::MergeFrom(const PushTaskResRequest& from) {
+// @@protoc_insertion_point(class_specific_merge_from_start:flwr.proto.PushTaskResRequest)
+ GOOGLE_DCHECK_NE(&from, this);
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ task_res_list_.MergeFrom(from.task_res_list_);
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+}
+
+void PushTaskResRequest::CopyFrom(const PushTaskResRequest& from) {
+// @@protoc_insertion_point(class_specific_copy_from_start:flwr.proto.PushTaskResRequest)
+ if (&from == this) return;
+ Clear();
+ MergeFrom(from);
+}
+
+bool PushTaskResRequest::IsInitialized() const {
+ return true;
+}
+
+void PushTaskResRequest::InternalSwap(PushTaskResRequest* other) {
+ using std::swap;
+ _internal_metadata_.InternalSwap(&other->_internal_metadata_);
+ task_res_list_.InternalSwap(&other->task_res_list_);
+}
+
+::PROTOBUF_NAMESPACE_ID::Metadata PushTaskResRequest::GetMetadata() const {
+ return ::PROTOBUF_NAMESPACE_ID::internal::AssignDescriptors(
+ &descriptor_table_flwr_2fproto_2ffleet_2eproto_getter, &descriptor_table_flwr_2fproto_2ffleet_2eproto_once,
+ file_level_metadata_flwr_2fproto_2ffleet_2eproto[6]);
+}
+
+// ===================================================================
+
+PushTaskResResponse_ResultsEntry_DoNotUse::PushTaskResResponse_ResultsEntry_DoNotUse() {}
+PushTaskResResponse_ResultsEntry_DoNotUse::PushTaskResResponse_ResultsEntry_DoNotUse(::PROTOBUF_NAMESPACE_ID::Arena* arena)
+ : SuperType(arena) {}
+void PushTaskResResponse_ResultsEntry_DoNotUse::MergeFrom(const PushTaskResResponse_ResultsEntry_DoNotUse& other) {
+ MergeFromInternal(other);
+}
+::PROTOBUF_NAMESPACE_ID::Metadata PushTaskResResponse_ResultsEntry_DoNotUse::GetMetadata() const {
+ return ::PROTOBUF_NAMESPACE_ID::internal::AssignDescriptors(
+ &descriptor_table_flwr_2fproto_2ffleet_2eproto_getter, &descriptor_table_flwr_2fproto_2ffleet_2eproto_once,
+ file_level_metadata_flwr_2fproto_2ffleet_2eproto[7]);
+}
+
+// ===================================================================
+
+class PushTaskResResponse::_Internal {
+ public:
+ static const ::flwr::proto::Reconnect& reconnect(const PushTaskResResponse* msg);
+};
+
+const ::flwr::proto::Reconnect&
+PushTaskResResponse::_Internal::reconnect(const PushTaskResResponse* msg) {
+ return *msg->reconnect_;
+}
+PushTaskResResponse::PushTaskResResponse(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned)
+ : ::PROTOBUF_NAMESPACE_ID::Message(arena, is_message_owned),
+ results_(arena) {
+ SharedCtor();
+ if (!is_message_owned) {
+ RegisterArenaDtor(arena);
+ }
+ // @@protoc_insertion_point(arena_constructor:flwr.proto.PushTaskResResponse)
+}
+PushTaskResResponse::PushTaskResResponse(const PushTaskResResponse& from)
+ : ::PROTOBUF_NAMESPACE_ID::Message() {
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+ results_.MergeFrom(from.results_);
+ if (from._internal_has_reconnect()) {
+ reconnect_ = new ::flwr::proto::Reconnect(*from.reconnect_);
+ } else {
+ reconnect_ = nullptr;
+ }
+ // @@protoc_insertion_point(copy_constructor:flwr.proto.PushTaskResResponse)
+}
+
+void PushTaskResResponse::SharedCtor() {
+reconnect_ = nullptr;
+}
+
+PushTaskResResponse::~PushTaskResResponse() {
+ // @@protoc_insertion_point(destructor:flwr.proto.PushTaskResResponse)
+ if (GetArenaForAllocation() != nullptr) return;
+ SharedDtor();
+ _internal_metadata_.Delete<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+inline void PushTaskResResponse::SharedDtor() {
+ GOOGLE_DCHECK(GetArenaForAllocation() == nullptr);
+ if (this != internal_default_instance()) delete reconnect_;
+}
+
+void PushTaskResResponse::ArenaDtor(void* object) {
+ PushTaskResResponse* _this = reinterpret_cast< PushTaskResResponse* >(object);
+ (void)_this;
+ _this->results_. ~MapField();
+}
+inline void PushTaskResResponse::RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena* arena) {
+ if (arena != nullptr) {
+ arena->OwnCustomDestructor(this, &PushTaskResResponse::ArenaDtor);
+ }
+}
+void PushTaskResResponse::SetCachedSize(int size) const {
+ _cached_size_.Set(size);
+}
+
+void PushTaskResResponse::Clear() {
+// @@protoc_insertion_point(message_clear_start:flwr.proto.PushTaskResResponse)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ results_.Clear();
+ if (GetArenaForAllocation() == nullptr && reconnect_ != nullptr) {
+ delete reconnect_;
+ }
+ reconnect_ = nullptr;
+ _internal_metadata_.Clear<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+const char* PushTaskResResponse::_InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) {
+#define CHK_(x) if (PROTOBUF_PREDICT_FALSE(!(x))) goto failure
+ while (!ctx->Done(&ptr)) {
+ ::PROTOBUF_NAMESPACE_ID::uint32 tag;
+ ptr = ::PROTOBUF_NAMESPACE_ID::internal::ReadTag(ptr, &tag);
+ switch (tag >> 3) {
+ // .flwr.proto.Reconnect reconnect = 1;
+ case 1:
+ if (PROTOBUF_PREDICT_TRUE(static_cast<::PROTOBUF_NAMESPACE_ID::uint8>(tag) == 10)) {
+ ptr = ctx->ParseMessage(_internal_mutable_reconnect(), ptr);
+ CHK_(ptr);
+ } else
+ goto handle_unusual;
+ continue;
+ // map results = 2;
+ case 2:
+ if (PROTOBUF_PREDICT_TRUE(static_cast<::PROTOBUF_NAMESPACE_ID::uint8>(tag) == 18)) {
+ ptr -= 1;
+ do {
+ ptr += 1;
+ ptr = ctx->ParseMessage(&results_, ptr);
+ CHK_(ptr);
+ if (!ctx->DataAvailable(ptr)) break;
+ } while (::PROTOBUF_NAMESPACE_ID::internal::ExpectTag<18>(ptr));
+ } else
+ goto handle_unusual;
+ continue;
+ default:
+ goto handle_unusual;
+ } // switch
+ handle_unusual:
+ if ((tag == 0) || ((tag & 7) == 4)) {
+ CHK_(ptr);
+ ctx->SetLastTag(tag);
+ goto message_done;
+ }
+ ptr = UnknownFieldParse(
+ tag,
+ _internal_metadata_.mutable_unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(),
+ ptr, ctx);
+ CHK_(ptr != nullptr);
+ } // while
+message_done:
+ return ptr;
+failure:
+ ptr = nullptr;
+ goto message_done;
+#undef CHK_
+}
+
+::PROTOBUF_NAMESPACE_ID::uint8* PushTaskResResponse::_InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const {
+ // @@protoc_insertion_point(serialize_to_array_start:flwr.proto.PushTaskResResponse)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ // .flwr.proto.Reconnect reconnect = 1;
+ if (this->_internal_has_reconnect()) {
+ target = stream->EnsureSpace(target);
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::
+ InternalWriteMessage(
+ 1, _Internal::reconnect(this), target, stream);
+ }
+
+ // map results = 2;
+ if (!this->_internal_results().empty()) {
+ typedef ::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >::const_pointer
+ ConstPtr;
+ typedef ConstPtr SortItem;
+ typedef ::PROTOBUF_NAMESPACE_ID::internal::CompareByDerefFirst Less;
+ struct Utf8Check {
+ static void Check(ConstPtr p) {
+ (void)p;
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::VerifyUtf8String(
+ p->first.data(), static_cast(p->first.length()),
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::SERIALIZE,
+ "flwr.proto.PushTaskResResponse.ResultsEntry.key");
+ }
+ };
+
+ if (stream->IsSerializationDeterministic() &&
+ this->_internal_results().size() > 1) {
+ ::std::unique_ptr items(
+ new SortItem[this->_internal_results().size()]);
+ typedef ::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >::size_type size_type;
+ size_type n = 0;
+ for (::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >::const_iterator
+ it = this->_internal_results().begin();
+ it != this->_internal_results().end(); ++it, ++n) {
+ items[static_cast(n)] = SortItem(&*it);
+ }
+ ::std::sort(&items[0], &items[static_cast(n)], Less());
+ for (size_type i = 0; i < n; i++) {
+ target = PushTaskResResponse_ResultsEntry_DoNotUse::Funcs::InternalSerialize(2, items[static_cast(i)]->first, items[static_cast(i)]->second, target, stream);
+ Utf8Check::Check(&(*items[static_cast(i)]));
+ }
+ } else {
+ for (::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >::const_iterator
+ it = this->_internal_results().begin();
+ it != this->_internal_results().end(); ++it) {
+ target = PushTaskResResponse_ResultsEntry_DoNotUse::Funcs::InternalSerialize(2, it->first, it->second, target, stream);
+ Utf8Check::Check(&(*it));
+ }
+ }
+ }
+
+ if (PROTOBUF_PREDICT_FALSE(_internal_metadata_.have_unknown_fields())) {
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormat::InternalSerializeUnknownFieldsToArray(
+ _internal_metadata_.unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(::PROTOBUF_NAMESPACE_ID::UnknownFieldSet::default_instance), target, stream);
+ }
+ // @@protoc_insertion_point(serialize_to_array_end:flwr.proto.PushTaskResResponse)
+ return target;
+}
+
+size_t PushTaskResResponse::ByteSizeLong() const {
+// @@protoc_insertion_point(message_byte_size_start:flwr.proto.PushTaskResResponse)
+ size_t total_size = 0;
+
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ // map results = 2;
+ total_size += 1 *
+ ::PROTOBUF_NAMESPACE_ID::internal::FromIntSize(this->_internal_results_size());
+ for (::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >::const_iterator
+ it = this->_internal_results().begin();
+ it != this->_internal_results().end(); ++it) {
+ total_size += PushTaskResResponse_ResultsEntry_DoNotUse::Funcs::ByteSizeLong(it->first, it->second);
+ }
+
+ // .flwr.proto.Reconnect reconnect = 1;
+ if (this->_internal_has_reconnect()) {
+ total_size += 1 +
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::MessageSize(
+ *reconnect_);
+ }
+
+ return MaybeComputeUnknownFieldsSize(total_size, &_cached_size_);
+}
+
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData PushTaskResResponse::_class_data_ = {
+ ::PROTOBUF_NAMESPACE_ID::Message::CopyWithSizeCheck,
+ PushTaskResResponse::MergeImpl
+};
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*PushTaskResResponse::GetClassData() const { return &_class_data_; }
+
+void PushTaskResResponse::MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to,
+ const ::PROTOBUF_NAMESPACE_ID::Message& from) {
+ static_cast(to)->MergeFrom(
+ static_cast(from));
+}
+
+
+void PushTaskResResponse::MergeFrom(const PushTaskResResponse& from) {
+// @@protoc_insertion_point(class_specific_merge_from_start:flwr.proto.PushTaskResResponse)
+ GOOGLE_DCHECK_NE(&from, this);
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ results_.MergeFrom(from.results_);
+ if (from._internal_has_reconnect()) {
+ _internal_mutable_reconnect()->::flwr::proto::Reconnect::MergeFrom(from._internal_reconnect());
+ }
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+}
+
+void PushTaskResResponse::CopyFrom(const PushTaskResResponse& from) {
+// @@protoc_insertion_point(class_specific_copy_from_start:flwr.proto.PushTaskResResponse)
+ if (&from == this) return;
+ Clear();
+ MergeFrom(from);
+}
+
+bool PushTaskResResponse::IsInitialized() const {
+ return true;
+}
+
+void PushTaskResResponse::InternalSwap(PushTaskResResponse* other) {
+ using std::swap;
+ _internal_metadata_.InternalSwap(&other->_internal_metadata_);
+ results_.InternalSwap(&other->results_);
+ swap(reconnect_, other->reconnect_);
+}
+
+::PROTOBUF_NAMESPACE_ID::Metadata PushTaskResResponse::GetMetadata() const {
+ return ::PROTOBUF_NAMESPACE_ID::internal::AssignDescriptors(
+ &descriptor_table_flwr_2fproto_2ffleet_2eproto_getter, &descriptor_table_flwr_2fproto_2ffleet_2eproto_once,
+ file_level_metadata_flwr_2fproto_2ffleet_2eproto[8]);
+}
+
+// ===================================================================
+
+class Reconnect::_Internal {
+ public:
+};
+
+Reconnect::Reconnect(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned)
+ : ::PROTOBUF_NAMESPACE_ID::Message(arena, is_message_owned) {
+ SharedCtor();
+ if (!is_message_owned) {
+ RegisterArenaDtor(arena);
+ }
+ // @@protoc_insertion_point(arena_constructor:flwr.proto.Reconnect)
+}
+Reconnect::Reconnect(const Reconnect& from)
+ : ::PROTOBUF_NAMESPACE_ID::Message() {
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+ reconnect_ = from.reconnect_;
+ // @@protoc_insertion_point(copy_constructor:flwr.proto.Reconnect)
+}
+
+void Reconnect::SharedCtor() {
+reconnect_ = uint64_t{0u};
+}
+
+Reconnect::~Reconnect() {
+ // @@protoc_insertion_point(destructor:flwr.proto.Reconnect)
+ if (GetArenaForAllocation() != nullptr) return;
+ SharedDtor();
+ _internal_metadata_.Delete<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+inline void Reconnect::SharedDtor() {
+ GOOGLE_DCHECK(GetArenaForAllocation() == nullptr);
+}
+
+void Reconnect::ArenaDtor(void* object) {
+ Reconnect* _this = reinterpret_cast< Reconnect* >(object);
+ (void)_this;
+}
+void Reconnect::RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena*) {
+}
+void Reconnect::SetCachedSize(int size) const {
+ _cached_size_.Set(size);
+}
+
+void Reconnect::Clear() {
+// @@protoc_insertion_point(message_clear_start:flwr.proto.Reconnect)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ reconnect_ = uint64_t{0u};
+ _internal_metadata_.Clear<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>();
+}
+
+const char* Reconnect::_InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) {
+#define CHK_(x) if (PROTOBUF_PREDICT_FALSE(!(x))) goto failure
+ while (!ctx->Done(&ptr)) {
+ ::PROTOBUF_NAMESPACE_ID::uint32 tag;
+ ptr = ::PROTOBUF_NAMESPACE_ID::internal::ReadTag(ptr, &tag);
+ switch (tag >> 3) {
+ // uint64 reconnect = 1;
+ case 1:
+ if (PROTOBUF_PREDICT_TRUE(static_cast<::PROTOBUF_NAMESPACE_ID::uint8>(tag) == 8)) {
+ reconnect_ = ::PROTOBUF_NAMESPACE_ID::internal::ReadVarint64(&ptr);
+ CHK_(ptr);
+ } else
+ goto handle_unusual;
+ continue;
+ default:
+ goto handle_unusual;
+ } // switch
+ handle_unusual:
+ if ((tag == 0) || ((tag & 7) == 4)) {
+ CHK_(ptr);
+ ctx->SetLastTag(tag);
+ goto message_done;
+ }
+ ptr = UnknownFieldParse(
+ tag,
+ _internal_metadata_.mutable_unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(),
+ ptr, ctx);
+ CHK_(ptr != nullptr);
+ } // while
+message_done:
+ return ptr;
+failure:
+ ptr = nullptr;
+ goto message_done;
+#undef CHK_
+}
+
+::PROTOBUF_NAMESPACE_ID::uint8* Reconnect::_InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const {
+ // @@protoc_insertion_point(serialize_to_array_start:flwr.proto.Reconnect)
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ // uint64 reconnect = 1;
+ if (this->_internal_reconnect() != 0) {
+ target = stream->EnsureSpace(target);
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::WriteUInt64ToArray(1, this->_internal_reconnect(), target);
+ }
+
+ if (PROTOBUF_PREDICT_FALSE(_internal_metadata_.have_unknown_fields())) {
+ target = ::PROTOBUF_NAMESPACE_ID::internal::WireFormat::InternalSerializeUnknownFieldsToArray(
+ _internal_metadata_.unknown_fields<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(::PROTOBUF_NAMESPACE_ID::UnknownFieldSet::default_instance), target, stream);
+ }
+ // @@protoc_insertion_point(serialize_to_array_end:flwr.proto.Reconnect)
+ return target;
+}
+
+size_t Reconnect::ByteSizeLong() const {
+// @@protoc_insertion_point(message_byte_size_start:flwr.proto.Reconnect)
+ size_t total_size = 0;
+
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ // Prevent compiler warnings about cached_has_bits being unused
+ (void) cached_has_bits;
+
+ // uint64 reconnect = 1;
+ if (this->_internal_reconnect() != 0) {
+ total_size += ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::UInt64SizePlusOne(this->_internal_reconnect());
+ }
+
+ return MaybeComputeUnknownFieldsSize(total_size, &_cached_size_);
+}
+
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData Reconnect::_class_data_ = {
+ ::PROTOBUF_NAMESPACE_ID::Message::CopyWithSizeCheck,
+ Reconnect::MergeImpl
+};
+const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*Reconnect::GetClassData() const { return &_class_data_; }
+
+void Reconnect::MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to,
+ const ::PROTOBUF_NAMESPACE_ID::Message& from) {
+ static_cast(to)->MergeFrom(
+ static_cast(from));
+}
+
+
+void Reconnect::MergeFrom(const Reconnect& from) {
+// @@protoc_insertion_point(class_specific_merge_from_start:flwr.proto.Reconnect)
+ GOOGLE_DCHECK_NE(&from, this);
+ ::PROTOBUF_NAMESPACE_ID::uint32 cached_has_bits = 0;
+ (void) cached_has_bits;
+
+ if (from._internal_reconnect() != 0) {
+ _internal_set_reconnect(from._internal_reconnect());
+ }
+ _internal_metadata_.MergeFrom<::PROTOBUF_NAMESPACE_ID::UnknownFieldSet>(from._internal_metadata_);
+}
+
+void Reconnect::CopyFrom(const Reconnect& from) {
+// @@protoc_insertion_point(class_specific_copy_from_start:flwr.proto.Reconnect)
+ if (&from == this) return;
+ Clear();
+ MergeFrom(from);
+}
+
+bool Reconnect::IsInitialized() const {
+ return true;
+}
+
+void Reconnect::InternalSwap(Reconnect* other) {
+ using std::swap;
+ _internal_metadata_.InternalSwap(&other->_internal_metadata_);
+ swap(reconnect_, other->reconnect_);
+}
+
+::PROTOBUF_NAMESPACE_ID::Metadata Reconnect::GetMetadata() const {
+ return ::PROTOBUF_NAMESPACE_ID::internal::AssignDescriptors(
+ &descriptor_table_flwr_2fproto_2ffleet_2eproto_getter, &descriptor_table_flwr_2fproto_2ffleet_2eproto_once,
+ file_level_metadata_flwr_2fproto_2ffleet_2eproto[9]);
+}
+
+// @@protoc_insertion_point(namespace_scope)
+} // namespace proto
+} // namespace flwr
+PROTOBUF_NAMESPACE_OPEN
+template<> PROTOBUF_NOINLINE ::flwr::proto::CreateNodeRequest* Arena::CreateMaybeMessage< ::flwr::proto::CreateNodeRequest >(Arena* arena) {
+ return Arena::CreateMessageInternal< ::flwr::proto::CreateNodeRequest >(arena);
+}
+template<> PROTOBUF_NOINLINE ::flwr::proto::CreateNodeResponse* Arena::CreateMaybeMessage< ::flwr::proto::CreateNodeResponse >(Arena* arena) {
+ return Arena::CreateMessageInternal< ::flwr::proto::CreateNodeResponse >(arena);
+}
+template<> PROTOBUF_NOINLINE ::flwr::proto::DeleteNodeRequest* Arena::CreateMaybeMessage< ::flwr::proto::DeleteNodeRequest >(Arena* arena) {
+ return Arena::CreateMessageInternal< ::flwr::proto::DeleteNodeRequest >(arena);
+}
+template<> PROTOBUF_NOINLINE ::flwr::proto::DeleteNodeResponse* Arena::CreateMaybeMessage< ::flwr::proto::DeleteNodeResponse >(Arena* arena) {
+ return Arena::CreateMessageInternal< ::flwr::proto::DeleteNodeResponse >(arena);
+}
+template<> PROTOBUF_NOINLINE ::flwr::proto::PullTaskInsRequest* Arena::CreateMaybeMessage< ::flwr::proto::PullTaskInsRequest >(Arena* arena) {
+ return Arena::CreateMessageInternal< ::flwr::proto::PullTaskInsRequest >(arena);
+}
+template<> PROTOBUF_NOINLINE ::flwr::proto::PullTaskInsResponse* Arena::CreateMaybeMessage< ::flwr::proto::PullTaskInsResponse >(Arena* arena) {
+ return Arena::CreateMessageInternal< ::flwr::proto::PullTaskInsResponse >(arena);
+}
+template<> PROTOBUF_NOINLINE ::flwr::proto::PushTaskResRequest* Arena::CreateMaybeMessage< ::flwr::proto::PushTaskResRequest >(Arena* arena) {
+ return Arena::CreateMessageInternal< ::flwr::proto::PushTaskResRequest >(arena);
+}
+template<> PROTOBUF_NOINLINE ::flwr::proto::PushTaskResResponse_ResultsEntry_DoNotUse* Arena::CreateMaybeMessage< ::flwr::proto::PushTaskResResponse_ResultsEntry_DoNotUse >(Arena* arena) {
+ return Arena::CreateMessageInternal< ::flwr::proto::PushTaskResResponse_ResultsEntry_DoNotUse >(arena);
+}
+template<> PROTOBUF_NOINLINE ::flwr::proto::PushTaskResResponse* Arena::CreateMaybeMessage< ::flwr::proto::PushTaskResResponse >(Arena* arena) {
+ return Arena::CreateMessageInternal< ::flwr::proto::PushTaskResResponse >(arena);
+}
+template<> PROTOBUF_NOINLINE ::flwr::proto::Reconnect* Arena::CreateMaybeMessage< ::flwr::proto::Reconnect >(Arena* arena) {
+ return Arena::CreateMessageInternal< ::flwr::proto::Reconnect >(arena);
+}
+PROTOBUF_NAMESPACE_CLOSE
+
+// @@protoc_insertion_point(global_scope)
+#include
diff --git a/src/cc/flwr/include/flwr/proto/fleet.pb.h b/src/cc/flwr/include/flwr/proto/fleet.pb.h
new file mode 100644
index 000000000000..842e800f5b1c
--- /dev/null
+++ b/src/cc/flwr/include/flwr/proto/fleet.pb.h
@@ -0,0 +1,2202 @@
+// Generated by the protocol buffer compiler. DO NOT EDIT!
+// source: flwr/proto/fleet.proto
+
+#ifndef GOOGLE_PROTOBUF_INCLUDED_flwr_2fproto_2ffleet_2eproto
+#define GOOGLE_PROTOBUF_INCLUDED_flwr_2fproto_2ffleet_2eproto
+
+#include
+#include
+
+#include
+#if PROTOBUF_VERSION < 3018000
+#error This file was generated by a newer version of protoc which is
+#error incompatible with your Protocol Buffer headers. Please update
+#error your headers.
+#endif
+#if 3018001 < PROTOBUF_MIN_PROTOC_VERSION
+#error This file was generated by an older version of protoc which is
+#error incompatible with your Protocol Buffer headers. Please
+#error regenerate this file with a newer version of protoc.
+#endif
+
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include
+#include // IWYU pragma: export
+#include // IWYU pragma: export
+#include // IWYU pragma: export
+#include
+#include
+#include
+#include "flwr/proto/node.pb.h"
+#include "flwr/proto/task.pb.h"
+// @@protoc_insertion_point(includes)
+#include
+#define PROTOBUF_INTERNAL_EXPORT_flwr_2fproto_2ffleet_2eproto
+PROTOBUF_NAMESPACE_OPEN
+namespace internal {
+class AnyMetadata;
+} // namespace internal
+PROTOBUF_NAMESPACE_CLOSE
+
+// Internal implementation detail -- do not use these members.
+struct TableStruct_flwr_2fproto_2ffleet_2eproto {
+ static const ::PROTOBUF_NAMESPACE_ID::internal::ParseTableField entries[]
+ PROTOBUF_SECTION_VARIABLE(protodesc_cold);
+ static const ::PROTOBUF_NAMESPACE_ID::internal::AuxiliaryParseTableField aux[]
+ PROTOBUF_SECTION_VARIABLE(protodesc_cold);
+ static const ::PROTOBUF_NAMESPACE_ID::internal::ParseTable schema[10]
+ PROTOBUF_SECTION_VARIABLE(protodesc_cold);
+ static const ::PROTOBUF_NAMESPACE_ID::internal::FieldMetadata field_metadata[];
+ static const ::PROTOBUF_NAMESPACE_ID::internal::SerializationTable serialization_table[];
+ static const ::PROTOBUF_NAMESPACE_ID::uint32 offsets[];
+};
+extern const ::PROTOBUF_NAMESPACE_ID::internal::DescriptorTable descriptor_table_flwr_2fproto_2ffleet_2eproto;
+namespace flwr {
+namespace proto {
+class CreateNodeRequest;
+struct CreateNodeRequestDefaultTypeInternal;
+extern CreateNodeRequestDefaultTypeInternal _CreateNodeRequest_default_instance_;
+class CreateNodeResponse;
+struct CreateNodeResponseDefaultTypeInternal;
+extern CreateNodeResponseDefaultTypeInternal _CreateNodeResponse_default_instance_;
+class DeleteNodeRequest;
+struct DeleteNodeRequestDefaultTypeInternal;
+extern DeleteNodeRequestDefaultTypeInternal _DeleteNodeRequest_default_instance_;
+class DeleteNodeResponse;
+struct DeleteNodeResponseDefaultTypeInternal;
+extern DeleteNodeResponseDefaultTypeInternal _DeleteNodeResponse_default_instance_;
+class PullTaskInsRequest;
+struct PullTaskInsRequestDefaultTypeInternal;
+extern PullTaskInsRequestDefaultTypeInternal _PullTaskInsRequest_default_instance_;
+class PullTaskInsResponse;
+struct PullTaskInsResponseDefaultTypeInternal;
+extern PullTaskInsResponseDefaultTypeInternal _PullTaskInsResponse_default_instance_;
+class PushTaskResRequest;
+struct PushTaskResRequestDefaultTypeInternal;
+extern PushTaskResRequestDefaultTypeInternal _PushTaskResRequest_default_instance_;
+class PushTaskResResponse;
+struct PushTaskResResponseDefaultTypeInternal;
+extern PushTaskResResponseDefaultTypeInternal _PushTaskResResponse_default_instance_;
+class PushTaskResResponse_ResultsEntry_DoNotUse;
+struct PushTaskResResponse_ResultsEntry_DoNotUseDefaultTypeInternal;
+extern PushTaskResResponse_ResultsEntry_DoNotUseDefaultTypeInternal _PushTaskResResponse_ResultsEntry_DoNotUse_default_instance_;
+class Reconnect;
+struct ReconnectDefaultTypeInternal;
+extern ReconnectDefaultTypeInternal _Reconnect_default_instance_;
+} // namespace proto
+} // namespace flwr
+PROTOBUF_NAMESPACE_OPEN
+template<> ::flwr::proto::CreateNodeRequest* Arena::CreateMaybeMessage<::flwr::proto::CreateNodeRequest>(Arena*);
+template<> ::flwr::proto::CreateNodeResponse* Arena::CreateMaybeMessage<::flwr::proto::CreateNodeResponse>(Arena*);
+template<> ::flwr::proto::DeleteNodeRequest* Arena::CreateMaybeMessage<::flwr::proto::DeleteNodeRequest>(Arena*);
+template<> ::flwr::proto::DeleteNodeResponse* Arena::CreateMaybeMessage<::flwr::proto::DeleteNodeResponse>(Arena*);
+template<> ::flwr::proto::PullTaskInsRequest* Arena::CreateMaybeMessage<::flwr::proto::PullTaskInsRequest>(Arena*);
+template<> ::flwr::proto::PullTaskInsResponse* Arena::CreateMaybeMessage<::flwr::proto::PullTaskInsResponse>(Arena*);
+template<> ::flwr::proto::PushTaskResRequest* Arena::CreateMaybeMessage<::flwr::proto::PushTaskResRequest>(Arena*);
+template<> ::flwr::proto::PushTaskResResponse* Arena::CreateMaybeMessage<::flwr::proto::PushTaskResResponse>(Arena*);
+template<> ::flwr::proto::PushTaskResResponse_ResultsEntry_DoNotUse* Arena::CreateMaybeMessage<::flwr::proto::PushTaskResResponse_ResultsEntry_DoNotUse>(Arena*);
+template<> ::flwr::proto::Reconnect* Arena::CreateMaybeMessage<::flwr::proto::Reconnect>(Arena*);
+PROTOBUF_NAMESPACE_CLOSE
+namespace flwr {
+namespace proto {
+
+// ===================================================================
+
+class CreateNodeRequest final :
+ public ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase /* @@protoc_insertion_point(class_definition:flwr.proto.CreateNodeRequest) */ {
+ public:
+ inline CreateNodeRequest() : CreateNodeRequest(nullptr) {}
+ explicit constexpr CreateNodeRequest(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized);
+
+ CreateNodeRequest(const CreateNodeRequest& from);
+ CreateNodeRequest(CreateNodeRequest&& from) noexcept
+ : CreateNodeRequest() {
+ *this = ::std::move(from);
+ }
+
+ inline CreateNodeRequest& operator=(const CreateNodeRequest& from) {
+ CopyFrom(from);
+ return *this;
+ }
+ inline CreateNodeRequest& operator=(CreateNodeRequest&& from) noexcept {
+ if (this == &from) return *this;
+ if (GetOwningArena() == from.GetOwningArena()
+ #ifdef PROTOBUF_FORCE_COPY_IN_MOVE
+ && GetOwningArena() != nullptr
+ #endif // !PROTOBUF_FORCE_COPY_IN_MOVE
+ ) {
+ InternalSwap(&from);
+ } else {
+ CopyFrom(from);
+ }
+ return *this;
+ }
+
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* descriptor() {
+ return GetDescriptor();
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* GetDescriptor() {
+ return default_instance().GetMetadata().descriptor;
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Reflection* GetReflection() {
+ return default_instance().GetMetadata().reflection;
+ }
+ static const CreateNodeRequest& default_instance() {
+ return *internal_default_instance();
+ }
+ static inline const CreateNodeRequest* internal_default_instance() {
+ return reinterpret_cast(
+ &_CreateNodeRequest_default_instance_);
+ }
+ static constexpr int kIndexInFileMessages =
+ 0;
+
+ friend void swap(CreateNodeRequest& a, CreateNodeRequest& b) {
+ a.Swap(&b);
+ }
+ inline void Swap(CreateNodeRequest* other) {
+ if (other == this) return;
+ if (GetOwningArena() == other->GetOwningArena()) {
+ InternalSwap(other);
+ } else {
+ ::PROTOBUF_NAMESPACE_ID::internal::GenericSwap(this, other);
+ }
+ }
+ void UnsafeArenaSwap(CreateNodeRequest* other) {
+ if (other == this) return;
+ GOOGLE_DCHECK(GetOwningArena() == other->GetOwningArena());
+ InternalSwap(other);
+ }
+
+ // implements Message ----------------------------------------------
+
+ inline CreateNodeRequest* New() const final {
+ return new CreateNodeRequest();
+ }
+
+ CreateNodeRequest* New(::PROTOBUF_NAMESPACE_ID::Arena* arena) const final {
+ return CreateMaybeMessage(arena);
+ }
+ using ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase::CopyFrom;
+ inline void CopyFrom(const CreateNodeRequest& from) {
+ ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase::CopyImpl(this, from);
+ }
+ using ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase::MergeFrom;
+ void MergeFrom(const CreateNodeRequest& from) {
+ ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase::MergeImpl(this, from);
+ }
+ public:
+ friend class ::PROTOBUF_NAMESPACE_ID::internal::AnyMetadata;
+ static ::PROTOBUF_NAMESPACE_ID::StringPiece FullMessageName() {
+ return "flwr.proto.CreateNodeRequest";
+ }
+ protected:
+ explicit CreateNodeRequest(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned = false);
+ private:
+ public:
+
+ static const ClassData _class_data_;
+ const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*GetClassData() const final;
+
+ ::PROTOBUF_NAMESPACE_ID::Metadata GetMetadata() const final;
+
+ // nested types ----------------------------------------------------
+
+ // accessors -------------------------------------------------------
+
+ // @@protoc_insertion_point(class_scope:flwr.proto.CreateNodeRequest)
+ private:
+ class _Internal;
+
+ template friend class ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper;
+ typedef void InternalArenaConstructable_;
+ typedef void DestructorSkippable_;
+ mutable ::PROTOBUF_NAMESPACE_ID::internal::CachedSize _cached_size_;
+ friend struct ::TableStruct_flwr_2fproto_2ffleet_2eproto;
+};
+// -------------------------------------------------------------------
+
+class CreateNodeResponse final :
+ public ::PROTOBUF_NAMESPACE_ID::Message /* @@protoc_insertion_point(class_definition:flwr.proto.CreateNodeResponse) */ {
+ public:
+ inline CreateNodeResponse() : CreateNodeResponse(nullptr) {}
+ ~CreateNodeResponse() override;
+ explicit constexpr CreateNodeResponse(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized);
+
+ CreateNodeResponse(const CreateNodeResponse& from);
+ CreateNodeResponse(CreateNodeResponse&& from) noexcept
+ : CreateNodeResponse() {
+ *this = ::std::move(from);
+ }
+
+ inline CreateNodeResponse& operator=(const CreateNodeResponse& from) {
+ CopyFrom(from);
+ return *this;
+ }
+ inline CreateNodeResponse& operator=(CreateNodeResponse&& from) noexcept {
+ if (this == &from) return *this;
+ if (GetOwningArena() == from.GetOwningArena()
+ #ifdef PROTOBUF_FORCE_COPY_IN_MOVE
+ && GetOwningArena() != nullptr
+ #endif // !PROTOBUF_FORCE_COPY_IN_MOVE
+ ) {
+ InternalSwap(&from);
+ } else {
+ CopyFrom(from);
+ }
+ return *this;
+ }
+
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* descriptor() {
+ return GetDescriptor();
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* GetDescriptor() {
+ return default_instance().GetMetadata().descriptor;
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Reflection* GetReflection() {
+ return default_instance().GetMetadata().reflection;
+ }
+ static const CreateNodeResponse& default_instance() {
+ return *internal_default_instance();
+ }
+ static inline const CreateNodeResponse* internal_default_instance() {
+ return reinterpret_cast(
+ &_CreateNodeResponse_default_instance_);
+ }
+ static constexpr int kIndexInFileMessages =
+ 1;
+
+ friend void swap(CreateNodeResponse& a, CreateNodeResponse& b) {
+ a.Swap(&b);
+ }
+ inline void Swap(CreateNodeResponse* other) {
+ if (other == this) return;
+ if (GetOwningArena() == other->GetOwningArena()) {
+ InternalSwap(other);
+ } else {
+ ::PROTOBUF_NAMESPACE_ID::internal::GenericSwap(this, other);
+ }
+ }
+ void UnsafeArenaSwap(CreateNodeResponse* other) {
+ if (other == this) return;
+ GOOGLE_DCHECK(GetOwningArena() == other->GetOwningArena());
+ InternalSwap(other);
+ }
+
+ // implements Message ----------------------------------------------
+
+ inline CreateNodeResponse* New() const final {
+ return new CreateNodeResponse();
+ }
+
+ CreateNodeResponse* New(::PROTOBUF_NAMESPACE_ID::Arena* arena) const final {
+ return CreateMaybeMessage(arena);
+ }
+ using ::PROTOBUF_NAMESPACE_ID::Message::CopyFrom;
+ void CopyFrom(const CreateNodeResponse& from);
+ using ::PROTOBUF_NAMESPACE_ID::Message::MergeFrom;
+ void MergeFrom(const CreateNodeResponse& from);
+ private:
+ static void MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to, const ::PROTOBUF_NAMESPACE_ID::Message& from);
+ public:
+ PROTOBUF_ATTRIBUTE_REINITIALIZES void Clear() final;
+ bool IsInitialized() const final;
+
+ size_t ByteSizeLong() const final;
+ const char* _InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) final;
+ ::PROTOBUF_NAMESPACE_ID::uint8* _InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const final;
+ int GetCachedSize() const final { return _cached_size_.Get(); }
+
+ private:
+ void SharedCtor();
+ void SharedDtor();
+ void SetCachedSize(int size) const final;
+ void InternalSwap(CreateNodeResponse* other);
+ friend class ::PROTOBUF_NAMESPACE_ID::internal::AnyMetadata;
+ static ::PROTOBUF_NAMESPACE_ID::StringPiece FullMessageName() {
+ return "flwr.proto.CreateNodeResponse";
+ }
+ protected:
+ explicit CreateNodeResponse(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned = false);
+ private:
+ static void ArenaDtor(void* object);
+ inline void RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena* arena);
+ public:
+
+ static const ClassData _class_data_;
+ const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*GetClassData() const final;
+
+ ::PROTOBUF_NAMESPACE_ID::Metadata GetMetadata() const final;
+
+ // nested types ----------------------------------------------------
+
+ // accessors -------------------------------------------------------
+
+ enum : int {
+ kNodeFieldNumber = 1,
+ };
+ // .flwr.proto.Node node = 1;
+ bool has_node() const;
+ private:
+ bool _internal_has_node() const;
+ public:
+ void clear_node();
+ const ::flwr::proto::Node& node() const;
+ PROTOBUF_MUST_USE_RESULT ::flwr::proto::Node* release_node();
+ ::flwr::proto::Node* mutable_node();
+ void set_allocated_node(::flwr::proto::Node* node);
+ private:
+ const ::flwr::proto::Node& _internal_node() const;
+ ::flwr::proto::Node* _internal_mutable_node();
+ public:
+ void unsafe_arena_set_allocated_node(
+ ::flwr::proto::Node* node);
+ ::flwr::proto::Node* unsafe_arena_release_node();
+
+ // @@protoc_insertion_point(class_scope:flwr.proto.CreateNodeResponse)
+ private:
+ class _Internal;
+
+ template friend class ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper;
+ typedef void InternalArenaConstructable_;
+ typedef void DestructorSkippable_;
+ ::flwr::proto::Node* node_;
+ mutable ::PROTOBUF_NAMESPACE_ID::internal::CachedSize _cached_size_;
+ friend struct ::TableStruct_flwr_2fproto_2ffleet_2eproto;
+};
+// -------------------------------------------------------------------
+
+class DeleteNodeRequest final :
+ public ::PROTOBUF_NAMESPACE_ID::Message /* @@protoc_insertion_point(class_definition:flwr.proto.DeleteNodeRequest) */ {
+ public:
+ inline DeleteNodeRequest() : DeleteNodeRequest(nullptr) {}
+ ~DeleteNodeRequest() override;
+ explicit constexpr DeleteNodeRequest(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized);
+
+ DeleteNodeRequest(const DeleteNodeRequest& from);
+ DeleteNodeRequest(DeleteNodeRequest&& from) noexcept
+ : DeleteNodeRequest() {
+ *this = ::std::move(from);
+ }
+
+ inline DeleteNodeRequest& operator=(const DeleteNodeRequest& from) {
+ CopyFrom(from);
+ return *this;
+ }
+ inline DeleteNodeRequest& operator=(DeleteNodeRequest&& from) noexcept {
+ if (this == &from) return *this;
+ if (GetOwningArena() == from.GetOwningArena()
+ #ifdef PROTOBUF_FORCE_COPY_IN_MOVE
+ && GetOwningArena() != nullptr
+ #endif // !PROTOBUF_FORCE_COPY_IN_MOVE
+ ) {
+ InternalSwap(&from);
+ } else {
+ CopyFrom(from);
+ }
+ return *this;
+ }
+
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* descriptor() {
+ return GetDescriptor();
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* GetDescriptor() {
+ return default_instance().GetMetadata().descriptor;
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Reflection* GetReflection() {
+ return default_instance().GetMetadata().reflection;
+ }
+ static const DeleteNodeRequest& default_instance() {
+ return *internal_default_instance();
+ }
+ static inline const DeleteNodeRequest* internal_default_instance() {
+ return reinterpret_cast(
+ &_DeleteNodeRequest_default_instance_);
+ }
+ static constexpr int kIndexInFileMessages =
+ 2;
+
+ friend void swap(DeleteNodeRequest& a, DeleteNodeRequest& b) {
+ a.Swap(&b);
+ }
+ inline void Swap(DeleteNodeRequest* other) {
+ if (other == this) return;
+ if (GetOwningArena() == other->GetOwningArena()) {
+ InternalSwap(other);
+ } else {
+ ::PROTOBUF_NAMESPACE_ID::internal::GenericSwap(this, other);
+ }
+ }
+ void UnsafeArenaSwap(DeleteNodeRequest* other) {
+ if (other == this) return;
+ GOOGLE_DCHECK(GetOwningArena() == other->GetOwningArena());
+ InternalSwap(other);
+ }
+
+ // implements Message ----------------------------------------------
+
+ inline DeleteNodeRequest* New() const final {
+ return new DeleteNodeRequest();
+ }
+
+ DeleteNodeRequest* New(::PROTOBUF_NAMESPACE_ID::Arena* arena) const final {
+ return CreateMaybeMessage(arena);
+ }
+ using ::PROTOBUF_NAMESPACE_ID::Message::CopyFrom;
+ void CopyFrom(const DeleteNodeRequest& from);
+ using ::PROTOBUF_NAMESPACE_ID::Message::MergeFrom;
+ void MergeFrom(const DeleteNodeRequest& from);
+ private:
+ static void MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to, const ::PROTOBUF_NAMESPACE_ID::Message& from);
+ public:
+ PROTOBUF_ATTRIBUTE_REINITIALIZES void Clear() final;
+ bool IsInitialized() const final;
+
+ size_t ByteSizeLong() const final;
+ const char* _InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) final;
+ ::PROTOBUF_NAMESPACE_ID::uint8* _InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const final;
+ int GetCachedSize() const final { return _cached_size_.Get(); }
+
+ private:
+ void SharedCtor();
+ void SharedDtor();
+ void SetCachedSize(int size) const final;
+ void InternalSwap(DeleteNodeRequest* other);
+ friend class ::PROTOBUF_NAMESPACE_ID::internal::AnyMetadata;
+ static ::PROTOBUF_NAMESPACE_ID::StringPiece FullMessageName() {
+ return "flwr.proto.DeleteNodeRequest";
+ }
+ protected:
+ explicit DeleteNodeRequest(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned = false);
+ private:
+ static void ArenaDtor(void* object);
+ inline void RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena* arena);
+ public:
+
+ static const ClassData _class_data_;
+ const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*GetClassData() const final;
+
+ ::PROTOBUF_NAMESPACE_ID::Metadata GetMetadata() const final;
+
+ // nested types ----------------------------------------------------
+
+ // accessors -------------------------------------------------------
+
+ enum : int {
+ kNodeFieldNumber = 1,
+ };
+ // .flwr.proto.Node node = 1;
+ bool has_node() const;
+ private:
+ bool _internal_has_node() const;
+ public:
+ void clear_node();
+ const ::flwr::proto::Node& node() const;
+ PROTOBUF_MUST_USE_RESULT ::flwr::proto::Node* release_node();
+ ::flwr::proto::Node* mutable_node();
+ void set_allocated_node(::flwr::proto::Node* node);
+ private:
+ const ::flwr::proto::Node& _internal_node() const;
+ ::flwr::proto::Node* _internal_mutable_node();
+ public:
+ void unsafe_arena_set_allocated_node(
+ ::flwr::proto::Node* node);
+ ::flwr::proto::Node* unsafe_arena_release_node();
+
+ // @@protoc_insertion_point(class_scope:flwr.proto.DeleteNodeRequest)
+ private:
+ class _Internal;
+
+ template friend class ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper;
+ typedef void InternalArenaConstructable_;
+ typedef void DestructorSkippable_;
+ ::flwr::proto::Node* node_;
+ mutable ::PROTOBUF_NAMESPACE_ID::internal::CachedSize _cached_size_;
+ friend struct ::TableStruct_flwr_2fproto_2ffleet_2eproto;
+};
+// -------------------------------------------------------------------
+
+class DeleteNodeResponse final :
+ public ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase /* @@protoc_insertion_point(class_definition:flwr.proto.DeleteNodeResponse) */ {
+ public:
+ inline DeleteNodeResponse() : DeleteNodeResponse(nullptr) {}
+ explicit constexpr DeleteNodeResponse(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized);
+
+ DeleteNodeResponse(const DeleteNodeResponse& from);
+ DeleteNodeResponse(DeleteNodeResponse&& from) noexcept
+ : DeleteNodeResponse() {
+ *this = ::std::move(from);
+ }
+
+ inline DeleteNodeResponse& operator=(const DeleteNodeResponse& from) {
+ CopyFrom(from);
+ return *this;
+ }
+ inline DeleteNodeResponse& operator=(DeleteNodeResponse&& from) noexcept {
+ if (this == &from) return *this;
+ if (GetOwningArena() == from.GetOwningArena()
+ #ifdef PROTOBUF_FORCE_COPY_IN_MOVE
+ && GetOwningArena() != nullptr
+ #endif // !PROTOBUF_FORCE_COPY_IN_MOVE
+ ) {
+ InternalSwap(&from);
+ } else {
+ CopyFrom(from);
+ }
+ return *this;
+ }
+
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* descriptor() {
+ return GetDescriptor();
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* GetDescriptor() {
+ return default_instance().GetMetadata().descriptor;
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Reflection* GetReflection() {
+ return default_instance().GetMetadata().reflection;
+ }
+ static const DeleteNodeResponse& default_instance() {
+ return *internal_default_instance();
+ }
+ static inline const DeleteNodeResponse* internal_default_instance() {
+ return reinterpret_cast(
+ &_DeleteNodeResponse_default_instance_);
+ }
+ static constexpr int kIndexInFileMessages =
+ 3;
+
+ friend void swap(DeleteNodeResponse& a, DeleteNodeResponse& b) {
+ a.Swap(&b);
+ }
+ inline void Swap(DeleteNodeResponse* other) {
+ if (other == this) return;
+ if (GetOwningArena() == other->GetOwningArena()) {
+ InternalSwap(other);
+ } else {
+ ::PROTOBUF_NAMESPACE_ID::internal::GenericSwap(this, other);
+ }
+ }
+ void UnsafeArenaSwap(DeleteNodeResponse* other) {
+ if (other == this) return;
+ GOOGLE_DCHECK(GetOwningArena() == other->GetOwningArena());
+ InternalSwap(other);
+ }
+
+ // implements Message ----------------------------------------------
+
+ inline DeleteNodeResponse* New() const final {
+ return new DeleteNodeResponse();
+ }
+
+ DeleteNodeResponse* New(::PROTOBUF_NAMESPACE_ID::Arena* arena) const final {
+ return CreateMaybeMessage(arena);
+ }
+ using ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase::CopyFrom;
+ inline void CopyFrom(const DeleteNodeResponse& from) {
+ ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase::CopyImpl(this, from);
+ }
+ using ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase::MergeFrom;
+ void MergeFrom(const DeleteNodeResponse& from) {
+ ::PROTOBUF_NAMESPACE_ID::internal::ZeroFieldsBase::MergeImpl(this, from);
+ }
+ public:
+ friend class ::PROTOBUF_NAMESPACE_ID::internal::AnyMetadata;
+ static ::PROTOBUF_NAMESPACE_ID::StringPiece FullMessageName() {
+ return "flwr.proto.DeleteNodeResponse";
+ }
+ protected:
+ explicit DeleteNodeResponse(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned = false);
+ private:
+ public:
+
+ static const ClassData _class_data_;
+ const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*GetClassData() const final;
+
+ ::PROTOBUF_NAMESPACE_ID::Metadata GetMetadata() const final;
+
+ // nested types ----------------------------------------------------
+
+ // accessors -------------------------------------------------------
+
+ // @@protoc_insertion_point(class_scope:flwr.proto.DeleteNodeResponse)
+ private:
+ class _Internal;
+
+ template friend class ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper;
+ typedef void InternalArenaConstructable_;
+ typedef void DestructorSkippable_;
+ mutable ::PROTOBUF_NAMESPACE_ID::internal::CachedSize _cached_size_;
+ friend struct ::TableStruct_flwr_2fproto_2ffleet_2eproto;
+};
+// -------------------------------------------------------------------
+
+class PullTaskInsRequest final :
+ public ::PROTOBUF_NAMESPACE_ID::Message /* @@protoc_insertion_point(class_definition:flwr.proto.PullTaskInsRequest) */ {
+ public:
+ inline PullTaskInsRequest() : PullTaskInsRequest(nullptr) {}
+ ~PullTaskInsRequest() override;
+ explicit constexpr PullTaskInsRequest(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized);
+
+ PullTaskInsRequest(const PullTaskInsRequest& from);
+ PullTaskInsRequest(PullTaskInsRequest&& from) noexcept
+ : PullTaskInsRequest() {
+ *this = ::std::move(from);
+ }
+
+ inline PullTaskInsRequest& operator=(const PullTaskInsRequest& from) {
+ CopyFrom(from);
+ return *this;
+ }
+ inline PullTaskInsRequest& operator=(PullTaskInsRequest&& from) noexcept {
+ if (this == &from) return *this;
+ if (GetOwningArena() == from.GetOwningArena()
+ #ifdef PROTOBUF_FORCE_COPY_IN_MOVE
+ && GetOwningArena() != nullptr
+ #endif // !PROTOBUF_FORCE_COPY_IN_MOVE
+ ) {
+ InternalSwap(&from);
+ } else {
+ CopyFrom(from);
+ }
+ return *this;
+ }
+
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* descriptor() {
+ return GetDescriptor();
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* GetDescriptor() {
+ return default_instance().GetMetadata().descriptor;
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Reflection* GetReflection() {
+ return default_instance().GetMetadata().reflection;
+ }
+ static const PullTaskInsRequest& default_instance() {
+ return *internal_default_instance();
+ }
+ static inline const PullTaskInsRequest* internal_default_instance() {
+ return reinterpret_cast(
+ &_PullTaskInsRequest_default_instance_);
+ }
+ static constexpr int kIndexInFileMessages =
+ 4;
+
+ friend void swap(PullTaskInsRequest& a, PullTaskInsRequest& b) {
+ a.Swap(&b);
+ }
+ inline void Swap(PullTaskInsRequest* other) {
+ if (other == this) return;
+ if (GetOwningArena() == other->GetOwningArena()) {
+ InternalSwap(other);
+ } else {
+ ::PROTOBUF_NAMESPACE_ID::internal::GenericSwap(this, other);
+ }
+ }
+ void UnsafeArenaSwap(PullTaskInsRequest* other) {
+ if (other == this) return;
+ GOOGLE_DCHECK(GetOwningArena() == other->GetOwningArena());
+ InternalSwap(other);
+ }
+
+ // implements Message ----------------------------------------------
+
+ inline PullTaskInsRequest* New() const final {
+ return new PullTaskInsRequest();
+ }
+
+ PullTaskInsRequest* New(::PROTOBUF_NAMESPACE_ID::Arena* arena) const final {
+ return CreateMaybeMessage(arena);
+ }
+ using ::PROTOBUF_NAMESPACE_ID::Message::CopyFrom;
+ void CopyFrom(const PullTaskInsRequest& from);
+ using ::PROTOBUF_NAMESPACE_ID::Message::MergeFrom;
+ void MergeFrom(const PullTaskInsRequest& from);
+ private:
+ static void MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to, const ::PROTOBUF_NAMESPACE_ID::Message& from);
+ public:
+ PROTOBUF_ATTRIBUTE_REINITIALIZES void Clear() final;
+ bool IsInitialized() const final;
+
+ size_t ByteSizeLong() const final;
+ const char* _InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) final;
+ ::PROTOBUF_NAMESPACE_ID::uint8* _InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const final;
+ int GetCachedSize() const final { return _cached_size_.Get(); }
+
+ private:
+ void SharedCtor();
+ void SharedDtor();
+ void SetCachedSize(int size) const final;
+ void InternalSwap(PullTaskInsRequest* other);
+ friend class ::PROTOBUF_NAMESPACE_ID::internal::AnyMetadata;
+ static ::PROTOBUF_NAMESPACE_ID::StringPiece FullMessageName() {
+ return "flwr.proto.PullTaskInsRequest";
+ }
+ protected:
+ explicit PullTaskInsRequest(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned = false);
+ private:
+ static void ArenaDtor(void* object);
+ inline void RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena* arena);
+ public:
+
+ static const ClassData _class_data_;
+ const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*GetClassData() const final;
+
+ ::PROTOBUF_NAMESPACE_ID::Metadata GetMetadata() const final;
+
+ // nested types ----------------------------------------------------
+
+ // accessors -------------------------------------------------------
+
+ enum : int {
+ kTaskIdsFieldNumber = 2,
+ kNodeFieldNumber = 1,
+ };
+ // repeated string task_ids = 2;
+ int task_ids_size() const;
+ private:
+ int _internal_task_ids_size() const;
+ public:
+ void clear_task_ids();
+ const std::string& task_ids(int index) const;
+ std::string* mutable_task_ids(int index);
+ void set_task_ids(int index, const std::string& value);
+ void set_task_ids(int index, std::string&& value);
+ void set_task_ids(int index, const char* value);
+ void set_task_ids(int index, const char* value, size_t size);
+ std::string* add_task_ids();
+ void add_task_ids(const std::string& value);
+ void add_task_ids(std::string&& value);
+ void add_task_ids(const char* value);
+ void add_task_ids(const char* value, size_t size);
+ const ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField& task_ids() const;
+ ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField* mutable_task_ids();
+ private:
+ const std::string& _internal_task_ids(int index) const;
+ std::string* _internal_add_task_ids();
+ public:
+
+ // .flwr.proto.Node node = 1;
+ bool has_node() const;
+ private:
+ bool _internal_has_node() const;
+ public:
+ void clear_node();
+ const ::flwr::proto::Node& node() const;
+ PROTOBUF_MUST_USE_RESULT ::flwr::proto::Node* release_node();
+ ::flwr::proto::Node* mutable_node();
+ void set_allocated_node(::flwr::proto::Node* node);
+ private:
+ const ::flwr::proto::Node& _internal_node() const;
+ ::flwr::proto::Node* _internal_mutable_node();
+ public:
+ void unsafe_arena_set_allocated_node(
+ ::flwr::proto::Node* node);
+ ::flwr::proto::Node* unsafe_arena_release_node();
+
+ // @@protoc_insertion_point(class_scope:flwr.proto.PullTaskInsRequest)
+ private:
+ class _Internal;
+
+ template friend class ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper;
+ typedef void InternalArenaConstructable_;
+ typedef void DestructorSkippable_;
+ ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField task_ids_;
+ ::flwr::proto::Node* node_;
+ mutable ::PROTOBUF_NAMESPACE_ID::internal::CachedSize _cached_size_;
+ friend struct ::TableStruct_flwr_2fproto_2ffleet_2eproto;
+};
+// -------------------------------------------------------------------
+
+class PullTaskInsResponse final :
+ public ::PROTOBUF_NAMESPACE_ID::Message /* @@protoc_insertion_point(class_definition:flwr.proto.PullTaskInsResponse) */ {
+ public:
+ inline PullTaskInsResponse() : PullTaskInsResponse(nullptr) {}
+ ~PullTaskInsResponse() override;
+ explicit constexpr PullTaskInsResponse(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized);
+
+ PullTaskInsResponse(const PullTaskInsResponse& from);
+ PullTaskInsResponse(PullTaskInsResponse&& from) noexcept
+ : PullTaskInsResponse() {
+ *this = ::std::move(from);
+ }
+
+ inline PullTaskInsResponse& operator=(const PullTaskInsResponse& from) {
+ CopyFrom(from);
+ return *this;
+ }
+ inline PullTaskInsResponse& operator=(PullTaskInsResponse&& from) noexcept {
+ if (this == &from) return *this;
+ if (GetOwningArena() == from.GetOwningArena()
+ #ifdef PROTOBUF_FORCE_COPY_IN_MOVE
+ && GetOwningArena() != nullptr
+ #endif // !PROTOBUF_FORCE_COPY_IN_MOVE
+ ) {
+ InternalSwap(&from);
+ } else {
+ CopyFrom(from);
+ }
+ return *this;
+ }
+
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* descriptor() {
+ return GetDescriptor();
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* GetDescriptor() {
+ return default_instance().GetMetadata().descriptor;
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Reflection* GetReflection() {
+ return default_instance().GetMetadata().reflection;
+ }
+ static const PullTaskInsResponse& default_instance() {
+ return *internal_default_instance();
+ }
+ static inline const PullTaskInsResponse* internal_default_instance() {
+ return reinterpret_cast(
+ &_PullTaskInsResponse_default_instance_);
+ }
+ static constexpr int kIndexInFileMessages =
+ 5;
+
+ friend void swap(PullTaskInsResponse& a, PullTaskInsResponse& b) {
+ a.Swap(&b);
+ }
+ inline void Swap(PullTaskInsResponse* other) {
+ if (other == this) return;
+ if (GetOwningArena() == other->GetOwningArena()) {
+ InternalSwap(other);
+ } else {
+ ::PROTOBUF_NAMESPACE_ID::internal::GenericSwap(this, other);
+ }
+ }
+ void UnsafeArenaSwap(PullTaskInsResponse* other) {
+ if (other == this) return;
+ GOOGLE_DCHECK(GetOwningArena() == other->GetOwningArena());
+ InternalSwap(other);
+ }
+
+ // implements Message ----------------------------------------------
+
+ inline PullTaskInsResponse* New() const final {
+ return new PullTaskInsResponse();
+ }
+
+ PullTaskInsResponse* New(::PROTOBUF_NAMESPACE_ID::Arena* arena) const final {
+ return CreateMaybeMessage(arena);
+ }
+ using ::PROTOBUF_NAMESPACE_ID::Message::CopyFrom;
+ void CopyFrom(const PullTaskInsResponse& from);
+ using ::PROTOBUF_NAMESPACE_ID::Message::MergeFrom;
+ void MergeFrom(const PullTaskInsResponse& from);
+ private:
+ static void MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to, const ::PROTOBUF_NAMESPACE_ID::Message& from);
+ public:
+ PROTOBUF_ATTRIBUTE_REINITIALIZES void Clear() final;
+ bool IsInitialized() const final;
+
+ size_t ByteSizeLong() const final;
+ const char* _InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) final;
+ ::PROTOBUF_NAMESPACE_ID::uint8* _InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const final;
+ int GetCachedSize() const final { return _cached_size_.Get(); }
+
+ private:
+ void SharedCtor();
+ void SharedDtor();
+ void SetCachedSize(int size) const final;
+ void InternalSwap(PullTaskInsResponse* other);
+ friend class ::PROTOBUF_NAMESPACE_ID::internal::AnyMetadata;
+ static ::PROTOBUF_NAMESPACE_ID::StringPiece FullMessageName() {
+ return "flwr.proto.PullTaskInsResponse";
+ }
+ protected:
+ explicit PullTaskInsResponse(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned = false);
+ private:
+ static void ArenaDtor(void* object);
+ inline void RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena* arena);
+ public:
+
+ static const ClassData _class_data_;
+ const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*GetClassData() const final;
+
+ ::PROTOBUF_NAMESPACE_ID::Metadata GetMetadata() const final;
+
+ // nested types ----------------------------------------------------
+
+ // accessors -------------------------------------------------------
+
+ enum : int {
+ kTaskInsListFieldNumber = 2,
+ kReconnectFieldNumber = 1,
+ };
+ // repeated .flwr.proto.TaskIns task_ins_list = 2;
+ int task_ins_list_size() const;
+ private:
+ int _internal_task_ins_list_size() const;
+ public:
+ void clear_task_ins_list();
+ ::flwr::proto::TaskIns* mutable_task_ins_list(int index);
+ ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField< ::flwr::proto::TaskIns >*
+ mutable_task_ins_list();
+ private:
+ const ::flwr::proto::TaskIns& _internal_task_ins_list(int index) const;
+ ::flwr::proto::TaskIns* _internal_add_task_ins_list();
+ public:
+ const ::flwr::proto::TaskIns& task_ins_list(int index) const;
+ ::flwr::proto::TaskIns* add_task_ins_list();
+ const ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField< ::flwr::proto::TaskIns >&
+ task_ins_list() const;
+
+ // .flwr.proto.Reconnect reconnect = 1;
+ bool has_reconnect() const;
+ private:
+ bool _internal_has_reconnect() const;
+ public:
+ void clear_reconnect();
+ const ::flwr::proto::Reconnect& reconnect() const;
+ PROTOBUF_MUST_USE_RESULT ::flwr::proto::Reconnect* release_reconnect();
+ ::flwr::proto::Reconnect* mutable_reconnect();
+ void set_allocated_reconnect(::flwr::proto::Reconnect* reconnect);
+ private:
+ const ::flwr::proto::Reconnect& _internal_reconnect() const;
+ ::flwr::proto::Reconnect* _internal_mutable_reconnect();
+ public:
+ void unsafe_arena_set_allocated_reconnect(
+ ::flwr::proto::Reconnect* reconnect);
+ ::flwr::proto::Reconnect* unsafe_arena_release_reconnect();
+
+ // @@protoc_insertion_point(class_scope:flwr.proto.PullTaskInsResponse)
+ private:
+ class _Internal;
+
+ template friend class ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper;
+ typedef void InternalArenaConstructable_;
+ typedef void DestructorSkippable_;
+ ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField< ::flwr::proto::TaskIns > task_ins_list_;
+ ::flwr::proto::Reconnect* reconnect_;
+ mutable ::PROTOBUF_NAMESPACE_ID::internal::CachedSize _cached_size_;
+ friend struct ::TableStruct_flwr_2fproto_2ffleet_2eproto;
+};
+// -------------------------------------------------------------------
+
+class PushTaskResRequest final :
+ public ::PROTOBUF_NAMESPACE_ID::Message /* @@protoc_insertion_point(class_definition:flwr.proto.PushTaskResRequest) */ {
+ public:
+ inline PushTaskResRequest() : PushTaskResRequest(nullptr) {}
+ ~PushTaskResRequest() override;
+ explicit constexpr PushTaskResRequest(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized);
+
+ PushTaskResRequest(const PushTaskResRequest& from);
+ PushTaskResRequest(PushTaskResRequest&& from) noexcept
+ : PushTaskResRequest() {
+ *this = ::std::move(from);
+ }
+
+ inline PushTaskResRequest& operator=(const PushTaskResRequest& from) {
+ CopyFrom(from);
+ return *this;
+ }
+ inline PushTaskResRequest& operator=(PushTaskResRequest&& from) noexcept {
+ if (this == &from) return *this;
+ if (GetOwningArena() == from.GetOwningArena()
+ #ifdef PROTOBUF_FORCE_COPY_IN_MOVE
+ && GetOwningArena() != nullptr
+ #endif // !PROTOBUF_FORCE_COPY_IN_MOVE
+ ) {
+ InternalSwap(&from);
+ } else {
+ CopyFrom(from);
+ }
+ return *this;
+ }
+
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* descriptor() {
+ return GetDescriptor();
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* GetDescriptor() {
+ return default_instance().GetMetadata().descriptor;
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Reflection* GetReflection() {
+ return default_instance().GetMetadata().reflection;
+ }
+ static const PushTaskResRequest& default_instance() {
+ return *internal_default_instance();
+ }
+ static inline const PushTaskResRequest* internal_default_instance() {
+ return reinterpret_cast(
+ &_PushTaskResRequest_default_instance_);
+ }
+ static constexpr int kIndexInFileMessages =
+ 6;
+
+ friend void swap(PushTaskResRequest& a, PushTaskResRequest& b) {
+ a.Swap(&b);
+ }
+ inline void Swap(PushTaskResRequest* other) {
+ if (other == this) return;
+ if (GetOwningArena() == other->GetOwningArena()) {
+ InternalSwap(other);
+ } else {
+ ::PROTOBUF_NAMESPACE_ID::internal::GenericSwap(this, other);
+ }
+ }
+ void UnsafeArenaSwap(PushTaskResRequest* other) {
+ if (other == this) return;
+ GOOGLE_DCHECK(GetOwningArena() == other->GetOwningArena());
+ InternalSwap(other);
+ }
+
+ // implements Message ----------------------------------------------
+
+ inline PushTaskResRequest* New() const final {
+ return new PushTaskResRequest();
+ }
+
+ PushTaskResRequest* New(::PROTOBUF_NAMESPACE_ID::Arena* arena) const final {
+ return CreateMaybeMessage(arena);
+ }
+ using ::PROTOBUF_NAMESPACE_ID::Message::CopyFrom;
+ void CopyFrom(const PushTaskResRequest& from);
+ using ::PROTOBUF_NAMESPACE_ID::Message::MergeFrom;
+ void MergeFrom(const PushTaskResRequest& from);
+ private:
+ static void MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to, const ::PROTOBUF_NAMESPACE_ID::Message& from);
+ public:
+ PROTOBUF_ATTRIBUTE_REINITIALIZES void Clear() final;
+ bool IsInitialized() const final;
+
+ size_t ByteSizeLong() const final;
+ const char* _InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) final;
+ ::PROTOBUF_NAMESPACE_ID::uint8* _InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const final;
+ int GetCachedSize() const final { return _cached_size_.Get(); }
+
+ private:
+ void SharedCtor();
+ void SharedDtor();
+ void SetCachedSize(int size) const final;
+ void InternalSwap(PushTaskResRequest* other);
+ friend class ::PROTOBUF_NAMESPACE_ID::internal::AnyMetadata;
+ static ::PROTOBUF_NAMESPACE_ID::StringPiece FullMessageName() {
+ return "flwr.proto.PushTaskResRequest";
+ }
+ protected:
+ explicit PushTaskResRequest(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned = false);
+ private:
+ static void ArenaDtor(void* object);
+ inline void RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena* arena);
+ public:
+
+ static const ClassData _class_data_;
+ const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*GetClassData() const final;
+
+ ::PROTOBUF_NAMESPACE_ID::Metadata GetMetadata() const final;
+
+ // nested types ----------------------------------------------------
+
+ // accessors -------------------------------------------------------
+
+ enum : int {
+ kTaskResListFieldNumber = 1,
+ };
+ // repeated .flwr.proto.TaskRes task_res_list = 1;
+ int task_res_list_size() const;
+ private:
+ int _internal_task_res_list_size() const;
+ public:
+ void clear_task_res_list();
+ ::flwr::proto::TaskRes* mutable_task_res_list(int index);
+ ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField< ::flwr::proto::TaskRes >*
+ mutable_task_res_list();
+ private:
+ const ::flwr::proto::TaskRes& _internal_task_res_list(int index) const;
+ ::flwr::proto::TaskRes* _internal_add_task_res_list();
+ public:
+ const ::flwr::proto::TaskRes& task_res_list(int index) const;
+ ::flwr::proto::TaskRes* add_task_res_list();
+ const ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField< ::flwr::proto::TaskRes >&
+ task_res_list() const;
+
+ // @@protoc_insertion_point(class_scope:flwr.proto.PushTaskResRequest)
+ private:
+ class _Internal;
+
+ template friend class ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper;
+ typedef void InternalArenaConstructable_;
+ typedef void DestructorSkippable_;
+ ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField< ::flwr::proto::TaskRes > task_res_list_;
+ mutable ::PROTOBUF_NAMESPACE_ID::internal::CachedSize _cached_size_;
+ friend struct ::TableStruct_flwr_2fproto_2ffleet_2eproto;
+};
+// -------------------------------------------------------------------
+
+class PushTaskResResponse_ResultsEntry_DoNotUse : public ::PROTOBUF_NAMESPACE_ID::internal::MapEntry {
+public:
+ typedef ::PROTOBUF_NAMESPACE_ID::internal::MapEntry SuperType;
+ PushTaskResResponse_ResultsEntry_DoNotUse();
+ explicit constexpr PushTaskResResponse_ResultsEntry_DoNotUse(
+ ::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized);
+ explicit PushTaskResResponse_ResultsEntry_DoNotUse(::PROTOBUF_NAMESPACE_ID::Arena* arena);
+ void MergeFrom(const PushTaskResResponse_ResultsEntry_DoNotUse& other);
+ static const PushTaskResResponse_ResultsEntry_DoNotUse* internal_default_instance() { return reinterpret_cast(&_PushTaskResResponse_ResultsEntry_DoNotUse_default_instance_); }
+ static bool ValidateKey(std::string* s) {
+ return ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::VerifyUtf8String(s->data(), static_cast(s->size()), ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::PARSE, "flwr.proto.PushTaskResResponse.ResultsEntry.key");
+ }
+ static bool ValidateValue(void*) { return true; }
+ using ::PROTOBUF_NAMESPACE_ID::Message::MergeFrom;
+ ::PROTOBUF_NAMESPACE_ID::Metadata GetMetadata() const final;
+};
+
+// -------------------------------------------------------------------
+
+class PushTaskResResponse final :
+ public ::PROTOBUF_NAMESPACE_ID::Message /* @@protoc_insertion_point(class_definition:flwr.proto.PushTaskResResponse) */ {
+ public:
+ inline PushTaskResResponse() : PushTaskResResponse(nullptr) {}
+ ~PushTaskResResponse() override;
+ explicit constexpr PushTaskResResponse(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized);
+
+ PushTaskResResponse(const PushTaskResResponse& from);
+ PushTaskResResponse(PushTaskResResponse&& from) noexcept
+ : PushTaskResResponse() {
+ *this = ::std::move(from);
+ }
+
+ inline PushTaskResResponse& operator=(const PushTaskResResponse& from) {
+ CopyFrom(from);
+ return *this;
+ }
+ inline PushTaskResResponse& operator=(PushTaskResResponse&& from) noexcept {
+ if (this == &from) return *this;
+ if (GetOwningArena() == from.GetOwningArena()
+ #ifdef PROTOBUF_FORCE_COPY_IN_MOVE
+ && GetOwningArena() != nullptr
+ #endif // !PROTOBUF_FORCE_COPY_IN_MOVE
+ ) {
+ InternalSwap(&from);
+ } else {
+ CopyFrom(from);
+ }
+ return *this;
+ }
+
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* descriptor() {
+ return GetDescriptor();
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* GetDescriptor() {
+ return default_instance().GetMetadata().descriptor;
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Reflection* GetReflection() {
+ return default_instance().GetMetadata().reflection;
+ }
+ static const PushTaskResResponse& default_instance() {
+ return *internal_default_instance();
+ }
+ static inline const PushTaskResResponse* internal_default_instance() {
+ return reinterpret_cast(
+ &_PushTaskResResponse_default_instance_);
+ }
+ static constexpr int kIndexInFileMessages =
+ 8;
+
+ friend void swap(PushTaskResResponse& a, PushTaskResResponse& b) {
+ a.Swap(&b);
+ }
+ inline void Swap(PushTaskResResponse* other) {
+ if (other == this) return;
+ if (GetOwningArena() == other->GetOwningArena()) {
+ InternalSwap(other);
+ } else {
+ ::PROTOBUF_NAMESPACE_ID::internal::GenericSwap(this, other);
+ }
+ }
+ void UnsafeArenaSwap(PushTaskResResponse* other) {
+ if (other == this) return;
+ GOOGLE_DCHECK(GetOwningArena() == other->GetOwningArena());
+ InternalSwap(other);
+ }
+
+ // implements Message ----------------------------------------------
+
+ inline PushTaskResResponse* New() const final {
+ return new PushTaskResResponse();
+ }
+
+ PushTaskResResponse* New(::PROTOBUF_NAMESPACE_ID::Arena* arena) const final {
+ return CreateMaybeMessage(arena);
+ }
+ using ::PROTOBUF_NAMESPACE_ID::Message::CopyFrom;
+ void CopyFrom(const PushTaskResResponse& from);
+ using ::PROTOBUF_NAMESPACE_ID::Message::MergeFrom;
+ void MergeFrom(const PushTaskResResponse& from);
+ private:
+ static void MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to, const ::PROTOBUF_NAMESPACE_ID::Message& from);
+ public:
+ PROTOBUF_ATTRIBUTE_REINITIALIZES void Clear() final;
+ bool IsInitialized() const final;
+
+ size_t ByteSizeLong() const final;
+ const char* _InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) final;
+ ::PROTOBUF_NAMESPACE_ID::uint8* _InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const final;
+ int GetCachedSize() const final { return _cached_size_.Get(); }
+
+ private:
+ void SharedCtor();
+ void SharedDtor();
+ void SetCachedSize(int size) const final;
+ void InternalSwap(PushTaskResResponse* other);
+ friend class ::PROTOBUF_NAMESPACE_ID::internal::AnyMetadata;
+ static ::PROTOBUF_NAMESPACE_ID::StringPiece FullMessageName() {
+ return "flwr.proto.PushTaskResResponse";
+ }
+ protected:
+ explicit PushTaskResResponse(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned = false);
+ private:
+ static void ArenaDtor(void* object);
+ inline void RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena* arena);
+ public:
+
+ static const ClassData _class_data_;
+ const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*GetClassData() const final;
+
+ ::PROTOBUF_NAMESPACE_ID::Metadata GetMetadata() const final;
+
+ // nested types ----------------------------------------------------
+
+
+ // accessors -------------------------------------------------------
+
+ enum : int {
+ kResultsFieldNumber = 2,
+ kReconnectFieldNumber = 1,
+ };
+ // map results = 2;
+ int results_size() const;
+ private:
+ int _internal_results_size() const;
+ public:
+ void clear_results();
+ private:
+ const ::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >&
+ _internal_results() const;
+ ::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >*
+ _internal_mutable_results();
+ public:
+ const ::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >&
+ results() const;
+ ::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >*
+ mutable_results();
+
+ // .flwr.proto.Reconnect reconnect = 1;
+ bool has_reconnect() const;
+ private:
+ bool _internal_has_reconnect() const;
+ public:
+ void clear_reconnect();
+ const ::flwr::proto::Reconnect& reconnect() const;
+ PROTOBUF_MUST_USE_RESULT ::flwr::proto::Reconnect* release_reconnect();
+ ::flwr::proto::Reconnect* mutable_reconnect();
+ void set_allocated_reconnect(::flwr::proto::Reconnect* reconnect);
+ private:
+ const ::flwr::proto::Reconnect& _internal_reconnect() const;
+ ::flwr::proto::Reconnect* _internal_mutable_reconnect();
+ public:
+ void unsafe_arena_set_allocated_reconnect(
+ ::flwr::proto::Reconnect* reconnect);
+ ::flwr::proto::Reconnect* unsafe_arena_release_reconnect();
+
+ // @@protoc_insertion_point(class_scope:flwr.proto.PushTaskResResponse)
+ private:
+ class _Internal;
+
+ template friend class ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper;
+ typedef void InternalArenaConstructable_;
+ typedef void DestructorSkippable_;
+ ::PROTOBUF_NAMESPACE_ID::internal::MapField<
+ PushTaskResResponse_ResultsEntry_DoNotUse,
+ std::string, ::PROTOBUF_NAMESPACE_ID::uint32,
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::TYPE_STRING,
+ ::PROTOBUF_NAMESPACE_ID::internal::WireFormatLite::TYPE_UINT32> results_;
+ ::flwr::proto::Reconnect* reconnect_;
+ mutable ::PROTOBUF_NAMESPACE_ID::internal::CachedSize _cached_size_;
+ friend struct ::TableStruct_flwr_2fproto_2ffleet_2eproto;
+};
+// -------------------------------------------------------------------
+
+class Reconnect final :
+ public ::PROTOBUF_NAMESPACE_ID::Message /* @@protoc_insertion_point(class_definition:flwr.proto.Reconnect) */ {
+ public:
+ inline Reconnect() : Reconnect(nullptr) {}
+ ~Reconnect() override;
+ explicit constexpr Reconnect(::PROTOBUF_NAMESPACE_ID::internal::ConstantInitialized);
+
+ Reconnect(const Reconnect& from);
+ Reconnect(Reconnect&& from) noexcept
+ : Reconnect() {
+ *this = ::std::move(from);
+ }
+
+ inline Reconnect& operator=(const Reconnect& from) {
+ CopyFrom(from);
+ return *this;
+ }
+ inline Reconnect& operator=(Reconnect&& from) noexcept {
+ if (this == &from) return *this;
+ if (GetOwningArena() == from.GetOwningArena()
+ #ifdef PROTOBUF_FORCE_COPY_IN_MOVE
+ && GetOwningArena() != nullptr
+ #endif // !PROTOBUF_FORCE_COPY_IN_MOVE
+ ) {
+ InternalSwap(&from);
+ } else {
+ CopyFrom(from);
+ }
+ return *this;
+ }
+
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* descriptor() {
+ return GetDescriptor();
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Descriptor* GetDescriptor() {
+ return default_instance().GetMetadata().descriptor;
+ }
+ static const ::PROTOBUF_NAMESPACE_ID::Reflection* GetReflection() {
+ return default_instance().GetMetadata().reflection;
+ }
+ static const Reconnect& default_instance() {
+ return *internal_default_instance();
+ }
+ static inline const Reconnect* internal_default_instance() {
+ return reinterpret_cast(
+ &_Reconnect_default_instance_);
+ }
+ static constexpr int kIndexInFileMessages =
+ 9;
+
+ friend void swap(Reconnect& a, Reconnect& b) {
+ a.Swap(&b);
+ }
+ inline void Swap(Reconnect* other) {
+ if (other == this) return;
+ if (GetOwningArena() == other->GetOwningArena()) {
+ InternalSwap(other);
+ } else {
+ ::PROTOBUF_NAMESPACE_ID::internal::GenericSwap(this, other);
+ }
+ }
+ void UnsafeArenaSwap(Reconnect* other) {
+ if (other == this) return;
+ GOOGLE_DCHECK(GetOwningArena() == other->GetOwningArena());
+ InternalSwap(other);
+ }
+
+ // implements Message ----------------------------------------------
+
+ inline Reconnect* New() const final {
+ return new Reconnect();
+ }
+
+ Reconnect* New(::PROTOBUF_NAMESPACE_ID::Arena* arena) const final {
+ return CreateMaybeMessage(arena);
+ }
+ using ::PROTOBUF_NAMESPACE_ID::Message::CopyFrom;
+ void CopyFrom(const Reconnect& from);
+ using ::PROTOBUF_NAMESPACE_ID::Message::MergeFrom;
+ void MergeFrom(const Reconnect& from);
+ private:
+ static void MergeImpl(::PROTOBUF_NAMESPACE_ID::Message* to, const ::PROTOBUF_NAMESPACE_ID::Message& from);
+ public:
+ PROTOBUF_ATTRIBUTE_REINITIALIZES void Clear() final;
+ bool IsInitialized() const final;
+
+ size_t ByteSizeLong() const final;
+ const char* _InternalParse(const char* ptr, ::PROTOBUF_NAMESPACE_ID::internal::ParseContext* ctx) final;
+ ::PROTOBUF_NAMESPACE_ID::uint8* _InternalSerialize(
+ ::PROTOBUF_NAMESPACE_ID::uint8* target, ::PROTOBUF_NAMESPACE_ID::io::EpsCopyOutputStream* stream) const final;
+ int GetCachedSize() const final { return _cached_size_.Get(); }
+
+ private:
+ void SharedCtor();
+ void SharedDtor();
+ void SetCachedSize(int size) const final;
+ void InternalSwap(Reconnect* other);
+ friend class ::PROTOBUF_NAMESPACE_ID::internal::AnyMetadata;
+ static ::PROTOBUF_NAMESPACE_ID::StringPiece FullMessageName() {
+ return "flwr.proto.Reconnect";
+ }
+ protected:
+ explicit Reconnect(::PROTOBUF_NAMESPACE_ID::Arena* arena,
+ bool is_message_owned = false);
+ private:
+ static void ArenaDtor(void* object);
+ inline void RegisterArenaDtor(::PROTOBUF_NAMESPACE_ID::Arena* arena);
+ public:
+
+ static const ClassData _class_data_;
+ const ::PROTOBUF_NAMESPACE_ID::Message::ClassData*GetClassData() const final;
+
+ ::PROTOBUF_NAMESPACE_ID::Metadata GetMetadata() const final;
+
+ // nested types ----------------------------------------------------
+
+ // accessors -------------------------------------------------------
+
+ enum : int {
+ kReconnectFieldNumber = 1,
+ };
+ // uint64 reconnect = 1;
+ void clear_reconnect();
+ ::PROTOBUF_NAMESPACE_ID::uint64 reconnect() const;
+ void set_reconnect(::PROTOBUF_NAMESPACE_ID::uint64 value);
+ private:
+ ::PROTOBUF_NAMESPACE_ID::uint64 _internal_reconnect() const;
+ void _internal_set_reconnect(::PROTOBUF_NAMESPACE_ID::uint64 value);
+ public:
+
+ // @@protoc_insertion_point(class_scope:flwr.proto.Reconnect)
+ private:
+ class _Internal;
+
+ template friend class ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper;
+ typedef void InternalArenaConstructable_;
+ typedef void DestructorSkippable_;
+ ::PROTOBUF_NAMESPACE_ID::uint64 reconnect_;
+ mutable ::PROTOBUF_NAMESPACE_ID::internal::CachedSize _cached_size_;
+ friend struct ::TableStruct_flwr_2fproto_2ffleet_2eproto;
+};
+// ===================================================================
+
+
+// ===================================================================
+
+#ifdef __GNUC__
+ #pragma GCC diagnostic push
+ #pragma GCC diagnostic ignored "-Wstrict-aliasing"
+#endif // __GNUC__
+// CreateNodeRequest
+
+// -------------------------------------------------------------------
+
+// CreateNodeResponse
+
+// .flwr.proto.Node node = 1;
+inline bool CreateNodeResponse::_internal_has_node() const {
+ return this != internal_default_instance() && node_ != nullptr;
+}
+inline bool CreateNodeResponse::has_node() const {
+ return _internal_has_node();
+}
+inline const ::flwr::proto::Node& CreateNodeResponse::_internal_node() const {
+ const ::flwr::proto::Node* p = node_;
+ return p != nullptr ? *p : reinterpret_cast(
+ ::flwr::proto::_Node_default_instance_);
+}
+inline const ::flwr::proto::Node& CreateNodeResponse::node() const {
+ // @@protoc_insertion_point(field_get:flwr.proto.CreateNodeResponse.node)
+ return _internal_node();
+}
+inline void CreateNodeResponse::unsafe_arena_set_allocated_node(
+ ::flwr::proto::Node* node) {
+ if (GetArenaForAllocation() == nullptr) {
+ delete reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(node_);
+ }
+ node_ = node;
+ if (node) {
+
+ } else {
+
+ }
+ // @@protoc_insertion_point(field_unsafe_arena_set_allocated:flwr.proto.CreateNodeResponse.node)
+}
+inline ::flwr::proto::Node* CreateNodeResponse::release_node() {
+
+ ::flwr::proto::Node* temp = node_;
+ node_ = nullptr;
+#ifdef PROTOBUF_FORCE_COPY_IN_RELEASE
+ auto* old = reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(temp);
+ temp = ::PROTOBUF_NAMESPACE_ID::internal::DuplicateIfNonNull(temp);
+ if (GetArenaForAllocation() == nullptr) { delete old; }
+#else // PROTOBUF_FORCE_COPY_IN_RELEASE
+ if (GetArenaForAllocation() != nullptr) {
+ temp = ::PROTOBUF_NAMESPACE_ID::internal::DuplicateIfNonNull(temp);
+ }
+#endif // !PROTOBUF_FORCE_COPY_IN_RELEASE
+ return temp;
+}
+inline ::flwr::proto::Node* CreateNodeResponse::unsafe_arena_release_node() {
+ // @@protoc_insertion_point(field_release:flwr.proto.CreateNodeResponse.node)
+
+ ::flwr::proto::Node* temp = node_;
+ node_ = nullptr;
+ return temp;
+}
+inline ::flwr::proto::Node* CreateNodeResponse::_internal_mutable_node() {
+
+ if (node_ == nullptr) {
+ auto* p = CreateMaybeMessage<::flwr::proto::Node>(GetArenaForAllocation());
+ node_ = p;
+ }
+ return node_;
+}
+inline ::flwr::proto::Node* CreateNodeResponse::mutable_node() {
+ ::flwr::proto::Node* _msg = _internal_mutable_node();
+ // @@protoc_insertion_point(field_mutable:flwr.proto.CreateNodeResponse.node)
+ return _msg;
+}
+inline void CreateNodeResponse::set_allocated_node(::flwr::proto::Node* node) {
+ ::PROTOBUF_NAMESPACE_ID::Arena* message_arena = GetArenaForAllocation();
+ if (message_arena == nullptr) {
+ delete reinterpret_cast< ::PROTOBUF_NAMESPACE_ID::MessageLite*>(node_);
+ }
+ if (node) {
+ ::PROTOBUF_NAMESPACE_ID::Arena* submessage_arena =
+ ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper<
+ ::PROTOBUF_NAMESPACE_ID::MessageLite>::GetOwningArena(
+ reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(node));
+ if (message_arena != submessage_arena) {
+ node = ::PROTOBUF_NAMESPACE_ID::internal::GetOwnedMessage(
+ message_arena, node, submessage_arena);
+ }
+
+ } else {
+
+ }
+ node_ = node;
+ // @@protoc_insertion_point(field_set_allocated:flwr.proto.CreateNodeResponse.node)
+}
+
+// -------------------------------------------------------------------
+
+// DeleteNodeRequest
+
+// .flwr.proto.Node node = 1;
+inline bool DeleteNodeRequest::_internal_has_node() const {
+ return this != internal_default_instance() && node_ != nullptr;
+}
+inline bool DeleteNodeRequest::has_node() const {
+ return _internal_has_node();
+}
+inline const ::flwr::proto::Node& DeleteNodeRequest::_internal_node() const {
+ const ::flwr::proto::Node* p = node_;
+ return p != nullptr ? *p : reinterpret_cast(
+ ::flwr::proto::_Node_default_instance_);
+}
+inline const ::flwr::proto::Node& DeleteNodeRequest::node() const {
+ // @@protoc_insertion_point(field_get:flwr.proto.DeleteNodeRequest.node)
+ return _internal_node();
+}
+inline void DeleteNodeRequest::unsafe_arena_set_allocated_node(
+ ::flwr::proto::Node* node) {
+ if (GetArenaForAllocation() == nullptr) {
+ delete reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(node_);
+ }
+ node_ = node;
+ if (node) {
+
+ } else {
+
+ }
+ // @@protoc_insertion_point(field_unsafe_arena_set_allocated:flwr.proto.DeleteNodeRequest.node)
+}
+inline ::flwr::proto::Node* DeleteNodeRequest::release_node() {
+
+ ::flwr::proto::Node* temp = node_;
+ node_ = nullptr;
+#ifdef PROTOBUF_FORCE_COPY_IN_RELEASE
+ auto* old = reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(temp);
+ temp = ::PROTOBUF_NAMESPACE_ID::internal::DuplicateIfNonNull(temp);
+ if (GetArenaForAllocation() == nullptr) { delete old; }
+#else // PROTOBUF_FORCE_COPY_IN_RELEASE
+ if (GetArenaForAllocation() != nullptr) {
+ temp = ::PROTOBUF_NAMESPACE_ID::internal::DuplicateIfNonNull(temp);
+ }
+#endif // !PROTOBUF_FORCE_COPY_IN_RELEASE
+ return temp;
+}
+inline ::flwr::proto::Node* DeleteNodeRequest::unsafe_arena_release_node() {
+ // @@protoc_insertion_point(field_release:flwr.proto.DeleteNodeRequest.node)
+
+ ::flwr::proto::Node* temp = node_;
+ node_ = nullptr;
+ return temp;
+}
+inline ::flwr::proto::Node* DeleteNodeRequest::_internal_mutable_node() {
+
+ if (node_ == nullptr) {
+ auto* p = CreateMaybeMessage<::flwr::proto::Node>(GetArenaForAllocation());
+ node_ = p;
+ }
+ return node_;
+}
+inline ::flwr::proto::Node* DeleteNodeRequest::mutable_node() {
+ ::flwr::proto::Node* _msg = _internal_mutable_node();
+ // @@protoc_insertion_point(field_mutable:flwr.proto.DeleteNodeRequest.node)
+ return _msg;
+}
+inline void DeleteNodeRequest::set_allocated_node(::flwr::proto::Node* node) {
+ ::PROTOBUF_NAMESPACE_ID::Arena* message_arena = GetArenaForAllocation();
+ if (message_arena == nullptr) {
+ delete reinterpret_cast< ::PROTOBUF_NAMESPACE_ID::MessageLite*>(node_);
+ }
+ if (node) {
+ ::PROTOBUF_NAMESPACE_ID::Arena* submessage_arena =
+ ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper<
+ ::PROTOBUF_NAMESPACE_ID::MessageLite>::GetOwningArena(
+ reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(node));
+ if (message_arena != submessage_arena) {
+ node = ::PROTOBUF_NAMESPACE_ID::internal::GetOwnedMessage(
+ message_arena, node, submessage_arena);
+ }
+
+ } else {
+
+ }
+ node_ = node;
+ // @@protoc_insertion_point(field_set_allocated:flwr.proto.DeleteNodeRequest.node)
+}
+
+// -------------------------------------------------------------------
+
+// DeleteNodeResponse
+
+// -------------------------------------------------------------------
+
+// PullTaskInsRequest
+
+// .flwr.proto.Node node = 1;
+inline bool PullTaskInsRequest::_internal_has_node() const {
+ return this != internal_default_instance() && node_ != nullptr;
+}
+inline bool PullTaskInsRequest::has_node() const {
+ return _internal_has_node();
+}
+inline const ::flwr::proto::Node& PullTaskInsRequest::_internal_node() const {
+ const ::flwr::proto::Node* p = node_;
+ return p != nullptr ? *p : reinterpret_cast(
+ ::flwr::proto::_Node_default_instance_);
+}
+inline const ::flwr::proto::Node& PullTaskInsRequest::node() const {
+ // @@protoc_insertion_point(field_get:flwr.proto.PullTaskInsRequest.node)
+ return _internal_node();
+}
+inline void PullTaskInsRequest::unsafe_arena_set_allocated_node(
+ ::flwr::proto::Node* node) {
+ if (GetArenaForAllocation() == nullptr) {
+ delete reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(node_);
+ }
+ node_ = node;
+ if (node) {
+
+ } else {
+
+ }
+ // @@protoc_insertion_point(field_unsafe_arena_set_allocated:flwr.proto.PullTaskInsRequest.node)
+}
+inline ::flwr::proto::Node* PullTaskInsRequest::release_node() {
+
+ ::flwr::proto::Node* temp = node_;
+ node_ = nullptr;
+#ifdef PROTOBUF_FORCE_COPY_IN_RELEASE
+ auto* old = reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(temp);
+ temp = ::PROTOBUF_NAMESPACE_ID::internal::DuplicateIfNonNull(temp);
+ if (GetArenaForAllocation() == nullptr) { delete old; }
+#else // PROTOBUF_FORCE_COPY_IN_RELEASE
+ if (GetArenaForAllocation() != nullptr) {
+ temp = ::PROTOBUF_NAMESPACE_ID::internal::DuplicateIfNonNull(temp);
+ }
+#endif // !PROTOBUF_FORCE_COPY_IN_RELEASE
+ return temp;
+}
+inline ::flwr::proto::Node* PullTaskInsRequest::unsafe_arena_release_node() {
+ // @@protoc_insertion_point(field_release:flwr.proto.PullTaskInsRequest.node)
+
+ ::flwr::proto::Node* temp = node_;
+ node_ = nullptr;
+ return temp;
+}
+inline ::flwr::proto::Node* PullTaskInsRequest::_internal_mutable_node() {
+
+ if (node_ == nullptr) {
+ auto* p = CreateMaybeMessage<::flwr::proto::Node>(GetArenaForAllocation());
+ node_ = p;
+ }
+ return node_;
+}
+inline ::flwr::proto::Node* PullTaskInsRequest::mutable_node() {
+ ::flwr::proto::Node* _msg = _internal_mutable_node();
+ // @@protoc_insertion_point(field_mutable:flwr.proto.PullTaskInsRequest.node)
+ return _msg;
+}
+inline void PullTaskInsRequest::set_allocated_node(::flwr::proto::Node* node) {
+ ::PROTOBUF_NAMESPACE_ID::Arena* message_arena = GetArenaForAllocation();
+ if (message_arena == nullptr) {
+ delete reinterpret_cast< ::PROTOBUF_NAMESPACE_ID::MessageLite*>(node_);
+ }
+ if (node) {
+ ::PROTOBUF_NAMESPACE_ID::Arena* submessage_arena =
+ ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper<
+ ::PROTOBUF_NAMESPACE_ID::MessageLite>::GetOwningArena(
+ reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(node));
+ if (message_arena != submessage_arena) {
+ node = ::PROTOBUF_NAMESPACE_ID::internal::GetOwnedMessage(
+ message_arena, node, submessage_arena);
+ }
+
+ } else {
+
+ }
+ node_ = node;
+ // @@protoc_insertion_point(field_set_allocated:flwr.proto.PullTaskInsRequest.node)
+}
+
+// repeated string task_ids = 2;
+inline int PullTaskInsRequest::_internal_task_ids_size() const {
+ return task_ids_.size();
+}
+inline int PullTaskInsRequest::task_ids_size() const {
+ return _internal_task_ids_size();
+}
+inline void PullTaskInsRequest::clear_task_ids() {
+ task_ids_.Clear();
+}
+inline std::string* PullTaskInsRequest::add_task_ids() {
+ std::string* _s = _internal_add_task_ids();
+ // @@protoc_insertion_point(field_add_mutable:flwr.proto.PullTaskInsRequest.task_ids)
+ return _s;
+}
+inline const std::string& PullTaskInsRequest::_internal_task_ids(int index) const {
+ return task_ids_.Get(index);
+}
+inline const std::string& PullTaskInsRequest::task_ids(int index) const {
+ // @@protoc_insertion_point(field_get:flwr.proto.PullTaskInsRequest.task_ids)
+ return _internal_task_ids(index);
+}
+inline std::string* PullTaskInsRequest::mutable_task_ids(int index) {
+ // @@protoc_insertion_point(field_mutable:flwr.proto.PullTaskInsRequest.task_ids)
+ return task_ids_.Mutable(index);
+}
+inline void PullTaskInsRequest::set_task_ids(int index, const std::string& value) {
+ task_ids_.Mutable(index)->assign(value);
+ // @@protoc_insertion_point(field_set:flwr.proto.PullTaskInsRequest.task_ids)
+}
+inline void PullTaskInsRequest::set_task_ids(int index, std::string&& value) {
+ task_ids_.Mutable(index)->assign(std::move(value));
+ // @@protoc_insertion_point(field_set:flwr.proto.PullTaskInsRequest.task_ids)
+}
+inline void PullTaskInsRequest::set_task_ids(int index, const char* value) {
+ GOOGLE_DCHECK(value != nullptr);
+ task_ids_.Mutable(index)->assign(value);
+ // @@protoc_insertion_point(field_set_char:flwr.proto.PullTaskInsRequest.task_ids)
+}
+inline void PullTaskInsRequest::set_task_ids(int index, const char* value, size_t size) {
+ task_ids_.Mutable(index)->assign(
+ reinterpret_cast(value), size);
+ // @@protoc_insertion_point(field_set_pointer:flwr.proto.PullTaskInsRequest.task_ids)
+}
+inline std::string* PullTaskInsRequest::_internal_add_task_ids() {
+ return task_ids_.Add();
+}
+inline void PullTaskInsRequest::add_task_ids(const std::string& value) {
+ task_ids_.Add()->assign(value);
+ // @@protoc_insertion_point(field_add:flwr.proto.PullTaskInsRequest.task_ids)
+}
+inline void PullTaskInsRequest::add_task_ids(std::string&& value) {
+ task_ids_.Add(std::move(value));
+ // @@protoc_insertion_point(field_add:flwr.proto.PullTaskInsRequest.task_ids)
+}
+inline void PullTaskInsRequest::add_task_ids(const char* value) {
+ GOOGLE_DCHECK(value != nullptr);
+ task_ids_.Add()->assign(value);
+ // @@protoc_insertion_point(field_add_char:flwr.proto.PullTaskInsRequest.task_ids)
+}
+inline void PullTaskInsRequest::add_task_ids(const char* value, size_t size) {
+ task_ids_.Add()->assign(reinterpret_cast(value), size);
+ // @@protoc_insertion_point(field_add_pointer:flwr.proto.PullTaskInsRequest.task_ids)
+}
+inline const ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField&
+PullTaskInsRequest::task_ids() const {
+ // @@protoc_insertion_point(field_list:flwr.proto.PullTaskInsRequest.task_ids)
+ return task_ids_;
+}
+inline ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField*
+PullTaskInsRequest::mutable_task_ids() {
+ // @@protoc_insertion_point(field_mutable_list:flwr.proto.PullTaskInsRequest.task_ids)
+ return &task_ids_;
+}
+
+// -------------------------------------------------------------------
+
+// PullTaskInsResponse
+
+// .flwr.proto.Reconnect reconnect = 1;
+inline bool PullTaskInsResponse::_internal_has_reconnect() const {
+ return this != internal_default_instance() && reconnect_ != nullptr;
+}
+inline bool PullTaskInsResponse::has_reconnect() const {
+ return _internal_has_reconnect();
+}
+inline void PullTaskInsResponse::clear_reconnect() {
+ if (GetArenaForAllocation() == nullptr && reconnect_ != nullptr) {
+ delete reconnect_;
+ }
+ reconnect_ = nullptr;
+}
+inline const ::flwr::proto::Reconnect& PullTaskInsResponse::_internal_reconnect() const {
+ const ::flwr::proto::Reconnect* p = reconnect_;
+ return p != nullptr ? *p : reinterpret_cast(
+ ::flwr::proto::_Reconnect_default_instance_);
+}
+inline const ::flwr::proto::Reconnect& PullTaskInsResponse::reconnect() const {
+ // @@protoc_insertion_point(field_get:flwr.proto.PullTaskInsResponse.reconnect)
+ return _internal_reconnect();
+}
+inline void PullTaskInsResponse::unsafe_arena_set_allocated_reconnect(
+ ::flwr::proto::Reconnect* reconnect) {
+ if (GetArenaForAllocation() == nullptr) {
+ delete reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(reconnect_);
+ }
+ reconnect_ = reconnect;
+ if (reconnect) {
+
+ } else {
+
+ }
+ // @@protoc_insertion_point(field_unsafe_arena_set_allocated:flwr.proto.PullTaskInsResponse.reconnect)
+}
+inline ::flwr::proto::Reconnect* PullTaskInsResponse::release_reconnect() {
+
+ ::flwr::proto::Reconnect* temp = reconnect_;
+ reconnect_ = nullptr;
+#ifdef PROTOBUF_FORCE_COPY_IN_RELEASE
+ auto* old = reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(temp);
+ temp = ::PROTOBUF_NAMESPACE_ID::internal::DuplicateIfNonNull(temp);
+ if (GetArenaForAllocation() == nullptr) { delete old; }
+#else // PROTOBUF_FORCE_COPY_IN_RELEASE
+ if (GetArenaForAllocation() != nullptr) {
+ temp = ::PROTOBUF_NAMESPACE_ID::internal::DuplicateIfNonNull(temp);
+ }
+#endif // !PROTOBUF_FORCE_COPY_IN_RELEASE
+ return temp;
+}
+inline ::flwr::proto::Reconnect* PullTaskInsResponse::unsafe_arena_release_reconnect() {
+ // @@protoc_insertion_point(field_release:flwr.proto.PullTaskInsResponse.reconnect)
+
+ ::flwr::proto::Reconnect* temp = reconnect_;
+ reconnect_ = nullptr;
+ return temp;
+}
+inline ::flwr::proto::Reconnect* PullTaskInsResponse::_internal_mutable_reconnect() {
+
+ if (reconnect_ == nullptr) {
+ auto* p = CreateMaybeMessage<::flwr::proto::Reconnect>(GetArenaForAllocation());
+ reconnect_ = p;
+ }
+ return reconnect_;
+}
+inline ::flwr::proto::Reconnect* PullTaskInsResponse::mutable_reconnect() {
+ ::flwr::proto::Reconnect* _msg = _internal_mutable_reconnect();
+ // @@protoc_insertion_point(field_mutable:flwr.proto.PullTaskInsResponse.reconnect)
+ return _msg;
+}
+inline void PullTaskInsResponse::set_allocated_reconnect(::flwr::proto::Reconnect* reconnect) {
+ ::PROTOBUF_NAMESPACE_ID::Arena* message_arena = GetArenaForAllocation();
+ if (message_arena == nullptr) {
+ delete reconnect_;
+ }
+ if (reconnect) {
+ ::PROTOBUF_NAMESPACE_ID::Arena* submessage_arena =
+ ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper<::flwr::proto::Reconnect>::GetOwningArena(reconnect);
+ if (message_arena != submessage_arena) {
+ reconnect = ::PROTOBUF_NAMESPACE_ID::internal::GetOwnedMessage(
+ message_arena, reconnect, submessage_arena);
+ }
+
+ } else {
+
+ }
+ reconnect_ = reconnect;
+ // @@protoc_insertion_point(field_set_allocated:flwr.proto.PullTaskInsResponse.reconnect)
+}
+
+// repeated .flwr.proto.TaskIns task_ins_list = 2;
+inline int PullTaskInsResponse::_internal_task_ins_list_size() const {
+ return task_ins_list_.size();
+}
+inline int PullTaskInsResponse::task_ins_list_size() const {
+ return _internal_task_ins_list_size();
+}
+inline ::flwr::proto::TaskIns* PullTaskInsResponse::mutable_task_ins_list(int index) {
+ // @@protoc_insertion_point(field_mutable:flwr.proto.PullTaskInsResponse.task_ins_list)
+ return task_ins_list_.Mutable(index);
+}
+inline ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField< ::flwr::proto::TaskIns >*
+PullTaskInsResponse::mutable_task_ins_list() {
+ // @@protoc_insertion_point(field_mutable_list:flwr.proto.PullTaskInsResponse.task_ins_list)
+ return &task_ins_list_;
+}
+inline const ::flwr::proto::TaskIns& PullTaskInsResponse::_internal_task_ins_list(int index) const {
+ return task_ins_list_.Get(index);
+}
+inline const ::flwr::proto::TaskIns& PullTaskInsResponse::task_ins_list(int index) const {
+ // @@protoc_insertion_point(field_get:flwr.proto.PullTaskInsResponse.task_ins_list)
+ return _internal_task_ins_list(index);
+}
+inline ::flwr::proto::TaskIns* PullTaskInsResponse::_internal_add_task_ins_list() {
+ return task_ins_list_.Add();
+}
+inline ::flwr::proto::TaskIns* PullTaskInsResponse::add_task_ins_list() {
+ ::flwr::proto::TaskIns* _add = _internal_add_task_ins_list();
+ // @@protoc_insertion_point(field_add:flwr.proto.PullTaskInsResponse.task_ins_list)
+ return _add;
+}
+inline const ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField< ::flwr::proto::TaskIns >&
+PullTaskInsResponse::task_ins_list() const {
+ // @@protoc_insertion_point(field_list:flwr.proto.PullTaskInsResponse.task_ins_list)
+ return task_ins_list_;
+}
+
+// -------------------------------------------------------------------
+
+// PushTaskResRequest
+
+// repeated .flwr.proto.TaskRes task_res_list = 1;
+inline int PushTaskResRequest::_internal_task_res_list_size() const {
+ return task_res_list_.size();
+}
+inline int PushTaskResRequest::task_res_list_size() const {
+ return _internal_task_res_list_size();
+}
+inline ::flwr::proto::TaskRes* PushTaskResRequest::mutable_task_res_list(int index) {
+ // @@protoc_insertion_point(field_mutable:flwr.proto.PushTaskResRequest.task_res_list)
+ return task_res_list_.Mutable(index);
+}
+inline ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField< ::flwr::proto::TaskRes >*
+PushTaskResRequest::mutable_task_res_list() {
+ // @@protoc_insertion_point(field_mutable_list:flwr.proto.PushTaskResRequest.task_res_list)
+ return &task_res_list_;
+}
+inline const ::flwr::proto::TaskRes& PushTaskResRequest::_internal_task_res_list(int index) const {
+ return task_res_list_.Get(index);
+}
+inline const ::flwr::proto::TaskRes& PushTaskResRequest::task_res_list(int index) const {
+ // @@protoc_insertion_point(field_get:flwr.proto.PushTaskResRequest.task_res_list)
+ return _internal_task_res_list(index);
+}
+inline ::flwr::proto::TaskRes* PushTaskResRequest::_internal_add_task_res_list() {
+ return task_res_list_.Add();
+}
+inline ::flwr::proto::TaskRes* PushTaskResRequest::add_task_res_list() {
+ ::flwr::proto::TaskRes* _add = _internal_add_task_res_list();
+ // @@protoc_insertion_point(field_add:flwr.proto.PushTaskResRequest.task_res_list)
+ return _add;
+}
+inline const ::PROTOBUF_NAMESPACE_ID::RepeatedPtrField< ::flwr::proto::TaskRes >&
+PushTaskResRequest::task_res_list() const {
+ // @@protoc_insertion_point(field_list:flwr.proto.PushTaskResRequest.task_res_list)
+ return task_res_list_;
+}
+
+// -------------------------------------------------------------------
+
+// -------------------------------------------------------------------
+
+// PushTaskResResponse
+
+// .flwr.proto.Reconnect reconnect = 1;
+inline bool PushTaskResResponse::_internal_has_reconnect() const {
+ return this != internal_default_instance() && reconnect_ != nullptr;
+}
+inline bool PushTaskResResponse::has_reconnect() const {
+ return _internal_has_reconnect();
+}
+inline void PushTaskResResponse::clear_reconnect() {
+ if (GetArenaForAllocation() == nullptr && reconnect_ != nullptr) {
+ delete reconnect_;
+ }
+ reconnect_ = nullptr;
+}
+inline const ::flwr::proto::Reconnect& PushTaskResResponse::_internal_reconnect() const {
+ const ::flwr::proto::Reconnect* p = reconnect_;
+ return p != nullptr ? *p : reinterpret_cast(
+ ::flwr::proto::_Reconnect_default_instance_);
+}
+inline const ::flwr::proto::Reconnect& PushTaskResResponse::reconnect() const {
+ // @@protoc_insertion_point(field_get:flwr.proto.PushTaskResResponse.reconnect)
+ return _internal_reconnect();
+}
+inline void PushTaskResResponse::unsafe_arena_set_allocated_reconnect(
+ ::flwr::proto::Reconnect* reconnect) {
+ if (GetArenaForAllocation() == nullptr) {
+ delete reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(reconnect_);
+ }
+ reconnect_ = reconnect;
+ if (reconnect) {
+
+ } else {
+
+ }
+ // @@protoc_insertion_point(field_unsafe_arena_set_allocated:flwr.proto.PushTaskResResponse.reconnect)
+}
+inline ::flwr::proto::Reconnect* PushTaskResResponse::release_reconnect() {
+
+ ::flwr::proto::Reconnect* temp = reconnect_;
+ reconnect_ = nullptr;
+#ifdef PROTOBUF_FORCE_COPY_IN_RELEASE
+ auto* old = reinterpret_cast<::PROTOBUF_NAMESPACE_ID::MessageLite*>(temp);
+ temp = ::PROTOBUF_NAMESPACE_ID::internal::DuplicateIfNonNull(temp);
+ if (GetArenaForAllocation() == nullptr) { delete old; }
+#else // PROTOBUF_FORCE_COPY_IN_RELEASE
+ if (GetArenaForAllocation() != nullptr) {
+ temp = ::PROTOBUF_NAMESPACE_ID::internal::DuplicateIfNonNull(temp);
+ }
+#endif // !PROTOBUF_FORCE_COPY_IN_RELEASE
+ return temp;
+}
+inline ::flwr::proto::Reconnect* PushTaskResResponse::unsafe_arena_release_reconnect() {
+ // @@protoc_insertion_point(field_release:flwr.proto.PushTaskResResponse.reconnect)
+
+ ::flwr::proto::Reconnect* temp = reconnect_;
+ reconnect_ = nullptr;
+ return temp;
+}
+inline ::flwr::proto::Reconnect* PushTaskResResponse::_internal_mutable_reconnect() {
+
+ if (reconnect_ == nullptr) {
+ auto* p = CreateMaybeMessage<::flwr::proto::Reconnect>(GetArenaForAllocation());
+ reconnect_ = p;
+ }
+ return reconnect_;
+}
+inline ::flwr::proto::Reconnect* PushTaskResResponse::mutable_reconnect() {
+ ::flwr::proto::Reconnect* _msg = _internal_mutable_reconnect();
+ // @@protoc_insertion_point(field_mutable:flwr.proto.PushTaskResResponse.reconnect)
+ return _msg;
+}
+inline void PushTaskResResponse::set_allocated_reconnect(::flwr::proto::Reconnect* reconnect) {
+ ::PROTOBUF_NAMESPACE_ID::Arena* message_arena = GetArenaForAllocation();
+ if (message_arena == nullptr) {
+ delete reconnect_;
+ }
+ if (reconnect) {
+ ::PROTOBUF_NAMESPACE_ID::Arena* submessage_arena =
+ ::PROTOBUF_NAMESPACE_ID::Arena::InternalHelper<::flwr::proto::Reconnect>::GetOwningArena(reconnect);
+ if (message_arena != submessage_arena) {
+ reconnect = ::PROTOBUF_NAMESPACE_ID::internal::GetOwnedMessage(
+ message_arena, reconnect, submessage_arena);
+ }
+
+ } else {
+
+ }
+ reconnect_ = reconnect;
+ // @@protoc_insertion_point(field_set_allocated:flwr.proto.PushTaskResResponse.reconnect)
+}
+
+// map results = 2;
+inline int PushTaskResResponse::_internal_results_size() const {
+ return results_.size();
+}
+inline int PushTaskResResponse::results_size() const {
+ return _internal_results_size();
+}
+inline void PushTaskResResponse::clear_results() {
+ results_.Clear();
+}
+inline const ::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >&
+PushTaskResResponse::_internal_results() const {
+ return results_.GetMap();
+}
+inline const ::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >&
+PushTaskResResponse::results() const {
+ // @@protoc_insertion_point(field_map:flwr.proto.PushTaskResResponse.results)
+ return _internal_results();
+}
+inline ::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >*
+PushTaskResResponse::_internal_mutable_results() {
+ return results_.MutableMap();
+}
+inline ::PROTOBUF_NAMESPACE_ID::Map< std::string, ::PROTOBUF_NAMESPACE_ID::uint32 >*
+PushTaskResResponse::mutable_results() {
+ // @@protoc_insertion_point(field_mutable_map:flwr.proto.PushTaskResResponse.results)
+ return _internal_mutable_results();
+}
+
+// -------------------------------------------------------------------
+
+// Reconnect
+
+// uint64 reconnect = 1;
+inline void Reconnect::clear_reconnect() {
+ reconnect_ = uint64_t{0u};
+}
+inline ::PROTOBUF_NAMESPACE_ID::uint64 Reconnect::_internal_reconnect() const {
+ return reconnect_;
+}
+inline ::PROTOBUF_NAMESPACE_ID::uint64 Reconnect::reconnect() const {
+ // @@protoc_insertion_point(field_get:flwr.proto.Reconnect.reconnect)
+ return _internal_reconnect();
+}
+inline void Reconnect::_internal_set_reconnect(::PROTOBUF_NAMESPACE_ID::uint64 value) {
+
+ reconnect_ = value;
+}
+inline void Reconnect::set_reconnect(::PROTOBUF_NAMESPACE_ID::uint64 value) {
+ _internal_set_reconnect(value);
+ // @@protoc_insertion_point(field_set:flwr.proto.Reconnect.reconnect)
+}
+
+#ifdef __GNUC__
+ #pragma GCC diagnostic pop
+#endif // __GNUC__
+// -------------------------------------------------------------------
+
+// -------------------------------------------------------------------
+
+// -------------------------------------------------------------------
+
+// -------------------------------------------------------------------
+
+// -------------------------------------------------------------------
+
+// -------------------------------------------------------------------
+
+// -------------------------------------------------------------------
+
+// -------------------------------------------------------------------
+
+// -------------------------------------------------------------------
+
+
+// @@protoc_insertion_point(namespace_scope)
+
+} // namespace proto
+} // namespace flwr
+
+// @@protoc_insertion_point(global_scope)
+
+#include
+#endif // GOOGLE_PROTOBUF_INCLUDED_GOOGLE_PROTOBUF_INCLUDED_flwr_2fproto_2ffleet_2eproto
diff --git a/src/cc/flwr/include/flwr/proto/node.grpc.pb.cc b/src/cc/flwr/include/flwr/proto/node.grpc.pb.cc
new file mode 100644
index 000000000000..9bb46c7e16ca
--- /dev/null
+++ b/src/cc/flwr/include/flwr/proto/node.grpc.pb.cc
@@ -0,0 +1,27 @@
+// Generated by the gRPC C++ plugin.
+// If you make any local change, they will be lost.
+// source: flwr/proto/node.proto
+
+#include "flwr/proto/node.pb.h"
+#include "flwr/proto/node.grpc.pb.h"
+
+#include
+#include
+#include