-
Notifications
You must be signed in to change notification settings - Fork 213
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor PR - Remove global variable and get_writer() function #1207
base: develop
Are you sure you want to change the base?
Conversation
* Typo fix Signed-off-by: Chaurasiya, Payal <[email protected]> * Add memory logs test Signed-off-by: Chaurasiya, Payal <[email protected]> * Add memory test in e2e Signed-off-by: Chaurasiya, Payal <[email protected]> * Add memory test in e2e Signed-off-by: Chaurasiya, Payal <[email protected]> * Simplify setup (securefederatedai#1187) * Simplify setup.py Signed-off-by: Shah, Karan <[email protected]> * Remove tensorboardX and never-used log_metric code Signed-off-by: Shah, Karan <[email protected]> * Test reducing requirements Signed-off-by: Shah, Karan <[email protected]> * Revert "Remove tensorboardX and never-used log_metric code" and add fn calls Signed-off-by: Shah, Karan <[email protected]> * Revert tb removal Signed-off-by: Shah, Karan <[email protected]> * Disable tensorboard logging for gramine CI test Signed-off-by: Shah, Karan <[email protected]> --------- Signed-off-by: Shah, Karan <[email protected]> Signed-off-by: Chaurasiya, Payal <[email protected]> * Fix for TF (2.13) cnn histology workspace 'Adam' object has no attribute 'weights' issue (securefederatedai#1194) Signed-off-by: yes <[email protected]> Signed-off-by: Chaurasiya, Payal <[email protected]> * fix(deps): requests min version set to 2.32.0 (securefederatedai#1198) Signed-off-by: Pant, Akshay <[email protected]> Signed-off-by: Chaurasiya, Payal <[email protected]> * Pin GaNDLF version to 0.1.1 (securefederatedai#1179) * fix(gandlf ci): pinned torchaudio version Signed-off-by: Pant, Akshay <[email protected]> * fix(gandlf ci): install torch without cache Signed-off-by: Pant, Akshay <[email protected]> * fix(gandlf ci): upgrade torch to 2.5.0 Signed-off-by: Pant, Akshay <[email protected]> * fix(gandlf ci): pin gandlf to 0.1.1 Signed-off-by: Pant, Akshay <[email protected]> --------- Signed-off-by: Pant, Akshay <[email protected]> Signed-off-by: Chaurasiya, Payal <[email protected]> * Migrate `shell/*` to `scripts/*` (securefederatedai#1193) * Update distribution scripts Signed-off-by: Shah, Karan <[email protected]> * Migrate shell/ to scripts/ Signed-off-by: Shah, Karan <[email protected]> * Remove lint test from ubuntu CI Signed-off-by: Shah, Karan <[email protected]> --------- Signed-off-by: Shah, Karan <[email protected]> Signed-off-by: Chaurasiya, Payal <[email protected]> * Set timeouts for all CI workflows (securefederatedai#1200) * Set timeouts for all CI workflows Signed-off-by: Shah, Karan <[email protected]> * forgot to add this too Signed-off-by: Shah, Karan <[email protected]> --------- Signed-off-by: Shah, Karan <[email protected]> Signed-off-by: Chaurasiya, Payal <[email protected]> * Review comments Signed-off-by: Chaurasiya, Payal <[email protected]> * Add details in file for further use Signed-off-by: Chaurasiya, Payal <[email protected]> * Fix for tf_3dunet_barts workspace (securefederatedai#1197) Signed-off-by: yes <[email protected]> Signed-off-by: Chaurasiya, Payal <[email protected]> * Update task_runner_e2e.yml Signed-off-by: Chaurasiya, Payal <[email protected]> * OpenFL roadmap update (securefederatedai#1196) * OpenFL 1.7 roadmap update Signed-off-by: Teodor Parvanov <[email protected]> * Addressing review comments Signed-off-by: Teodor Parvanov <[email protected]> --------- Signed-off-by: Teodor Parvanov <[email protected]> Signed-off-by: Chaurasiya, Payal <[email protected]> * Update log verbosity (securefederatedai#1202) Signed-off-by: Shah, Karan <[email protected]> Signed-off-by: Chaurasiya, Payal <[email protected]> * Restore `openfl-tutorials` as installable package (securefederatedai#1203) * Add openfl-tutorials as package Signed-off-by: Shah, Karan <[email protected]> * Add __init__.py Signed-off-by: Shah, Karan <[email protected]> * Add nbformat pkg Signed-off-by: Shah, Karan <[email protected]> * Try localhost Signed-off-by: Shah, Karan <[email protected]> * Revert "Try localhost" This reverts commit 44b8304. Signed-off-by: Shah, Karan <[email protected]> * Try python3.10 Signed-off-by: Shah, Karan <[email protected]> * Try localhost Signed-off-by: Shah, Karan <[email protected]> --------- Signed-off-by: Shah, Karan <[email protected]> Signed-off-by: Chaurasiya, Payal <[email protected]> * Fix for ubuntu Signed-off-by: Chaurasiya, Payal <[email protected]> * Writing memory details in json Signed-off-by: Chaurasiya, Payal <[email protected]> * Update task_runner_e2e.yml Signed-off-by: Chaurasiya, Payal <[email protected]> * E501 fix Signed-off-by: Chaurasiya, Payal <[email protected]> * Lint changes Signed-off-by: Chaurasiya, Payal <[email protected]> --------- Signed-off-by: Chaurasiya, Payal <[email protected]> Signed-off-by: Shah, Karan <[email protected]> Signed-off-by: yes <[email protected]> Signed-off-by: Pant, Akshay <[email protected]> Signed-off-by: Teodor Parvanov <[email protected]> Co-authored-by: Karan Shah <[email protected]> Co-authored-by: Shailesh Tanwar <[email protected]> Co-authored-by: Akshay Pant <[email protected]> Co-authored-by: teoparvanov <[email protected]> Signed-off-by: Chaurasiya, Payal <[email protected]>
Remove global variable `writer` and initialize `writer` directly in `write_metric` function across multiple files. * **AggregatorBasedWorkflow/101_torch_cnn_mnist/src/utils.py** - Remove global variable `writer` - Remove `get_writer` function - Initialize `writer` directly in `write_metric` function * **AggregatorBasedWorkflow/102_aggregator_validation/src/utils.py** - Remove global variable `writer` - Remove `get_writer` function - Initialize `writer` directly in `write_metric` function * **AggregatorBasedWorkflow/104_keras_mnist/src/utils.py** - Remove global variable `writer` - Remove `get_writer` function - Initialize `writer` directly in `write_metric` function * **AggregatorBasedWorkflow/301_torch_cnn_mnist_watermarking/src/utils.py** - Remove global variable `writer` - Remove `get_writer` function - Initialize `writer` directly in `write_metric` function * **AggregatorBasedWorkflow/501_pytorch_tinyimagenet_transfer_learning/src/utils.py** - Remove global variable `writer` - Remove `get_writer` function - Initialize `writer` directly in `write_metric` function * **AggregatorBasedWorkflow/vertical_fl_two_party/src/utils.py** - Remove global variable `writer` - Remove `get_writer` function - Initialize `writer` directly in `write_metric` function * **torch_cnn_mnist/src/utils.py** - Remove global variable `writer` - Remove `get_writer` function - Initialize `writer` directly in `write_metric` function * **torch_cnn_mnist_eden_compression/src/mnist_utils.py** - Remove global variable `writer` - Remove `get_writer` function - Initialize `writer` directly in `write_metric` function * **torch_cnn_mnist_fed_eval/src/mnist_utils.py** - Remove global variable `writer` - Remove `get_writer` function - Initialize `writer` directly in `write_metric` function * **torch_cnn_mnist_straggler_check/src/mnist_utils.py** - Remove global variable `writer` - Remove `get_writer` function - Initialize `writer` directly in `write_metric` function * **torch_llm_horovod/src/emotion_utils.py** - Remove global variable `writer` - Remove `get_writer` function - Initialize `writer` directly in `write_metric` function --- For more details, open the [Copilot Workspace session](https://copilot-workspace.githubnext.com/payalcha/openfl?shareId=XXXX-XXXX-XXXX-XXXX). Signed-off-by: Chaurasiya, Payal <[email protected]>
Signed-off-by: Chaurasiya, Payal <[email protected]>
Remove `log_metric_callback` and its usage from `Aggregator` class in `openfl/component/aggregator/aggregator.py`. * Remove the import of `write_metric`. * Remove the `log_metric_callback` attribute from the `Aggregator` class. * Remove the initialization and usage of `log_metric_callback` in the `Aggregator` class constructor. * Remove the conditional logging logic that uses `log_metric_callback` in the `Aggregator` class methods. Signed-off-by: Chaurasiya, Payal <[email protected]>
Signed-off-by: Chaurasiya, Payal <[email protected]>
Signed-off-by: Chaurasiya, Payal <[email protected]>
…function Remove the `get_writer` function and initialize the `SummaryWriter` directly in the `write_metric` function in multiple files. * **openfl-tutorials/deprecated/native_api/Federated_Pytorch_MNIST_Tutorial.ipynb** - Remove the direct initialization of `SummaryWriter` and replace it with a `get_writer` function. - Use the `get_writer` function to initialize the writer in the `write_metric` function. * **openfl-workspace/experimental/workflow/AggregatorBasedWorkflow/102_aggregator_validation/src/utils.py** - Remove the direct initialization of `SummaryWriter` and replace it with a `get_writer` function. - Use the `get_writer` function to initialize the writer in the `write_metric` function. * **openfl-workspace/experimental/workflow/AggregatorBasedWorkflow/104_keras_mnist/src/utils.py** - Remove the direct initialization of `SummaryWriter` and replace it with a `get_writer` function. - Use the `get_writer` function to initialize the writer in the `write_metric` function. * **openfl-workspace/experimental/workflow/AggregatorBasedWorkflow/301_torch_cnn_mnist_watermarking/src/utils.py** - Remove the direct initialization of `SummaryWriter` and replace it with a `get_writer` function. - Use the `get_writer` function to initialize the writer in the `write_metric` function. * **openfl-workspace/experimental/workflow/AggregatorBasedWorkflow/501_pytorch_tinyimagenet_transfer_learning/src/utils.py** - Remove the direct initialization of `SummaryWriter` and replace it with a `get_writer` function. - Use the `get_writer` function to initialize the writer in the `write_metric` function. * **openfl-workspace/experimental/workflow/AggregatorBasedWorkflow/vertical_fl/src/utils.py** - Remove the `get_writer` function. - Initialize `SummaryWriter` directly in the `write_metric` function. * **tests/github/experimental/workflow/AggregatorBasedWorkflow/testcase_datastore_cli/src/utils.py** - Remove the `get_writer` function. - Initialize `SummaryWriter` directly in the `write_metric` function. * **tests/github/experimental/workflow/AggregatorBasedWorkflow/testcase_include_exclude/src/utils.py** - Remove the `get_writer` function. - Initialize `SummaryWriter` directly in the `write_metric` function. * **tests/github/experimental/workflow/AggregatorBasedWorkflow/testcase_internalloop/src/utils.py** - Remove the `get_writer` function. - Initialize `SummaryWriter` directly in the `write_metric` function. * **tests/github/experimental/workflow/AggregatorBasedWorkflow/testcase_private_attributes_initialization_with_both_options/src/utils.py** - Remove the `get_writer` function. - Initialize `SummaryWriter` directly in the `write_metric` function. * **tests/github/experimental/workflow/AggregatorBasedWorkflow/testcase_private_attributes_initialization_without_callable/src/utils.py** - Remove the `get_writer` function. - Initialize `SummaryWriter` directly in the `write_metric` function. * **tests/github/experimental/workflow/AggregatorBasedWorkflow/testcase_private_attributes/src/utils.py** - Remove the `get_writer` function. - Initialize `SummaryWriter` directly in the `write_metric` function. * **tests/github/experimental/workflow/AggregatorBasedWorkflow/testcase_reference_with_include_exclude/src/utils.py** - Remove the `get_writer` function. - Initialize `SummaryWriter` directly in the `write_metric` function. * **tests/github/experimental/workflow/AggregatorBasedWorkflow/testcase_reference/src/utils.py** - Remove the `get_writer` function. - Initialize `SummaryWriter` directly in the `write_metric` function. * **tests/github/experimental/workflow/AggregatorBasedWorkflow/testcase_subset_of_collaborators/src/utils.py** - Remove the `get_writer` function. - Initialize `SummaryWriter` directly in the `write_metric` function. * **tests/github/experimental/workflow/AggregatorBasedWorkflow/testcase_validate_particpant_names/src/utils.py** - Remove the `get_writer` function. - Initialize `SummaryWriter` directly in the `write_metric` function. Signed-off-by: Chaurasiya, Payal <[email protected]>
…function Remove the `get_writer` function and initialize the writer variable directly in the calling function. * Remove the `get_writer` function from `openfl-workspace/experimental/workflow/AggregatorBasedWorkflow/102_aggregator_validation/src/utils.py` and initialize the writer variable directly in the `write_metric` function. * Remove the `get_writer` function from `openfl-workspace/experimental/workflow/AggregatorBasedWorkflow/104_keras_mnist/src/utils.py` and initialize the writer variable directly in the `write_metric` function. * Remove the `get_writer` function from `openfl-tutorials/deprecated/native_api/Federated_Pytorch_MNIST_Tutorial.ipynb` and initialize the writer variable directly in the `write_metric` function. Signed-off-by: Chaurasiya, Payal <[email protected]>
8e7a9c3
to
b95b63e
Compare
Signed-off-by: Chaurasiya, Payal <[email protected]>
openfl-tutorials/deprecated/native_api/Federated_Pytorch_MNIST_Tutorial.ipynb
Outdated
Show resolved
Hide resolved
def write_metric(node_name, task_name, metric_name, metric, round_number): | ||
"""Write metric callback.""" | ||
get_writer() | ||
writer = SummaryWriter('./logs/cnn_mnist', flush_secs=5) | ||
writer.add_scalar(f'{node_name}/{task_name}/{metric_name}', metric, round_number) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For all taskrunner example updates in this PR: this function will no longer be required, since a TensorBoard
callback will internally maintain a writer object and necessary functions to write scalars.
In summary, this (including imports) can be removed too.
from tensorboardX import SummaryWriter | ||
|
||
writer = None | ||
|
||
|
||
def get_writer(): | ||
"""Create global writer object. | ||
|
||
This function creates a global `SummaryWriter` object for logging to | ||
TensorBoard. | ||
""" | ||
global writer | ||
if not writer: | ||
writer = SummaryWriter("./logs/tensorboard", flush_secs=5) | ||
|
||
|
||
def write_metric(node_name, task_name, metric_name, metric, round_number): | ||
"""Write metric callback. | ||
|
||
This function logs a metric to TensorBoard. | ||
|
||
Args: | ||
node_name (str): The name of the node. | ||
task_name (str): The name of the task. | ||
metric_name (str): The name of the metric. | ||
metric (float): The value of the metric. | ||
round_number (int): The current round number. | ||
""" | ||
get_writer() | ||
writer = SummaryWriter("./logs/tensorboard", flush_secs=5) | ||
writer.add_scalar(f"{node_name}/{task_name}/{metric_name}", metric, round_number) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All this can be removed as well.
P.S: Creating a writer inside write_metric
will create a new file handle on each call, which is slow and wasteful. What I meant was to create a function that creates a writer, passes it to a function and returns it. Example:
def get_metric_writer(log_dir: str = "./logs"):
writer = SummaryWriter(log_dir)
def _write_scalar(node_name, task_name, metric_name, metric, round_number):
writer.add_scalar(f"{node_name}/{task_name}/{metric_name}", metric, round_number)
return _write_scalar
Anyways, this also needs to be removed because it will be taken care of, in TensorBoard callback
Signed-off-by: Chaurasiya, Payal <[email protected]>
This PR is just to refactor the code in different utils.py which is using SummaryWriter.