Skip to content

Commit

Permalink
Add suggestion
Browse files Browse the repository at this point in the history
  • Loading branch information
charlesbvll committed Nov 23, 2023
1 parent 54d8a68 commit 6c0ab4a
Showing 1 changed file with 9 additions and 0 deletions.
9 changes: 9 additions & 0 deletions examples/vertical-fl/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -393,6 +393,15 @@ guides each participant's model to adjust and improve, achieving optimization
not just based on its own data but also leveraging insights from the entire
network's data.

We do not need to return parameters here because updates are completed locally
in VFL. But the server should still send the gradients back to all clients to
let them continue the back prop and update their local model. In Flower, the
parameters returned by `aggregate_fit` will be stored and sent to
`Client.evaluate` via `configure_fit`. So we take advantage of this and return
our gradients in `aggregate_fit` so that they'll be sent to `Client.evaluate` as
`parameters`. That's also why we can obtain gradients from the `parameters`
argument in `Client.evaluate` (see next section).

The last thing we have to do is to redefine the `aggregate_evaluate` function to
disable distributed evaluation (as the clients do not hold any labels to test
their local models).
Expand Down

0 comments on commit 6c0ab4a

Please sign in to comment.