Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/SK-732 | Introduces FedYogi and FedAdaGrad aggregators #580

Merged
merged 32 commits into from
May 2, 2024

Conversation

ahellander
Copy link
Member

Description

Introduces new features related to model aggregation:

  1. Two new aggregation algorithms, FedYogi and FedAdaGrad (SK-738).
  2. Pass hyperparamters to aggregators via a new paramter to "start_session", aggregator_kwargs.

@ahellander ahellander requested a review from niklastheman April 18, 2024 21:47
@ahellander ahellander removed the HOLD label Apr 19, 2024
@Wrede Wrede changed the title Feature/sk 732 Feature/SK-732 | Introduces FedYogi and FedAdaGrad aggregators Apr 22, 2024
docs/aggregators.rst Show resolved Hide resolved
docs/aggregators.rst Outdated Show resolved Hide resolved
fedn/fedn/network/combiner/aggregators/fedopt.py Outdated Show resolved Hide resolved
@@ -102,36 +114,128 @@ def combine_models(self, helper=None, delete_models=True):
"AGGREGATOR({}): Error encoutered while processing model update {}, skipping this update.".format(self.name, e))
self.model_updates.task_done()

model = self.serveropt_adam(helper, pseudo_gradient, model_old)
if params['serveropt'] == 'adam':
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would also catch KeyError if something unexpected changes, like default_params

fedn/fedn/network/combiner/aggregators/fedopt.py Outdated Show resolved Hide resolved
@ahellander ahellander requested a review from Wrede April 29, 2024 13:41
Copy link
Member

@Wrede Wrede left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

minor changes requested

see compute package). The base class implements a default callback that checks that all metadata assumed by the aggregation algorithms FedAvg and FedOpt
is present in the metadata. However, the callback could also be used to implement custom preprocessing and additional checks including strategies
The ``on_model_update`` callback recieves the model update messages from clients (including all metadata) and can be used to perform validation and
potential transformation of the model update before it is places on the aggregation queue (see image above).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo, "placed"

The ``on_model_update`` callback recieves the model update messages from clients (including all metadata) and can be used to perform validation and
potential transformation of the model update before it is places on the aggregation queue (see image above).
The base class implements a default callback that checks that all metadata assumed by the aggregation algorithms FedAvg and FedOpt
is available. The callback could also be used to implement custom pre-processing and additional checks including strategies
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

new line should not be here?

docs/aggregators.rst Show resolved Hide resolved
:param params: Additional key-word arguments.
:type params: dict
:param parameters: Aggregator hyperparameters.
:type parameters: `fedn.utils.parmeters.Parameters`, optional
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:class: then reference to the class

@@ -10,8 +10,12 @@ class Aggregator(AggregatorBase):

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

class name should be somethiing more related to FedOpt? like FedOptAggregator?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, this is how the plugin architecture is implemented, it has to be called Aggregator.

if key not in params:
params[key] = value
# Define parameter schema
parameter_schema = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would have this a required attribute of AggregatorBase, even if it can be empty

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's think about this for a bit longer, I think we should discuss this whole Parameter thing in more depth and make sure we have something that we want to use across the whole code-base.

elif parameters['serveropt'] == 'yogi':
model = self.serveropt_yogi(helper, pseudo_gradient, model_old, parameters)
elif parameters['serveropt'] == 'adagrad':
model = self.serveropt_adagrad(helper, pseudo_gradient, model_old, parameters)
else:
logger.error("Unsupported server optimizer passed to FedOpt.")
return
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should it not return None, data?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, good catch.

@Wrede
Copy link
Member

Wrede commented Apr 30, 2024

I have activated the docs for this branch: https://fedn.readthedocs.io/en/feature-sk-732/aggregators.html

@ahellander ahellander requested a review from Wrede April 30, 2024 12:57
@Wrede Wrede merged commit 3fe22d9 into master May 2, 2024
11 checks passed
@Wrede Wrede deleted the feature/SK-732 branch May 2, 2024 08:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants