-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Description
Type
Feature
Description
I would like to contribute a new example demonstrating diffusion model training in a federated learning environment using Flower.
This contribution aims to showcase how generative diffusion models can be trained collaboratively across multiple clients while preserving privacy and robustness.
The project will be developed in two phases:
Phase 1: Federated Diffusion Model Example (MNIST)
Phase 2: Security and Robustness Enhancements
Planned Implementation
Phase 1: Federated Diffusion Model Example (MNIST)
Goal:
Implement a simple example of training a diffusion model (e.g., DDPM) using the MNIST dataset in a federated setup with Flower.
Key components:
-
Implement a lightweight diffusion model (e.g., DDPM or denoising autoencoder–based variant).
-
Distribute MNIST data across multiple simulated clients using Flower’s NumPyClient or ClientApp.
-
Train model parameters across clients using FedAvg.
Phase 2: Security and Robustness Enhancements
Goal:
Extend the MNIST federated diffusion example by integrating privacy-preserving and robustness-enhancing mechanisms to defend against adversarial training or inference attacks.
Planned additions:
-
Integrate Differential Privacy (DP) via Opacus or Gaussian noise mechanisms.
-
Explore secure aggregation or lightweight encryption masking for model updates.
Additional Context
No response