Skip to content

Commit

Permalink
highlight the developed part in the architecture
Browse files Browse the repository at this point in the history
Signed-off-by: Marchons <[email protected]>
  • Loading branch information
Yoda-wu committed Sep 9, 2024
1 parent a43d90b commit 3c90bdf
Show file tree
Hide file tree
Showing 13 changed files with 10 additions and 18 deletions.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -101,17 +101,13 @@ Federated incremental learning is a special kind of federated learning. However

Sedna is a edge-cloud collaborative training AI framework. It support federated learning ability. Base on Sedna, we can have a simple federated learning architecture:

![image-20240729150049512](FCI_SSL_image/federated_learning_architecture)


![image-20240909194418085](FCI_SSL_image/federated_learning_architecture)

It is quite clear and specific that the paradigm need to start the server and client in single node and then perform federated learning, the whole process can be shown as follow:

![image-20240909194832334](FCI_SSL_image/paradigm_process_2)


![image-20240729150728176](FCI_SSL_image/paradigm_process_2)


Federated Learning Paradigm involved above process, `the yellow block` means the process that the paradigm will execute, `the green block` is the process that paradigm will invoke the estimator to execute and `the pink block` is the process that paradigm will invoke the aggregation to execute.

Base on the above paradigm, we can conduct our federated-class-incremental learning paradigm and the timing diagram of the entire Benchmarking system is presented below:

Expand All @@ -135,22 +131,16 @@ We will leverage the existed *TestEnvManager*, *TestCaseController* and *StoryMa

The overall architecture is shown as follow:



![image-20240821171243966](FCI_SSL_image/architecture_design)


![image-20240909194437318](FCI_SSL_image/architecture_design)

We design a novel algorithm paradigm namely Federated-Class-Incremental-Learning Paradigm, We specify the process of the algorithm paradigm as follows:

![image-20240729151355604](FCI_SSL_image/paradigm_process)

![image-20240909194458903](FCI_SSL_image/paradigm_process)

Federated Class Incremental Learning Paradigm involved above process, `the orange block` means the process that the paradigm will execute, `the green block` is the process that paradigm will invoke the estimator to execute and `the pink block` is the process that paradigm will invoke the aggregation to execute.

In order to provide functionality extensibility and convenience to users, we have specified a process where most of the functionality can be replaced by user-implemented functionality(block in yellow). In addition, we require users to implement the server and client modules (block in green and orange) to complete the whole algorithm process.



### 3.3 Benchmarking Design

**Datasets Setting**
Expand Down Expand Up @@ -393,9 +383,9 @@ ianvs -f ./example/cifar100/federated_learning/fedavg/benchmarkingjob.yaml

#### 3.5.2 Federated Class Incremental Learning

Base on the architecture above, If user want to implement a federated class incremental learning algorithm, they can following the below instruction:
Base on the architecture above, If user want to implement a federated class incremental learning algorithm,they can following the below instruction:

example path: `example/cifar100/federeated_class_incremental_learning/glfc`
example path: `example/cifar100/federeated_class_incremental_learning/fedavg`

1. Define an `Estimator` but you can choose to implement a helper function

Expand Down Expand Up @@ -499,6 +489,8 @@ ianvs -f ./example/cifar100/federeated_class_incremental_learning/glfc/benchmark

To match the implementation of federated learning in Sedna, if user want to implement a federated learning algorithm, they can following the below instruction:

The different between this example and example 3.5.1 is that in this version, client and server communicate through websocket while example 3.5.1 communicate with in memory.

example path: `example/cifar100/sedna_federeated_learning/fedavg`

1. Define an `Estimator` like `3.5.1 Federated Learning Example `
Expand Down

0 comments on commit 3c90bdf

Please sign in to comment.