-
Federated Learning without Full Labels: A Survey, https://scholar.google.com/scholar?hl=ko&as_sdt=0%2C5&q=-+Federated+Learning+without+Full+Labels%3A+A+Survey%2C+&btnG=
-
A Survey on Secure and Private Federated Learning Using Blockchain: Theory and Application in Resource-constrained Computing, https://arxiv.org/abs/2303.13727
-
Privacy-Enhancing Technologies in Federated Learning for the Internet of Healthcare Things: A Survey, https://arxiv.org/abs/2303.14544
-
FedIL: Federated Incremental Learning from Decentralized Unlabeled Data with Convergence Analysis, https://arxiv.org/abs/2302.11823
- A reliable and fair federated learning mechanism for mobile edge computing, https://www.sciencedirect.com/science/article/abs/pii/S1389128623001238
- FederatedTrust: A Solution for Trustworthy Federated Learning, https://scholar.google.co.kr/scholar?hl=ko&as_sdt=0%2C5&q=FederatedTrust%3A+A+Solution+for+Trustworthy+Federated+Learning&btnG=
Thus, this work analyzes the existing requirements for trustworthiness evaluation in FL and proposes a comprehensive taxonomy of six pillars (privacy, robustness, fairness, explainability, accountability, and federation) with notions and more than 30 metrics for computing the trustworthiness of FL models. Then, an algorithm called FederatedTrust has been designed according to the pillars and metrics identified in the previous taxonomy to compute the trustworthiness score of FL models. A prototype of FederatedTrust has been implemented and deployed into the learning process of FederatedScope, a well-known FL framework.
- Welfare and Fairness Dynamics in Federated Learning: A Client Selection Perspective, https://scholar.google.co.kr/scholar?hl=ko&as_sdt=0%2C5&q=Welfare+and+Fairness+Dynamics+in+Federated+Learning%3A+A+Client+Selection+Perspective&btnG=
To address this problem, we designed a novel incentive mechanism that involves a client selection process to remove low-quality clients and a money transfer process to ensure a fair reward distribution. Our experimental results strongly demonstrate that the proposed incentive mechanism can effectively improve the duration and fairness of the federation.
- BlockLearning, https://github.com/hacdias/blocklearning
-
Blockchain-based Federated Learning: A Systematic Survey, https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=Blockchain-based+Federated+Learning%3A+A+Systematic+Survey&btnG=
-
FLSys: Toward an Open Ecosystem for Federated Learning Mobile Apps, https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=FLSys%3A+Toward+an+Open+Ecosystem+for+Federated+Learning+Mobile+App&btnG=
-
Blockchain and homomorphic encryption based privacy-preserving model aggregation for medical images, https://www.sciencedirect.com/science/article/pii/S0895611122001094
-
Client Selection in Federated Learning: Principles, Challenges, and Opportunities, https://scholar.google.com/scholar_url?url=https://arxiv.org/pdf/2211.01549&hl=en&sa=X&d=2115249361318653297&ei=HBtqY46zLs6vywSCpZqQAQ&scisig=AAGBfm2a2m565SgYEYGYi_adsYlgcaIqkA&oi=scholaralrt&hist=V4mg3H8AAAAJ:5696132008450199958:AAGBfm1cdWgR_IqsnmWMWDhOWv33uveUSg&html=&pos=1&folt=cit
- International Workshop on Federated Learning: Recent Advances and New Challenges in Conjunction with NeurIPS 2022 (FL-NeurIPS'22), https://federated-learning.org/fl-neurips-2022/
- ICASSP 2022 Special session:Frontiers of Federated Learning: Applications, Challenges, and Opportunities, https://2022.ieeeicassp.org/view_session.php?SessionID=1297
- e λ is the testing accuracy of the global models on the validation set after t rounds
-
Blockchain-Based Decentralized Federated Learning, https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=Blockchain-Based+Decentralized+Federated+Learning&btnG=
We design our solution to abstract functionalities enabling the framework to be applicable to many problems. We also integrate the proposed system with smart contracts to ensure fairness and full transparency of the FL process. The full source code is available at https://github.com/a-dirir/DFL.
-
Intelligent contracts: Making smart contracts smart for blockchain intelligence, https://www.sciencedirect.com/science/article/abs/pii/S0045790622006383
It is clear that SCs designed for blockchain-based AI tasks have different functionalities than conventional SCs designed for business logic, which may lead to other characteristics and require a separate study. To make a distinction, this paper names this special type of SCs designed for blockchain-based AI tasks as Intelligent Contracts (ICs). Although ICs show a bright future, their current attempts are independent and fragmented, lacking systematic analysis. As a result, it remains unclear how to construct ICs from SCs, what their characteristics are, and how their applications contribute to AI-driven blockchain intelligence.
-
FLock: Defending Malicious Behaviors in Federated Learning with Blockchain, https://arxiv.org/abs/2211.04344, https://flock.io/
-
Blockchain-empowered Federated Learning: Challenges, Solutions, and Future Directions, https://dl.acm.org/doi/abs/10.1145/3570953
• Decoupled model. For each node, it works either in federated learning or blockchain. No nodes work in both systems.
• Coupled model. All the nodes work in both federated learning and blockchain.
• Overlapped model. A portion of nodes work in both federated learning and blockchain. The nodes’ roles can adjust dynamically -
Decentralized Federated Learning: A Comprehensive Survey and a New Blockchain-based Data Evaluation Scheme, https://ieeexplore.ieee.org/abstract/document/9922390
-
Making Smart Contracts Predict and Scale, https://ieeexplore.ieee.org/abstract/document/9922480
-
Smarter Contracts to Predict using Deep-Learning Algorithms, https://ieeexplore.ieee.org/abstract/document/9922240
https://towardsdatascience.com/3-ai-marketplaces-everyone-has-to-know-one-will-define-the-century-a4295d4f0229, SingularityNet uses a blockchain platform for storing models but fails to predict on-chain. https://singularitynet.io/
-
Blockchain based Decentralised Model Aggregation for Cross-Silo Federated Learning in Industry 4.0, https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=Blockchain+based+Decentralised+Model+Aggregation+for+Cross-Silo+Federated+Learning+in+Industry+4.0&btnG=
-
Blockchain based Federated Learning for Object Detection, https://dl.acm.org/doi/abs/10.1145/3548608.3559272
-
Novel smart homecare IoT system with Edge-AI and Blockchain, https://www.researchgate.net/publication/364122174_Novel_smart_homecare_IoT_system_with_Edge-AI_and_Blockchain
-
https://scholar.google.com/citations?hl=ko&user=8RfIQ9cAAAAJ&view_op=list_works&sortby=pubdate
-
Towards a Remote Monitoring of Patient Vital Signs Based on IoT-Based Blockchain Integrity Management Platforms in Smart Hospitals, https://www.mdpi.com/1424-8220/20/8/2195
-
Improving blockchain performance in clinical trials using intelligent optimal transaction traffic control mechanism in smart healthcare applications, https://www.sciencedirect.com/science/article/abs/pii/S0360835222003801
-
HealthBlock: Asecureblockchain-basedhealthcaredatamanagementsystem, https://www.sciencedirect.com/science/article/abs/pii/S1389128621004382
-
Optimizing Server-side Aggregation For Robust Federated Learning via Subspace Training, https://arxiv.org/abs/2211.05554
We consider a practical scenario that the service provider itself can collect a small amount of clean training data for the current learning task, which raises an emerging line of research in prior FL studies [5, 9, 45, 47, 62].
Unfortunately, it is impractical for the service provider to collect lots of on-server proxy data.
To be precise, each time after local training, SmartFL updates the global model to be the optimal convex combination of the received client models’ parameters by fitting the on-server proxy data.
This makes SmartFL enjoy a much lower demand for on-server proxy data, better generalization, and higher aggregation eff iciency. With an optimized global model, SmartFL successfully boosts performance under either or both challenging conditions. We also establish theoretical guarantees on the convergence and generalization of SmartFL.
It is worth mentioning that our setup is practical, which assumes the service provider itself collects a small clean labelled proxy dataset (around a hundred samples by default).
-
FedCos: A Scene-adaptive Enhancement for Federated Learning, https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=FedCos%3A+A+Scene-adaptive+Enhancement+for+Federated+Learning&btnG=
-
Data-Centric Client Selection for Federated Learning over Distributed Edge Networks, https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=Data-Centric+Client+Selection+for+Federated+Learning+over+Distributed+Edge+Networks&btnG=
-
HARMONY: Heterogeneity-Aware Hierarchical Management for Federated Learning System, https://ieeexplore.ieee.org/abstract/document/9923843
-
A Snapshot of the Frontiers of Client Selection in Federated Learning, https://arxiv.org/abs/2210.04607
-
A Multi-agent Reinforcement Learning Approach for Efficient Client Selection in Federated Learning, https://arxiv.org/abs/2201.02932
-
FusionFedBlock: Fusion of blockchain and federated learning to preserve privacy in industry 5.0, https://www.sciencedirect.com/science/article/pii/S1566253522001658?casa_token=Auk-KEz5MVwAAAAA:b22Xfdso55xBMPassPW_VF4cHOda2h2vuujBJ41JKkIcsAR0LmIoQduiLa7V-UvnKNtXAh6zptc
-
Fed-CBS: A Heterogeneity-Aware Client Sampling Mechanism for Federated Learning via Class-Imbalance Reduction, https://arxiv.org/abs/2209.15245
-
Game of Gradients: Mitigating Irrelevant Clients in Federated Learning, https://ojs.aaai.org/index.php/AAAI/article/view/17093
-
Secure Shapley Value for Cross-Silo Federated Learning, https://arxiv.org/abs/2209.04856
-
Empirical Measurement of Client Contribution for Federated Learning with Data Size Diversification, https://ieeexplore.ieee.org/document/9906094
-
FedCCEA : A Practical Approach of Client Contribution Evaluation for Federated Learning, https://arxiv.org/abs/2106.02310
- A Survey on Heterogeneous Federated Learning, https://arxiv.org/abs/2210.04505
- A Study on Blockchain-Based Asynchronous Federated Learning Framework, https://koreascience.kr/article/CFKO202220859352236.pdf
- BAFL: A Blockchain-Based Asynchronous Federated Learning Framework, https://ieeexplore.ieee.org/document/9399813
- PersA-FL: Personalized Asynchronous Federated Learning, https://arxiv.org/abs/2210.01176
- Analysis and Evaluation of Synchronous and Asynchronous FLchain, https://arxiv.org/abs/2112.07938
-
Federated and Meta learning over Non-Wireless and Wireless Networks: A Tutorial, https://arxiv.org/abs/2210.13111
-
NVIDIAFLARE: Federated Learning fromSimulationtoReal-World, https://arxiv.org/abs/2210.13291
-
Incentive Mechanisms in Federated Learning, https://journal-home.s3.ap-northeast-2.amazonaws.com/site/ictc2022/abs/B8-5.pdf
-
Bong Jun Choi, https://scholar.google.com/citations?hl=en&user=IlNI3yoAAAAJ&view_op=list_works&sortby=pubdate
-
Hierarchical Federated Learning based Anomaly Detection using Digital Twins for Smart Healthcare, https://arxiv.org/abs/2111.12241
-
Detecting Anomalous User Behavior in Remote Patient Monitoring, https://arxiv.org/abs/2106.11844
-
REFL: Resource-Efficient Federated Learning, https://mcanini.github.io/papers/refl.eurosys23.pdf
-
Github: https://github.com/ahmedcs/REFL
-
FeLebrities: a user-centric assessment of Federated Learning frameworks, https://www.techrxiv.org/articles/preprint/FeLebrities_a_user-centric_assessment_of_Federated_Learning_frameworks/21263013
-
A Study of Blockchain-Based Federated Learning, https://link.springer.com/chapter/10.1007/978-3-031-11748-0_7
-
Securing federated learning with blockchain: a systematic literature review, https://link.springer.com/article/10.1007/s10462-022-10271-9
-
Federated Learning Design and FunctionalModels: Survey, https://www.researchsquare.com/article/rs-2101865/latest.pdf
-
Meta Knowledge Condensation for Federated Learning, https://arxiv.org/abs/2209.14851
-
FedTA: Teacher Assistant Knowledge Distillation in non-IID Federated Learning, https://jeremyzhang1.github.io/assets/CS242_Report.pdf
-
Self-supervised RGB-NIR Fusion Video Vision Transformer Framework for rPPG Estimation, https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9931758
-
Video-based Remote Physiological Measurement via Self-supervised Learning, https://arxiv.org/abs/2210.15401
-
[ECCV 2022] Augmentation of rPPG Benchmark Datasets: Learning to Remove and Embed rPPG Signals via Double Cycle Consistent Learning from Unpaired Facial Videos, https://www.ecva.net/papers/eccv_2022/papers_ECCV/papers/136760351.pdf
-
Blood Pressure Measurement: From Cuff-Based to Contactless Monitoring, https://www.mdpi.com/2227-9032/10/10/2113
-
Accurate and Reliable Assessment of Heart Rate in Real-Life Clinical Settings Using an Imaging Photoplethysmography, https://www.mdpi.com/2077-0383/11/20/6101
-
Remote Photoplethysmography Is an Accurate Method to Remotely Measure Respiratory Rate: A Hospital-Based Trial, https://www.mdpi.com/2077-0383/11/13/3647
-
Innovative measurement of routine physiological variables (heart rate, respiratory rate and oxygen saturation) using a remote photoplethysmography imaging system: a prospective comparative trial protocol, https://bmjopen.bmj.com/content/11/8/e047896.abstract
-
Innovative measurement of routine physiological variables (heart rate, respiratory rate and oxygen saturation) using a remote photoplethysmography imaging system: a prospective comparative trial protocol, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8365801/
-
Deep Physiological Sensing Toolbox, https://arxiv.org/abs/2210.00716
-
Real-Time Monitoring of User Stress, Heart Rate, and Heart Rate Variability on Mobile Devices, https://arxiv.org/pdf/2210.01791
-
Adaptive-Weight Network for imaging photoplethysmography signal extraction and heart rate estimation, https://ieeexplore.ieee.org/abstract/document/9912354
-
Contactless Blood Pressure Estimation System Using a Computer Vision System, https://www.mdpi.com/2411-5134/7/3/84/htm
FedFOR: Stateless Heterogeneous Federated Learning with First-Order Regularization, https://arxiv.org/abs/2209.10537
We derive a first-order gradient regularization to penalize inconsistent local updates due to local data heterogeneity. Specifically, to mitigate weight divergence, we introduce a first-order approximation of the global data distribution into local objectives, which intuitively penalizes updates in the opposite direction of the global update. The end result is a stateless FL algorithm that achieves 1) significantly faster convergence (i.e., fewer communication rounds) and 2) higher overall converged performance than SOTA methods under non-IID data distribution. Importantly, our approach does not impose unrealistic limits on the client size, enabling learning from a large number of clients as is typical in most FL applications.
Our code will be released at https://github.com/GT-RIPL/FedFOR.
Most FL algorithms can be categorized under innovation to one of the components.
- ServerOpt methods improve convergence by adding server-side momentum or adaptive optimizers. FedAvgM (Hsu, Qi, and Brown 2019) and SlowMo (Wang et al. 2019) simulates imbalanced data distribution, which is one form of data heterogeneity, and proposes to use server side momentum to improve convergence. FedAdagrad/FedYogi/FedAdam (Reddi et al. 2020) extends FedAvg by including several adaptive optimizers on the server-side to combat data heterogeneity.
- ClientOpt methods add client-side regularization to reduce the effects of data heterogeneity. FedProx (Li et al. 2020) proposes to add a proximal term (L2 regularization) to limit the impact of non-IID data. FedCurv (Shoham et al. 2019) adapts a second-order gradient regularization method, EWC (Kirkpatrick et al. 2017), from continual learning to FL. Recent works, FedPD (Zhang et al. 2020) and FedDyn (Acar et al. 2021) improve convergence on non-IID data by including a first-order regularization term, which seeks consensus among clients. ClientOpt is investigated more heavily than ServerOpt because poor performance due to data heterogeneity is a direct result of local optimization. Furthermore, ClientOpt is more difficult to design because of the stateless requirement for FL when the number of clients is large. For example, FedPD and FedDyn are stateful algorithms and can be difficult to apply to large-scale, real-world FL settings.
In this paper, we focus on the ClientOpt component to improve convergence under non-IID data distribution and compare analytically and empirically to all ClientOpt methods mentioned in this section.