As concerns about data privacy continue to grow, Federated Learning (FL) offers an effective solution by enabling collaborative machine learning without requiring data centralization. This approach helps protect data confidentiality and security. However, FL also faces challenges, such as managing complex data dependencies in a distributed environment, which necessitates more efficient model designs. The Mamba architecture, an optimized state-space model for sequence modeling, captures long-term dependencies effectively while meeting hardware constraints, providing a new perspective to address these issues. This study integrates the Mamba architecture into the FL framework to enhance its performance and scalability in privacy-preserving machine learning. Experiments on image classification tasks using the MNIST and CIFAR-10 datasets demonstrate that, even in a decentralized environment, the Mamba-based FL model achieves higher accuracy and more efficient training. The findings indicate that Mamba can strengthen the effectiveness of FL, making it a viable option for complex, distributed, and privacy-sensitive applications.
Mamba-Based Federated Learning Architecture for Privacy-Preserving Machine Learning
23.10.2024
680353 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
Preserving Privacy with Federated Learning in Route Choice Behavior Modeling
Transportation Research Record | 2021
|Efficient privacy-preserving federated learning method for Internet of Ships
DOAJ | 2022
|