Privacy-sensitive federated learning for cross-domain adaptation: The Mamba-MoE approach

Domain adaptation in decentralized environments poses significant challenges, particularly in privacy-sensitive and resource-constrained scenarios. Conventional approaches, such as Domain-Adversarial Neural Networks (DANN) and Maximum Mean Discrepancy (MMD), rely on large datasets and centralized pr...

Full description

Saved in:
Bibliographic Details
Main Authors: Muhammad Kashif Jabbar, Huang Jianjun, Ayesha Jabbar, Zaka Ur Rehman
Format: Article
Language:English
Published: Elsevier 2025-09-01
Series:Results in Engineering
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2590123025025010
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Domain adaptation in decentralized environments poses significant challenges, particularly in privacy-sensitive and resource-constrained scenarios. Conventional approaches, such as Domain-Adversarial Neural Networks (DANN) and Maximum Mean Discrepancy (MMD), rely on large datasets and centralized processing, making them impractical for federated learning due to privacy and computational constraints. This study introduces Federated Mamba-MoE, a novel framework that integrates Federated learning (FL) with Mixture of Experts (MoE) to enable efficient cross-domain adaptation without requiring data centralization. The proposed architecture leverages adaptive expert routing, selective expert activation, and adaptive feature fusion, ensuring improved domain generalization while preserving privacy. Comprehensive evaluations on Natural Language Processing (NLP) and image classification benchmarks demonstrate 91.6% accuracy, 85.4% F1-score, privacy loss ϵ<1.0, computational efficiency of 5 ms/epoch/client, and minimal communication overhead (2 MB/round). The results highlight the model's superiority in addressing domain heterogeneity while maintaining privacy, making it a robust solution for decentralized machine learning applications in privacy-sensitive domains such as healthcare and internet of things (IoT). Emphasize that Federated Mamba-MoE uniquely integrates adaptive expert routing with multi-layer domain-specific feature fusion and dynamic privacy-aware optimization.
ISSN:2590-1230