首页 | 官方网站   微博 | 高级检索  
     

基于多目标优化的联邦学习进化算法
引用本文:胡智勇,于千城,王之赐,张丽丝. 基于多目标优化的联邦学习进化算法[J]. 计算机应用研究, 2024, 41(2)
作者姓名:胡智勇  于千城  王之赐  张丽丝
作者单位:北方民族大学计算机科学与工程学院,北方民族大学计算机科学与工程学院,北方民族大学计算机科学与工程学院,北方民族大学计算机科学与工程学院
基金项目:宁夏重点研发计划(引才专项)项目(2022YCZX0013);宁夏重点研发计划(重点)项目(2023BDE02001);银川市校企联合创新项目(2022XQZD009);北方民族大学2022年校级科研平台《数字化农业赋能宁夏乡村振兴创新团队》项目(2022PT_S10);“图像与智能信息处理创新团队”国家民委创新团队资助项目
摘    要:传统联邦学习存在通信成本高、结构异构、隐私保护力度不足的问题,为此提出了一种联邦学习进化算法,应用稀疏进化训练算法降低通信成本,结合本地化差分隐私保护参与方隐私,同时采用NSGA-Ⅲ算法优化联邦学习全局模型的网络结构、稀疏性,调整数据可用性与隐私保护之间的关系,实现联邦学习全局模型有效性、通信成本和隐私性的均衡。不稳定通信环境下的实验结果表明,在MNIST和CIFAR-10数据集上,与FNSGA-Ⅲ算法错误率最低的解相比,该算法所得解的通信效率分别提高57.19%和52.17%,并且参与方实现了(3.46,10-4)和(6.52,10-4)-本地化差分隐私。在不严重影响全局模型准确率的前提下,该算法有效降低了联邦学习的通信成本并保护了参与方隐私。

关 键 词:联邦学习   多目标优化   NSGA-Ⅲ算法   本地化差分隐私   参数优化
收稿时间:2023-05-28
修稿时间:2024-01-15

Federated learning evolutionary algorithm based on multi-objective optimization
Hu Zhiyong,Yu Qiancheng,Wang Zhici and Zhang Lisi. Federated learning evolutionary algorithm based on multi-objective optimization[J]. Application Research of Computers, 2024, 41(2)
Authors:Hu Zhiyong  Yu Qiancheng  Wang Zhici  Zhang Lisi
Affiliation:School of Computer Science and Engineering,,,
Abstract:Traditional federated learning faces challenges such as high communication costs, structural heterogeneity, and insufficient privacy protection. To address these issues, this paper proposed a federated learning evolutionary algorithm that applied sparse evolutionary training algorithm to reduce communication costs and integrated local differential privacy protection for participants'' privacy. Additionally, it utilized the NSGA-Ⅲ algorithm to optimize the network structure and sparsity of the global federated learning model, adjusting the relationship between data availability and privacy protection. This achieved a balance between the effectiveness, communication costs, and privacy of the global federated learning model. Experimental results under unstable communication environments demonstrate that, on the MNIST and CIFAR-10 datasets, compared to the solution with the lowest error rate using the FNSGA-Ⅲ algorithm, the proposed algorithm improves communication efficiency by 57.19% and 52.17%, respectively. The participants also achieved(3.46, 10-4) and(6.52, 10-4) -local differential privacy. This algorithm can effectively reduce communication costs and protect participant privacy without significantly compromising the accuracy of the global model.
Keywords:federated learning   multi-objective optimization   NSGA-Ⅲ algorithm   local differential privacy   parameters optimization
点击此处可从《计算机应用研究》浏览原始摘要信息
点击此处可从《计算机应用研究》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号