[1] |
MEHMOOD A, NATGUNANATHAN I, XIANG Y, et al. Protection of big data privacy[J]. IEEE access, 2016, 4: 1821-1834.
doi: 10.1109/ACCESS.2016.2558446
|
[2] |
VOIGT P, VON DEM BUSSCHE A. The eugeneral data protection regulation (gdpr)[J]. A Practical Guide, 1st Ed., Cham: Springer International Publishing, 2017, 10(3152676): 10-5555.
|
[3] |
MCMAHAN B, MOORE E, RAMAGE D, et al. Communication-efficient learning of deep networks from decentralized data[C]// Artificial intelligence and statistics, PMLR, 2017: 1273-1282.
|
[4] |
LI W, MILLETARÌ F, XU D, et al. Privacy-preserving federated brain tumour segmentation[C]// International workshop on machine learning in medical imaging. Springer, 2019: 133-141.
|
[5] |
LLIU Y, HUANG A, LUO Y, et al. Fedvision: An online visual object detection platform powered by federated learning[C]// Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 34, 2020: 13172-13179.
|
[6] |
DIMITRIADIS D, KEN’ICHI KUMATANI, GMYR R, et al. A Federated Approach in Training Acoustic Models[C]// Interspeech, 2020: 981-985.
|
[7] |
LI M, ANDERSEN D G, PARK J W, et al. Scaling distributed machine learning with the parameter server[C]// The 11th USENIX Symposium on Operating Systems Design and Implementation[S.l.:s.n], 2014: 583-598.
|
[8] |
DAI W, KUMAR A, WEI J, et al. High-performance distributed ML at scale through parameter server consistency models[C]// Proceedings of the AAAI Conference on Artificial Intelligence, 2015: 79-87.
|
[9] |
NIU F, RECHT B, RE C, et al. HOGWILD! a lock-free approach to parallelizing stochastic gradient descent[C]// Proceedings of the 24th International Conference on Neural Information Processing Systems, 2011: 693-701.
|
[10] |
HO Q, CIPAR J, CUI H, et al. More effective distributed ml via a stale synchronous parallel parameter server[J]. Advances in neural information processing systems, 2013, 26: 1223-1231.
|
[11] |
WU Q, HE K, CHEN X. Personalized federated learning for intelligent IoT applications: A cloud-edge based framework[J]. IEEE Open Journal of the Computer Society, 2020, 1: 35-44.
doi: 10.1109/OJCS
|
[12] |
LI T, SAHU A K, ZAHEER M, et al. Federated optimization in heterogeneous networks[J]. Proceedings of Machine Learning and Systems, 2020, 2: 429-450.
|
[13] |
LI Q, DIAO Y, CHEN Q, et al. Federated learning on non-iid data silos: An experimental study[C]// 2022 IEEE 38th International Conference on Data Engineering (ICDE), IEEE, 2022: 965-978.
|
[14] |
KULKARNI V, KULKARNI M, PANT A. Survey of personalization techniques for federated learning[C]// 2020 Fourth World Conference on Smart Trends in Systems, Security and Sustainability (WorldS4), IEEE, 2020: 794-797.
|
[15] |
HUANG Y, CHU L, ZHOU Z, et al. Personalized Cross-Silo Federated Learning on Non-IID Data[C]// AAAI, 2021: 7865-7873.
|
[16] |
ZHU Z, HONG J, ZHOU J. Data-free knowledge distillation for heterogeneous federated learning[C]// International Conference on Machine Learning, PMLR, 2021: 12878-12889.
|
[17] |
GO A, BHAYANI R, HUANG L. Twitter sentiment classification using distant supervision[J]. CS224N project report, Stanford, 2009, 1(12): 2009.
|
[18] |
NETZER Y, WANG T, COATES A, et al. Reading digits in natural images with unsupervised feature learning[C]// NIPS Workshop on Deep Learning and Unsupervised Feature Learning, 2011: 1-9.
|
[19] |
CALDAS S, DUDDU S M K, WU P, et al. Leaf: A benchmark for federated settings[J]. arXiv preprint arXiv:1812.01097, 2018.
|
[20] |
KRIZHEVSKY A, SUTSKEVER I, HINTON G E. Imagenet classification with deep convolutional neural networks[J]. Communications of the ACM, 2017, 60(6): 84-90.
doi: 10.1145/3065386
|
[21] |
ZHANG X, ZHAO J, LECUN Y. Character-level convolutional networks for text classification[C]// Proceedings of the 28th International Conference on Neural Information Processing Systems-Volume 1, 2015: 649-657.
|
[22] |
SOCHER R, PERELYGIN A, WU J, et al. Recursive deep models for semantic compositionality over a sentiment treebank[C]// Proceedings of the 2013 conference on empirical methods in natural language processing, 2013: 1631-1642.
|
[23] |
LIU Z, LUO P, WANG X, et al. Deep learning face attributes in the wild[C]// Proceedings of the IEEE international conference on computer vision, 2015: 3730-3738.
|
[24] |
LECUN Y, BOTTOU L, BENGIO Y, et al. Gradient-based learning applied to document recognition[J]. Proceedings of the IEEE, 1998, 86(11): 2278-2324.
doi: 10.1109/5.726791
|
[25] |
COHEN G, AFSHAR S, TAPSON J, et al. EMNIST: Extending MNIST to handwritten letters[C]// 2017 international joint conference on neural networks (IJCNN), IEEE, 2017: 2921-2926.
|
[26] |
CALDAS S, DUDDU S M K, WU P, et al. Leaf: A benchmark for federated settings[J]. arXiv preprint arXiv: 1812.01097, 2018.
|
[27] |
POURANSARI H, GHILI S. Tiny imagenet visual recognition challenge[Z]. CS231 N course, Stan-ford Univ., Stanford, CA, USA, 2014.
|
[28] |
LI T, SAHU A K, ZAHEER M, et al. Federated optimization in heterogeneous networks[J]. Proceedings of Machine Learning and Systems, 2020, 2: 429-450.
|
[29] |
LI Q, HE B, SONG D. Model-Contrastive Federated Learning[C]// 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, 2021: 10708-10717.
|
[30] |
LI T, HU S, BEIRAMI A, et al. Ditto: Fair and robust federated learning through personalization[C]// International Conference on Machine Learning, PMLR, 2021: 6357-6368.
|
[31] |
KARIMIREDDY S P, KALE S, MOHRI M, et al. Scaffold: Stochastic controlled averaging for federated learning[C]// International Conference on Machine Learning, PMLR, 2020: 5132-5143.
|
[32] |
LI D, WANG J. Fedmd: Heterogenous federated learning via model distillation[J]. arXiv preprint arXiv:1910.03581, 2019.
|
[33] |
HAO W, EL-KHAMY M, LEE J, et al. Towards fair federated learning with zero-shot data augmentation[C]// Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021: 3310-3319.
|
[34] |
TANG Z, ZHANG Y, SHI S, et al. Virtual Homogeneity Learning: Defending against Data Heterogeneity in Federated Learning[C]// Proceedings of the 39th International Conference on Machine Learning, PMLR, 2022: 21111-21132.
|
[35] |
GONG M, ZHANG K, LIU T, et al. Domain adaptation with conditional transferable components[C]// International conference on machine learning, PMLR, 2016: 2839-2848.
|
[36] |
WU H, WANG P. Fast-convergent federated learning with adaptive weighting[J]. IEEE Transactions on Cognitive Communications and Networking, 2021, 7(4): 1078-1088.
doi: 10.1109/TCCN.2021.3084406
|
[37] |
CHEN H Y, CHAO W L. FedBE: Making Bayesian Model Ensemble Applicable to Federated Learning[C]// International Conference on Learning Representations, 2020: 1-21.
|
[38] |
YUROCHKIN M, AGARWAL M, GHOSH S, et al. Bayesian nonparametric federated learning of neural networks[C]// International Conference on Machine Learning, PMLR, 2019: 7252-7261.
|
[39] |
THIBAUX R, JORDAN M I. Hierarchical beta processes and the Indian buffet process[C]// Artificial intelligence and statistics, PMLR, 2007: 564-571.
|
[40] |
FALLAH A, MOKHTARI A, OZDAGLAR A. Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach[J]. Advances in Neural Information Processing Systems, 2020, 33: 3557-3568.
|
[41] |
FINN C, ABBEEL P, LEVINE S. Model-agnostic meta-learning for fast adaptation of deep networks[C]// International conference on machine learning, PMLR, 2017: 1126-1135.
|
[42] |
ACAR D A E, ZHAO Y, ZHU R, et al. Debiasing model updates for improving personalized federated training[C]// International Conference on Machine Learning, PMLR, 2021: 21-31.
|
[43] |
LI T, SANJABI M, BEIRAMI A, et al. Fair Resource Allocation in Federated Learning[C]// International Conference on Learning Representations, 2019: 1-27.
|
[44] |
KHODAK M, BALCAN M F, TALWALKAR A. Adaptive gradient-based meta-learning methods[C]// Proceedings of the 33rd International Conference on Neural Information Processing Systems, 2019: 5917-5928.
|
[45] |
SMITH V, CHIANG C K, SANJABI M, et al. Federated multi-task learning[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017: 4427-4437.
|
[46] |
GHOSH A, HONG J, YIN D, et al. Robust federated learning in a heterogeneous environment[J]. arXiv preprint arXiv:1906.06629, 2019.
|
[47] |
HARTIGAN J, WONG M. A K-Means Clustering Algorithm[J]. Journal of the Royal Statistical Society, Series C (Applied Statistics), 1979, 28(1): 100-108.
|
[48] |
SATTLER F, MÜLLER K R, SAMEK W. Clustered federated learning: Model-agnostic distributed multitask optimization under privacy constraints[J]. IEEE transactions on neural networks and learning systems, 2020, 32(8): 3710-3722.
doi: 10.1109/TNNLS.2020.3015958
|
[49] |
GHOSH A, CHUNG J, YIN D, et al. An efficient framework for clustered federated learning[J]. Advances in Neural Information Processing Systems, 2020, 33: 19586-19597.
|
[50] |
JEONG E, OH S, KIM H, et al. Communication-efficient on-device machine learning: Federated distillation and augmentation under non-iid private data[J]. arXiv preprint arXiv: 1811.11479, 2018.
|
[51] |
LIN T, KONG L, STICH S U, et al. Ensemble distillation for robust model fusion in federated learning[J]. Advances in Neural Information Processing Systems, 2020, 33: 2351-2363.
|
[52] |
LAI F, DAI Y, SINGAPURAM S, et al. Fedscale: Benchmarking model and system performance of federated learning at scale[C]// International Conference on Machine Learning, PMLR, 2022: 11814-11827.
|