cover of episode A comprehensive experimental comparison between federated and centralized learning

A comprehensive experimental comparison between federated and centralized learning

2023/7/29
logo of podcast PaperPlayer biorxiv bioinformatics

PaperPlayer biorxiv bioinformatics

Frequently requested episodes will be transcribed first

Shownotes Transcript

Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2023.07.26.550615v1?rss=1

Authors: Garst, S., Dekker, J., Reinders, M.

Abstract: Purpose: Federated learning is an upcoming machine learning paradigm which allows data from multiple sources to be used for training of classifiers without the data leaving the source it originally resides. This can be highly valuable for use cases such as medical research, where gathering data at a central location can be quite complicated due to privacy and legal concerns of the data. In such cases, federated learning has the potential to vastly speed up the research cycle. Although federated and central learning have been compared from a theoretical perspective, an extensive experimental comparison of performances and learning behavior still lacks. Methods: We have performed a comprehensive experimental comparison between federated and centralized learning. We evaluated various classifiers on various datasets exploring influences of different sample distributions as well as different class distributions across the clients. Results: The results show similar performances under a wide variety of settings between the federated and central learning strategies. Federated learning is able to deal with various imbalances in the data distributions. It is sensitive to batch effects between different datasets when they coincide with location, similar as with central learning, but this setting might go unobserved more easily. Conclusion: Federated learning seems robust to various challenges such as skewed data distributions, high data dimensionality, multiclass problems and complex models. Taken together, the insights from our comparison gives much promise for applying federated learning as an alternative to sharing data.

Copy rights belong to original authors. Visit the link for more info

Podcast created by Paper Player, LLC