Author: Kamp, Michael; Fischer, Jonas; Security, Jilles Vreeken CISPA Helmholtz Center for Information; Informatics, Max Planck Institute for
Title: Federated Learning from Small Datasets Cord-id: as8bizwj Document date: 2021_10_7
ID: as8bizwj
Snippet: Federated learning allows multiple parties to collaboratively train a joint model without sharing local data. This enables applications of machine learning in settings of inherently distributed, undisclosable data such as in the medical domain. In practice, joint training is usually achieved by aggregating local models, for which local training objectives have to be in expectation similar to the joint (global) objective. Often, however, local datasets are so small that local objectives differ gr
Document: Federated learning allows multiple parties to collaboratively train a joint model without sharing local data. This enables applications of machine learning in settings of inherently distributed, undisclosable data such as in the medical domain. In practice, joint training is usually achieved by aggregating local models, for which local training objectives have to be in expectation similar to the joint (global) objective. Often, however, local datasets are so small that local objectives differ greatly from the global objective, resulting in federated learning to fail. We propose a novel approach that intertwines model aggregations with permutations of local models. The permutations expose each local model to a daisy chain of local datasets resulting in more efficient training in data-sparse domains. This enables training on extremely small local datasets, such as patient data across hospitals, while retaining the training efficiency and privacy benefits of federated learning.
Search related documents:
Co phrase search for related documents- accuracy achieve and loss function: 1, 2
- additional information and local model: 1
Co phrase search for related documents, hyperlinks ordered by date