Selected article for: "classification performance and model train"

Author: Gawali, Manish; ArvindC, S; Suryavanshi, Shriya; Madaan, Harshit; Gaikwad, Ashrika; BhanuPrakash, KN; Kulkarni, Viraj; Pant, Aniruddha
Title: Comparison of Privacy-Preserving Distributed Deep Learning Methods in Healthcare
  • Cord-id: c5ov3b1x
  • Document date: 2020_12_23
  • ID: c5ov3b1x
    Snippet: In this paper, we compare three privacy-preserving distributed learning techniques: federated learning, split learning, and SplitFed. We use these techniques to develop binary classification models for detecting tuberculosis from chest X-rays and compare them in terms of classification performance, communication and computational costs, and training time. We propose a novel distributed learning architecture called SplitFedv3, which performs better than split learning and SplitFedv2 in our experi
    Document: In this paper, we compare three privacy-preserving distributed learning techniques: federated learning, split learning, and SplitFed. We use these techniques to develop binary classification models for detecting tuberculosis from chest X-rays and compare them in terms of classification performance, communication and computational costs, and training time. We propose a novel distributed learning architecture called SplitFedv3, which performs better than split learning and SplitFedv2 in our experiments. We also propose alternate mini-batch training, a new training technique for split learning, that performs better than alternate client training, where clients take turns to train a model.

    Search related documents:
    Co phrase search for related documents
    • accountability health insurance portability and machine learning: 1
    • adam optimizer and loss function: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
    • adam optimizer and machine learning: 1, 2
    • additional parameter and lymph node: 1
    • local client and machine reside: 1