Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Split learning for health: Distributed deep learning without sharing raw patient data

About

Can health entities collaboratively train deep learning models without sharing sensitive raw data? This paper proposes several configurations of a distributed deep learning method called SplitNN to facilitate such collaborations. SplitNN does not share raw data or model details with collaborating institutions. The proposed configurations of splitNN cater to practical settings of i) entities holding different modalities of patient data, ii) centralized and local health entities collaborating on multiple tasks and iii) learning without sharing labels. We compare performance and resource efficiency trade-offs of splitNN and other distributed deep learning methods like federated learning, large batch synchronous stochastic gradient descent and show highly encouraging results for splitNN.

Praneeth Vepakomma, Otkrist Gupta, Tristan Swedish, Ramesh Raskar• 2018

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 5-shot (test)
Top-1 Acc79.63
19
Image ClassificationCIFAR-100 10-shot (test)
Top-1 Acc83.66
19
Image ClassificationPlaces365 3-shot
Accuracy26.84
18
Few-shot Image ClassificationDomainNet Clipart
Accuracy63.95
18
Image ReconstructionCIFAR-10 (test)
SSIM0.949
15
Image ReconstructionSVHN (test)
MSE0.046
12
Image ClassificationCIFAR-100 3-shot (test)
Accuracy74.05
12
Image ReconstructionEMNIST (test)
MSE0.344
12
Image ReconstructionFashionMNIST (test)
MSE0.494
12
Image ClassificationPlaces365 5-shot
Accuracy32.19
12
Showing 10 of 27 rows

Other info

Follow for update