Please use this identifier to cite or link to this item:
https://elib.vku.udn.vn/handle/123456789/2751
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Le, Trieu Phong | - |
dc.contributor.author | Tran, Thi Phuong | - |
dc.date.accessioned | 2023-09-26T02:33:49Z | - |
dc.date.available | 2023-09-26T02:33:49Z | - |
dc.date.issued | 2023-07 | - |
dc.identifier.isbn | 978-3-031-36886-8 | - |
dc.identifier.uri | https://link.springer.com/chapter/10.1007/978-3-031-36886-8_2 | - |
dc.identifier.uri | http://elib.vku.udn.vn/handle/123456789/2751 | - |
dc.description | Lecture Notes in Networks and Systems (LNNS, volume 734); CITA: Conference on Information Technology and its Applications; pp: 15-24. | vi_VN |
dc.description.abstract | In distributed machine learning, multiple machines or workers collaborate to train a model. However, prior research in cross-silo distributed learning with differential privacy has the drawback of requiring all workers to participate in each training iteration, hindering flexibility and efficiency. To overcome these limitations, we introduce a new algorithm that allows partial worker attendance in the training process, reducing communication costs by over 50% while preserving accuracy on benchmark data. The privacy of the workers is also improved because less data are exchanged between workers. | vi_VN |
dc.language.iso | en | vi_VN |
dc.publisher | Springer Nature | vi_VN |
dc.subject | Differential privacy | vi_VN |
dc.subject | Partial attendance | vi_VN |
dc.subject | Communication efficiency | vi_VN |
dc.subject | Distributed machine learning | vi_VN |
dc.title | Differentially-Private Distributed Machine Learning with Partial Worker Attendance: A Flexible and Efficient Approach | vi_VN |
dc.type | Working Paper | vi_VN |
Appears in Collections: | CITA 2023 (International) |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.