Player is loading...

Embed

Embed code

Transcriptions

Note: this content has been automatically generated.
00:00:07
well i just reject the focus something shouldn't models in the distant alexei
00:00:13
the main a feature of this until i still think is what the during
00:00:17
training but data off every utility's local and then never at least the users device
00:00:23
we're all that is not until forty and every device communicates on t. v. a small
00:00:27
number of neighbours for example on this picture on the connected devices allow the to communicate
00:00:33
this i think is very useful for example in me just in a uh this can help
00:00:38
of the two hospitals to pay in uh
00:00:41
something together without taking a sensitive data from hospitals
00:00:48
a state of about this and the last thing you got gay consist of two steps optimisation and the communication
00:00:55
during customisations that uh uh every device computes for optimal updates to all the trainable parameters
00:01:02
and the uh during communication step we want to share this updates with other devices
00:01:08
of uh because uh off and uh communication between machines is often slow
00:01:13
and the amount of trainable parameters in motion don't models is huge
00:01:18
for example a neural networks have several millions of rights it is very slow to
00:01:23
communicate to love and between devices and that is because measure bottleneck in doesn't always turning
00:01:31
in our work but this this problem and there were proposed in all this
00:01:34
in the last issue yeah okay i just recall the chalk wage should yeah
00:01:39
this can drastically the did use communication time uh using communication compression
00:01:45
the main idea for well get it is what that every
00:01:47
step but don't communicate also parameters but the only part of it
00:01:53
and we keep information about the others locally at the correct whatever set the next generations
00:01:58
and the this is my first they just in the last issue the algorithm which can conversion and the arbitrary high compression
00:02:05
and more of a cut a conversion speed is uh the same a simplistic

Share this talk: 


Conference Program

Welcome address
Martin Vetterli, President of EPFL
6 June 2019 · 9:48 a.m.
Introduction
James Larus, Dean of IC School, EPFL
6 June 2019 · 9:58 a.m.
Introduction
Jean-Pierre Hubaux, IC Research Day co-chair
6 June 2019 · 10:07 a.m.
Adventures in electronic voting research
Dan Wallach, Professor at Rice University, Houston, USA
6 June 2019 · 10:14 a.m.
When foes are friends: adversarial examples as protective technologies
Carmela Troncoso, Assistant Professor at EPFL
6 June 2019 · 11:09 a.m.
Low-Latency Metadata Protection for Organizational Networks
Ludovic Barman, LCA1|DeDiS, EPFL
6 June 2019 · noon
Interactive comparison-based search, and who-is-th.at
Daniyar Chumbalov, INDY 1, EPFL
6 June 2019 · 12:06 p.m.
Decentralized, Secure and Verifiable Data Sharing
David Froelicher, LCA1|DeDiS, EPFL
6 June 2019 · 12:09 p.m.
Communication Efficient Decentralised Machine Learning
Anastasia Koloskova, MLO, EPFL
6 June 2019 · 12:11 p.m.
Detecting the Unexpected via Image Resynthesis
Krzysztof Lis, CVLab, EPFL
6 June 2019 · 12:14 p.m.
Sublinear Algorithms for Graph Processing
Aida Mousavifar, THL4, EPFL
6 June 2019 · 12:16 p.m.
Protecting the Metadata of Your Secret Messages
Kirill Nikitin, DEDIS, EPFL
6 June 2019 · 12:18 p.m.
Teaching a machine learning algorithm faster
Farnood Salehi, INDY 2, EPFL
6 June 2019 · 12:21 p.m.
Secure Microarchitectural Design
Atri Bhattacharyya, PARSA/HexHive, EPFL
6 June 2019 · 12:23 p.m.
Security testing hard to reach code
Mathias Payer, Assistant Professor at EPFL
6 June 2019 · 1:50 p.m.
Best Research Presentation Award Ceremony
Bryan Ford, Jean-Pierre Hubaux, Deirdre Rochat, EPFL
6 June 2019 · 3:54 p.m.