Player is loading...


Embed code


Note: this content has been automatically generated.
well i just reject the focus something shouldn't models in the distant alexei
the main a feature of this until i still think is what the during
training but data off every utility's local and then never at least the users device
we're all that is not until forty and every device communicates on t. v. a small
number of neighbours for example on this picture on the connected devices allow the to communicate
this i think is very useful for example in me just in a uh this can help
of the two hospitals to pay in uh
something together without taking a sensitive data from hospitals
a state of about this and the last thing you got gay consist of two steps optimisation and the communication
during customisations that uh uh every device computes for optimal updates to all the trainable parameters
and the uh during communication step we want to share this updates with other devices
of uh because uh off and uh communication between machines is often slow
and the amount of trainable parameters in motion don't models is huge
for example a neural networks have several millions of rights it is very slow to
communicate to love and between devices and that is because measure bottleneck in doesn't always turning
in our work but this this problem and there were proposed in all this
in the last issue yeah okay i just recall the chalk wage should yeah
this can drastically the did use communication time uh using communication compression
the main idea for well get it is what that every
step but don't communicate also parameters but the only part of it
and we keep information about the others locally at the correct whatever set the next generations
and the this is my first they just in the last issue the algorithm which can conversion and the arbitrary high compression
and more of a cut a conversion speed is uh the same a simplistic

Share this talk: 

Conference Program

Welcome address
Martin Vetterli, President of EPFL
6 June 2019 · 9:48 a.m.
James Larus, Dean of IC School, EPFL
6 June 2019 · 9:58 a.m.
Jean-Pierre Hubaux, IC Research Day co-chair
6 June 2019 · 10:07 a.m.
Adventures in electronic voting research
Dan Wallach, Professor at Rice University, Houston, USA
6 June 2019 · 10:14 a.m.
When foes are friends: adversarial examples as protective technologies
Carmela Troncoso, Assistant Professor at EPFL
6 June 2019 · 11:09 a.m.
Low-Latency Metadata Protection for Organizational Networks
Ludovic Barman, LCA1|DeDiS, EPFL
6 June 2019 · noon
Interactive comparison-based search, and
Daniyar Chumbalov, INDY 1, EPFL
6 June 2019 · 12:06 p.m.
Decentralized, Secure and Verifiable Data Sharing
David Froelicher, LCA1|DeDiS, EPFL
6 June 2019 · 12:09 p.m.
Communication Efficient Decentralised Machine Learning
Anastasia Koloskova, MLO, EPFL
6 June 2019 · 12:11 p.m.
Detecting the Unexpected via Image Resynthesis
Krzysztof Lis, CVLab, EPFL
6 June 2019 · 12:14 p.m.
Sublinear Algorithms for Graph Processing
Aida Mousavifar, THL4, EPFL
6 June 2019 · 12:16 p.m.
Protecting the Metadata of Your Secret Messages
Kirill Nikitin, DEDIS, EPFL
6 June 2019 · 12:18 p.m.
Teaching a machine learning algorithm faster
Farnood Salehi, INDY 2, EPFL
6 June 2019 · 12:21 p.m.
Secure Microarchitectural Design
Atri Bhattacharyya, PARSA/HexHive, EPFL
6 June 2019 · 12:23 p.m.
Security testing hard to reach code
Mathias Payer, Assistant Professor at EPFL
6 June 2019 · 1:50 p.m.
Best Research Presentation Award Ceremony
Bryan Ford, Jean-Pierre Hubaux, Deirdre Rochat, EPFL
6 June 2019 · 3:54 p.m.