Player is loading...

Embed

Copy embed code

Transcriptions

Note: this content has been automatically generated.
00:00:00
yeah i see alright hand that you sell out of it i think it
00:00:06
would be really interesting tool so i have a question about secure multiparty computation
00:00:11
a long wrong but the way i understand it boards um is that it's based on the assumption that the different parties
00:00:18
will never clued yeah because of all the parties can do
00:00:21
they can they can uh uh quite a description case saints case
00:00:25
so well and tractors how do you decide what is
00:00:28
an appropriate number of parties such that they want clued
00:00:34
yeah so so i'll in the approach we we design we
00:00:37
use the the so called any trust the right model were
00:00:42
all parties but one can conclude if you have one but doesn't colour the system is secure
00:00:49
so the more parties you have the more purely the system is more secure
00:00:53
but how to get to the exact number in fact it's like i'm talking say
00:00:58
you know like fine for like i was on any particular cases you you just need it's you just need and minus one
00:01:06
so and minus one can conclude and and one one uh you see what i mean
00:01:11
to threshold on and what is the minimum number of parties that don't have to go to
00:01:16
these guys would be one i understand mathematically but um what i
00:01:20
don't understand and practised how do you go about deciding on a number
00:01:25
and not and number just um like um
00:01:29
i mean even if it's an minus one right yeah i mean still check of the situation what does doesn't hold on tractors
00:01:37
right
00:01:38
have some ten yeah i think in the end there's a there's always some legal protection around
00:01:43
that so that's what what happen in reality we're uh we have to make contracts were people
00:01:50
agree to knock on wood somehow right into not sure if
00:01:53
you really don't okay uh that's oh so technically the system
00:01:59
preserve security as long as there's one party that that does include with the others
00:02:04
even if all the others are compromised uh but practically uh how do we ensure this is through legal code
00:02:11
okay and then in your applications with the whole school how many parties would use typically
00:02:16
uh in the application i showed you the it's it's the five universe documents with
00:02:20
so the rebar little bows and geneva so five party answer yeah five okay yeah
00:02:26
but this is it's i mean theoretically it's unlimited rights really okay
00:02:29
at as many times as you want yeah okay they give him
00:02:37
um but i must say it's yes
00:02:44
and how beautiful presentation i just so one question um about the
00:02:49
slide you showed of for the um the that most of them
00:02:54
for the the the production information comes or switzerland uh is the solution does this one that you use
00:03:01
a solution uh accept to the to the you all
00:03:04
read it so o. e. does your review of the
00:03:09
the two solutions to go no i think it does kind of the feedback
00:03:14
you gave on the approach right we're to essentially the the whole trick was to
00:03:19
to never have the the corruption gene pool uh_huh uh so
00:03:23
this was basically guaranteeing that there's no way of decrypt doing
00:03:27
unless all these editions involved in the process agreed to to
00:03:31
do that right so as long as they these under interruption
00:03:34
this can be consider anonymous so that what is that what is it back uh_huh and this was instrumental also too
00:03:41
a move forward also with canton or um
00:03:45
protection authorities and ethics
00:03:47
come up that you can
00:03:55
any other questions that nice i have one question still yes i'm new
00:04:01
here see if we have a lot of people working on has a ah
00:04:05
keychain so we i i often hear people around saying oh
00:04:09
yes meeting tonight and there's not enough data unless it yes it's
00:04:13
a problem but i was wondering if this privacy in your field
00:04:16
one of the main issues are they also other factors that uh
00:04:20
need to gauge and not being shane as much and yeah domain as i mentioned before i think there is a
00:04:27
that more of a challenge that we have to change like this data ownership feeling right so we're done
00:04:33
there's a lot of back at it academic competitions right so who on b.
00:04:37
b. s. everyone just want to exhaust any possible question before you can have how
00:04:42
and uh so so the and this is a problem right so uh i think
00:04:48
uh and that doesn't have to be good price so so i
00:04:52
think this is one problem and also because of their decision especially
00:04:57
seen in uh in health care as i mentioned before like they are
00:05:02
like we award data is collected through proprietary systems than
00:05:07
and every property system as his own way of storing and modelling uh the uh the semantics of the data
00:05:13
so there's a lot of effort that has to be put before you can put any federated system place right on mapping the
00:05:20
that constant yeah and they're they're huge initiatives in europe uh
00:05:25
to to to map these to a so called common become models
00:05:28
um to to you know really try to sort of the the
00:05:32
pro right uh_huh it's a lot of manual manual work into that
00:05:38
and a lot of data that is out there that we're not using other because it's that's exactly
00:05:42
exactly yes especially if you go also have outside if you have i guess he that to other countries
00:05:49
less developed countries also i think yeah even more states i think
00:05:52
it's where light like cadillacs could have a great impact liking standardisation
00:05:58
uh i think are the for example structuring yeah unstructured data
00:06:03
and try to might be up to two semantic standards this this is something uh_huh
00:06:08
useful and not that awful the and interesting and yeah
00:06:15
ah yes okay thank you very much for having many interesting topics
00:06:22
and of course you can delve into the details and different
00:06:25
technologies at a poem controlled by i have more say pragmatic
00:06:30
uh yeah action action hospitals in switzerland okay definitely be confined
00:06:34
in agreement we can find the third trustee party and we
00:06:37
can provide data and these guys can train in the commonplace
00:06:40
unless you think about some i would say that ownership or
00:06:45
whatever commercial but in make okay so it's it's let let let let's just need to get these discussion out of this but
00:06:51
i'm more concerned about the the phone aspects get talking about
00:06:54
performance data instead of the different hospitals like will but different equipment
00:06:58
different equipment is categorise the like different resolution like different sensitivity yeah
00:07:04
by different parameters and its concerns
00:07:06
everything from microscope to whatever imaging device
00:07:11
and here you might think okay the protected data incidence of that um but you can clearly understand that
00:07:16
the the train the models on the different take originating from different sensors say let's call it looks at
00:07:22
you might have much more she'd use concerns in their need of shrink his data
00:07:28
rather than just protecting privacy and your simple standardisation probably format of the
00:07:32
data will not help much if the data is a lot different physical nature
00:07:37
yeah and here my mean my my personal doubt when i started to sink a about all this application
00:07:42
it's great it's cool it's still have it but basically even hospitals would be the you you provide them privacy
00:07:48
protection protocols but whether it's the useful and practical and can put magically
00:07:53
practical that uh if we use unit you should will converge to the data
00:07:57
we get from different sensors yeah no i think i agree with you so
00:08:03
i i think uh this doesn't solve the problem so uh it it's uh
00:08:09
i think what what will be it's it's a complimentary solution if you want to
00:08:14
to centralise india i think's improvisation will not disappear a bigger country but not disappear because
00:08:20
as you mentioned there are situations in which further it under
00:08:24
any would cause more harm in terms of introducing bias or
00:08:28
uh uh and things that you cannot leave and monica or because you don't have access to the original data
00:08:34
uh what i what i think is that it could be
00:08:37
a way to facilitate some expose initial exploration of beta before central
00:08:43
position or or caring go out uh uh you know simple
00:08:48
analysis or training maturity models where you are pretty sure about standardisation
00:08:53
um but of course is not is not the do both solution that will
00:08:58
um you know substitute to traditional beta sure i think i think the two approachable
00:09:04
quite this is just that we if we want to driving to centralise data sharing
00:09:10
especially across borders the stakes really really mark uh in the eh
00:09:14
speech and i i shortly before they this whispers eyes at network
00:09:18
which is currently adopting morse improvisation on trusted research environments like which a lot of
00:09:25
uh access control security even with that it takes one year for people to agree
00:09:30
uh to to to sign contracts and so on so imagine these at the international scale
00:09:35
so i think federated learning at federated analogies can play a role in sort
00:09:40
of accelerating the first access to the data but of course it is not final

Share this talk: 


Conference Program

(Keynote Talk) Privacy-Preserving Machine Learning : theoretical and practical considerations
Prof. Slava Voloshynovskiy, University of Geneva, professor with the Department of Computer Science and head of the Stochastic Information Processing group
Oct. 11, 2023 · 7:55 a.m.
2509 views
5 minutes Q&A - Privacy-Preserving Machine Learning : theoretical and practical considerations
Prof. Slava Voloshynovskiy, University of Geneva, professor with the Department of Computer Science and head of the Stochastic Information Processing group
Oct. 11, 2023 · 8:40 a.m.
Enabling Digital Sovereignity With ML
Vladimir Vujovic, Senior Digital Innovation Manager, SICPA
Oct. 11, 2023 · 8:44 a.m.
5 minutes Q&A - Enabling Digital Sovereignity With ML
Vladimir Vujovic, Senior Digital Innovation Manager, SICPA
Oct. 11, 2023 · 8:58 a.m.
Privacy-Enhanced Computation in the Age of AI
Dr. Dimitar Jechev, Co-founder and CTO of Inpher
Oct. 11, 2023 · 9:01 a.m.
138 views
5 minutes Q&A - Privacy-Enhanced Computation in the Age of AI
Dr. Dimitar Jechev, Co-founder and CTO of Inpher
Oct. 11, 2023 · 9:20 a.m.
Privacy by Design Age Verification & Online Child Safety
Dr. Onur Yürüten, Head of Age Assurance Solutions and Senior ML Engineer in Privately
Oct. 11, 2023 · 9:26 a.m.
5 minutes Q&A - Privacy by Design Age Verification & Online Child Safety
Dr. Onur Yürüten, Head of Age Assurance Solutions and Senior ML Engineer in Privately
Oct. 11, 2023 · 9:41 a.m.
(Keynote Talk) Biometrics in the era of AI: From utopia to dystopia?
Dr. Catherine Jasserand, KU Leuven (Belgium), Marie Skłodowska-Curie fellow at Biometric Law Lab
Oct. 11, 2023 · 11:06 a.m.
5 minutes Q&A - Biometrics in the era of AI: From utopia to dystopia?
Dr. Catherine Jasserand, KU Leuven (Belgium), Marie Skłodowska-Curie fellow at Biometric Law Lab
Oct. 11, 2023 · 11:42 a.m.
AI and Privacy
Alexandre Jotterand, CIPP/E, CIPM, attorney-at-law, partner at id est avocats
Oct. 11, 2023 · 11:48 a.m.
5 minutes Q&A - AI and Privacy
Alexandre Jotterand, CIPP/E, CIPM, attorney-at-law, partner at id est avocats
Oct. 11, 2023 · 12:06 p.m.
Preliminary Pperspectives on the Ethical Implications of GenAI
Julien Pache, A Partner at Ethix and Venture Partner at Verve Ventures
Oct. 11, 2023 · 12:12 p.m.
5 minutes Q&A - Preliminary Pperspectives on the Ethical Implications of GenAI
Julien Pache, A Partner at Ethix and Venture Partner at Verve Ventures
Oct. 11, 2023 · 12:30 p.m.
AI & Media: Can You Still Trust Information
Mounir Krichane, Director of the EPFL Media Center
Oct. 11, 2023 · 12:32 p.m.
5 minutes Q&A - AI & Media: Can You Still Trust Information
Mounir Krichane, Director of the EPFL Media Center
Oct. 11, 2023 · 12:54 p.m.
(Keynote Talk) Unlocking the Power of Artificial Intelligence for Precision Medicine with Privacy-Enhancing Technologies
Prof. Jean Louis Raisaro, CHUV-UNIL, assistant professor of Biomedical Informatics and Data Science at the Faculty of Biology and Medicine and the head of the Clinical Data Science Group at the Biomedical Data Science Center
Oct. 11, 2023 · 1:22 p.m.
5 minutes Q&A - Unlocking the Power of Artificial Intelligence for Precision Medicine with Privacy-Enhancing Technologies
Prof. Jean Louis Raisaro, CHUV-UNIL, assistant professor of Biomedical Informatics and Data Science at the Faculty of Biology and Medicine and the head of the Clinical Data Science Group at the Biomedical Data Science Center
Oct. 11, 2023 · 1:50 p.m.
Genomics, AI and Privacy
Julien Duc, Co-Founder and Co-CEO of Nexco Analytics
Oct. 11, 2023 · 2:01 p.m.
5 minutes Q&A - Genomics, AI and Privacy
Julien Duc, Co-Founder and Co-CEO of Nexco Analytics
Oct. 11, 2023 · 2:18 p.m.
How trust & transparency lead the success of an Idiap student's Master's project in fraud detection
Raphaël Lüthi, Machine Learning and AI Lead at Groupe Mutuel
Oct. 11, 2023 · 2:22 p.m.
5 minutes Q&A - How trust & transparency lead the success of an Idiap student's Master's project in fraud detection
Raphaël Lüthi, Machine Learning and AI Lead at Groupe Mutuel
Oct. 11, 2023 · 2:38 p.m.

Recommended talks

Privately Pitch
François Helg
March 19, 2015 · 12:21 p.m.
2182 views