Player is loading...

Embed

Copy embed code

Transcriptions

Note: this content has been automatically generated.
00:00:00
and i'm yes
00:00:01
questions
00:00:01
um
00:00:07
hello so
00:00:08
thank you for the talk a um
00:00:10
i think the question are you in your legal bubble discussing about
00:00:16
uh whether the machine learning model is considered personal these or not
00:00:22
what are the the mentioned adding modal yes because in some way i'm
00:00:26
gonna show it later in the talk but it's it's a sort of
00:00:30
compressed way of your training the right so it might be also considered this personal data
00:00:38
yeah um is when we start sharing models instead of the uh
00:00:43
the question is uh are we under g. d. p. or not
00:00:46
yeah and the the easy answer to that is that you know whenever
00:00:51
you could indirectly identify individuals within your that assets which
00:00:55
as soon will be possible in a lot of cases
00:00:58
then you'll match unloading model will contain person of that that
00:01:04
and that's the answer you will get from a new lawyer and the and the but more importantly uh courts
00:01:09
basically that's why they would say they will not really focus on the tech aspect that goes beyond that
00:01:16
um um
00:01:19
um um um
00:01:24
like your problem yeah my
00:01:26
side just related to that um is the definition or personally
00:01:31
identifiable information a static definition or does not involve
00:01:35
with what is technically capable or possible at any time
00:01:38
for example maybe nowadays you could almost identify someone by the slayer which directs for example right
00:01:44
maybe this was all possible before so how do we see that the definition of these uh
00:01:50
things evolve with with the capabilities or with ah yeah that's yeah okay
00:01:55
so it's definitely not static at and it first of all it depend on the context all
00:02:02
what is personalised that for you may not be a person and that that for me
00:02:06
so that you know directed them fire there would be person that that for every on
00:02:11
and then they are in directing the fire maybe you have the additional
00:02:14
information that is required to you identify a other individuals and i do not
00:02:20
for instance you have you you on site for your data i use and you so
00:02:25
suddenly mice that that's that's for you would still personal data because you have to actually
00:02:31
to uh and to link back the individuals uh the that that to the individuals but i don't have it
00:02:38
so for me it's not personally does for you it's not for me which have a lot
00:02:41
of impact and you know all we must both be hate on the that the protection those
00:02:47
and the question for me is whether i could still you know you don't if i
00:02:51
the individuals with but that our the information that i have in my surroundings
00:02:56
and with that you have to take into account the reasonable
00:02:59
means that i could help page to identify the individuals and and
00:03:07
i don't i don't think there is a lot of research from the legal fees
00:03:10
it now that goes into you know how far do you need to look at
00:03:15
so we'll more consider uh basically the current coupon and why we will say that
00:03:20
and maybe you know if you really have to be strict about
00:03:25
proving that what you have is really not person all the time
00:03:29
does answer your question sure yeah i'm
00:03:38
okay so a question is what is from a legal perspective difference between let's
00:03:43
say google and cared deeply t. both all systems take next do you some output
00:03:49
they have it asked is relevant what's what is the difference from the legal
00:03:53
stand yeah i think it's a very it's a very sound questions uh i am
00:03:58
and um so i guess the view off the get and paint the case i mentioned was
00:04:03
that while it is the same what this i don't want to re open the door you know
00:04:08
uh creating a situation where a a search engine would not be lawful under a new
00:04:12
law because they're doing kind of the same thing but id we you doing the same thing
00:04:18
and i'm not sure uh and i i'm not entirely sure with that as well because you know the impact
00:04:23
but tragically taken have an individualist is very different
00:04:26
than the one and we will search engine has
00:04:36
i i yeah well one of the differences that's with school so it the ruling
00:04:45
of the court of justice actually came also and with the right to be forgotten
00:04:51
so they say okay you you are legitimacy in collecting the that task right
00:04:55
uh scrapping does that than line and and creating these you know inventor yeah whatever
00:05:01
and but in exchange you will have quite cumbersome obligations which will
00:05:05
be to the to make the press and disappear from the results
00:05:10
whichever challenging at the beginning and as far as i'm concerned
00:05:14
i'm not sure that judge if it is currently able to do that

Share this talk: 


Conference Program

(Keynote Talk) Privacy-Preserving Machine Learning : theoretical and practical considerations
Prof. Slava Voloshynovskiy, University of Geneva, professor with the Department of Computer Science and head of the Stochastic Information Processing group
Oct. 11, 2023 · 7:55 a.m.
2509 views
5 minutes Q&A - Privacy-Preserving Machine Learning : theoretical and practical considerations
Prof. Slava Voloshynovskiy, University of Geneva, professor with the Department of Computer Science and head of the Stochastic Information Processing group
Oct. 11, 2023 · 8:40 a.m.
Enabling Digital Sovereignity With ML
Vladimir Vujovic, Senior Digital Innovation Manager, SICPA
Oct. 11, 2023 · 8:44 a.m.
5 minutes Q&A - Enabling Digital Sovereignity With ML
Vladimir Vujovic, Senior Digital Innovation Manager, SICPA
Oct. 11, 2023 · 8:58 a.m.
Privacy-Enhanced Computation in the Age of AI
Dr. Dimitar Jechev, Co-founder and CTO of Inpher
Oct. 11, 2023 · 9:01 a.m.
138 views
5 minutes Q&A - Privacy-Enhanced Computation in the Age of AI
Dr. Dimitar Jechev, Co-founder and CTO of Inpher
Oct. 11, 2023 · 9:20 a.m.
Privacy by Design Age Verification & Online Child Safety
Dr. Onur Yürüten, Head of Age Assurance Solutions and Senior ML Engineer in Privately
Oct. 11, 2023 · 9:26 a.m.
5 minutes Q&A - Privacy by Design Age Verification & Online Child Safety
Dr. Onur Yürüten, Head of Age Assurance Solutions and Senior ML Engineer in Privately
Oct. 11, 2023 · 9:41 a.m.
(Keynote Talk) Biometrics in the era of AI: From utopia to dystopia?
Dr. Catherine Jasserand, KU Leuven (Belgium), Marie Skłodowska-Curie fellow at Biometric Law Lab
Oct. 11, 2023 · 11:06 a.m.
5 minutes Q&A - Biometrics in the era of AI: From utopia to dystopia?
Dr. Catherine Jasserand, KU Leuven (Belgium), Marie Skłodowska-Curie fellow at Biometric Law Lab
Oct. 11, 2023 · 11:42 a.m.
AI and Privacy
Alexandre Jotterand, CIPP/E, CIPM, attorney-at-law, partner at id est avocats
Oct. 11, 2023 · 11:48 a.m.
5 minutes Q&A - AI and Privacy
Alexandre Jotterand, CIPP/E, CIPM, attorney-at-law, partner at id est avocats
Oct. 11, 2023 · 12:06 p.m.
Preliminary Pperspectives on the Ethical Implications of GenAI
Julien Pache, A Partner at Ethix and Venture Partner at Verve Ventures
Oct. 11, 2023 · 12:12 p.m.
5 minutes Q&A - Preliminary Pperspectives on the Ethical Implications of GenAI
Julien Pache, A Partner at Ethix and Venture Partner at Verve Ventures
Oct. 11, 2023 · 12:30 p.m.
AI & Media: Can You Still Trust Information
Mounir Krichane, Director of the EPFL Media Center
Oct. 11, 2023 · 12:32 p.m.
5 minutes Q&A - AI & Media: Can You Still Trust Information
Mounir Krichane, Director of the EPFL Media Center
Oct. 11, 2023 · 12:54 p.m.
(Keynote Talk) Unlocking the Power of Artificial Intelligence for Precision Medicine with Privacy-Enhancing Technologies
Prof. Jean Louis Raisaro, CHUV-UNIL, assistant professor of Biomedical Informatics and Data Science at the Faculty of Biology and Medicine and the head of the Clinical Data Science Group at the Biomedical Data Science Center
Oct. 11, 2023 · 1:22 p.m.
5 minutes Q&A - Unlocking the Power of Artificial Intelligence for Precision Medicine with Privacy-Enhancing Technologies
Prof. Jean Louis Raisaro, CHUV-UNIL, assistant professor of Biomedical Informatics and Data Science at the Faculty of Biology and Medicine and the head of the Clinical Data Science Group at the Biomedical Data Science Center
Oct. 11, 2023 · 1:50 p.m.
Genomics, AI and Privacy
Julien Duc, Co-Founder and Co-CEO of Nexco Analytics
Oct. 11, 2023 · 2:01 p.m.
5 minutes Q&A - Genomics, AI and Privacy
Julien Duc, Co-Founder and Co-CEO of Nexco Analytics
Oct. 11, 2023 · 2:18 p.m.
How trust & transparency lead the success of an Idiap student's Master's project in fraud detection
Raphaël Lüthi, Machine Learning and AI Lead at Groupe Mutuel
Oct. 11, 2023 · 2:22 p.m.
5 minutes Q&A - How trust & transparency lead the success of an Idiap student's Master's project in fraud detection
Raphaël Lüthi, Machine Learning and AI Lead at Groupe Mutuel
Oct. 11, 2023 · 2:38 p.m.

Recommended talks

Final pitch, team ProtectID
OTROSHI SHAHREZA, Hatef, Idiap Research Institute
Aug. 26, 2021 · 2:44 p.m.