Player is loading...

Embed

Copy embed code

Transcriptions

Note: this content has been automatically generated.
00:00:01
now actually a i and privacy and we'll discuss that a beat
00:00:06
a short presentation about me maybe the only important aspect
00:00:10
is that i have a low degree love background the not
00:00:13
the technical one uh which would impact the bubble i'm walking
00:00:17
in which is not the same as most of you are
00:00:20
um i'm also a member of the she's data protection organisation i'm just mentioning it
00:00:25
like because the the name will come back again during the presentation does g.
00:00:29
q. and that is an important player in the the privacy the uh regulation
00:00:37
and i i decided to do a relate to the helicopter
00:00:42
a view of this issue and present it might might might resent
00:00:46
on the uh this topic so have three topics out i want to discuss and and
00:00:51
um i hope my presentation will be a sufficiently brief so that you uh here e.
00:00:57
i hear it into the n. um people jump right to the first one is that
00:01:02
privacy is not to call problem of uh artificial
00:01:06
intelligence and and maybe for you it's something entirely logical
00:01:10
and again that might be a question of bubble because in
00:01:14
the ecosystem or yeah i work when we see artificially intelligent the
00:01:19
first things that come to our mind is why how it impacts
00:01:23
privacy all the problems that i have with data protection but actually
00:01:28
he here we can seen these images that private information is just a a
00:01:32
small topic and the all the that i think are much more important that
00:01:36
a privacy so maybe you're gonna say that you know i'm i'm up i'm
00:01:40
forcing a an open door and that's all very logical that it probably is
00:01:45
but the question that it raises for me is
00:01:48
who should be responsible for supervising or managing artificial intelligence
00:01:53
and when i look at again my bubbled maybe different in your view
00:01:59
but from my perspective in companies the persons that are being in charge
00:02:04
of deciding wait uh you know artificial intelligence can be use or not
00:02:09
is the data protection officer or the legal officer
00:02:12
or some one that highest is regular battery back around
00:02:16
re actually there is a lot of all the questions that
00:02:19
are very important article once that has been discussed just before
00:02:23
and that maybe i d. p. o. is not the best a person to answer
00:02:27
and same goes with the technical issues maybe you would say well should be you know
00:02:32
uh and take up the text or detective about that should be responsible
00:02:37
but generally not the way it goes uh uh you don't ask you know trader
00:02:42
and to define the rules for a trading a commodities insecurities
00:02:47
but the question is who should be responsible in an organisation that the same
00:02:52
question and we have i think regarding government agency we
00:02:55
should be responsible to supervise the use of artificial intelligence
00:03:00
and again if we see at all it has been handled so far
00:03:04
and europe that has been clearly a task of the data protection supervisory authority
00:03:11
especially one in italian we come back to that i to be different in the
00:03:15
u. s. for instance is the activity at that has a a and say something about
00:03:22
what should a artificial intelligence do and how we
00:03:26
can market and our products that use artificial intelligence
00:03:32
and here is a news from uh the can hit the commute is the fringe
00:03:37
and that uh production supervisory authority have from twenty
00:03:42
a tool and they basically stating that the camille
00:03:46
would be yeah the authority that is responsible for um
00:03:51
managing the new regulation that you just
00:03:54
heard about in europe regarding an artificial intelligence
00:03:59
and for me it really raises a question is it's the good all authority to do that should we not create special
00:04:05
authority to do so uh like and uh spain has done
00:04:08
now and and of course the risk is that if you
00:04:12
choose someone whose primary view is privacy then they will only focus on that and maybe
00:04:18
uh not too much and they would not focus enough on the other aspect so if there are
00:04:23
and policymakers in this room i urge you to consider
00:04:27
this aspect because i think that it's an important one
00:04:31
so going aspect and privacy related risk of a
00:04:35
i i'm not new and that just more intense
00:04:38
and here's an image that came out a just a few weeks after a charger pity was really is
00:04:44
and i think it's the perfect view of you know what happened
00:04:47
to the re on my bubble again of privacy just after the events
00:04:52
you see macro seven open i a i just very happy the showing charge it it's a
00:04:58
such a pity trying away i. t. privacy and trade secrets
00:05:01
goal which uh didn't had released bother that time just looking
00:05:06
and a bit stupid lady i rage uh data protection authority turning its back on the problem
00:05:12
and very few authorities and specially the guarantee which is italian super easier to reach it
00:05:18
and uh being very afraid and and making and some moves directly
00:05:25
and and if we really want to see how a privacy the interacts with
00:05:30
and artificial intelligence we have to see the the big picture again
00:05:34
and i would have the a helicopter view or maybe yeah we over simplify technical problems and it's a
00:05:41
good thing does is why lawyers do the over
00:05:43
simplify take problems and they over complex vital ago once
00:05:48
and so with at least two the first job i will not do the second one
00:05:51
but if we look at how an artificial
00:05:55
intelligence models interacts with data i see three ways
00:05:59
we have here the training that that we have spoken quite a lot about the training that than i would
00:06:03
speak again about that and then we have the include that uh so it's the user of the mother that will
00:06:09
include the that uh to uh the uh to carry something to get efficient divisions and then
00:06:15
the the model we're regular generate some that's uh that's the out about that
00:06:20
so if we more directly to the first one it puts
00:06:23
that are and we try to consider whether the the privacy impact
00:06:29
um the first question that we have is are we actually processing any person but also it can be
00:06:34
it but often it won't be and in that case that one be a person no that's a protection usual
00:06:40
which also raise the question well should really did look new you know dealing with that
00:06:45
and open questions and if we do process person of that because we will often do
00:06:50
then we have very very traditional that the production issues i'm not saying that they're simple to answer
00:06:55
but actually very simple so if you use you know the tools
00:06:58
from the and uh the two people start speak uh spoke this morning
00:07:05
and then uh it's basically managing them so you want to have your band there that is trustworthy
00:07:11
you want to ensure that that ties not transfer over the border specifically to the u. s.
00:07:16
without some safeguards in place you want to know all the data that you would be
00:07:21
it would be we use beats by chad's b. t. o. by the tools that we heard
00:07:25
and was being processed days on consent for further training the model
00:07:30
and it was this question about collecting mater that that's information about
00:07:33
the use of the series and it's complex but it's known it's
00:07:38
what we have done as though you're in that uh protection for
00:07:40
years and it's not something that is very specific to a artificial intelligence
00:07:47
yeah but i wanted to introduce one maybe different topic we're getting into that that does anyone knows what it is
00:07:55
i i yeah exactly it's called the rewind pendant
00:08:02
and it's basically an a. i. device that will record all your conversations
00:08:09
and with the view of making you know a permanent record a record of anything that you've talked about
00:08:15
and the uh n. w. you know to go back and say okay uh i forgot what my wife that's example that you
00:08:21
uh ice media out for the dresser is so i can come back to that or
00:08:25
maybe i want to hear again the first well the from a new one baby yeah
00:08:30
yeah i've got it raises a lot of questions you
00:08:32
know there was this uh commands on the uh on x.
00:08:37
uh about very creepy device it really looks like um
00:08:42
blacken your eye it's actually there was one it is that whether it was the exactly the same thing
00:08:48
and so apparently it will be uh i it will exist there is
00:08:51
already a price tag attached to it so it's it's not yet there but
00:08:54
it will be here and of course here great many new questions regarding
00:08:59
h. put that up because you would interpret that that it is not yours
00:09:02
and that race extra questions regarding consent authorisation uh and
00:09:07
they are currently in studying son pervasive preserving their thoughts
00:09:11
uh with voice fingerprints where basically we have today i consent to being recorded
00:09:16
and they will register a a fingerprint of your voice and if you don't then it will not record you a voice
00:09:24
so that's it for the uh input that i now the training that that's something
00:09:27
we already talked about uh this morning and and here again that's nothing very new
00:09:34
so basically the only difference is that the new scale so i you know
00:09:38
that's available up that that's about with new not new but and the more questions
00:09:44
and there was this question that was actually just ask before can i
00:09:47
scrap that are from the internet and it is a very interesting question indeed
00:09:52
and the answer is quite expensive and because it depends whether or not
00:09:58
you nor your product or your company will be firing d. us they've and
00:10:03
for one case decided that the algorithm that where i
00:10:07
created based on data that was collected uh with all
00:10:10
the local bases and would need to be raised in
00:10:14
saudi that to wait which are a case in the us
00:10:17
very interesting because we don't have this sort of penalties in europe ah switzerland and regarding the legal bases well then
00:10:24
uh can i spread in that the internet the answer is i i it
00:10:28
depends where they're you know who you are what you do and who you ask
00:10:32
so if you are the google search engine there was actually a case in a
00:10:38
two dozen fought in from the highest um the court of justice in the new route
00:10:44
and they basically said that you can they can scrape the intended to
00:10:48
uh created a um a search engine based on their legitimate interested only the content that they do so
00:10:55
then there was another case regarding clear view you know the company that
00:11:00
two pictures from and every faces a available on the web and then created
00:11:05
a a fashion recognition software then uh the can you remember the clear fake late
00:11:10
in or so they find a clear view twenty million euros in a twenty twenty two
00:11:16
and then if you ask the get and there's so the other uh uh de
00:11:19
uh uh italian suit every zero to retire get h. i. g. p. t. a.
00:11:23
yeah we have an interesting case 'cause i don't know if you remember
00:11:27
but when judge judy was released the italian guy and they say that now
00:11:33
you cannot uh i'm i'm not sure that you can comply with the g. d. p. or so i order you to stop
00:11:39
i'm using the services in italy and if you would go to if you
00:11:43
would be any today you could not accept access for something like a month
00:11:48
h. i. g. p. t. at all and one of the question that was raised was that there was a lack of legal
00:11:53
basis and it was a lot of discussion in my small bubble about that get because there was a really a question what will
00:12:00
and the guarantee do after the thirty days because this
00:12:03
was really clear that's tragic beauty would not changing thirty days
00:12:09
and after that said today is it well basically the uh
00:12:12
the guarantees that okay all think liaise again really open doors
00:12:16
uh you have not provided you know uh uh make an is for you to complain
00:12:21
and the uh charge would be to uh say or twenty i have said
00:12:25
that they would make a fourth correcting nectar it data although they cannot do that
00:12:30
uh and that's sufficient but no well uh or there are you know view
00:12:33
on whether or not you can describe the internet and then it based on you
00:12:37
uh illegitimate interest it's more i would say oh confirmation that you can do how much
00:12:43
time we have left okay and mark and you mentioned that you can do if you
00:12:49
ask me then there is the question about the secondary use of personal data so it's
00:12:54
links to scrapping that up the internet but it's also if you have your private we got
00:13:00
and you want to uh uh and reuse the data for uh
00:13:04
training y'all i'm a i'm a does then the question
00:13:07
is do i need to get consents and and here in
00:13:11
switzerland at least we have these general authorisation to process
00:13:14
press another data or what we call non person other purposes
00:13:18
so as long as your processing that up for me
00:13:21
and that is not to identify handling to specific individual
00:13:25
then you can reuse it without the need for consents it is the data protection view their specific
00:13:32
wills i stated here in the human research act i guess we're gonna have a uh some more information
00:13:38
on that just a afterwards or or will also return the uh
00:13:42
not to go on that topic that you can access a on
00:13:45
the um and then very quickly on us but that ah so
00:13:51
uh what ever you know come out comes out of the yeah mothers
00:13:56
and he i think we really have reached a we have privacy risks quite a lot
00:14:01
and a lot of other issues that you know your already have discussed about now but it's
00:14:07
where we can have i think the the mall rates from a personal uh point of view and
00:14:13
uh of course the the the the entry ponds for privacy laws is that
00:14:17
the the needs to be personal that that in all in the output data
00:14:21
and then you can have inaccurate result about the problem which i g. t. t. and here um the problem is that
00:14:28
it's one of the key principle of data protection laws that you have to ensure the accuracy of you press no that that
00:14:34
and it's often not do able but apparently no one cares and then you
00:14:38
can have two accurate results about individuals on the it's just questions about the capacities
00:14:44
of fashion recognition or the type of uh what cognition capabilities um and it's
00:14:51
a real real program and then there is all these you know uh did fakes
00:14:55
uh issues and it's i think a very interesting question is it your personal that down
00:15:00
uh if you want to go go there is a um um the case with that with uh get that you know the digit player
00:15:07
that actually recorded the song go with the voice of i
00:15:11
mean an but not with i mean i mean self uh um
00:15:15
and the uh created that song so it it is the voice of i mean and it is the style of i mean them
00:15:21
but it's not the song for many minimum and the question is well is it's the privacy right the
00:15:26
feminine or not because it's it's not really him that's polk and i think this was very very interesting questions
00:15:32
and the last topic that's relevant is all the topics about automated decisions
00:15:37
so the risks that you know computers will take decisions that impact those without a human
00:15:43
in the middle um i'm not sure it's actually a privacy issue but it's an issue that
00:15:48
has been button privacy laws so we have to do a without and and clearly it's a
00:15:53
it's an open a problem uh last binds
00:15:58
a solution it's exist but flexibility is needed and
00:16:02
when i say flexibility is more than actually you should be really not
00:16:07
to be compliant and i've had a few time you know deeper compliance
00:16:11
this morning i don't think that's something that can be achieved generally and
00:16:14
i really don't think that it can be achieved in the a. i.
00:16:18
walls and so there are steps that you can do to reduce your
00:16:21
risks have first you could avoid working with personal that whenever it's possible
00:16:27
there are techniques twenty mice the data and it do what
00:16:30
what it was but it's still a a good thing to do
00:16:34
then you can add up to all these privacy by design and by the full
00:16:37
techniques it's a requirement of the new figure that the production go in six in hand
00:16:43
and so it all the techniques that we have spoke about this morning to
00:16:46
reduce the identity ability et cetera et cetera and then there is does neither
00:16:52
and it's important in the new data protection act to conduct data protection
00:16:56
impact assessments so it's a systematic review of the privacy and security risk
00:17:01
of your processing activity and it will be required in most a. i.
00:17:05
project so you definitely need to do that and that will help you already
00:17:09
and take certain actions uh and mitigation so and last
00:17:13
but not least you will have to accept that whatever you
00:17:16
will do will not come from entirely with the regulations and

Share this talk: 


Conference Program

(Keynote Talk) Privacy-Preserving Machine Learning : theoretical and practical considerations
Prof. Slava Voloshynovskiy, University of Geneva, professor with the Department of Computer Science and head of the Stochastic Information Processing group
Oct. 11, 2023 · 7:55 a.m.
2509 views
5 minutes Q&A - Privacy-Preserving Machine Learning : theoretical and practical considerations
Prof. Slava Voloshynovskiy, University of Geneva, professor with the Department of Computer Science and head of the Stochastic Information Processing group
Oct. 11, 2023 · 8:40 a.m.
Enabling Digital Sovereignity With ML
Vladimir Vujovic, Senior Digital Innovation Manager, SICPA
Oct. 11, 2023 · 8:44 a.m.
5 minutes Q&A - Enabling Digital Sovereignity With ML
Vladimir Vujovic, Senior Digital Innovation Manager, SICPA
Oct. 11, 2023 · 8:58 a.m.
Privacy-Enhanced Computation in the Age of AI
Dr. Dimitar Jechev, Co-founder and CTO of Inpher
Oct. 11, 2023 · 9:01 a.m.
138 views
5 minutes Q&A - Privacy-Enhanced Computation in the Age of AI
Dr. Dimitar Jechev, Co-founder and CTO of Inpher
Oct. 11, 2023 · 9:20 a.m.
Privacy by Design Age Verification & Online Child Safety
Dr. Onur Yürüten, Head of Age Assurance Solutions and Senior ML Engineer in Privately
Oct. 11, 2023 · 9:26 a.m.
5 minutes Q&A - Privacy by Design Age Verification & Online Child Safety
Dr. Onur Yürüten, Head of Age Assurance Solutions and Senior ML Engineer in Privately
Oct. 11, 2023 · 9:41 a.m.
(Keynote Talk) Biometrics in the era of AI: From utopia to dystopia?
Dr. Catherine Jasserand, KU Leuven (Belgium), Marie Skłodowska-Curie fellow at Biometric Law Lab
Oct. 11, 2023 · 11:06 a.m.
5 minutes Q&A - Biometrics in the era of AI: From utopia to dystopia?
Dr. Catherine Jasserand, KU Leuven (Belgium), Marie Skłodowska-Curie fellow at Biometric Law Lab
Oct. 11, 2023 · 11:42 a.m.
AI and Privacy
Alexandre Jotterand, CIPP/E, CIPM, attorney-at-law, partner at id est avocats
Oct. 11, 2023 · 11:48 a.m.
5 minutes Q&A - AI and Privacy
Alexandre Jotterand, CIPP/E, CIPM, attorney-at-law, partner at id est avocats
Oct. 11, 2023 · 12:06 p.m.
Preliminary Pperspectives on the Ethical Implications of GenAI
Julien Pache, A Partner at Ethix and Venture Partner at Verve Ventures
Oct. 11, 2023 · 12:12 p.m.
5 minutes Q&A - Preliminary Pperspectives on the Ethical Implications of GenAI
Julien Pache, A Partner at Ethix and Venture Partner at Verve Ventures
Oct. 11, 2023 · 12:30 p.m.
AI & Media: Can You Still Trust Information
Mounir Krichane, Director of the EPFL Media Center
Oct. 11, 2023 · 12:32 p.m.
5 minutes Q&A - AI & Media: Can You Still Trust Information
Mounir Krichane, Director of the EPFL Media Center
Oct. 11, 2023 · 12:54 p.m.
(Keynote Talk) Unlocking the Power of Artificial Intelligence for Precision Medicine with Privacy-Enhancing Technologies
Prof. Jean Louis Raisaro, CHUV-UNIL, assistant professor of Biomedical Informatics and Data Science at the Faculty of Biology and Medicine and the head of the Clinical Data Science Group at the Biomedical Data Science Center
Oct. 11, 2023 · 1:22 p.m.
5 minutes Q&A - Unlocking the Power of Artificial Intelligence for Precision Medicine with Privacy-Enhancing Technologies
Prof. Jean Louis Raisaro, CHUV-UNIL, assistant professor of Biomedical Informatics and Data Science at the Faculty of Biology and Medicine and the head of the Clinical Data Science Group at the Biomedical Data Science Center
Oct. 11, 2023 · 1:50 p.m.
Genomics, AI and Privacy
Julien Duc, Co-Founder and Co-CEO of Nexco Analytics
Oct. 11, 2023 · 2:01 p.m.
5 minutes Q&A - Genomics, AI and Privacy
Julien Duc, Co-Founder and Co-CEO of Nexco Analytics
Oct. 11, 2023 · 2:18 p.m.
How trust & transparency lead the success of an Idiap student's Master's project in fraud detection
Raphaël Lüthi, Machine Learning and AI Lead at Groupe Mutuel
Oct. 11, 2023 · 2:22 p.m.
5 minutes Q&A - How trust & transparency lead the success of an Idiap student's Master's project in fraud detection
Raphaël Lüthi, Machine Learning and AI Lead at Groupe Mutuel
Oct. 11, 2023 · 2:38 p.m.