Player is loading...

Embed

Copy embed code

Transcriptions

Note: this content has been automatically generated.
00:00:00
or a perspective on the ethical implication of jenny i said parameter parameter it
00:00:05
because obviously it's early we don't know yet and ethical question spent a long
00:00:10
lost how so i don't wanna have be for kind of captured on on
00:00:14
on this on this though so um ethics is basically two things it is um
00:00:21
i think tank and it is a digital ethics advisory for so we try to
00:00:26
advise organisation on the responsibility they have to take all and uh the put digital change
00:00:33
two words of myself a training as a lawyer
00:00:36
and also as a field or science and uh
00:00:40
uh some some the two with ethics with my training in philosophy but uh today i have
00:00:44
mostly uh the hat of the at this is the asset i'm also venture partner
00:00:49
uh adverb ventures with a venture capital for we invest in the technology
00:00:54
so why they have these two hats so if i find something i hope they will
00:00:57
for not for cup society in a couple of years so i'm trying to ask myself
00:01:02
uh the question before doing so yeah i got so what i'll talk about today so
00:01:08
i'm personally have on a project very small product is not an academic project which activity
00:01:14
um i well the tide that um then i'll talk about tragically to human i think we talked to within
00:01:20
the first talk about design and what's the important of design all try to to give some some thoughts about that
00:01:26
and then chat g. p. t. on workers a while
00:01:29
they are using it today and what is this potential
00:01:32
impacts social impact by having it i'm not gonna qualify like an economist but also some ethical question about it
00:01:39
i'll go to narratives on they i and try to show you
00:01:41
that these narratives tend to actually frame the questions we arc styles self
00:01:46
and to see if we are present uh uh and actually realised how these frames um and finish with that
00:01:52
because it's topical today so what's i i think their impact on security and privacy now we think about it
00:02:00
so the project with it all the unchanged beatty so it
00:02:03
was a very practical project nothing academic so wanted and always
00:02:07
where are we in the actually in right now the adoption
00:02:10
of tragically t. what is the impact on different sectors so
00:02:14
we talked me sector that we thought there's been some some
00:02:17
impact in all there is impact going on some again journalism
00:02:21
uh the creative economy people you know playing with pictures and designing stuff
00:02:26
and also people from education right so we had
00:02:28
this people interviews these people that or you know practitioners
00:02:32
uh the output was you know asking six fundamental
00:02:35
philosophical questions dark on the horizontal to all categories
00:02:40
three use cases one for each category saying oh this is an ethical question that you will
00:02:44
have using this tool in you will work uh at and then we we gave them a i
00:02:50
can bass that two two aunts or um some reflections that it might have with their colleagues and
00:02:55
how works of the round it all this material is free for you to use if you're interested
00:02:59
uh and what i will do is now select one of the questions one of the six question and then go to
00:03:04
the to the impact on on workers were before obviously unlike
00:03:09
any flaws of four and i think to some extent also lawyers
00:03:13
tend to like definitions so i have to take one uh
00:03:16
with with uh with me on on attic so ethics is
00:03:19
a systematic reflection on value and
00:03:22
implementation on concrete decisions and choices right
00:03:26
and take is normative versus descriptive soul it telling what things
00:03:31
ought to be not all they are right so this is
00:03:35
uh to this morning we have a lot of descriptive stuff out things are but are we gonna discuss
00:03:40
how things could be or should be from an ethical
00:03:43
perspective right so there's three typical quite of questions we
00:03:49
are asking ethics some of them are at peaks in the technology in there we'll see we'll talk about
00:03:54
design their social justice question so how we deal with
00:03:58
the impact on social justice when it technology goes into
00:04:02
the into the caught the the society and does have an impact on how the frame are
00:04:07
narrated and how we think about uh about a i right or other technologies for the mat
00:04:12
to concretely well i'm i'm if i try to create baskets just to
00:04:16
around yourself your question about that that it takes algorithmic decade designed epics
00:04:21
the other side we have impact on label on people social interaction impact
00:04:25
on economic inequality uh yeah i'm not trying to take the wall oven
00:04:30
economists but just saying we should pay for what and why and the
00:04:33
whys basically why exactly or the moral argument for this person to support such
00:04:39
or such a cost and the last one is um how we make
00:04:43
sense of the whirling we frame the questions that are relevant to your ticket
00:04:46
to the right so i hope i have that clear because that be
00:04:50
able framework that that will that will help me go go through that so
00:04:55
well i was very happy in the first dog that we talk about design because most of the
00:04:59
time in this ad technologies neutral as we've heard so it's just the way we use it like a
00:05:05
um whether uh whether the s. or d. s. that it's that it's a knife right now
00:05:10
you could i mean i challenge you to have a bread cutting knife to do a crime
00:05:14
but yeah i mean i guess i guess i guess you could do it the second guy
00:05:18
one night it's probably there was an idea about what you would do with the knife right um
00:05:23
uh so i i guess i guess the design questions comes a
00:05:26
little bit more into play right so um i agree completely from
00:05:31
whatever part of the first door can i disagree with the european
00:05:34
commission if they don't see that like the technology is not neutral
00:05:38
it is the way you design it has an impact the way you want to
00:05:41
do it so can do we yeah alternate put the numbers trig drawn killing people
00:05:46
is probably not neutral thing because is built for that but i'm just saying very
00:05:51
very kind of a like a clear example to show that design matters right so
00:05:57
and and and here it it matters as well and i guess it matters as well when we look at a
00:06:02
i tools a lack chow lots and and and and others right and if i if i say something very stupid on
00:06:09
very unclear you stop me please right so chatty beattie to
00:06:14
human here i'm gonna talk about the design of tragedy b.
00:06:17
and i'm asking the question to you meant to say are we in a case which went to an triple more fires
00:06:23
um the uh the the tool or you guys actually anthropomorphic rising the way it is design
00:06:29
not saying it's bad or wrong per se i'm just trying to know what is the d. design choices we
00:06:35
we take so i'll ask chad g. p. c. um why did
00:06:38
obey i actually design you the weight is that okay it's because
00:06:42
is natural language is the best way to communicate it helps
00:06:46
the adoption people know with written languages of property for us as
00:06:49
human so it's also the easiest way to communicate might make
00:06:53
sense right it is friction last and will be engage with it
00:06:57
now the question i have is why i also add these three points i don't know if you've seen that but when
00:07:03
chad g. p. c.'s actually composing yeah these three points like
00:07:06
what someone is composing something was something to thinking about something
00:07:10
but the are you thinking work because in alternative could've been
00:07:14
okay you do nothing and just once you don't confuse the
00:07:18
passages blue one go but now you have it in this goal and you have a small stuff that's away with just
00:07:24
composing here so it's small it's small but it gives the impression that someone is composing that some
00:07:30
with teaching although it's an algorithm that strata actually
00:07:33
making some probabilities quite calculation that you will explain
00:07:37
me how it works but certainly not actually composing
00:07:40
the best sentence there is a thinking i'm thinking
00:07:44
on on the sun about how how to do it best right so this is a design question
00:07:49
and it seems small but obviously has some effects we've seen that he has some effect
00:07:53
and some people think was an engineer who will that thought that the thing was sent you
00:07:57
um and and and and act actually this dissension scott
00:08:01
also confirms all details like that when we actually tried
00:08:04
to and triple more flies things uh where maybe we we we should know that's of medical question is um
00:08:12
i want to do the work is so now i've changed uh oh sorry i've uh i changed
00:08:16
about the um i go to to narrative but that's also linked to the to the social question so
00:08:22
yeah i tool like chad to be taken either in
00:08:25
as people or replace people inside the bad right if you
00:08:29
get replaced because you like your job you don't wanna be replaced or your and you do your job better right
00:08:34
but these or narrative that we tend to to to to put in place a around
00:08:39
stuff now when you think about workers is it gonna be replaced are gonna be enhanced right
00:08:44
uh what is what is happening i think all the feel the only
00:08:48
thing that we noticed that would then store project which i g. d.
00:08:51
b. is that there will be a transition there is already a transition
00:08:54
and there are people adopting the tool so it's not a question how
00:08:58
usually will happen you will happen in other question is actually who will
00:09:02
take responsibility for that transition right this is the ethical question so i'll opportunities
00:09:08
obviously i'm not gonna tell you what is possible with this tool but pretty
00:09:12
sure it seems that you decrease the amount of time they need to prep
00:09:15
they actually have been scoring and personalise the teaching must faster so great the work
00:09:20
they work less uh a whole um they didn't create workers pretty could clear
00:09:25
as well they get more input less time and for the guys in marketing
00:09:29
great hyper personalisation mass content you gonna buy more stuff thanks their marketing so
00:09:35
it's also kind of good thing if you are marketers these day now some concerns
00:09:39
t. sure obviously had to deal with the whole class using g. i.
00:09:42
g. b. t. some creative work uh say but i thought it was really
00:09:46
kind of the creative person but there is a computer there's much import much
00:09:49
better much faster obviously feeling a little bit is empowered and the last thing
00:09:53
and i'll stop on that is that we had the interaction with the text
00:09:58
light of the text replied of a button switches the association of the writers
00:10:02
in switzerland to not the one striking in the u. s. but the ones
00:10:05
with slip that says uh okay and this seems like an existential risk for
00:10:10
because even if the level is not as good as it has been or that our that
00:10:15
we could do with we spend the day on it like spending for sure in in like
00:10:20
half a second it's much much better so they might just to
00:10:24
morrow be out of your right and the question is that whatever
00:10:28
it's not as if it's good or the bad thing is that if they are already out of the well who
00:10:31
do pay for that right is it is it obey i that needs to pay for that because they created the
00:10:36
problem in the first place and actually gonna collect the dollar whether or is it the state because the says okay
00:10:42
we have to we still all these people that use there's now um um we have to figure out or the people
00:10:47
themselves eight in the back choice just this technological change
00:10:51
are happening right now and this is knots in any regulation
00:10:56
somewhere is gonna be at some points at the political questions but obviously has a stance and how we frame things
00:11:01
and we take responsibility as a society to define who pays for this but i'm not gonna give answered by the
00:11:09
way that's the good thing about doing it ticks you off the questions but you do never aunts or any of those
00:11:15
so um narratives on on a i i'm just gonna but show show they've been so i said this one is a
00:11:21
very clear narrative that we have today in my in most
00:11:25
cases situation it's it's a mix of enhancement and and an replacements
00:11:30
um but the fact that we say that the machines can be workers and they can do work
00:11:35
uh is also kind of already and for providing is that you
00:11:39
or agent doing work right and before it was it you don't have
00:11:44
anyone else in human doing work right it's a good it's gonna property we have ourself
00:11:48
and now we're applying to a lot of about two two two or two two machines
00:11:53
ah this as important as well because if we frame it like that
00:11:56
we frame a competition between machines that are self or even this you know
00:12:01
if you look at this picture i thinking rob bottles thinking okay um the
00:12:05
raw possibly future if you look what you what um i think it was
00:12:08
is it easy test lab work uh tess lie i that has this robot
00:12:12
that looks i know human no wait it's gonna be doing task right the
00:12:16
fate due to the fact that we human eyes it in that way actually
00:12:19
put very frontal competition there are alternative we can say machine is that right
00:12:24
and then we we can when we say that it doesn't look like very much something we know always okay it's
00:12:29
obviously gonna of madison not replace us and obviously when we
00:12:33
frame things differently and we framing in a less and to
00:12:36
pummel file is the way it also change the bobble you change your opinion change in our to change the questions we
00:12:42
office okay right um and then try this one show um
00:12:47
uh um i've been i've been thinking about this one but
00:12:51
um you probably know the very famous paper from alan turing a computing machinery in intelligence
00:12:57
when these papers they oh okay machine think right and he he takes one paragraph and says
00:13:02
uh basically to use less questions because i i cannot define what thinking it's right and then it goes to the
00:13:09
imitation game and the whole turing test right but the the single fact that we say that the machine can think
00:13:15
what are the missy is intelligent is actually saying okay a property that we
00:13:20
think we thought as a deeply human we defining as possible for a for a
00:13:25
for a mushy right so now if i define intelligence as let's say information processing
00:13:31
surely but obviously you could say okay information processing that goes the machine but if
00:13:35
intelligent also include the capability of having a moral judgement then obviously makes more difficult
00:13:41
to have machines have that and then when will not speak about a i in
00:13:44
the first place because there are no artificial intelligence of the definition would give us
00:13:48
a society from intelligent could not fit machines so i'm i'm i'm i'm thinking here
00:13:55
it's not unusual concept we change concept and uh we
00:13:58
define what concepts are and we always say it's philosopher
00:14:02
stop playing with words that the only thing you do is playing with words but these words matter in the
00:14:06
end when we actually frame the question this is this is an example of now at all that to say
00:14:14
what are the impacts on privacy and and and security
00:14:18
i'm here again i i take the street kind of um
00:14:22
the this kind of type of questions and i link them basically to to privacy
00:14:27
so if you go to a high narrative and b.
00:14:29
o.s these we have talked about what is proportional to accept
00:14:33
for this amount of benefit so i think we had this this day does
00:14:37
this new things all all now now you know we have more economic benefit
00:14:40
if you give out more data it's easier to enter thing or it's easier
00:14:45
to actually buy something so why not doing more data on more more economic
00:14:49
benefits we grow the economy it is good so just restrict the little bit
00:14:53
you liberty it's fine right so where do we say that there is a
00:14:57
live in there and where exactly is the limit this is the discussion we
00:15:00
have in the narrative we frame around the eye does make a difference there
00:15:04
a design intact obviously more data collected where yeah i
00:15:08
uh tools improve them obviously they get they got that smart
00:15:11
uh and that high mark my words now when i say smart uh the
00:15:15
the say the the got better because the collected more data and the more
00:15:19
they that they have the better the be all the better the all the less we do them all we obey uh do we only know well being
00:15:25
these are things that that we have to be aware off when we say okay if
00:15:29
we design it that way it would just collect more data to collect more data there's a
00:15:32
benefit but there is also cost to now if i go to the same um thing
00:15:37
on on on security uh one thing is this is a i race right to say okay
00:15:43
we are competing on values let's say this country not gonna name anyone has
00:15:47
this kind of value and we have this kind of values so it's important for
00:15:51
us that maybe in that race because then we we prefer our values right because
00:15:54
obviously there were no when euros so values if we say that so so basically
00:15:59
this is a risk in if we the only thing we know is that we
00:16:02
screen is the speed dog is not weakened use even actually designing stuff with a
00:16:07
more secure way or ensuring security you know the so the faster we go the
00:16:12
wrist there is yeah to not just the the name of the game or values
00:16:16
and uh i i challenge anyone to actually point that it's too which
00:16:19
values with competing against that that is obviously a tricky question social justice
00:16:24
um concentration of wealth so if open a i ended in
00:16:29
a very tiny per the tiny number of corporation actually gets
00:16:33
to to get most of the a i. k. h. it's actually the case i guess for the
00:16:37
moment at least or maybe four five companies competing to be the the platform for this kind of
00:16:43
thing um and the the actually get all the money of that i'm probably will have a we
00:16:48
have a lot of people not happy with that if a lot people not not happy we have
00:16:53
experience in front a couple of century goal is not really conducive of building a a society
00:16:58
that got some recording heads in the and i'm not saying is happening but definitely social cohesion is
00:17:03
that question and ways when there is a disproportionate kind of walls distribution the last one and
00:17:08
i'm not gonna say anything because i think it's uh it's for the next the a. i. perspective
00:17:13
oh perspective on a ah it's about democracy
00:17:16
right no mass info creation mass content creation
00:17:20
could lead to appease the more logical relativism which facts although want if if there is a
00:17:26
postponement epistemology go relativism how do we build a society because we don't agree on the facts

Share this talk: 


Conference Program

(Keynote Talk) Privacy-Preserving Machine Learning : theoretical and practical considerations
Prof. Slava Voloshynovskiy, University of Geneva, professor with the Department of Computer Science and head of the Stochastic Information Processing group
Oct. 11, 2023 · 7:55 a.m.
2509 views
5 minutes Q&A - Privacy-Preserving Machine Learning : theoretical and practical considerations
Prof. Slava Voloshynovskiy, University of Geneva, professor with the Department of Computer Science and head of the Stochastic Information Processing group
Oct. 11, 2023 · 8:40 a.m.
Enabling Digital Sovereignity With ML
Vladimir Vujovic, Senior Digital Innovation Manager, SICPA
Oct. 11, 2023 · 8:44 a.m.
5 minutes Q&A - Enabling Digital Sovereignity With ML
Vladimir Vujovic, Senior Digital Innovation Manager, SICPA
Oct. 11, 2023 · 8:58 a.m.
Privacy-Enhanced Computation in the Age of AI
Dr. Dimitar Jechev, Co-founder and CTO of Inpher
Oct. 11, 2023 · 9:01 a.m.
138 views
5 minutes Q&A - Privacy-Enhanced Computation in the Age of AI
Dr. Dimitar Jechev, Co-founder and CTO of Inpher
Oct. 11, 2023 · 9:20 a.m.
Privacy by Design Age Verification & Online Child Safety
Dr. Onur Yürüten, Head of Age Assurance Solutions and Senior ML Engineer in Privately
Oct. 11, 2023 · 9:26 a.m.
5 minutes Q&A - Privacy by Design Age Verification & Online Child Safety
Dr. Onur Yürüten, Head of Age Assurance Solutions and Senior ML Engineer in Privately
Oct. 11, 2023 · 9:41 a.m.
(Keynote Talk) Biometrics in the era of AI: From utopia to dystopia?
Dr. Catherine Jasserand, KU Leuven (Belgium), Marie Skłodowska-Curie fellow at Biometric Law Lab
Oct. 11, 2023 · 11:06 a.m.
5 minutes Q&A - Biometrics in the era of AI: From utopia to dystopia?
Dr. Catherine Jasserand, KU Leuven (Belgium), Marie Skłodowska-Curie fellow at Biometric Law Lab
Oct. 11, 2023 · 11:42 a.m.
AI and Privacy
Alexandre Jotterand, CIPP/E, CIPM, attorney-at-law, partner at id est avocats
Oct. 11, 2023 · 11:48 a.m.
5 minutes Q&A - AI and Privacy
Alexandre Jotterand, CIPP/E, CIPM, attorney-at-law, partner at id est avocats
Oct. 11, 2023 · 12:06 p.m.
Preliminary Pperspectives on the Ethical Implications of GenAI
Julien Pache, A Partner at Ethix and Venture Partner at Verve Ventures
Oct. 11, 2023 · 12:12 p.m.
5 minutes Q&A - Preliminary Pperspectives on the Ethical Implications of GenAI
Julien Pache, A Partner at Ethix and Venture Partner at Verve Ventures
Oct. 11, 2023 · 12:30 p.m.
AI & Media: Can You Still Trust Information
Mounir Krichane, Director of the EPFL Media Center
Oct. 11, 2023 · 12:32 p.m.
5 minutes Q&A - AI & Media: Can You Still Trust Information
Mounir Krichane, Director of the EPFL Media Center
Oct. 11, 2023 · 12:54 p.m.
(Keynote Talk) Unlocking the Power of Artificial Intelligence for Precision Medicine with Privacy-Enhancing Technologies
Prof. Jean Louis Raisaro, CHUV-UNIL, assistant professor of Biomedical Informatics and Data Science at the Faculty of Biology and Medicine and the head of the Clinical Data Science Group at the Biomedical Data Science Center
Oct. 11, 2023 · 1:22 p.m.
5 minutes Q&A - Unlocking the Power of Artificial Intelligence for Precision Medicine with Privacy-Enhancing Technologies
Prof. Jean Louis Raisaro, CHUV-UNIL, assistant professor of Biomedical Informatics and Data Science at the Faculty of Biology and Medicine and the head of the Clinical Data Science Group at the Biomedical Data Science Center
Oct. 11, 2023 · 1:50 p.m.
Genomics, AI and Privacy
Julien Duc, Co-Founder and Co-CEO of Nexco Analytics
Oct. 11, 2023 · 2:01 p.m.
5 minutes Q&A - Genomics, AI and Privacy
Julien Duc, Co-Founder and Co-CEO of Nexco Analytics
Oct. 11, 2023 · 2:18 p.m.
How trust & transparency lead the success of an Idiap student's Master's project in fraud detection
Raphaël Lüthi, Machine Learning and AI Lead at Groupe Mutuel
Oct. 11, 2023 · 2:22 p.m.
5 minutes Q&A - How trust & transparency lead the success of an Idiap student's Master's project in fraud detection
Raphaël Lüthi, Machine Learning and AI Lead at Groupe Mutuel
Oct. 11, 2023 · 2:38 p.m.

Recommended talks