Player is loading...

Embed

Copy embed code

Transcriptions

Note: this content has been automatically generated.
00:00:02
and uh uh well i'll i will begin to uh uh media uh right now so
00:00:08
i'm i'm going from the a. p. f. l. time and the
00:00:10
head of the initiative for media innovation which which was created by the
00:00:14
the full a founding members uh that you can see on top so accurate buttons and and and media partners
00:00:21
and in me is is basically a platform
00:00:23
bringing together academia media companies uh to advance a
00:00:28
research science and also to transfer a technology and knowledge back to the to the to the industry
00:00:35
and it's a with media it's a it's a very multi disciplinary uh uh efforts so you can you know
00:00:41
you have technology information technology p. f. l. universities
00:00:45
but you also have social sciences journalism so it's uh
00:00:48
a very uh a broad mix of uh of feels like that come into play when it when it that when it comes to
00:00:55
to to media innovation as you can see on the you know the the right there uh
00:00:59
when i'm when i'm talking about media um i'm i'm talking uh mostly about traditional media here
00:01:05
but you know so uh it could be also you know media companies uh distributing content on
00:01:10
social media it's uh it's quite broad but it's it's a around the the initiative it it's mostly
00:01:17
a traditional media media outlets and one of the one
00:01:22
of the the the the application of the field where
00:01:25
we work in a is it a artificially intelligent i should say machine learning right
00:01:29
so application of mentioned owning a for media and we'll talk about this because right now was
00:01:35
gentle way i it's it's kind of changing the game for for media integrated industry uh that much
00:01:41
i wasn't supposed to have a project that we initiated enhance overboard with the insurance so
00:01:48
this one's the challenge came from doesn't have
00:01:51
somewhat is ah the cubs broadcast the sources german
00:01:56
signal doesn't bother sets and as page seven issue as the years pass of the programs in
00:02:02
has is excessively to um some passages and something else we can do this manually but it's
00:02:13
expensive one six times so managed to hit it big they had a big
00:02:17
already has some some sort of work flow document maybe transcribe just grates on western
00:02:24
wasn't two kinds he didn't have i said good standard hunt uh how do they have like a
00:02:29
little patience yeah so really word word words like
00:02:34
clank is not proper doesn't uh german so this
00:02:37
project really train models uh yeah that's why 'cause i don't like about the data from the set
00:02:42
i've i'm just i just say gee this is something that enables some you see you get hurt
00:02:48
full of the whole club will get to do some uses examples examples supposedly now lay eyes we
00:02:55
use like yours come from many sources now in media like uh like for the whole of the
00:03:00
unit industry in in roles and uh it would be just function doesn't bruise animals and also face
00:03:07
something using 'cause i'll possible so i can nation
00:03:10
figs uh_huh pen's always walks like now see gentlemen high
00:03:15
it is all the changes image click name expenses are quite
00:03:20
quite a quite a long time right now is what does all
00:03:24
this is also being being is just an just behind me here and and just you know it's it's when it ask uh
00:03:31
important question so second time you see the the iceberg today but
00:03:35
i think it's a it's a it's a great illustration of you know
00:03:38
how how great of course these these generative uh technologies are right so
00:03:44
you can you know for media you can make text uh you can
00:03:47
generate in in that you can do all kinds of things and this is true for for everyone and then then you have this darkest side
00:03:53
um and if you think about information in general one of my most pressing a
00:04:00
you know issues here is misinformation propaganda this information because these tools can of course
00:04:06
be used at scale to you know create all sorts of uh all sorts of
00:04:12
uh of content that could be a you know falls and and and and push
00:04:17
push ideas uh or um there's there's also and we
00:04:22
we talked about already today ethical issues ah just to
00:04:26
come back to medical i think starts with the data right we we talked about you know scrapping the internet
00:04:32
this is an illustration about this that i that i that i
00:04:35
like this is from the conical double the call this data never sleeps
00:04:40
it gives you an idea rough idea of the the amount of data that we all yeah well collectively uh
00:04:46
uh create to every minutes and this is the the the numbers for twenty twenty two every minute on the internet
00:04:52
and of course these models are trained with huge amount of data um
00:04:56
and and it's also a lot of personal data it's not personal data
00:05:00
only a media companies actually uh also being you know news website uh
00:05:04
being scraps uh i'm also for for for for data in this case
00:05:10
and media is interesting because it's quality data right
00:05:13
it's uh it's verified by journalist so you get
00:05:17
you know it's usually read written so it's it's quite interesting also for these companies
00:05:22
and then we we know we we talked about uh uh you know
00:05:25
uh biases and and so on the the the list goes on so
00:05:30
what does this mean also for media so i'll go back to my my
00:05:34
automatic uh you know uh so typing of of swiss german to to standard german
00:05:40
this project was interesting because it's uh it's it's typically a prada
00:05:44
it it lasted two years right so twenty four month to actually build
00:05:49
you know task specific models to do translation from from dialects to to stand adjourned
00:05:54
and at the end of the project it it then that um at the beginning of this year by the end of the project activity came up and that's
00:06:00
interesting because it's a it's a brute force approach right to to an l. p.
00:06:05
i. g. p. g. can do many task so translation yeah why not summarisation yes
00:06:11
and it was interesting to to have the the researchers one from unit to unit was java uh geneva here
00:06:17
doing a comparison of the output of their model that the
00:06:21
best the best model and a versus checked u. p. g.
00:06:25
and it did that they did that summit of evaluation so with evaluated is uh
00:06:29
to to to to look and to to try and and see which one was was pets
00:06:34
and they found out actually that chatty p. d. was quite good at producing
00:06:39
a an output for and in standard german uh starting from the from the
00:06:44
the the overall uh a transcription and and very roll uh a translation that
00:06:49
was quite good at that but you can also see the weaknesses so typically
00:06:54
you know it tends to add stuff it it used all you all use checked u. p. g. it's quite you know it
00:07:00
it likes long sentences so it's it's it's it's usually much too long analogue
00:07:04
rate for for for subtitles but it's to stay quite a compact and uh and
00:07:11
the the the words here of course as a nation one of the big
00:07:15
big a weakness of the of these i mean weakness it's it's part of how they're they're being bills of course
00:07:22
so you you also sometimes have power phrases and you have like you
00:07:25
know uh uh changes uh changes and and things that are completely actually invented
00:07:32
and and wrong so it's extremely powerful but also it it has a lot
00:07:37
of a whole lot of witnesses and despite this kind of you know minutes
00:07:44
media companies today like any industry uh cannot just you know
00:07:48
look away and say well not good enough i'll i'll just continue
00:07:52
the way i work now and it's it's just impossible to do that because you're you will
00:07:57
you will you will just lose the game it's a competitive game for media it's quite a
00:08:02
you know it's it's quite hard so they have to also look into these technologies
00:08:07
and and see how they can use it in a medical and and and in a way to to to basically
00:08:12
keep the trust they have with their audiences so i'll show you a an example of what that can can mean
00:08:19
so this is an example for from startup called the user me i and this is
00:08:23
this is like the perfect a little tool box for a a journalist and editor so here
00:08:29
they have access to uh you know all kinds of small apps we can do transcription
00:08:33
they they have a like a recorded the interview you could generate an image why not
00:08:38
uh do some fact checking summary we can you know have how it's it's
00:08:43
of course they will also that they will always verify information that's out journalist work
00:08:48
so even if they they use such a tool well you can say they will verify information table cross check it
00:08:54
and the output will still be published by a human and it's it's it's still quality content why not and then
00:09:02
even so it does raise ethical ethical issues so this this is
00:09:07
design examples from the from the media and uh and and uh and and and and
00:09:11
and politics as well so on the left here you have a a a an article
00:09:15
from from break that was published uh uh this summer so it talks about uh you
00:09:19
know ask um made by by by by by youngsters that that stole a millions of uh
00:09:24
millions of of dollars also swings whatever okay plastic article it's all well the image that is used
00:09:32
here it's actually uh made using major and so it's a it's a synthetic in it you see
00:09:38
for young people are having fun having a drink in the uh i guess the private jets
00:09:43
that that so it's an illustration for this article but it's not a real image it looks real but
00:09:49
it's not real and this really ask you know
00:09:51
the attic ethical questions about about media about information can
00:09:57
can you use an image like these uh like this effort to to to illustrate
00:10:01
an article and it's the same thing here for a bit have so political party
00:10:06
and this is the complaint for the for the federal election and you can
00:10:09
see here in in that you you see activists blocking the street doesn't endurance
00:10:14
lights are on quite a powerful image but again
00:10:19
it's it's made by one of these um one of
00:10:22
these uh a i tool it's syntactic it's not real is it ethical in this case you know uh
00:10:29
same same kind of question so now in certain and a high news was one of the first uh
00:10:34
it's it's uh so uh media news portal to to actually have an a i tried to and say
00:10:41
say today audiences they they published it's uh earlier this year say okay this is
00:10:46
what have we would use this this type of technology and this is the red line
00:10:51
we for example we we want you know you know never use
00:10:55
uh an image that looks like a real image to illustrate an article for example
00:10:59
so they have the chart the s. s. gaze also writing a chart the the
00:11:03
the there's also consortium in europe working on a on a common charter that can be
00:11:08
can be shed by by media companies but it's uh you know it's always nice to innovate and try new tools but then you
00:11:14
have to be careful and same thing political parties also writing a
00:11:18
chart as a a right now i now of course we we
00:11:24
we read of some of you uh ah i hope still read
00:11:28
a newspaper or go on you side but the the reality is that
00:11:32
most people now get the news consortia media uh gets a you know have fallen social media
00:11:38
and and connect with people so dreamy yeah this is typically what you see on social media right
00:11:42
and the these apply really all of you will see in this picture is
00:11:45
so fake a restive tramp uh well princes in is the man second coat
00:11:51
okay it's you know it's their priorities uh it's it's it's for a good laugh um but it's
00:11:57
again photo realistic images you can still may d.'s with the the the tools that are available now
00:12:03
and the look quite uh quite like reality and you you can
00:12:07
have fun with them uh but some people you know get get the
00:12:10
seed get get a and and it it can also of course
00:12:14
the use but for for propaganda so it's quite uh quite dangerous um
00:12:20
a back to back to media e. texts easy i mean all images you can do that
00:12:26
and now we we also see you know that depict technologies we talked about that uh earlier today
00:12:32
uh for for audio so orderly fake video and it it has a news in media i think
00:12:38
that the question was asked us um at some point well can can really deep vaguely used in uh
00:12:44
but you know to make sense and for for for useful a useful application that
00:12:49
they seem to be some but then again does ethical questions in in using that and and trust a a question
00:12:54
using that so this is an example from from the united states as the first radio a cool live ninety five five
00:13:01
no i don't uh that is using a a virtual radials
00:13:05
out not not just virtual it's a it's a it's a
00:13:09
it's a digital twin of their star one of the one
00:13:13
of their uh radio host and she's called the actually easy
00:13:16
and here they have one a no duty takeover voice
00:13:21
uh that's the first video that will will show in uh in the second hopefully it works
00:13:25
and then they they really have a virtual virtual copy of actually z. called a i actually
00:13:32
that actually on air so when she's on vocation
00:13:34
typically um they use uh the user digital twain
00:13:40
on air to converse with the audience so what we'll see that now in in these exist
00:13:51
so
00:14:02
they
00:14:04
uh_huh uh_huh
00:14:29
yeah
00:14:35
i quite impressive so it they they use a tool called radio g. p. g.
00:14:40
the you know the name does it all it's like checked u. p. t. but uh
00:14:43
you you actually have voice uh like a a voice conversation it's quite uh oh what works quite well and
00:14:49
again on radio they you know the the a i will say it's a i actually it's not actually easy right
00:14:56
when when when when actually season is on vacation but still you know the it's you
00:15:02
know it it's it's always difficult to to really know who's who's who actually talking to
00:15:07
and same thing i i didn't say that but for for the for the examples for for for big
00:15:12
and the the article and and political parties you have you haven't mentioned you have a
00:15:16
text saying you know it's it's done with no journey it's done with and with a
00:15:21
with an a. i. too but again you see the picture you remember the picture you
00:15:26
you did on this is we read the small text so it's it's kind of uh
00:15:30
it's it's uh it it's always although always difficult now what you can do would you
00:15:35
know be vague with voice you can do of course now with video with having an avatar
00:15:40
and and this is a an example from twenty twenty already uh in in south korea uh n. b.
00:15:46
n. is a is a news channel so they have you know they have breaking news you can have news
00:15:51
you know any time of the of the day during the week and so they wanted to
00:15:54
have a a copy of their stop isn't again to keep this this relationship of trust so we
00:16:00
when when she's not in the studio it's not just a news that's
00:16:04
written on screen is something that's you know a bit cold they wanna have
00:16:07
like a a replicate of their of the host and that's what they
00:16:11
did they did they did a action so i'll just show you a trashy
00:16:23
i know i see oh yeah yeah yeah yeah i know well so she
00:16:31
says presenting here for her to it and as you can see is it really
00:16:36
looks ally so you you really have you have the the you know the face yeah it
00:16:41
did take you have the voice as well and you also have the gestures so they really
00:16:45
try to reproduce hair as as as close as possible and is it actually surprise some of
00:16:52
the the audience in in this case because at some point the the the digital twain appeared
00:16:58
on on screen and it wasn't really clear that is it was or so people
00:17:02
then you about it and it you know it started a southern conversation now what can be used
00:17:08
for for for t. v. channel like this of course can be used by by propaganda so you've seen
00:17:14
i'm sure many example this is one uh from from last
00:17:18
year with the uh deep fake of uh values even ski
00:17:21
your green president and he's he's basically is a telling his troops to a drop
00:17:26
the weapons so ran the and and uh or a player a few luke release here
00:17:38
lucia i'm just just to illustrate this is we could
00:17:47
we could call this a cheap fake it's it's not as good as the the the the example
00:17:51
from from south korea other examples but it's still
00:17:55
it's still quite quite dangerous this was was was uh
00:17:59
was was a diffused on social media of course but the the guys who did that also hacked
00:18:04
a ukrainian uh use website and the poster that video on the news website to try and you know
00:18:11
show it as something of it is something that was true and and
00:18:15
verified so it's it's extremely dangerous even though the the example is not the
00:18:20
the the the the greatest in in the way it was and i know
00:18:25
the future is already moving towards you
00:18:28
know a video text text to video uh
00:18:31
um uh models is this is an example just to to illustrate from from goodwill
00:18:35
called finicky so here like images you describe what you want a new video sequence
00:18:42
uh and you're you're even your own directory right because here you can say well you know so men so now you travel
00:18:48
so you can really you could get really describe the video as you want it to be and
00:18:54
again it's fantastic it will at shorty change the way you know the creative industry the media industry
00:19:01
we'll we'll approach content uh the thing cinema i think everything
00:19:05
will change when these uh technologies are are mature enough
00:19:09
um for now it's you know uh it's it's it's not there
00:19:13
yet but it's it's moving uh extremely fast i and we we we
00:19:20
we we talked about these these so now looking at at actions
00:19:24
uh what what can be done we we already covered some of these
00:19:27
uh uh today we talked about regulation standardisation does it's it's it's also part of
00:19:33
the the solutions that being now implemented like you pick trust so the the way you
00:19:38
can you can you know add information in into a a constant in all the
00:19:43
province to know exactly if it was manipulated change and and and and have this uh
00:19:48
um have this yeah this this chain of uh of where where the where the condom comes from basically
00:19:55
um you have technical solutions you know anybody working uh today at
00:19:59
my idiot for e. p. f. uh on deep fake for example
00:20:03
to try to improve the detected no the the the the take applications and and in technology also work
00:20:08
on detectors ah also yes of course you you're helping
00:20:13
making these uh these these contents more uh you know
00:20:17
uh to to asked was life and you also have to the detectors but it's um it's a cat
00:20:22
and mouse game because detectors don't really work or you know they work for awhile and then technology improve
00:20:28
so regulations annotation thickening solutions will it kind of slow and they
00:20:33
don't always work as as a standard so i live i think
00:20:37
personally that that media technology literacy is one of the biggest uh
00:20:41
you know one of the biggest step we can we can take
00:20:44
and this is you know starting from kids but uh it's it's a continuous uh a continuous education the effort
00:20:51
so i yeah i'll i'll end up on this way i was asking the question at the beginning you know can
00:20:57
you still trust information i really think you can uh if
00:21:00
you if you use and choose your your your sources wisely
00:21:04
uh i think you know some some information on the internet is still there five by journalist
00:21:10
or we could we could be the uh for example to take uh to take a few examples
00:21:14
uh but you you have to be you know you you need to the critical uh a sense
00:21:18
and and the and the proper education also now to be more and more careful as as things are

Share this talk: 


Conference Program

(Keynote Talk) Privacy-Preserving Machine Learning : theoretical and practical considerations
Prof. Slava Voloshynovskiy, University of Geneva, professor with the Department of Computer Science and head of the Stochastic Information Processing group
Oct. 11, 2023 · 7:55 a.m.
2509 views
5 minutes Q&A - Privacy-Preserving Machine Learning : theoretical and practical considerations
Prof. Slava Voloshynovskiy, University of Geneva, professor with the Department of Computer Science and head of the Stochastic Information Processing group
Oct. 11, 2023 · 8:40 a.m.
Enabling Digital Sovereignity With ML
Vladimir Vujovic, Senior Digital Innovation Manager, SICPA
Oct. 11, 2023 · 8:44 a.m.
5 minutes Q&A - Enabling Digital Sovereignity With ML
Vladimir Vujovic, Senior Digital Innovation Manager, SICPA
Oct. 11, 2023 · 8:58 a.m.
Privacy-Enhanced Computation in the Age of AI
Dr. Dimitar Jechev, Co-founder and CTO of Inpher
Oct. 11, 2023 · 9:01 a.m.
138 views
5 minutes Q&A - Privacy-Enhanced Computation in the Age of AI
Dr. Dimitar Jechev, Co-founder and CTO of Inpher
Oct. 11, 2023 · 9:20 a.m.
Privacy by Design Age Verification & Online Child Safety
Dr. Onur Yürüten, Head of Age Assurance Solutions and Senior ML Engineer in Privately
Oct. 11, 2023 · 9:26 a.m.
5 minutes Q&A - Privacy by Design Age Verification & Online Child Safety
Dr. Onur Yürüten, Head of Age Assurance Solutions and Senior ML Engineer in Privately
Oct. 11, 2023 · 9:41 a.m.
(Keynote Talk) Biometrics in the era of AI: From utopia to dystopia?
Dr. Catherine Jasserand, KU Leuven (Belgium), Marie Skłodowska-Curie fellow at Biometric Law Lab
Oct. 11, 2023 · 11:06 a.m.
5 minutes Q&A - Biometrics in the era of AI: From utopia to dystopia?
Dr. Catherine Jasserand, KU Leuven (Belgium), Marie Skłodowska-Curie fellow at Biometric Law Lab
Oct. 11, 2023 · 11:42 a.m.
AI and Privacy
Alexandre Jotterand, CIPP/E, CIPM, attorney-at-law, partner at id est avocats
Oct. 11, 2023 · 11:48 a.m.
5 minutes Q&A - AI and Privacy
Alexandre Jotterand, CIPP/E, CIPM, attorney-at-law, partner at id est avocats
Oct. 11, 2023 · 12:06 p.m.
Preliminary Pperspectives on the Ethical Implications of GenAI
Julien Pache, A Partner at Ethix and Venture Partner at Verve Ventures
Oct. 11, 2023 · 12:12 p.m.
5 minutes Q&A - Preliminary Pperspectives on the Ethical Implications of GenAI
Julien Pache, A Partner at Ethix and Venture Partner at Verve Ventures
Oct. 11, 2023 · 12:30 p.m.
AI & Media: Can You Still Trust Information
Mounir Krichane, Director of the EPFL Media Center
Oct. 11, 2023 · 12:32 p.m.
5 minutes Q&A - AI & Media: Can You Still Trust Information
Mounir Krichane, Director of the EPFL Media Center
Oct. 11, 2023 · 12:54 p.m.
(Keynote Talk) Unlocking the Power of Artificial Intelligence for Precision Medicine with Privacy-Enhancing Technologies
Prof. Jean Louis Raisaro, CHUV-UNIL, assistant professor of Biomedical Informatics and Data Science at the Faculty of Biology and Medicine and the head of the Clinical Data Science Group at the Biomedical Data Science Center
Oct. 11, 2023 · 1:22 p.m.
5 minutes Q&A - Unlocking the Power of Artificial Intelligence for Precision Medicine with Privacy-Enhancing Technologies
Prof. Jean Louis Raisaro, CHUV-UNIL, assistant professor of Biomedical Informatics and Data Science at the Faculty of Biology and Medicine and the head of the Clinical Data Science Group at the Biomedical Data Science Center
Oct. 11, 2023 · 1:50 p.m.
Genomics, AI and Privacy
Julien Duc, Co-Founder and Co-CEO of Nexco Analytics
Oct. 11, 2023 · 2:01 p.m.
5 minutes Q&A - Genomics, AI and Privacy
Julien Duc, Co-Founder and Co-CEO of Nexco Analytics
Oct. 11, 2023 · 2:18 p.m.
How trust & transparency lead the success of an Idiap student's Master's project in fraud detection
Raphaël Lüthi, Machine Learning and AI Lead at Groupe Mutuel
Oct. 11, 2023 · 2:22 p.m.
5 minutes Q&A - How trust & transparency lead the success of an Idiap student's Master's project in fraud detection
Raphaël Lüthi, Machine Learning and AI Lead at Groupe Mutuel
Oct. 11, 2023 · 2:38 p.m.

Recommended talks

Projet UNICITY pour le contrôle d'accès & Perception & activity group
Jean-Marc Odobez, Idiap Senior Researcher, EPFL Maître d'enseignement et de recherche
Aug. 28, 2019 · 9:41 a.m.
112 views