Player is loading...

Embed

Copy embed code

Transcriptions

Note: this content has been automatically generated.
00:00:00
hi everyone i hope you can you can hear me i'm sure this is quite a broad topic to
00:00:05
address in fifteen minutes uh that your perspective l. e. i. n. democracy but i try at least to
00:00:10
to do my best to give you a snapshot uh on certain certain points maybe some preliminary remarks um
00:00:17
there's no doubt that internet uh has the market has access
00:00:21
to content and vastly expanded you need words ability to create
00:00:26
share receive all kinds of backpack transforming uh
00:00:30
our relations balls was communication and with each other
00:00:34
uh alongside those benefits however there is no doubt either
00:00:38
that internet has entered societies and the individualised to their detriment
00:00:43
internet as being a channel and remain a channel that
00:00:47
to disseminate a harmful online content uh serving as a vehicle
00:00:51
uh to basically a whole body in this information
00:00:56
page with information on the on the mining in the and the safety and the dignity of individuals and
00:01:03
dividing societies and this devising democracies at
00:01:07
examples not related to a ah um
00:01:11
these traits are not due the cambridge analytic ask and that everybody remembers it
00:01:15
as well as basically what actually was talking about so the report by uh
00:01:21
robert moore about uh the russian interference during the two thousand sixteen
00:01:25
presidential election so the fact that algorithms basically you are
00:01:31
designed to cater online content to each user's
00:01:34
preference uh at has led to polarisation and fragmentation
00:01:40
on the of a society's captivating more extremist i.
00:01:43
d.s and two eroding cohesion to a large extent
00:01:48
couple of that event and closely related to that event of social midi out
00:01:53
uh and the concentration of power in a handful of big
00:01:57
tech companies that have remained largely unregulated so for that um
00:02:03
it's basically a created the society of surveillance capitalism as above puts it in these uh it's i
00:02:09
mean the work that basically uh where private actors sorry competes with governmental institutions are so what about
00:02:19
yeah well yeah at some points it only has an really five these concerns ah due to the
00:02:25
rapid advancement of the technology uh what so what are these rights we've already touched upon smuggle quick
00:02:31
but typically you can take examples uh already in the us a couple of years ago where they was
00:02:37
a a a case that went up to the peace comes in supreme court uh uh related to the use of uh
00:02:44
close i'm close source risk assessment tool to assess a of fans and the risk of
00:02:50
real fans of a convicted people and where the question was okay i mean can we basically
00:02:56
trust these two oh uh how does a court assess
00:03:00
basically the the transparency of the algorithm behind et cetera
00:03:04
uh it from a due process standpoint uh uh uh so that's a key issue
00:03:09
just for you to know the biscuits in supreme court finally ruled
00:03:12
that it was o. k. that there should be improvement in the future
00:03:16
as to information it's uh try to try and understand that the girls
00:03:21
by this isn't discrimination everywhere i'm typically for
00:03:24
twos for a hiring decisions a fundamental rights you
00:03:29
know that in china that's seven hundred million
00:03:31
cameras everywhere basically which obviously is a serious uh
00:03:37
fundamental right uh issue engaging effect on democratic participation and effects we've already mentioned that
00:03:44
i would say the fake not only for this information that's one thing that but also for private related issues that can have
00:03:51
a serious impact i'm thinking about revenge for for instance and you know that in the us not there is a of you
00:03:57
which is being discussed or do we see that the u. s. are quite reluctant to to to rule on anything but
00:04:03
on that regard they are thinking of putting a bomb on this type of uh
00:04:08
rightfully so deep that face still there are certain a coalition which exist and probably
00:04:15
you know it of the c. to p. a. u. m. i. which is uh
00:04:18
a partnership in a consortium by certain companies to
00:04:21
try a enabled the authentication of um content but so
00:04:28
far it sounds from what i have right i'm not initially and i'm not in a position to assess that
00:04:33
the outcome and it because see efficiency of these
00:04:36
two uh oh this partnership the is limited so
00:04:42
ultimately it sounds that went technology that is the
00:04:46
problem technology is not always the solution or at least
00:04:50
it is not always a sufficient um and
00:04:55
as people basically have becoming increasingly aware of the risks uh
00:04:59
and potentially harmful effects in association with digital tools and wheeze
00:05:04
these big tech companies vast economic associated powers it is not a surprise that there is
00:05:10
a great so cool words for a regulation but then
00:05:14
the question is what he's a regular to remote ah i
00:05:19
so basically uh if we look at a at a high
00:05:23
level we have street types of regulatory mother that exist worldwide
00:05:27
one he's the us approach so the us approach uh it
00:05:32
is i would what i would call it technically better and
00:05:35
you know it's a market driven approach uh then you have
00:05:38
the china uh which is it clearly states that even approaches
00:05:43
that technician autocracy i would say and then you have the e. u. which is more human rights oh that driven approach so
00:05:51
basically the us there is one fundamental right which darren's anything
00:05:56
it's free speech no free speech free internet incentives to innovate
00:06:01
and government intervention is considered to compromise the efficiency of the markets and two on
00:06:07
the mine ultimately individually but ah that's at least the general perspective the china ah
00:06:14
was in china it's very different the goal is to maximise the country's technological dominance
00:06:19
while maintaining social what they call our money on a and control over
00:06:23
the communication so it's rather a tool in service of but of course
00:06:29
and i you know uh it's very different to humans intrigue approach uh uh uh
00:06:34
and on like the us of course we speech does exist but it's not
00:06:38
the only fundamental right to focus on and we try to have a balance
00:06:42
between different fundamental rights uh and the host uh
00:06:46
of other considerations the outcome of this is the
00:06:50
following so in the u. s. typically you have so far which is currently on the review potentially
00:06:56
but the section code section two hundred thirty of the communications decency
00:07:00
act which pretty much provide for a wide exemption of interactive service providers
00:07:07
to say well in it in an action you are not
00:07:10
liable for whatever happens when you're platforms okay uh and this
00:07:15
is basically meant to create an incentive for innovation but with
00:07:19
significant consequence is an because basically they are totally exempted from miami
00:07:24
it's not now is being put into question uh you have worried about the executive older released a
00:07:31
last uh wharton i believe by uh the but by then
00:07:34
administration which is not the regulation actually it's more guidelines to
00:07:39
try and make some recommendations to before or the react to
00:07:44
the design it's a trial and governance around a i models so
00:07:49
the us a perspective heady relies on private sector ah
00:07:53
and guidelines and codes of conduct in china and so the cyber administration for china what's good yeah
00:08:00
it's actually a issued a certain number of regulation as well that that
00:08:05
very specific uh reacted to algorithmic putting all these for instance
00:08:09
or a more recently with the code into re measures i
00:08:12
don't know why intervene but interim measures about generated a ah
00:08:16
and what's quite interesting when you read a transacted i don't read about
00:08:22
translated in english or in french uh you have certain
00:08:26
things that ah come on twelve concerns like explain ability
00:08:30
transparency sources et cetera but it's always in relation
00:08:35
it's maybe a bit small there it's french translation
00:08:38
but it's always to say well you need to adhere to the social values it's a try
00:08:44
to protect basically uh the policy standpoint um uh of the
00:08:49
cuban communist party so it's always driven uh in that space
00:08:54
and in the u. we know uh of course there is now the student to call uh
00:09:00
the u. k. i. actor whose final question is being approved uh
00:09:06
so what is the regulatory model to follow obviously not expected the
00:09:10
chinese mother is not acceptable in i'll catch up and it sounds
00:09:15
more and more than relying on the private sector at the american model proves um
00:09:21
to gradually be a fader ah in the global presents of big tech companies associated with what is considered
00:09:27
uh i'm an excessive power they wield a globally
00:09:31
that triggers backlash now on a worldwide basis and internet
00:09:36
users are getting more concerned about online harmful content including
00:09:40
to respond bargain that all foreign interference is with elections
00:09:44
and as the number of high profile a cases increase
00:09:48
the tech industry has not distributing in saying that with that
00:09:51
knowledge is a problem technologies solution and maybe the attack
00:09:55
uh on the us capital in january two thousand and one
00:09:59
it's been a turning point basically in the us approach towards a regulating online uh contact so
00:10:06
we're here with the e. u. governments more than a and what does it mean
00:10:11
well first of all one should not forget that there is that ah
00:10:16
international instruments which already exist oh uh it's it's a very long time
00:10:21
one can mention if you the european convention rights european such a chapters et cetera et cetera i mean
00:10:27
they all apply well uh it's no matter whether it's a higher something estonia
00:10:31
uh to enter channels so all these exist and
00:10:35
apply the last existing national regulations oh you have
00:10:40
the g. d. p. r. so here in switzerland the figure that that protection act that supply criminal provisions which can
00:10:47
a regulator that issues such as the phoenix at c. v. liability i kiwis copyright et cetera so
00:10:54
all that is the framework that is applicable to a ah question being is it sufficient or not ah it's
00:11:00
another story and should we because all these existing tools
00:11:04
i would say i'm more from an ex post standpoint ah
00:11:08
so they're not yet to prevent demo here when the g. p. out
00:11:11
some point but the ammonia to cure it uh the current e. u.
00:11:16
e. i. act easier to put a governance framework which is a compliance
00:11:19
framework uh which is increasing approach for an x. on the approach to say
00:11:24
we try and address the risk already before the altar so
00:11:31
how does it look like this you reacted very very uh
00:11:34
not sure what you have certain underlying principle which i've been
00:11:38
directed by the ethics guidelines for transfer c. i. and which are
00:11:41
pretty much in line with the guidelines in principle set forced by the o. e. c. d.
00:11:47
um so i'm not going to go into the details you can read
00:11:50
that but it's basically all towards transparency i would say uh in a way
00:11:55
and the robustness of the the system um
00:12:01
now yeah i now what is interesting is that b. e. u. regulatory approach
00:12:08
and human rights approach try to head extract the return on the fact that
00:12:11
we've seen that with the g. d. already thirty try to say well any
00:12:15
developer of a system put in the marketing to you wherever they are based
00:12:19
will be subject to our regulation and any user of the output of such is time
00:12:26
which is used in the you will also be subject to uh our regulation so it doesn't apply only in the
00:12:32
u. but as soon as you have the touching point within the you you are submitted to which means pretty much
00:12:39
not going to see anyone but at least the powerbook
00:12:42
last light must a much larger variety of entities the structure
00:12:48
you've or read a lot about that so it's a risk
00:12:50
based approach which is meant to address these fundamental rights issue
00:12:55
and try to save quite as well uh what i would call a
00:13:00
democracy so you have a certain number of areas that that probably beat
00:13:05
it's important to note is that meet eatery that but bases and national security purposes i kept that out of
00:13:11
the regulation because it is considered that this is a matter of international public row and moral server a meeting now
00:13:19
but we know that just the term of national security and you remember the us but we got back it's a trend asked
00:13:26
it's already very white ah and it gives a lot of a three
00:13:30
way i would say it was consider that something is national security purpose without
00:13:35
falling and having to comply with the un backed away across these parties so
00:13:41
that probably we could track it seems you have a lot ah manipulative techniques
00:13:47
that should uh however lead to that significant harm whatever that
00:13:51
means that social scoring um read in reference to china huh
00:13:55
so read time remote identification system uh i have
00:13:59
been a a deviated area for a long time
00:14:03
finally the compromise with that it would be up to member states to assess whether they want
00:14:08
to uh or allow a realtime remote identification system back before it
00:14:13
is or not but if they do it will be only for
00:14:16
limited uh circumstances subject to prior authorisation for judicial
00:14:21
authority et cetera such as for instance missed children oh
00:14:25
uh which i didn't quite make sense uh put it it's a pack this is also coming the risk assessment such
00:14:33
i resist them they are at the core of the regulation basically you have to type of uh
00:14:39
fees in that space you the ones that are safety
00:14:42
components of a product already submitted to a compliance regulations such
00:14:47
as mac spec for instance no medical devices or other system
00:14:52
that i'm now uh uh you know next uh which is
00:14:56
related either to uh i would say a risk of biases uh
00:15:02
and discrimination uh for employment education for instant access
00:15:07
to a public services uh or that puts it
00:15:11
a a three seventy the security like critical infrastructure for instance um and then what is uh
00:15:19
what has come up in the latest versions finally adopted of the u. i. is an exemption up
00:15:24
of all this is them when there is no a risk to hats uh also fundamental rights
00:15:32
g. d. i. as opposed to the address i would say the keyword to remember here and i will let you the read
00:15:37
this like this are interested but its transparency uh basically you
00:15:40
need to be transparent uh asked to uh uh the generated output
00:15:47
made from such a system and also have a copyright police i would be curious to see i mean
00:15:51
how to come up with the copyright policy uh in this type of models and you will have additional
00:15:58
obligation if uh the g. p. i. issues high
00:16:01
impact capability which is guaranteeing then explanation twenty five flop
00:16:06
uh you know better than me what actually decent means um but uh this is it in a very very action
00:16:12
which is quite significant uh finds um uh if they do not comply
00:16:18
but a key issue uh and he's actually was touching upon that but as well
00:16:23
uh is not only the question whether these heavy regulation will
00:16:28
keener innovation in the you know some people say well it
00:16:31
is because you are so much regulated that basically you have
00:16:34
no big take a on like a new light behind us in china it may not be true because we also in
00:16:40
a fragmented digital signal market we have a lot of their it was we had a lot of languages we have a
00:16:44
lot of applicable laws uh at it with the finances on the developed oh uh if you want to get as a startup
00:16:51
more than ten million uh suisse front you need pretty much good to
00:16:54
the west coast because in switzerland you don't that's that type of investment
00:16:59
we have to make it bankruptcy load because uh like chapter eleven in
00:17:02
the us you cannot carry on your activities once robbing cropped up a at
00:17:07
and you have also we have also need the ability to attract the world best innovative that as we're proactive migration buttons
00:17:14
so then i'm not the only the question of the regulation that comes into
00:17:18
play they a lot of other factors that should be maybe put on that's good
00:17:23
and just to to to to wrap it up so to win this battle over us and
00:17:28
china uh you it needs to be able to the motor that it's mother can be enforced
00:17:33
and so far it remains a challenge in to win that they
00:17:37
need to test it against the big that companies like with vertical button
00:17:41
to basically try and predict and then maybe had by people even in the us like
00:17:46
that the knack and um uh who is the head of the f. t. c. and
00:17:51
was one of the first uh you know seminal paper already published quite some time ago
00:17:56
to say that these big companies from unknown factor standpoint should be dismantled uh
00:18:01
but in any case there is a lot of work ahead of us
00:18:03
uh and i will close uh on these words and if you are interested
00:18:08
but only if you are interested i realise every week and the paper
00:18:11
on the u. a. act with that i've tried to the sick it's uh

Share this talk: 


Conference Program

Opening and introduction
Prof. Lonneke van der Plas, Group Leader at Idiap, Computation, Cognition & Language
Feb. 21, 2024 · 9 a.m.
Democracy in the Time of AI: The Duty of the Media to Illuminate, Not Obscure
Sara Ibrahim, Online Editor & Journalist for the public service SWI swissinfo.ch, the international unit of the Swiss Broadcasting Corporation
Feb. 21, 2024 · 9:15 a.m.
AI in the federal administration and public trust: the role of the Competence Network for AI
Dr Kerstin Johansson Baker, Head of CNAI Unit, Swiss Federal Statistical Office
Feb. 21, 2024 · 9:30 a.m.
Automated Fact-checking: an NLP perspective
Prof. Andreas Vlachos, University Cambridge
Feb. 21, 2024 · 9:45 a.m.
DemoSquare: Democratize democracy with AI
Dr. Victor Kristof, Co-founder & CEO of DemoSquare
Feb. 21, 2024 · 10 a.m.
Claim verification from visual language on the web
Julian Eisenschlos, AI Research @ Google DeepMind
Feb. 21, 2024 · 11:45 a.m.
Generative AI and Threats to Democracy: What Political Psychology Can Tell Us
Dr Ashley Thornton, Geneva Graduate Institute
Feb. 21, 2024 · noon
Morning panel
Feb. 21, 2024 · 12:15 p.m.
AI and democracy: a legal perspective
Philippe Gilliéron, Attorney-at-Law, Wilhelm Gilliéron avocats
Feb. 21, 2024 · 2:30 p.m.
Smartvote: the present and future of democracy-supporting tools
Dr. Daniel Schwarz, co-founder Smartvote and leader of Digital Democracy research group at IPST, Bern University of Applied Sciences (BFH)
Feb. 21, 2024 · 2:45 p.m.
Is Democracy ready for the Age of AI?
Dr. Georges Kotrotsios, Technology advisor, and former VP of CSEM
Feb. 21, 2024 · 3 p.m.
Fantastic hallucinations and how to find them
Dr Andreas Marfurt, Lucerne University of Applied Sciences and Arts (HSLU)
Feb. 21, 2024 · 3:15 p.m.
LOCO and DONALD: topic-matched corpora for studying misinformation language
Dr Alessandro Miani, University of Bristol
Feb. 21, 2024 · 3:30 p.m.
Afternoon panel
Feb. 21, 2024 · 3:45 p.m.