Player is loading...

Embed

Copy embed code

Transcriptions

Note: this content has been automatically generated.
00:00:00
i have a panel and it's a charity by a saturday night and uh well i would say just say let's start it up
00:00:16
now now yeah okay okay so we're almost
00:00:21
at the end all these exciting day um
00:00:24
i think you have all of us here that it was a wrench ah can i think up a more positive
00:00:30
then the morning so it's a nice a way to and
00:00:35
uh but before starting our panel i want to
00:00:38
introduce a special guest uh from i'm going to what's
00:00:43
happening here uh hello it yeah i i you are policy and end up by advocacy manager and i'm going
00:00:50
to watch and you had an experience uh and in the public policy the battle manage our line of
00:00:56
voting advice for enough very meaning yes on basically you
00:00:59
could talk about everything that we passed upon a today
00:01:03
and um i'd like to start asking you to tell us a bit more
00:01:07
about the work that you think that you do i've got it than watch
00:01:11
how new um spot on misinformation and uh
00:01:16
and there was no democracy with your work
00:01:19
yes uh thank you sir and thanks for the invitation and i will
00:01:22
try not to expect that discussion uh and and keep it extremely short
00:01:26
uh so i couldn't which is a human to write a organisation we are based in a very in any in in zurich
00:01:32
and a focus on the effect of a i and all good uns on people and on society
00:01:37
so what we do is uh we got a it depends like we do we investigation research
00:01:42
uh as mentioned this morning from a salve for example would be a research for um
00:01:46
before the the swiss election uh i would be in chat and on the basis of stuff
00:01:51
research we do that a policy uh uh papers that we don't defended
00:01:55
in our advocacy work and also in our campaigns that uh that we do
00:01:59
and um maybe what's interesting to know is we try to go be hot beyond the just the
00:02:04
an artificial intelligence like we were doing that work or before uh the the hype and continue to do it
00:02:10
and what does really into interest us is the is actually i
00:02:14
got make decisions like decisions that are taking a taken via good algorithms
00:02:19
oh yeah it can be a a fully automated it's you don't it can be just recommendation uh
00:02:24
um uh like propose that that that that that the rate that that that have any impact on
00:02:29
oh on people and we try to see how can we what conditions that we needed so that
00:02:33
those technology to benefits to um to the majority
00:02:37
of of people enjoyed our enjoyed our focus um you
00:02:41
that our main topics and it's it's as mentioned today uh we talked a lot about that
00:02:46
and it's a public uh opinion how uh what is the role of algorithms for the
00:02:51
shaking of public opinion but it's also protector we had this morning some insight from casting
00:02:57
uh well we also try to what condition we need for use a row of a research technology in
00:03:02
the public sector but also on the workplace and and also a discussion on on a topic like agree
00:03:10
how can actually strengthen a devil's we have so that
00:03:13
we also better protected uh i guess uh discrimination uh_huh
00:03:17
gunshot thank us that and um i want to last
00:03:20
another uh thing and in your perfection in your work
00:03:24
how much you think people since the slightly uh are really interested
00:03:28
uh in knowing more about what's happening behind the black boxes
00:03:34
uh oh i'm going to and um is there really a real movement uh behind that
00:03:39
well i i think it's it's becoming more and more uh that's probably the chance of that of the
00:03:45
kind of hype uh does we we know since uh judge judy was deployed uh that people are more confronted
00:03:51
with technology unkind already and and kinda draws on the standard uh that we i must say that's technologies
00:03:57
also kind of pictures is we already knew before we had all the technologies that could have a zero impact
00:04:03
and there were already a lot of automated uh automation uh in
00:04:07
the public sector but often we don't really know when we're actually
00:04:11
uh when data position that is uh taking a on us by integrating
00:04:16
um but yes i think it would be it's more and more but uh
00:04:18
there's a lot of literacy also uh that uh_huh and when it comes to
00:04:23
uh for instance big tech companies that you pattern your investigation i mentioned today
00:04:27
and then with indigenous with and then germany
00:04:30
um concerning their being a chow uh from microsoft
00:04:34
what is the reaction and a day yeah put in
00:04:38
place any kind of a majors in front of your investigation
00:04:43
well it's it's always very different from one case one other
00:04:46
but in the case of that investigation uh we published like first
00:04:49
result and anywhere in touch with uh also microsoft and uh
00:04:53
as far as we could see the i didn't make a the
00:04:56
a lot of changes uh when we can continue to conduct the the research and we so there is like
00:05:02
they're indulgences alright or even province there okay i think that's still a lot to do
00:05:08
um so now i want to uh open the panel for other
00:05:13
uh speakers and i think we got a few questions for you actually
00:05:18
and i think i think a perspective is uh when the current and meeting interesting
00:05:24
because uh there are a lot of question marks deal i think also highlighted but you
00:05:29
um i think it did a good overview of also the geopolitical pensions
00:05:35
behind the all of that which is also very important with between um mean blocks
00:05:41
the new china in uh in the u. s. and that are now the main players
00:05:47
and so my first question and then i will connect with the questions that uh come from the public is
00:05:52
what do you think will be in a thinking park of
00:05:55
the anyway i after a at the word level when will come
00:06:00
and um the question from the public uh what would be
00:06:03
the way for switzerland can we imagine are different the way and
00:06:08
or it would have to comply as well uh_huh okay um
00:06:13
so i don't think there will be a one yeah one button uh that will be c. t. v. uh uh
00:06:19
get imposing force globally because the models are too different
00:06:24
and and and not to be i'm talking about china ah
00:06:27
and what i would cool see you in us a closer
00:06:31
to one another because it receives notion of democracy unwillingness to
00:06:36
to improve the well being still uh in a different way rather than
00:06:40
china is more aggressive so the the question is whether a as actually
00:06:45
actually put it on our autocracy regime ah i is that these away
00:06:49
that cannot be excluded known we see that the e. u. increasing me
00:06:54
a a growing always a democracy in bags getting going down so
00:07:00
um it's not a per six perspective that we should exclude so it now
00:07:06
however i would hope that uh the human
00:07:09
centred approach that we have a wheel however
00:07:13
prevail and that they will be kind of the week up uh mm
00:07:18
from company soon now also see how the big take at some point that
00:07:23
uh uh with the you the you back on the capital but basically
00:07:27
the us model is kind of the fader in that at some point
00:07:31
the companies themselves i'm not in a position to to
00:07:34
to alter regulate uh and the alternative easy either you take
00:07:40
the sectoral approach huh well you take a traversal
00:07:44
approach uh some people are cute that the european approach
00:07:49
which ones to be drawn person and or example which is kind of the training the e. d. u.
00:07:54
he's a it too optimistic and that it's too complex to govern and enforce
00:08:00
because they are too many specific cities in that it would be too difficult um
00:08:06
i don't know i i think that to put some some basic ground uh based upon risk approach
00:08:12
at the puts and you can question with the it has been rightfully done
00:08:16
or not make beep but that approach i think makes sense and then you can
00:08:19
you can you can get down to the details maybe for different sector your approach in the
00:08:25
at national level may be uh taking into account some more uh especially cities
00:08:32
so how should sweet scent on the mall for a um
00:08:39
yeah i'm going to be it's uh that we know that says she says uh
00:08:45
i mean the federal concern is always reluctant to to
00:08:48
knit shirt i saw the perspective in switzerland is always
00:08:52
the us at some point we we are the pro um
00:08:57
we don't move uh uh we see how it goes
00:09:00
the existing framework is enough usually that's that the answer
00:09:04
the the the the list actually framework is enough to deal
00:09:06
with the problems uh uh and so we don't mean anything specific
00:09:12
uh and then we so for instance in the copyright uh related areas that
00:09:17
ten years later we have set the new provision about
00:09:20
uh uh some platforms and uh hosting content providers or so
00:09:26
i think the reason need at some point to put some governments um
00:09:35
and that it is something most like to you maybe i'm wrong but
00:09:39
that tweed at some point get traction politically in switzerland as well uh now
00:09:47
i don't think that will be at the level of the u. i. uh which indeed uh
00:09:53
promote compliance ten point is kind of a a a a them the monster at some point
00:09:59
um but i think this can be done in a
00:10:02
more pragmatic way probably to really address the risks without
00:10:09
maybe impeding innovation also for uh
00:10:14
a semi use that actually out the major players in
00:10:18
our ecosystem no in the receipts true is that uh
00:10:22
one fears that only because the companies will be able to comply with such a v. p. so regulation and that uh
00:10:29
at some point between again concentrate powers that already concentrated so
00:10:33
that the question but i think to lead a seven just say
00:10:36
well our just give environment is sufficient to deal with the problem i think it's just under uh
00:10:42
that's my perspective uh_huh and but however there was a race
00:10:47
uh recently after the lunch and chatted with me uh especially
00:10:52
for regulation around the world and you think that
00:10:56
uh what we could find a at one point
00:11:00
uh and global regulation i global governance and
00:11:05
given the fact that uh i even the last is going in
00:11:08
that direction ends with i'm either exploring uh with um opportunity um
00:11:16
i think the ah come on 'em denominator is minimum didn't mean it
00:11:22
uh even if you read the chinese piece of legislation i mean you know miniatures
00:11:28
that we also have regarding sore sees regarding monitoring control but of course for different reasons
00:11:35
uh so these i mean a tourist i would say they they tend to exist a a globally yeah it's
00:11:43
now does it mean that we would be in
00:11:46
a position to would shave a uni form global approach
00:11:51
on the public and governmental level i have some doubts uh would that be the case may be for
00:11:58
the private sector us room kind of part mushy pony i am whatever it's a truck um
00:12:06
that could be that i have to say i have my doubts we so the the the the the
00:12:11
the private sector pros because when one regulated himself uh
00:12:14
it's always kind of biased and you have your own uh
00:12:17
the agenda so i have some guy messed with
00:12:21
the impact of such a private sector thought although
00:12:25
i understand on the other hand but the but the constitution may have a problem with the estimator information
00:12:29
because nobody really knows what's behind so uh yeah yeah because from the public uh came these other
00:12:35
remark technology has no border so we can put
00:12:40
in place regulation with borders but at the end
00:12:44
we have a to face this problem as where right yes but that's why
00:12:48
the you try to have extra curator factor into say no matter where you are
00:12:52
if you want to to to trade in the you you need to be subject to regulation uh_huh
00:12:58
but they need to be able to enforce it so you can't you and i will not not to address
00:13:04
a daniel and ask in a in it you think
00:13:09
that our regulatory a framework will be was so a useful
00:13:15
for a a solution like smart what uh to
00:13:19
um make government thing even people i'm more convinced
00:13:24
uh going in the direction of a smart waiting and waiting
00:13:31
well for the moment um it actually works quite well i think
00:13:37
works at the moment um we have mid to my knowledge no problems all
00:13:42
over europe um with the eight seasons like smart want and though many of them
00:13:48
most of them of course are affiliated with universities and research
00:13:52
isn't it yeah and we have seen with the trust levels
00:13:55
um science universities that's highly trust that um and it's uh
00:14:01
of course you can always argue that's they they could be other
00:14:05
organisations the lobby groups may be and uh they could set up
00:14:09
be kind of smart vote systems uh a biased systems so that um
00:14:16
you did the algorithms they get just a favour one side um
00:14:24
yes of course it um i mean
00:14:27
you you you probably you you have some some legal um framework already um
00:14:32
i mean we are so um regulated in a way because it's not a um
00:14:39
b. b. of course the d. v. d. have to um of
00:14:42
a um data protection laws for example and and all the stuff
00:14:47
um but um no regulations regarding uh the algorithms of um we use um
00:14:54
and it's of course it it it's not allowed to be um
00:15:01
you have you you have to be sincere about but but you
00:15:04
do with your um uh algorithms and and the recommendations um but
00:15:11
i'm i'm a bit capital if if we really should go the
00:15:15
way of regulation because the all be before the the problem arises and
00:15:20
and at the moment we have not that it's a much of any problems with it
00:15:27
uh_huh and we had different questions from the public regarding the news um attention use
00:15:34
all in the ah tools um in in smart board still a boat you think that
00:15:39
your trust level would drop if you use the i. tools
00:15:42
for instance to a speed out the survey people thing out
00:15:49
um yeah different ways how we could uh
00:15:52
incorporate a integrates a i yeah of course um
00:15:59
the uh
00:16:02
the thing i um my talk was about uh
00:16:05
was about um how do i re comments um
00:16:09
candidates and uh what's the mechanism behind recommending thing we commend them
00:16:13
command a station and um yeah i would say
00:16:18
that explain ability interested with the that's really a core
00:16:22
um thing and um trust levels but to drop
00:16:27
if we can not explain anymore how these results um
00:16:32
uh it's it's calculated um of course they're always may be a
00:16:39
more subtle forms of incorporating a yeah you like as you mentioned um
00:16:47
the there are possibilities um to speed up the
00:16:52
the the the answering of the questionnaire maybe um
00:16:56
when men i don't want to answer all the seventy questions for example
00:17:00
and stay are mechanisms how to select the questions that are best to
00:17:06
um gets a recommendation and stay yeah i
00:17:10
i could it's um be um one form um
00:17:15
of helping selecting um of out of the seventy questions that twenty questions up best
00:17:22
the and and this uh depends then on the part you were taking maybe because
00:17:28
depending on how you a monster the questions then the the system selects another question that it's
00:17:34
a post to you i'm yet to find and actually the best solution um
00:17:42
yeah that that's that's a possible that you i mean you always reluctance to do with uh
00:17:48
but um that's um the discussion that's going
00:17:51
on at the moment i'm exactly um how could
00:17:56
the u. s. a. i. that is helpful
00:17:59
who without being um you know way um
00:18:04
uh uh yeah to that's in a way that that the result
00:18:07
is not a explainable anymore um when i am selected questions then
00:18:13
in the end i still can't with them um current system i still can um
00:18:19
um calculates the the the result out of
00:18:24
these selection of questions of course you can
00:18:27
say that the the selection of the question then is that it's a um could be biased
00:18:33
um this this this this balance we have and it's
00:18:36
um it's been you there's no yes no no aunts were
00:18:40
uh and in the end it's transparency people they they need to know what they
00:18:45
do uh and what they what they see um but the system does to them
00:18:50
and um we have to find ways to inform
00:18:53
them out so in a way that they understand what
00:18:57
this system does hum and i think one interesting point about smart what is something
00:19:03
that you um also mention it requires a
00:19:07
on an extended to some extent from literacy
00:19:10
to understand the question send and replied band to than and
00:19:15
with awareness so could a ah out there to maybe a mm
00:19:20
maybe keep on board or take on board more
00:19:23
uh people there are also um with with the uh
00:19:28
not so a higher education uh the last uh interesting and
00:19:33
you think that you could how old's so there and to
00:19:38
yeah i'd be more interested in politics and and uh and reply to the answers
00:19:45
yes um i can imagine that's this could help so in
00:19:49
a way explaining the questions um although of course there are another
00:19:54
ways of rising a bias but um
00:20:00
this this um it could be elements how to use a i. e. and
00:20:05
i mean we use a either way i'm already um because we have some
00:20:09
automated it's um trends and nations um
00:20:13
because the candidates they fill in some um
00:20:17
voluntary information about their hobbies since it's a such
00:20:21
things and these um intimate is information is uh
00:20:25
in in the in the original language of the of the candidates and it's there is this people um
00:20:31
the translation then of their entries and this of course
00:20:35
is as already i support it's uh_huh of course and
00:20:39
all these things that's that's of course it's possible that we introduce um
00:20:46
but i'm still it's um knots that
00:20:50
call or system then of the recommendation it's
00:20:56
kind of information around and it's yeah i i mean i can imagine that we introduce more
00:21:04
of this stuff yes uh_huh have any uh we were ever tried it two or three how
00:21:10
as michael from yes me too i tried it and then finding enough
00:21:15
um nineteen first candy that was my husband so i think
00:21:21
i think that'd work pretty well for me
00:21:27
so uh i don't know what the the other experiences uh
00:21:29
where but i have nothing to complain one yeah but it
00:21:37
um okay i think i i would like to
00:21:41
ask anywhere uh so positive in my new um about
00:21:46
the solution that a i'm could could gain so
00:21:49
really looking at it more on as as ron opportunity
00:21:54
among you said you you talk about the complexity of budget digital position but we
00:22:01
one she is a nine to solve the problems aren't we using a sort
00:22:06
of the thing is to to work your id thing with also caused by you
00:22:13
i want to be differently i think we're going towards this direction whatever
00:22:18
we do these direction would also have to be uh we're free scanned opportunities
00:22:25
you can cost it is just what i thought was mechanistic very much open with with
00:22:32
eh but actually we're going to direction we cannot avoid
00:22:37
south to have to be where fifty four people are you know not that's not the point like
00:22:43
really cute to kick people people being open source
00:22:46
i get bored of politics involved continuing but also
00:22:52
people far away from here to let you put to to to to be changed that
00:22:55
this kind of thing so yes we will deal with this direction meeting for educated done
00:23:01
if you have this morning completely from getting new projects to to to to get used to the gate
00:23:06
but not only see no country did katie for people to be aware of the risks the the actual good to
00:23:14
keep our open sign household but not point o. ice but try to okay guys for quasi cousins
00:23:22
uh_huh and um the question from the public and why
00:23:28
did you exactly a focus on digital technology is still broadly
00:23:33
it isn't a more useful to look at specific kayaking parked on democracy
00:23:39
he's listed by the cell technologies us this topic point
00:23:44
i said that the state does the technologies to steven asked to the station
00:23:49
this is different to come pick suffocation so bits of the
00:23:52
closing due to sit there you see its smart commerce includes whatever
00:23:57
smart process with everything beyond this was the driver tools complex vacation so
00:24:04
you can get one more complex deficient exist can be one
00:24:09
of the solutions were good probably towards the direction towards peace
00:24:13
think people understand the impact of the complex vacation in the
00:24:18
back to the legislation that we need to do some related
00:24:21
because we don't have it it's his was trying to put
00:24:23
a mark d. d. peer pressures might help to third person huh
00:24:30
but more than one was a good come up later perhaps about the
00:24:34
boat ownership of data c. d. to come personalising data these kind of things
00:24:39
professional person from senior spoke about taxation of
00:24:43
problems work not accessible data question without needs knows
00:24:47
and eh we're not cute they're so acting things can help
00:24:51
to understand the difference in high noon accepting solution here solution be
00:24:56
but the station much more global phenomenon also cite
00:25:00
the reckon hum can't make maybe a concrete example um
00:25:05
about how how they i can help to navigate these
00:25:08
complexity and for instance could we imagine a future uh where
00:25:15
is human uh what is replaced by me any i it's exactly what i said and here it was a bit more
00:25:22
more negative than uh you don't because the disk is there and what happens when
00:25:28
in front of the station if the technology and the technology to invest just recently
00:25:36
i know it won't technology does want to copy no extension that within ten years is going to happen but
00:25:43
i do a soft and can be perhaps you you take notes can tells me i can see that jules
00:25:50
taking into account was they can talk about what
00:25:53
she believes what to his his background knowledge he please
00:25:59
in this particular case she would she would firmly vote for this case will need for just about
00:26:06
uh_huh swordfish is something just would we accept not perhaps yes
00:26:11
perhaps not but it's such a theory question but we need to
00:26:14
ask ourselves hum and one important point uh and do you think
00:26:19
that it is to propose a i as a solution isn't it
00:26:23
meeting e. uh to make sure that this program in a way that
00:26:27
eh i have not uh it it's an objective that it's new trial
00:26:33
it's absolutely the bias essentially the question of bias to me
00:26:36
that if we don't have the station the sky furtively see the
00:26:41
sweets to to be understood not proposing think it's comic simply it's
00:26:45
not my proposal we can think of this and if it comes
00:26:50
how what society school twenty no way that's not bias bookmarks with
00:26:56
to them it would be possible that when it's possible how can make
00:26:59
it it's nice it's not controlled by subside why not speak about peaks
00:27:04
with the brakes so that takes the seat is and what it is it's a nice state
00:27:09
recently this one like like yours control that you mean you don't worry
00:27:14
of course by this might exist but at least the first
00:27:16
mutually is controlled by citizens this kind of thing can be you
00:27:21
more fair to do something like this and the big one among the big three weeks i see yeah
00:27:29
and another really scare and various items nations right uh
00:27:33
does the term i find it a very funny actually
00:27:37
but uh uh i think and it's interesting and an investigation from the new york times um
00:27:44
highlighted that it's such a pity makes things out about three percent of the times and
00:27:50
google system up to twenty seven percent and so how it is possible to redefine with all
00:27:58
these on this in a sense and isn't there in an interest by the tech company
00:28:03
to ha banning bases and what do you think oh are you the question for me uh
00:28:13
yeah um i kind of as i said at the end of
00:28:17
my talk there's it's a bit application dependent right for some applications you
00:28:22
are completely fine with producing nations anyone to brainstorm ideas or so
00:28:26
then doesn't need to reference some existing knowledge source or something like that
00:28:30
if you need to come up with the news story or invent the children's book
00:28:33
or something like that then then you want to reduce emissions and then for other
00:28:38
like medical legal reasons or applications you wouldn't want to model you
00:28:43
would want to to stick as close as possible to the input
00:28:47
um yeah so so then i guess yes for some applications the probably also big taken
00:28:53
everyone else and all the users would want to loose nations and for others they don't
00:28:57
the key ingredient of course is to be able to switch it on and off when whenever
00:29:02
you want to do that right or or to have it for some and not for others
00:29:06
um yeah so we're definitely not there yet in controlling it completely and not having it at all
00:29:12
right i also think this discrepancy is rather large
00:29:16
free to twenty seven percent yeah um yeah um
00:29:22
and then there's a tons of ways already now in which you can kind of spirit we've
00:29:26
the way you products right that's the big difference as well so might even be that you've
00:29:32
someone used to different prompting template then the numbers would change again and and and
00:29:38
yeah so it's a i think a very complex uh you sure at the moment and see a
00:29:43
still this is a very active research areas you i think as we have seen today as well
00:29:49
um and there's lots more to be done yeah definitely don't have a civil pull pull that right now of how one
00:29:55
should always to this one could the label being um
00:30:00
how does the nation's interest is then it could be ah
00:30:03
ah responses submission and why has not done yet uh_huh so
00:30:09
you could there's multiple ways in which we could use lately right one is to create data sets wouldn't
00:30:14
rain on right and the other would be to kind of label outputs the problem with leaving output to start
00:30:21
too many outputs are generated right now for all of them to be checked where humans for
00:30:25
it um and doubts so you'd have to use some at least semi automatic system that flax parts
00:30:32
that might be might need to undergo review or something like that right so
00:30:37
um and for safety critical applications that may be necessary today right
00:30:42
to do something like that yeah so definitely right now
00:30:46
also for losing nation detection research legal standard is we
00:30:49
also human say is this a hallucination or not and then we try to detect what the humans have labelled it
00:30:56
yeah but we actually did the uh small a study in this paper that i have presented with
00:31:02
uh colleagues here and friends at home so also not computer scientists and we asked them
00:31:08
to detectives nation seems doomed summaries that part
00:31:11
one of the sequences sequence models generated and
00:31:15
there can be disagreements between humans as well right then and then there are some kind of the
00:31:20
yeah the more are full hallucinations varied completely make stuff up or down the the kind of
00:31:26
uh including background knowledge like in which stadium does manchester united play or things like
00:31:31
that right any knows that because it has seen it many times it knows these things
00:31:36
and and it will just included in the summer even though it's not mentioned
00:31:40
maybe if it's a work in a way or a home game or something
00:31:43
so yeah definitely lay billing and uh and i think
00:31:49
leaving it's in itself is an interesting research a topic
00:31:52
how can we make these kind of more effective and
00:31:55
yeah and make better use of the human labels um right now a big
00:32:01
direction is that we use chat you beat your trip before to automatically create
00:32:05
labels and that this uh this under discussion as well in the community right
00:32:10
then you don't have to to pay workers anymore you
00:32:14
can distribute it and a great those labels impair low
00:32:18
so that's a one direction that tech companies have gone
00:32:22
to words i've seen researchers from and tropic given interview
00:32:26
like two or three months ago where they say for the when we when we also close guilt
00:32:32
humans then we met may as well ask the job but it's the same quality is what they said
00:32:39
but more for like an extremely tricky things we still asked uh
00:32:43
the highly skilled uh the yep people we've we've degrees
00:32:48
so that that's one direction that does come american a company calls for example and
00:32:54
and the question is also if people will affect contrasts the labels in the future steel
00:33:00
so we we don't know right and and uh another question
00:33:04
from the public uh was about the i don't know a
00:33:08
response because you said that in many cases not acceptable a
00:33:12
biotech companies but why when the model has a insufficient data
00:33:17
yeah no no that's if it has insufficient data then we would like you to say i don't know uh
00:33:24
um or are you saying why doesn't it so that the right now or yeah why haven't the question why not require had a i i
00:33:30
don't know response when the model has has insufficient thing huh yeah that
00:33:34
we giving is to to know when the model has insufficient data there right
00:33:39
so if we know where in this case then it should answer i don't know that that's what we would want to to have
00:33:45
like this would be the setting that we would want to have the tricky thing is to know when we are in this setting
00:33:51
uh and it's so it's uh it's unknown unknown um setting yeah so
00:33:58
so yeah that that tech companies would like the models to say i don't know if it's
00:34:02
if it's going to elucidate the other is right but uh but the
00:34:07
the tricky thing that that that's light was about this this is tricky to
00:34:10
train because during training you don't know if the model right now knows
00:34:15
the on to do this if you train it to say i don't know
00:34:19
if it knows it then it you to teach it to withhold information
00:34:23
um and so that's also not what they want to do they also don't want to to always on to
00:34:29
reeve i don't know because they want it to be helpful chat but otherwise people are not going to use it
00:34:33
so there is really a bit of a yet tricky trade of their yeah thank you and uh
00:34:42
i sound though i think you had a lot of fun uh putting together your corpora right
00:34:50
and um can you tell us a bit
00:34:52
uh the story how how we get started exactly
00:34:57
also these uh names that you that you uh
00:35:01
decided a rough ah okay i see here is all
00:35:05
oh um while local actually was an idea oh uh from my supervisor but i will
00:35:11
let you see so he came back uh we were looking for a name for schools
00:35:16
uh we had some accounting some well obviously bought
00:35:19
locals that that's what locals up was really low
00:35:24
uh_huh uh they'll actually i was playing i mean follow
00:35:28
cool idea oh um i was i'm searching for so long
00:35:34
i just saw that was similar to dawn on us august right in that they're actually calls i mean it's a corpus also all
00:35:40
political bullshit and thirty thousand initiate any this way it is that i mean
00:35:49
there's nothing against that but this was she could be president of the united states for the second time so
00:35:55
yeah i mean that's a problem yeah that's why we need more psychologist ah
00:36:04
and what are and if you should uh help um how an
00:36:10
audience understand the linguistic markers of
00:36:13
misinformation and considered it conspiracy theories
00:36:17
how could they identify and them by themselves as yeah
00:36:22
cool face broke i just every day or phase book
00:36:25
not just getting arm so what we found another people follow social media
00:36:30
oh a dozen language or just counting waltzes so was related to the concept
00:36:37
like a sentiment them so that we can do
00:36:42
ah um you see a language of the set show him language off
00:36:45
a little oh the language of a dolly nouns uh a sweater warts uh
00:36:52
uh they use generally longer taxes c. uses so local because they want to argument what to do
00:36:59
these personalities might want you to believe that uh
00:37:04
um because some people really believe conspiracy theories of course
00:37:08
uh so this is the language part you can see a lot of of summation marks a a a
00:37:14
that article questions like these these troll that really say you know this kind of stuff
00:37:22
yeah yeah and uh mm question from someone
00:37:26
in the audience is a conspiracy theory always misinformation
00:37:32
no no i think it's it's interesting because
00:37:36
ms informational means uh it's wrong or full so
00:37:40
conspiracy are mostly full also bought for the majority are
00:37:46
uh on the fireball on a on falsify double so you cannot
00:37:51
do that it's like freud right i think it's not just that it's
00:37:57
i could be your your yeah and so cannot be wrong
00:38:02
i think yeah that's laughter is obviously wrong you know lots
00:38:06
of that are these are the dominating the wharton eighty it could
00:38:11
be eagles if you say it's not like that they say
00:38:14
okay it's a conspiracy that heidi so there's always this delay yeah
00:38:20
secondly of conspiracy so i think in the everything can be conspiracy
00:38:25
ah not necessarily falls in okay and we
00:38:30
still have a few minutes and uh i think
00:38:33
it would be nice to uh finish by
00:38:36
looking got built into the future uh right so
00:38:41
because i think that in the future of democracy there would be me yeah i in a positive way
00:38:47
so can we imagine a i and democracy going hand by hand
00:38:52
in the future and me to regulations tristan darts to
00:38:56
code of ethics to a good their research being able to uh
00:39:01
take control all the i and not be
00:39:05
taken by by it uh and and by this
00:39:08
way of uh of uh of revolutionary technology i would like to make around fighting we have now
00:39:18
oh and yes i i think it's best known i really hope it will also be
00:39:25
the case but i think there is a lot of steps uh before we get there and
00:39:29
as is that there is regular three there is interest there is really a lot of uh of
00:39:33
things to and to to to to go through it was a spoke earlier about like business models
00:39:39
and to i think there is a a lot
00:39:41
of power relation actually a behind the technologies and um
00:39:45
what we i think we don't do enough at the moment it's also really looking at the whole supply chain
00:39:51
oh the technology looking really uh what is actually used for doing
00:39:55
that technology uh it's like a human it's a label in the data
00:40:00
uh sometimes into really a precarious situations but it's also a
00:40:03
lot of um natural resources energy that are used that we really
00:40:07
i can have a a complete overview of of of what it
00:40:11
is actually that is a behind the technology it and we we
00:40:14
think also when do we really need to use it not on only when is it nice to have a bit of a i put
00:40:20
i have a problem and when when i well how can a result it and maybe it's a i
00:40:26
there is a solution but maybe it's another technology that can also do that work that we also um
00:40:31
uh do some processes like fundamental impact assessment so that we can really also it identified
00:40:37
risky really early in the process i was me to get them all the way long
00:40:41
to really um yeah to to to have this and good framework to to to use technology
00:40:47
uh_huh yeah you mentioned the network resources and the impact on our environment
00:40:52
and uh what's the non humans basis and it is very important yeah
00:40:57
i think what do you think i think we we have to be optimistic a new we
00:41:03
because yeah it's there's not going to go away and we have to deal with but a room
00:41:09
i mean he if we look back uh in your i don't
00:41:12
know nineteen ninety five lawrence and the way a lot of uh
00:41:16
a technology battalion that we're seeing where the internet is a great piece it's uh it's
00:41:20
free in the there is no rule that it's uh try and we saw that to
00:41:25
ultimately very quickly became an area regulated if not over regulated it's a try with
00:41:32
a lot of controls obedience in when it's not not that all only better than a space so uh
00:41:39
i would tend to i mean we miss to be the train we solution media reasons sequences that we see but i
00:41:44
i would hope that i'm not going to make a second time to see mistaken that uh they will be you we too
00:41:51
to to to deal with the problem eh not profitable we for
00:41:56
for everyone so i want to be a optimistic there's a lot of work to
00:41:59
it as a as a but i want to be optimistic in shoes same would
00:42:04
be wheeze copyright it's try not p. and they would be a couple of years
00:42:08
we used to do you use a remuneration when it's it's right and yeah um
00:42:14
so it's just a matter of a time right now exhibit k. o. t. could citron but uh i am i want to be optimistic
00:42:22
thank you bye bye and yes i would uh i agree and and uh
00:42:29
probably um there are so many tools that we know off and
00:42:34
that we don't know of their a. i. is already doing great stuff
00:42:38
and um of course we are discussing here
00:42:41
about politics and and and the more problematic
00:42:45
areas um but which is probably not the the the main um
00:42:53
the um the spot there a i i'm plays a role
00:42:57
um i'm i mentioned and there's all these translations things uh
00:43:02
it's it's it's great to have a a i tools there
00:43:05
and um so i'm i'm quite optimistic and i'm i'm out so
00:43:12
i i think that's quite hard to ban ideas it's quite hard to plan a um
00:43:17
software um ideas which is uh not really tangible
00:43:22
on goods with which is actually a knowledge and and
00:43:27
so we we have we have to find ways um with organisations like algorithm
00:43:32
option that the state uh its regulations and yeah to find ways to to
00:43:39
to have a optimistic way to uh for for these um
00:43:45
i'm a i uh_huh mechanisms to another positive new york because i think you
00:43:50
two of factual unplug the skies comic anyways can we have to deal with that
00:43:56
the tradition informations are important and also wrote a rule which one of us us
00:44:01
people that you know we know what would this mean and he ate work altogether between
00:44:07
sources and these people feel and years you journalist node
00:44:14
i understand we're going for did you kate so the civil society should take this
00:44:20
tape based upon the significant to increase the difficulty can we just we can love them
00:44:27
including critically good good society would we think it's yeah see this site
00:44:32
is the basis absolutely absolutely the right okay then it should take the plate
00:44:37
to the place it should take and now it's not up to company
00:44:41
is it worked because it seems like should the bucket sure it's our responsibility
00:44:48
house people we know what about we're we're going to portals erection
00:44:53
thank you yeah i'm also but i'm the yam optimist by nature and
00:45:02
i think you know through right it's what what choice do we have
00:45:06
um but the uh yeah so i think as long as uh the main goal should be
00:45:11
to to get the right incentives right i think a big part will be the legal framework here
00:45:18
and i'm positively surprised that the like countries around the world have recognise
00:45:23
that this is an important topic now and will no longer sleep on it
00:45:27
uh at least it seems right so that already is like one positive sign for me
00:45:31
and now we just need to get it right there which is the
00:45:35
heart exactly right exactly just um um but yeah so already some positive signs
00:45:42
uh and then another positive sign that uh i can share
00:45:45
because now i'm not only researcher anymore i'm also a teacher
00:45:48
um and i have students graduating uh and uh when i
00:45:52
talk to them about the future work that they want to do
00:45:54
are already if they're looking for fuses uh the put a lot
00:45:59
of focus on doing something good with uh yeah in the world
00:46:02
and they say we don't want to work on face detection software would like to do medical image analysis or we want to
00:46:08
find new viewers or things like that so um
00:46:13
at least uh that is also a gives me very positive outlook on the future that i
00:46:19
see people really say yeah i want to use a i for something good for the good yes
00:46:24
that's true i sounded at least one thing is to give you a thing usually that some uh see i don't
00:46:33
have a strong opinion on artificial intelligence um for me it's
00:46:37
it's it's all right like our cars and stuff like that
00:46:40
so we make it deals but use of people um hum i'm mostly on the positive side so
00:46:49
and i also you a lot for my job also yeah on top
00:46:54
okay great i think it's a perfect way to finish thank you very much for
00:46:59
your attention for being here and liked joy the upper or thank you to everyone
00:47:10
if we go to the after ah what do you like to let the session please uh thank you i'm sorry
00:47:22
no she the training so you can leave the now with um first of all thank you thank you so
00:47:29
much not make it thank you so much no i'm thank you the organisms and the speaker us here and yeah
00:47:35
because i think it was a very important the this is very important topic and the less no no no no i was not here in
00:47:42
the morning unfortunately but what i want to say is that if we
00:47:46
the society you guys and everyone really engage in the development of this technology
00:47:53
no not only can we really big um democracy really democracy again some point i
00:48:00
but making the democratic a i i think it's an important what there's
00:48:03
where and when i talked to so i'm not an expert so i'm i'm
00:48:07
i'm coming from the domain ex pat i'm more working means maybe cad data and when i talk
00:48:13
to clinicians only what i think okay this is the technology i'm trying to show what it is
00:48:18
but please please don't stay outside when we develop these technologies we need to develop it together and i think
00:48:24
for this topic in particular it's very important that you
00:48:27
know that when to really ensure that the democracy remain
00:48:31
we all uh are in this it's very difficult to be aware these technologies to fast evolving but i think
00:48:37
no maybe people who are willing to promote awareness to teach and i mean this
00:48:44
thing too young was for me make me so happy now i'm nice and good
00:48:49
and i would just have to make him if well to root use these big companies the
00:48:54
like i mean okay they have some people really like money but not
00:48:57
don't so thank you for the speakers the organisers but also very importantly
00:49:04
to the sponsors because um without them we would be nothing to thank you for the it or do they
00:49:09
for this at w. the side p. l. place he thought so many people
00:49:14
and uh so i hope you enjoyed your pillow and i hope you will come back as well because
00:49:21
uh we think this type of event is important to engage with the society and to put lots of different
00:49:27
people together to put seats and thinking as it was a chance because i think james james ideas of a
00:49:34
i for democracy thanks a lot and enjoy yep hope it's night that's why it's it's a very good one

Share this talk: 


Conference Program

Opening and introduction
Prof. Lonneke van der Plas, Group Leader at Idiap, Computation, Cognition & Language
Feb. 21, 2024 · 9 a.m.
Democracy in the Time of AI: The Duty of the Media to Illuminate, Not Obscure
Sara Ibrahim, Online Editor & Journalist for the public service SWI swissinfo.ch, the international unit of the Swiss Broadcasting Corporation
Feb. 21, 2024 · 9:15 a.m.
AI in the federal administration and public trust: the role of the Competence Network for AI
Dr Kerstin Johansson Baker, Head of CNAI Unit, Swiss Federal Statistical Office
Feb. 21, 2024 · 9:30 a.m.
Automated Fact-checking: an NLP perspective
Prof. Andreas Vlachos, University Cambridge
Feb. 21, 2024 · 9:45 a.m.
DemoSquare: Democratize democracy with AI
Dr. Victor Kristof, Co-founder & CEO of DemoSquare
Feb. 21, 2024 · 10 a.m.
Claim verification from visual language on the web
Julian Eisenschlos, AI Research @ Google DeepMind
Feb. 21, 2024 · 11:45 a.m.
Generative AI and Threats to Democracy: What Political Psychology Can Tell Us
Dr Ashley Thornton, Geneva Graduate Institute
Feb. 21, 2024 · noon
Morning panel
Feb. 21, 2024 · 12:15 p.m.
AI and democracy: a legal perspective
Philippe Gilliéron, Attorney-at-Law, Wilhelm Gilliéron avocats
Feb. 21, 2024 · 2:30 p.m.
Smartvote: the present and future of democracy-supporting tools
Dr. Daniel Schwarz, co-founder Smartvote and leader of Digital Democracy research group at IPST, Bern University of Applied Sciences (BFH)
Feb. 21, 2024 · 2:45 p.m.
Is Democracy ready for the Age of AI?
Dr. Georges Kotrotsios, Technology advisor, and former VP of CSEM
Feb. 21, 2024 · 3 p.m.
Fantastic hallucinations and how to find them
Dr Andreas Marfurt, Lucerne University of Applied Sciences and Arts (HSLU)
Feb. 21, 2024 · 3:15 p.m.
LOCO and DONALD: topic-matched corpora for studying misinformation language
Dr Alessandro Miani, University of Bristol
Feb. 21, 2024 · 3:30 p.m.
Afternoon panel
Feb. 21, 2024 · 3:45 p.m.