Transcriptions
Note: this content has been automatically generated.
00:00:02
ah hello remote come back from lunch break or group you you're
00:00:09
and i'm ready to continue our supporters year extension yeah i your and ethics
00:00:15
ah or speakers recession and our first speaker it is doctor catherine just see how
00:00:22
from my university you're a lowering your uh autograph during the user mary's good oscar or
00:00:30
you know or a parametric low lap so please go through thinking remote carmen and yours yours
00:00:40
ah so thank you very much and probably kind indeed annotation by then there is the best you know
00:00:46
uh it's it's nice for me to be uh in the middle of
00:00:50
of exponents because i'm not a by a metric expert but a legal experts
00:00:54
and uh yeah when uh i was thinking about uh which topic to advise of course that we talk
00:01:00
about uh de uh have a proposal for the
00:01:04
a a yeah act and explain where the negotiation are
00:01:08
but i was thinking that uh um how do we can see biometric technology as we know societies and that's
00:01:14
why i choose to do it and the title it talks yeah for me to get to the stock yep
00:01:19
uh it's it's neither but but i decided to ditch is this title because it's very important
00:01:24
that we understand why there is maybe so much resistance and white still what if you caught
00:01:29
to find a new g. t. makes use of the technologies uh in our a democracy and and society
00:01:36
so um it's it's about giving you some context um so the overview of that so
00:01:42
i'm going to talk a bit about it took and is talking on maybe you know it's
00:01:46
uh it's it's not this is only a only links to
00:01:50
about technologies but i would look at a painting to technologies
00:01:54
did i want to address a topic which i found very important um for researchers
00:02:00
and for the industry is about technology new twenty t. and what it means
00:02:05
and i will uh i would just point out to some research by because
00:02:10
of errors and it is easy to uh oh which trench uh these technology neutrality
00:02:16
and then i will discuss about the deregulation of for me for a
00:02:21
i so the future a acts and explain uh where the issues are currently
00:02:27
and the channels that we have to agree on on this on this
00:02:30
topic so uh i've i've uh i've worked mainly on fisher recognition technologies
00:02:36
i'm not going to rise all the biometric technologies that are covered by the feature octave should enjoy don't act
00:02:43
meaning be emotional recognition and the barometric categorisation systems
00:02:47
but i will just briefly talk about them as well
00:02:51
so concerning um the top yeah and just of jobs
00:02:56
so um i assume that an interview uh no science fiction if
00:03:01
if you work in the field uh you might know since diction
00:03:04
and a new as well that we always talk about these two payout does to plan a won't
00:03:10
uh that's ferrous days it took actually it was for us this idea would
00:03:13
be to clearly would have any day and society uh well no countries necessary
00:03:17
because people will behave in a it the other way and uh we have
00:03:22
any their form of government as we could be society doesn't exist of course
00:03:27
and then uh with the science fiction only tonight you has when uh
00:03:31
uh but but it's it's previously uh it it was a much be
00:03:34
first inspection also the term this type that came out like being the
00:03:40
words from society it's it's a nightmare to be to even such a society
00:03:44
uh what is interesting is that these two tams it would have been a calling them by
00:03:49
yeah i'm pretty sure uh uh by uh to mess more far directly to
00:03:54
chow and uh by if you're so far produced reserve fell for the stock yep
00:03:59
uh and they are critical representation of so say t. i. as well and you try
00:04:03
to i'm sure that you only know uh in nineteen eighty four uh by george orwell
00:04:08
it's not so much about the technology but it's about the government trying to control every aspect
00:04:12
of people's life throw technologies uh technology are helping uh in two men saying b. b. over there
00:04:19
uh uh i'm sure as well that you heard of some of you of be an immense they'll uh by
00:04:25
my guide attitudes or uh maybe more recently because that's
00:04:28
good that you can see a everywhere climb to send
00:04:32
um kansas it's it's an interesting book because it's about a a human you
00:04:37
mentally into robots that because um that becomes a friends uh two is sick child
00:04:42
and it's also makes a job all um that you can you can read uh if yeah if you're interested
00:04:49
uh so all these books our presentation of how technologies as well we'll
00:04:54
yeah we'll be part of our society might be part of our society
00:04:58
but i think the rally teasing between uh in its and then you have concept let that stop you out or put up yeah
00:05:05
i don't know if you heard about these uh concepts uh has to complete the job so to stop yeah
00:05:11
it's it's a concept that in my guides out those whenever i put him up the we because she said that
00:05:17
we have elements of the stock in our world uh it's not a two p.
00:05:20
of course that we are currently even but we have agreements of of the distant yeah
00:05:25
and it's also something that uh who haven't janine
00:05:28
chooses sociologist in in a african american studies nikki states
00:05:34
and she she'll actually that these technologies a. i. technologies not necessarily biometric technologies
00:05:40
they are also uh used uh um to um to frighten
00:05:44
discriminates if people and the and minorities of in our reading about
00:05:49
people so that's that's very interesting to have these uh these drastic t. will you also happen to p. r. it's it's like
00:05:56
you would consider that society is not static and there is always hope
00:05:59
for better so you have uh the hope to have a business eighty
00:06:04
but then i i wanted to give you what i found uh in science fiction and uh
00:06:08
but that's in the real world about uh these relation to uh it took a distant yep
00:06:14
i don't know if you uh i'm not sure you can read
00:06:16
do you know these biometric marrow it's it's an experiment setup that by
00:06:22
and at least uh from us right now um and she instead it's uh it at different uh location
00:06:29
like a museum of amsterdam so i i went there i was with my son it was very interesting
00:06:34
because you've got an r. r. and the mar was kind of phase so it's it's it's can analyses
00:06:40
and according to this kind they will tell you if you are
00:06:43
self confidence and the percentage of sin convenience and ease of use senior
00:06:48
and then that idea it's the project an image of one hundred percent
00:06:52
of each of these characteristics or self confidence uh uh relax a beauty
00:06:58
and the face that you say you can select a you can check and if you do online you too
00:07:04
it's unlikely interface so this is very interesting to see how a high activity
00:07:08
interprets of course is what we f. a. not is there is that right
00:07:12
they can interpret beauty um so i thought that was
00:07:16
that was a different perspective the on this the maybe uh
00:07:21
imaginary societies and how we could use of a matrix but then you
00:07:25
have another one and this is back in your right beside uh uh
00:07:29
we we could side i think most obese it is that of a
00:07:32
black morrow because most of them are the relates also to biometric technologies
00:07:37
this one is is quite intriguing because it's about uh making
00:07:42
alive uh that a depressant again so by creating a digital avatar
00:07:48
but the thing is that the person looks like speaks like but it of course
00:07:52
it's notes because the brain has not been scanned so it's the behaviour which is can
00:07:57
and the choice let's issues because some people don't accept uh to
00:08:00
lose of course the abilities and what if we could do that
00:08:04
why don't we it because he needs as well that we should put to this kind of experiments i'm
00:08:09
not talking about he got it because what should be uh the the the the moral uh boundaries as well
00:08:16
how should we goes when it we shouldn't go ahead and that's the whole question as well with biometric technology because
00:08:22
they can provide so many means uh to achieve
00:08:26
whatever we want or in the future even even more
00:08:30
so then i wanted to to go back to the one that we have a currently
00:08:35
and it's it's like two sides of a point and and uh
00:08:38
i have very simplistic a representation of that's for sure once biometric technologies
00:08:44
i was thinking of this me and maybe uh you know at that particular to do this me a
00:08:51
box they just touch your fingerprints they said convenience security
00:08:56
yeah they don't inform us like but that's another issue
00:08:58
but for them this is either situation you don't have to use in email anymore payment means well
00:09:05
because you uh easily uh button to fire yourself with your fingerprints and you've got
00:09:10
a carte blanche to it and then you also have these uh uh but your fingerprint
00:09:17
cool it's a unleash a lot of uh their applications you can play with a few complaints
00:09:23
you can unlock your fallen uh you can do a
00:09:26
your shopping online you can have access to all the services
00:09:29
so for me these are seen as convenience and it's it's
00:09:33
i think it's easy to illustrate that with verification slash with
00:09:37
authentication uh uh press this is our uh because these uh
00:09:42
then base will uh have less implications in terms of fundamental rights
00:09:47
but then the other side so it's a selection not uh
00:09:50
it it's that's a completely uh uh impartial but what i'm thinking
00:09:55
this one really really trouble me it's about a lie detector and i'm sure that you heard
00:10:01
about that so we could discuss with on presenting a biometric technology per se because they don't always
00:10:08
they are not willing to buy much recognition but stay the a a process by which the characteristics
00:10:14
and this uh i thought of course for it's a project that was the finance by the european commission to try
00:10:20
to um to needs um in the the flow over
00:10:24
of new grants in the u. but the problem of that
00:10:28
is um is is the assumption that you can read a motion
00:10:34
on the face of people when they cross the borders whereas when people cross
00:10:39
the border especially the heart in a stressful situation began should different type of emotions
00:10:44
and we know that emotions are the expression of emotion is not us only deepens as little function to go
00:10:50
it depends on many things so we are funny actually this emotion recognition are really problematic
00:10:56
and we find them in d. f. h. artificial intelligence act so the the this this was a very bad the
00:11:02
project that a receiver that of it produces not answer the
00:11:06
whereas some um trial in place that i've been a start
00:11:11
uh they had planned to try in greece hungary unless yeah and uh after lots of
00:11:15
the uh there was a beagle try and and the and the all these experiments have stopped
00:11:21
but i was thinking of of peace one you might see that it was last week or two weeks ago
00:11:27
uh it did make a concerning uh tom hanks but also mister beast when he
00:11:32
studies might be less no one i don't know but tom hanks i went was immense
00:11:37
uh it is gonna that there was a defect cover of
00:11:40
in a uh promoting a a dental plan even a there are
00:11:44
a agree to that and uh and also uh to be something on its that run to set that you didn't consent to these
00:11:50
but is to show how easy it is to work to generate uh this kind of contents
00:11:56
uh so in that context is not to one foot i would say of course
00:11:59
we have a for the where it's uh so you only fleetingly and the followers
00:12:04
but it could be much more harmful when you use the
00:12:07
fake in the context of elections of for elections purposes for example
00:12:12
and then i found a paper on i don't know uh uh if you are way out this paper uh i'm very intrigued by
00:12:18
it's it's about a man something on says uh it's it's it's
00:12:23
it's a survey that's that's what you twenty thousand papers and patents
00:12:27
that where in computer vision r. e. r. and they say that's the problem is
00:12:32
the way they describe also human they describe humans in these studies ask you object
00:12:38
uh so this article maybe not
00:12:42
a mm completely um uh how
00:12:45
to say it it might be against or um their their research
00:12:50
but you can read the research and i think it's very important as we such
00:12:53
a and that's why we took as well uh technology neutrality that's where we have that
00:12:58
that's everything that you communicate will have an impact on the way your research would be perceived
00:13:04
and uh it's the same when you select the data but in that case it was to show
00:13:09
that most of the of the research that was done to so maybe human beings will to develop
00:13:15
projects a fuss audience but this is so that that's also very important to to and then it's
00:13:20
and then the the the code for more interdisciplinary the a
00:13:24
teens uh when you do a project also to be aware
00:13:27
of the impacts from manic the contest because you only got perspective of research uh that that shouldn't
00:13:33
but i wanted to give you a bit more a background on what
00:13:37
by image with that are are as well that from a legal perspective
00:13:41
because all these technologies the process that's not data on it it depends
00:13:46
uh you'd be can be uh across the ice that's not that's out
00:13:50
so they are physical physiological of the of your data that can be linked to unidentifiable in d. v. d.
00:13:57
or so that's that's how we define with a specific function i can come back to that uh later but there
00:14:04
why it's also so problematic about this but i need to get on biometric technologies
00:14:09
is because they capture information about our identity not only
00:14:13
civic identity but we are how we behave this way
00:14:17
so it's it's about something that can be very person or something
00:14:21
uh which which to which might not want to disclose as well
00:14:26
uh and that's why we have all this discussion about uh we don't we should rely more on what the
00:14:31
biometric that essay rather than what people say and speech
00:14:35
in the cortex of a a new direction for example the
00:14:38
clam identity yes use the f. format ties the identity
00:14:43
uh and we know that the only people is ages of
00:14:46
the technology so uh asking with their uh it's this top yeah we took yeah it doesn't make sense because that
00:14:52
or but they we already use these technologies but it's more how should we use
00:14:56
this technology responsibly and which kind of you shouldn't be allowed as well of these technologies
00:15:03
uh for us it was reinforcements but it's more and more commercial purposes as well and that's why we have
00:15:09
to keep that in mind because the purposes i'm at the same commercial purposes so so with the profit in mind
00:15:15
it's not a law enforcement uh it can be of course to secure the big interest
00:15:21
but it's completely different but this is uh that that that that i thought are completely different
00:15:26
and then we have some benefits that have to be put in
00:15:28
that ends this coming on security and i was thinking as well
00:15:32
as in a case of body identification you could think that sometime
00:15:37
if someone is register some way ah as is our fingerprints register somewhere
00:15:42
it could be very useful to use this information to identify someone who doesn't have an identity
00:15:48
so you mean user so janey i know with that but but we can at a and fingerprints identification we not
00:15:54
be the only technique but it's one of them and we cannot say because we see that the awesome fundamental right issues
00:16:01
that we should not use it in some cases as well i think the in cases of body identification this is very important
00:16:08
as unless lessons are just uh somewhere in in the database can be passports or that type of the that
00:16:13
the bases and of course the reef and the fact that you will lose control over u. one b. d.
00:16:20
you can be profile you can i begin decent inferences incorrect inferences
00:16:24
as well or something that you don't want to be in far from
00:16:27
from your body from your behaviour and you are not necessarily the
00:16:30
way out or you will not have to cooperates a with it so
00:16:35
i'm i'm talking about is to give you some background on the issues because
00:16:39
i think they and there is a lot actually uh to say about this topic
00:16:42
but there's still want to be that i want to raise technology neutrality and
00:16:47
the reason is because for many of perspective when we talk about technology neutrality
00:16:53
tries it's very easy we say we drive generalisation like
00:16:57
the g. d. p. r. with no technologies in mind
00:17:01
yes you switch right uh the stations basically technologies because the answer uh a specific technical problem
00:17:08
but this is not what i wanted to raise here here it's about without
00:17:12
technologies are neutral on that and when you design then when you develop them
00:17:17
and what is interesting is the position of the commission so i'm i'm showing the is to
00:17:23
in the white paper are but also in their uh your pen strategy for
00:17:27
that are so these these two papers off from twenty twenty the commission consider that
00:17:33
uh these so this regulation there should be a technology called
00:17:38
a new toward because we should not attach any value to technologies
00:17:43
because technology is not good it's not bad it's just the news at ten
00:17:48
and there are some without the captain enjoying that and they agree with them it was as seventeen
00:17:53
that and i i wanted to mention one of them is inclined i'm gonna look kind point now you
00:18:00
write something very nice it was uh almost twenty years ago about fisher recognition for me that was
00:18:08
an eye opener because he said that actually the way at that time and see this technology web design
00:18:14
you will exclude some people people will not be able to be recognised because you didn't choose them in
00:18:20
the in the training and the um the training but that's out there was no that's out about specific yet
00:18:26
group of people for example uh um and it's it's very
00:18:30
important to disclose the technical choices that you make but they can
00:18:34
use a political choices so sit agencies to inform and it's
00:18:38
part of fairness as well that it it's part of actually see
00:18:42
uh and it will avoid bases as well so look kind for now uh for me is is one of the authors that's uh
00:18:49
that's what's the um when with something visionary about what is happening uh and now c. could
00:18:55
it does create x. so been in your design i you will make some choices and explain them
00:19:01
uh you find all the authors and that i have more political stance uh
00:19:05
for example p. c. which if you want to read you wrote about the matrix
00:19:10
with these technologies and by all politics because he explained that this technology we have developed as well
00:19:17
um we just basic purpose mine nineties by image speaker identification technologies
00:19:22
it was in in in the context of dominating some populations and
00:19:28
then because of that but they will not trained on the right as
00:19:32
well that ah so is e. e. talk about whiteness of biometric template
00:19:36
at that time so it was in two thousand and that's about that
00:19:39
and if that was an interesting perspective it doesn't say that i agree with you that it's just about
00:19:44
putting some context and perspective and that that's a researcher as a design of ages
00:19:49
design on this side and they don't don't think how their technologies can be used
00:19:54
uh so that's a another point and of course the issues of
00:19:58
biases are so the easiest one is the technical uh races you say
00:20:01
oh you just introduce diversity in your training data set and you are and us garden but that's not the case we know with x.
00:20:08
these bases they also ingrained in our society into when we
00:20:12
can see as well uh uh maybe some people or or
00:20:16
we treat a disability i don't know you you can think of what actually tore at historic
00:20:22
ave is is that we have and you can find them in the systems doesn't mean that
00:20:27
the designers of the researcher i'm not biased but they were present society as
00:20:31
well and this type of bay is easy you don't necessarily aware of them
00:20:36
so there is an important work to be done on that uh about uh making
00:20:41
a visible the bases that you can have any this is very difficult activity has them
00:20:47
and then uh and last but not least a division of really actually see the phillies if he doesn't uh
00:20:55
and uh and you also have that disease which is interesting it's about the
00:21:01
value double chart so what explain he say that technology is they can be
00:21:07
when they are develop either good either but it's it's like
00:21:11
any click on and then depending on which type of technology developed
00:21:16
you will have to do the good side either hire or the the back side
00:21:20
for example you become part of a cookie machine and you say well equipped machine
00:21:24
it's not many a design for good the bases you will not use a coffee
00:21:28
machine to our someone of course can be using that object but that's not but
00:21:33
if you think of this word how can you say that this was not be
00:21:37
designed to on someone uh to keep someone so i like this idea of these uh
00:21:42
value of more value of the bar chart because it shows that
00:21:47
whatever you do it's always good to think about the uh the
00:21:52
are we could be misused it doesn't mean that it imposes more variation of
00:21:57
course on researchers on because you are not here to solve all the problems
00:22:01
but uh it's importance and then to question and to be critical but
00:22:05
with which uh what you do and then uh i just wanted to
00:22:12
so uh i'm not sure you can relate it it's it's but friday what
00:22:15
you watch newspaper that's the technology is either at rest in the neutral stance
00:22:21
not least because every technology is always design according to some implicit and explicit values
00:22:27
by some people or for some people we didn't get to and we we could still in mind
00:22:34
for some uses right um then with us with a four then season constraints and so forth
00:22:38
and i think this is a good summary for me at at the semis uh the issue
00:22:44
and then it's it's an open question i don't have the answer bits about what is your
00:22:49
whatever research on the wall of designers as well that we should not only focused on the use
00:22:54
saying the problem is only the use is that some technology will always be design
00:23:00
to our right out into head out to perfect a society uh in general so after
00:23:06
this uh um um i i would like to to jump it between two to a
00:23:11
division into don't act because i think that's what you would like maybe to to here
00:23:15
i just a few words about it and to tell you while it's a difficult to reach
00:23:19
an agreement on it so uh we have currently the g. d. p. r. on for a in
00:23:26
um the regulation of biometric data with the obvious than data
00:23:30
but the g. d. p. r. there's much regulate with technology themself
00:23:34
and there was a gap that was no regulation for the technologies for
00:23:38
the development design of the use of these uh uh a i says them
00:23:42
and and the commission phone but beware to um to difference research uh and
00:23:50
you shouldn't let you call the way they did it but the first talk use
00:23:53
on more um we ski a risk issue likely ability shoes it was that productivity they
00:23:59
had that in mind when uh when the approbation uh uh the uh the city proposal
00:24:04
uh before the be sure digital proposal sorry when when when they made their web attractive it
00:24:09
and then you have a lot of the provision that are more uh security fifty oriented and you have another part and
00:24:16
it's only about biometric technology displays interesting it's about the impact
00:24:20
on fundamental rights uh and the men folk use they mentioned technologies
00:24:25
so the uh n. b. proposal was publishing twenty twenty one
00:24:30
and in case you don't know the commission just explain to you with this
00:24:34
risk based approach for categories of the ice them so you had nice unacceptable risk
00:24:40
under which for a new remote barometric institution system in two weeks place for law enforcement in real
00:24:46
time but i will come back to then you have high risk much most of the systems here finally
00:24:53
it's very long but i have a slight to show you the and then you need
00:24:57
to raise with the way it was it was even better conditions a bit strange on and
00:25:01
it critical here because for example the commission consider that depicts we were just limited every six
00:25:07
then subject to transparency that you should one use a bad guy interacting with the to pick
00:25:15
but we know for me defective right with the destruction of technology of techniques that not develop
00:25:21
um with an arm in mind because they were developed for a very
00:25:25
specific purpose uh at the origin doesn't mean that we cannot find out up
00:25:29
of there would have been if it's uh to the techniques technologies but anyway
00:25:34
the the commission didn't understand the and the risk that effect goes to society
00:25:40
it it which angela in the negotiation and neither rates well actually biometric technology
00:25:47
use for verification purposes they don't put in your risk according to the commission
00:25:52
so the on that subject any any roads which means that even have
00:25:57
developments for the moment but remember this is that's a lesson exception of light
00:26:02
do we not subject to specific rules regarding the development or uh when they are sold on
00:26:06
the market because the this yeah acting as far as well uttering a new markets also it's
00:26:13
for a a system that will be design i would say you and on top of the hill market or
00:26:19
designing the you of course so yes i wanted to
00:26:23
show you these three types of biometric system that's uh that
00:26:27
are regulated emotion recognition a dime to get the graduation
00:26:33
says that's so according to the commission on these two
00:26:36
sister and they are also limited risk except as some
00:26:39
of the few few cases in uh in particular law enforcement
00:26:44
and remotes biometric identification system so this last one is actually
00:26:50
a broad category that covers feature recognition technology for those gate technology
00:26:57
any type of technology that you could deploy yes when the public spaces um this definition
00:27:03
is far from being perfect it has been high decrease i because from a technical perspective uh
00:27:09
it yeah distribution is always remote at the t. i. thing what i meant is the
00:27:14
remote capture so there is no corporation and there is no one is it's at a distance
00:27:20
um this will not be the the final definition that but is just to
00:27:25
show how far sometimes illegal perspective we
00:27:29
are describing as well technique or the technologies
00:27:33
uh it it's highly difficult because it's which kind of definitions should you should you d. but beside that's what
00:27:40
i want to say and afterwards i will tell you more about the screen with parametric institutions estimates about the calendar
00:27:47
it's important because uh this a. i. x. is not
00:27:51
yet legislation it it may be noted it was proposing
00:27:55
up with twenty twenty one by the commission the concealed
00:27:58
better present good romance either disposition in november twenty twenty two
00:28:03
but your current balance better presents or the citizens of the
00:28:07
e. u. i doubted his position in june twenty twenty three
00:28:11
based on that position a lot of media reported that the a. i. act
00:28:15
was that it this is not true so currently we don't even know if
00:28:19
you would have an a i act we have um the intent is to
00:28:23
additional negotiation that record created so we the three um a institution that's started
00:28:31
and for the moment we haven't been able to agree on a remote magnetic
00:28:36
institutions is there a motion recognition i categorisation system the parliament at as ask
00:28:43
us way to need the use of initial recognition to to have a cheek
00:28:48
purposes because of the respect rallying to
00:28:51
uh uh waiting emotion and operating emotion
00:28:55
um so for the moment we don't know at their uh remote biometric identification stands and i apologise
00:29:02
for that because i have some slides but i really want push to explain explain to the differences
00:29:08
so the commission propose a band so when we saw that the permission
00:29:12
to put a band for the use in real time so like shoes
00:29:17
of this technology in public spaces by put his uh ever thought oh it's it's a good idea
00:29:23
but then you have three broad exceptions and when i separate
00:29:27
exception is because the first one is about finding button show
00:29:31
the victims of crime including missing should run so you would
00:29:35
deploy this technology in real time which which seems to be um
00:29:40
also a disproportionate prevention of union for to live and safety director effect
00:29:47
back so i and he's not national security as well as i think uh
00:29:51
maybe they are using this technology for this purpose is that we don't know it
00:29:57
but the worse is the last one it about depicting localise in prosecuting kind of fun there i suspect
00:30:03
based on one of your fancy that isn't the least of very to
00:30:06
a fancy is uh that can be subject to the european r. s. reference
00:30:10
but this list of references have been interpreted differently
00:30:14
by member states which mean that might know a advances
00:30:18
couldn't it to uh uh your pen nice weren't done on the basis of that you could use this technology
00:30:25
okay so i'm trying to work too good bit faster are um when you're
00:30:30
gonna use of this technology will be high risk and this includes a if a
00:30:34
private company is doing actually a security on behalf of law enforcement in the same
00:30:39
condition the wheel it wouldn't be banned according to the commission would be at risk
00:30:45
so the court seal which are present government when
00:30:48
fat going exception and it's not surprising because the conceit
00:30:52
wants to uh of the components they want us sure
00:30:55
more civilians so it's using this technology for most audience
00:30:59
the high for example or prevention of attack to critical
00:31:03
infrastructures so it shows that as soon as you start opening
00:31:07
these a box of exceptions you can go very far
00:31:11
um and then the buttons and and the parliament it's the
00:31:15
citizens the parliament does not want any use of this
00:31:18
technology in real time or a past event or sort of
00:31:23
white prospective use by any actors they still one exception
00:31:28
is in case of the proxy cushion of specific serious crimes
00:31:32
after judicial but isolation this is the only exception um
00:31:37
but the parliament's heart difficulties to reach its own a political
00:31:43
consensus on these documents only supposition just before beef and
00:31:49
be either t. deposition uh the political party say that they
00:31:53
wanted more exception on the screen with a metric units patients them and the problem here that's why i say easy
00:32:00
to please the guy because we have to find a balance
00:32:03
between between the predicted the the substantial public security uh interest
00:32:09
and so it's it's the security i mean did yours and their rights are fundamental rights in this book as bases
00:32:17
and another problem that i see is that the commission as the justify this exception because
00:32:23
they are impacting our fundamental rights and there is no we've either is that these uh technologies
00:32:29
our efficiency that they are necessary in our uh democratic critics bases there is no evidence
00:32:36
uh and you have several and use that are against any form of user these technologies
00:32:42
but since i'm a member state are already using a mm uh by magic with
00:32:47
the are really using facial recognition technologies it prospectively for investigation purposes i don't think
00:32:53
that the conceal watson the government but they want to ban it we can i
00:32:59
think they can and then it so now we are really at um at the crossroads
00:33:05
of course i mean what would happen to these um a. i. x. because as it stands
00:33:10
now we don't have an agreement on that the european parliament we have election next year in engine
00:33:16
and if we don't have an agreement by the end of the year it's very difficult
00:33:20
to ensure that we we don't agreements uh uh the first months of the twenty twenty four
00:33:25
so it's it's it's a very difficult the balance to find any to put a
00:33:30
good balance and yet so it's not going to be fundamental right buses predicts security
00:33:35
is going to be a a different provision of the act that
00:33:39
will be a compromise uh oh that will be balance against each other
00:33:44
and then to finish i just wanted to give you a one update
00:33:47
it's about uh your print gotta human rights concerning the the right to privacy
00:33:53
very interesting case and that was published in july where a new but
00:33:58
administrator was our estate because he was basically a demonstrating in the underground
00:34:04
and after a few a few days i think the police on this trace again in the in the underground
00:34:11
the use facial recognition retrospectively in life because they could identify him
00:34:16
through a social made yeah uh she's features the courts it's not necessarily
00:34:21
a democratic society at the this kind of use that they say
00:34:24
that's for that context been may be they didn't take position with their
00:34:30
deploying fisher recognition technology in public space he's
00:34:33
he's up isn't that's a necessary in democratic society
00:34:38
that's the way that they they believe to the e. u. institutions that that's a political debate
00:34:43
and that's a debate as well that we should all have as citizens
00:34:46
that we are not attributes because we did agape that's to the institution
00:34:51
uh then i know that in sweden and you you might have the
00:34:54
right to um oh yeah i found them you can me participate more in
00:35:00
this type of the bait but it's also to think of which kind of
00:35:04
society would like to even as a citizen that as as researchers or uh