Player is loading...

Embed

Embed code

Transcriptions

Note: this content has been automatically generated.
00:00:00
going to give myself a wrong but we can skip a couple slices
00:00:04
ah yes thank you very much for having these these recording working
00:00:09
yes um and so i have come here today not to present solutions bits to ask questions
00:00:16
and i hope to have lots of uh inspiring input from all of you so um uh normally i'll
00:00:23
people are supposed to ask questions at the end if you have never isn't inside 'specially the last third
00:00:30
of the talk please uh raise your voice and let's make is an interactive a thing
00:00:35
um so i will go through a these four stages of thinking we are
00:00:42
why i'm here uh um why we wanna talk about every other
00:00:46
at a idle uh what you're wearing is actually means
00:00:49
uh which might be quite a into a a um
00:00:53
some you and then uh how and that's was
00:00:56
the big question mark too so that's specifically where all your input would be a warm welcome
00:01:02
um so we are these twenty two these
00:01:07
people um in zurich uh oh
00:01:11
well um we are stipends requisite two years ago uh on our aim is to make an autopilot
00:01:18
that can uh convincingly outperform the human pilot when flying the electric
00:01:23
person all transport aircraft of the near future ah um
00:01:29
uh up
00:01:31
why i you ask um so this all started when uh we saw um companies
00:01:39
like this and project like this take off their now more than a hundred
00:01:43
uh companies and project and start up's also some bigger companies are in
00:01:47
the game uh trying to build electric aircraft the direct enabler
00:01:51
is the uh density and against the batteries has come to the point
00:01:55
where it is now a possible that's of course driven because
00:01:58
of the mobile computing revolution but now transport is beginning to reap
00:02:02
the benefits of that and uh where is my team
00:02:07
ah that they are so i forgot to say that i'm not here alone i'm actually just the figurehead
00:02:13
of uh this company the people that know about the learning and a a i. r. over there and
00:02:18
the person that knows about actual flying is now pointing at the l. p. s. microphone ah
00:02:24
um so uh the um i nearly forgot the up about the part the
00:02:29
i'm i'm not here long uh i'm really the figurehead so we saw
00:02:34
lots of people started this is because of the battery that is not possible is all great
00:02:39
uh except if you wanna do anything interesting you gonna have to get rid of the pilots
00:02:45
um two weeks after we started we're in airbus came out with report
00:02:49
saying as much you know electric flying for personal or transport is fantastic
00:02:54
uh we're going to have to get to full time uses possible
00:02:58
there was a actually started to projects in this space and even um u. b. s.
00:03:03
not known for wild crazy opinions said the this is technically feasible in could bring
00:03:09
lots and lots of benefits um and when they mean benefits they mean money so um
00:03:15
um the reason for that um we can go
00:03:19
into a details i'm not side the time
00:03:22
commission the report that said you know we could really do with more tommy in flight
00:03:27
these green things the more green a bar is the more they think
00:03:32
that uh automation and on me and and computer science can help
00:03:35
um we think that what tony is needed to log onto the white later uh and
00:03:40
those are the things that they left grey so it turns out this was
00:03:44
um after we've already started in that are probably road map um we saw this and that things were
00:03:50
they say yeah you gonna need humans there that's exactly where we're working to replace it and
00:03:54
um so this in a talking to a air traffic control and emergency procedures but for now
00:04:01
uh in a not flying into other people and only landing where you really want
00:04:05
to um that those are the uh the early things were working on
00:04:11
so are probably road map uh was the finder i'll actually can fly helicopters
00:04:16
so she went through the ballot um commercial pilot license for helicopter scale test
00:04:21
and there's a set of things about fifty of them uh that you have to show you can do is you pilot
00:04:27
ah and what we wanna do is do all those things only clearly better uh the philosophy being
00:04:33
that that's the easiest way to get to a adaptation in the market in society because
00:04:39
air spaces exists they has this huge body regulations it's like a legacy system that
00:04:45
you have to interface with and if you start with lots of new instruments
00:04:48
new brown infrastructures in new rules that's a much larger over whole
00:04:52
them being able to fly when humans can fly nowadays
00:04:55
um so uh it's like if you are fancy
00:04:59
fin tech startup that has wonderful training algorithms
00:05:02
can't talk to the p. d. p. eleven that's in the basement of the bank you know you have a problem um so this is our
00:05:09
um imitation of the market in terms of legacy systems that you have to overcome
00:05:15
and then if you go through this by the license kill test you realise that the massive chunk of that is
00:05:21
flying in via particles visual flight rules i using your eyes so all the instruments are more or less optional
00:05:28
as long as you have one eye open you know one high hand on the steak you know
00:05:32
in many is small aeroplanes actually mechanical connection between your muscles and the flight control surface
00:05:39
um as long as that works you should be able to get the
00:05:42
plane on the ground safely that's a pretty high bar for reliability
00:05:46
and it means that you use your eyes as the first instrument also
00:05:49
is the last one the last resort when everything else fails
00:05:52
so if we wanna build e. systems we have to build vision based
00:05:55
systems to replicate what humans can fly and where they can land
00:06:00
um so this is one of our potential customers union does our model of
00:06:06
appeal to you uh or too well with the camera system set up
00:06:10
um one of the advantages we obviously have as we can look in all directions at the same time
00:06:15
which humans typically can't we can have our emergency plan lead ready when it's needed instead of
00:06:21
after it's needed and uh and this is the eggs an example of a kind of system we're building
00:06:29
the the the problem does represent over there so um we need
00:06:33
to be able to spot of things in this guy
00:06:37
uh we've trained a neural net work to recognise drones it's very good at recognising drones
00:06:42
um this isn't there's other systems that we build like it was a slam
00:06:46
system um and there's some other logic on top of this with
00:06:50
this is the class systems i wanna talk about because this is the kind
00:06:53
of system that competes with human cognitive function a whole i. e.
00:07:02
uh that we need to talk to so to summarise if you want full
00:07:06
autonomy the air we need to build systems that outperform the human
00:07:10
on every scale on what they're currently passed to do uh and that
00:07:14
means visual flight rules first and that means we have to process
00:07:18
visual information at a semantic level so we we should not just see cloud
00:07:22
of points in our own position in an abstract three dimensional space
00:07:25
we should understand that that's the thing that will not be friends with us when it gets closer
00:07:30
and that's a place where i would be happy if i landed
00:07:32
um and that's currently we only possible with beat neural networks
00:07:38
um so it's only recently become feasible that he can reliably tell the difference between a cat and
00:07:42
a dog in an image and we need to extend that and bring it to market
00:07:46
and tell the difference between something that you could save you land on versus
00:07:50
something that has lots of cats and dogs right um so oh
00:07:56
uh that is why we talk about here working is of people earning
00:08:00
an a i know what may might you ask is air worthiness
00:08:06
um this is what it looks like an uh uh uh yeah we're the computer
00:08:13
looks like this is insanely expensive and um uh comes
00:08:18
with a backup documentation with which you can
00:08:21
go to we asked our or the f. a. a. to prove that it will probably not
00:08:26
die on humid air um it also means that it runs about
00:08:31
five to seven years behind what's in your um laptop
00:08:36
or phone so we have to run to more the
00:08:38
algorithms on yesterday's computers um that's what i would
00:08:43
respect the hardware the system we're building is an autopilot and this is a um partway three
00:08:51
paragraph thirteen twenty nine the first rule that applies to air working is all pilots and you can see
00:08:57
the first rule is that you should be able to switch off and have a
00:09:00
human um work with it i'm sorry i have to uh quickly kill
00:09:05
mm the person that's trying to or just the sound somebody's trying to me
00:09:10
oh i'm so there's a good reason for this mainly at flying is actually very dangerous
00:09:15
so you systems have to be very safe in your to demonstrate the very safe as long as they're not you have rules like this to overcome
00:09:22
um so what does that look like impact is if you make software make
00:09:25
a system that's trying to be air worthy if to follow processes
00:09:30
as a whole industry of people that will come and tell you what documents to ride and what
00:09:35
the table of contents of the docking should be in how they should be for all documents
00:09:38
these documents have names inspiring names like here before seven
00:09:42
six one or the u. one seven it's c.
00:09:46
old ones beep but we're now it see um and they govern the sort of process is you have to have in place
00:09:53
before you go to a as a or f. a. a. to say uh i program
00:09:57
this is it okay if i put it in an aeroplane goodwill people and um
00:10:02
one of the crucial um concepts you have to deal with is the uh
00:10:07
development shortens level uh or the doll level so that the the doll
00:10:14
all levels double design development assures level um so they um there
00:10:21
but related to the the safety levels in the cars
00:10:24
but uh i still level e. two eighty the wheel or want to five anyway there the other way around in your space
00:10:32
so level he means uh in cabin entertainment system the
00:10:36
cabin light in your toilet um things that
00:10:39
if they break do not affect me a function of the aircraft as something they comply
00:10:44
uh basically have to prove that doesn't catch fire and you're good
00:10:48
um level d. in c. over instruments that help the pilots
00:10:52
uh at your level d. if uh um break down causes nuisance to the people are
00:10:58
trying to find the aircraft level saying is if you lose your primary flight display
00:11:02
and you have no idea where you are the more a yours seriously freaked
00:11:06
out if that happens so communications to air traffic control the level c.
00:11:11
um level be in a office isn't that bypass the pilots with
00:11:14
your remote control system for jet engine that's level a you
00:11:18
know that should not break these numbers here that you see
00:11:22
are the uh of tolerance of failure per hour flight
00:11:26
so if ever thousand hours applied your level the system breaks mm yeah
00:11:31
um if your level a system breaks a more often than once per billion hours of flight
00:11:38
um the people are going to be upset you while so basically if it breaks they will be
00:11:43
upset it um to these other things that kill lots of people if they break um
00:11:50
so that puts in place a lot of um things they want to see from your system when you make it
00:11:57
uh to give you an example here um what this means for
00:12:01
when you make software he makes offered the highest level
00:12:05
basically the person who designs it cannot be the person who implements it cannot be the person that tess it cannot
00:12:10
be the person that checks and everybody is following the plan so you need easy for people to do
00:12:16
you know change a light bulb um you need to have
00:12:19
a very detail hierarchy of the high level requirements
00:12:23
that's what the aircraft as opposed to your system what is isn't supposed to the aircraft
00:12:28
what the hardware and software are supposed to do um the high level requirements for the software as
00:12:33
what you think of as what the thing has two outputs the lower levels what you'd call the design
00:12:39
they have to chase that down to the code in the tests and show that it's all their energy didn't forget anything i'm
00:12:45
sold the the admin is the burden but also how well you sleep at night as often here because remember if you
00:12:52
messes up people will die and typically people don't cope with that very
00:12:57
well um you need to you know every line of code
00:13:01
need to sign off that actually is correct so if you're used to writing apps
00:13:06
for phones you should not be allowed within twenty news of the keyboard
00:13:10
uh connected to the computer use for c. typical programming um every
00:13:15
condition and every if statement uh everybody in this is
00:13:19
multiple mm cases and this um i forgot what the abuses
00:13:23
that for but if you have a brain formula
00:13:26
that that that decides uh which we're if they've been goes it's
00:13:29
short every variable in there can independently effect the outcome
00:13:33
uh at some point you have to improve the your compiler actually
00:13:38
compiles your language correctly to the by code um so
00:13:42
we actually have someone who's modifying the compiler that's used to
00:13:46
generate a g. p. u. code um because turns out
00:13:50
you cannot assume that your compiler is friends with you um um uh this one
00:13:57
requirements correlated target processor that sounds kind of big but it includes things like
00:14:01
um so you're programming um more control over jet engine i thought it's great that we can now
00:14:07
have multiple cores rummaging around in shared memory as you do nowadays yeah that's not gonna happen
00:14:12
because uh discourse might contend for the same shared memory at a different time
00:14:17
and you might have micro seconds of fluctuations in your own time and that's not
00:14:21
gonna be level a so the level it more become more the controller
00:14:26
computers nowadays are a hundred megahertz eight bit computers that
00:14:31
will not break they will also have problems dealing with a mega pixels of
00:14:37
uh imagery that's a a high frame rates um good reviews and
00:14:45
software quality assurance transition porch area are even um figure
00:14:50
things unanswered but now we have a question
00:14:56
yeah
00:15:02
yeah
00:15:04
oh
00:15:09
well so that's actually an interesting question so the problem is with software you can't actually prove it's
00:15:15
you can't pull the talkers correct so what they do is they assume it will fail
00:15:20
and it will go into the full three uh something that will definitely fail and everything else has to deal with that
00:15:26
and then so even though will definitely feel you have to then go through this a list of things to
00:15:32
prove that it probably won't feel that often whether to try thinking about very hard so it's actually
00:15:39
it's actually even more messed up than i was explaining
00:15:50
yes i
00:15:57
yes for those things it is you know we made in the lab
00:16:01
that this thing can turn around this many times that breaking
00:16:04
uh we have this weights that we switched on a billion times
00:16:08
and then break or we have these artificially aged surface and
00:16:11
we show that the appalachian rate is twelve micron per year and so we should be able to fly for fifty years
00:16:17
was software there's none of that's a sofa software it is you know all of this except
00:16:24
we don't have to prove that so why don't you just follow these processes and then
00:16:28
at least we have a we have um some indication that will probably not feel
00:16:39
but materials yes i
00:16:51
yeah yeah yeah
00:16:55
yes
00:16:59
yes that is that that is the
00:17:04
oh there is so ultimately so it sounds all very you know hammered
00:17:09
uh our clothes but in the end it's just the story
00:17:12
right and it's the story of which each of the steps something may come in and say i don't trust this that
00:17:17
and then you say well but i have this all documented actually saw it yeah that's a lot of work
00:17:22
this is how much work it is you first um make a plan for
00:17:27
all the stuff you're going to do and you have a project
00:17:31
readiness review or plan readiness review this is the review that says it plans ready
00:17:36
and then you write this isn't specification and this is stage of all but one when the other f.
00:17:40
a. a. comes by and sees it you have a plan and they signed up on the plan
00:17:44
then you go and ride high level requirements and you have a preliminary design review
00:17:49
and then you're blue in the design yeah but the company is are you
00:17:53
and then you write some more specifications and then you write some code and then
00:17:56
they can buy the second time to see if you're doing everything correctly
00:18:00
then um you have validation and verification low level and then you put it
00:18:05
all in a box easy box work and they come and look at
00:18:08
stage of all the number three to see if your box works as a plant
00:18:12
and then there's and so when you put the bottom the plane and they
00:18:17
so okay you can go flight is now they come by some of them so
00:18:21
very hard to fake his it's also hard to reverse engineer that's basically
00:18:25
if you show up uh there were there ready ready done things you hop okay can i have um a certificate of
00:18:31
or where is the thing i i really works you basically start over and you know go through all these faces
00:18:41
it's
00:18:44
yes and uh and how you plan to the whole thing
00:18:49
and um so
00:18:53
uh
00:18:58
it in medical research is also um a mentoring now um
00:19:06
um i think you just make a political statement but uh uh we can get to that
00:19:11
later uh so these slides were borrowed from uh from we hired to teachers on a
00:19:16
wonderful ways of you one seven eight c. which is all this result of over have
00:19:20
had been planning you have to have in place before you can write software so
00:19:26
for many people active in the market specially people come in from internet cowboy lands
00:19:30
this is you know all the parts uh being a a barrier to progress
00:19:35
but you know if you have to do it you might as well embrace it and make it make your life better because you will
00:19:40
lose a lot of sleep if somebody dies because you had a
00:19:43
bug in your uh thing so um and so it also
00:19:49
all these things was used for good reasons so it you should not just way this away and say you know where we were
00:19:55
we're going to just blew me away to regulations and show that will be a better
00:19:59
place because uh a it's probably not true and be good luck with that um
00:20:06
so for the case and we have a problem um so i told you in the
00:20:11
first set that we need to make a deep learn systems they can deal with
00:20:15
visual processing information and in the second part i told you to get any sort of
00:20:20
certificate where where is you need to do all these things now for um
00:20:25
what you call the low level requirements this is where you put in i have
00:20:28
this function computing this or i have um this filter with these constants
00:20:34
and now all of a sudden we show up with six million ways in our as net thirty five
00:20:38
or fifty million or is not thirty four i forget which resonant that we use thirty four
00:20:45
or use all of the resonance and um uh so there's no is
00:20:49
of constance and the they're going to ask not only wise
00:20:53
is good enough because you could say well i have all these testing or also say why is this three point fifty five
00:20:59
um and then that's just the first question than the other fifty
00:21:03
million questions for so um if you want to have
00:21:08
a story of white safe enough it's not just to you could have like bach tests if they work exhaustive
00:21:15
but they're not um so you need to open the bottom sure there's no funny stuff in there
00:21:20
so we need a way to put funny stuff in there so that is the rest
00:21:24
of the talk about how are we going to get neural networks
00:21:28
uh approved for application the safety critical scenarios when there is
00:21:33
no way of thinking for the authorities uh about this
00:21:36
um that's something we have to develop so we not only have to develop assistance we can
00:21:41
do this we also have to develop the story you show that they're safe enough
00:21:46
and for that's uh so we need in other words the development assurance process
00:21:50
for neural networks um oh incidentally there's been lots of literature of applying neural
00:21:56
nets in flight uh two thousand six already andrew and at stan ford
00:22:02
makes a model helicopters stand fly and they vastly outperformed human pilots and
00:22:07
on now and talk to him in two thousand eleven or twelve
00:22:10
which was google and he wasn't will about a a because
00:22:14
you've already flying helicopters if i understand correctly um
00:22:18
why uh would you like to do something real helicopters leaving off people we'll helicopters
00:22:23
or cases nobody's going to trust tunnel network um airbus a had a project
00:22:30
with a university somewhere in the u. k. for which one the pull
00:22:34
number of neural nets to control certain aspects of a big
00:22:37
uh ever three twenty o. three forty vastly outperform human in the article ends with you
00:22:43
know unfortunately there's not gonna be survive because it's just a bunch of random numbers
00:22:46
how many were on the knots out that the study for the f. a. a. uh on application of adaptive systems
00:22:52
um they massively confused the matter because the confused adaptive and neural network
00:22:59
so if you have something that changes configuration in flight you have a fundamental problem
00:23:04
i'm proving that it is anything um but uh no i work itself needn't uh be
00:23:11
more fun applied also they concentrated on the case where this is in control
00:23:16
in in the control look of the proves ability and a crucial part of that is that you have
00:23:19
a bowl of reality you controlling something that makes it much more complicated so we're gonna propose later
00:23:24
is that we do something slightly different if you go through these high every
00:23:28
row hierarchy requirements for example we haven't instruments that is going to
00:23:32
tell the central fly computer to abort the landing because it doesn't look safe to lance
00:23:38
really was everything's exactly the same as normal system except this one function in
00:23:42
the middle the things one image in and produces one classification out
00:23:47
uh is is a landing sites yes or no and just that function with
00:23:51
the has the fifty million numbers that we have to sixteen million slide
00:23:56
uh we have to argue about where they come from and why the correct if they are correct
00:24:02
so that in itself is not that different from
00:24:07
this these of artificial intelligence so one definition of artificial intelligence is uh using
00:24:13
an understanding of the world's to automatically effectuate some effective it's something like that
00:24:19
uh this is the p. i. d. controller uh invented by james what
00:24:23
that uh kicked off the industrial revolution uh this is the modern representation of that it has three random numbers
00:24:30
nobody has any idea where that be the i'm digging come from but that's horrible should because there's a
00:24:36
um repeated the process to get to these three round numbers
00:24:41
there's the accepted way to analysts that this is a stable system this is actually harder than the
00:24:46
example i gave before because this process requires a model of reality that you have to trust
00:24:51
so if we can do away with that maybe we have a case so
00:24:55
we want to talk to um a a as ah and they're like
00:25:01
we so we're going to talk to them some more um so now we
00:25:06
get to the interesting part way you all get to be involved um
00:25:12
i remember that a doctor told me at this point i should ask someone question oh yes i should ask the obvious question

Share this talk: 


Conference program

Airworthy AI; challenges of certification, part one
Dr. Luuk van Dijk, Founder and CEO of Daedalean
12 Oct. 2018 · 2:05 p.m.
Airworthy AI; challenges of certification, part two
Dr. Luuk van Dijk, Founder and CEO of Daedalean
12 Oct. 2018 · 2:30 p.m.
Airworthy AI; challenges of certification, Q&A
Dr. Luuk van Dijk, Founder and CEO of Daedalean
12 Oct. 2018 · 2:56 p.m.

Recommended talks

Presentation of the «Robot Learning & Interaction» research group
CALINON, Sylvain, Idiap Senior Researcher
29 Aug. 2018 · 9:43 a.m.