Player is loading...

Embed

Copy embed code

Transcriptions

Note: this content has been automatically generated.
00:00:01
yeah so it's locally known this is the whole but joining an interaction group
00:00:07
when evoking hole but takes we might have this type of applications in mine
00:00:12
but there are many upcoming applications but look
00:00:15
very different includes orbits assisting us
00:00:20
it includes orbits being part of all buddy and includes
00:00:25
robots walking far away from us calibrated by people
00:00:30
do you think between these applications is that these robots need to
00:00:35
be people upon more frequently and they put this assaults
00:00:40
so this is why learning is essential for this kind of robots and in your group
00:00:45
we are particularly interested in the learning from
00:00:48
demonstration aspect there are several underlying challenges
00:00:53
first babies admit to learning problem to know how to orchestrate different learning modalities
00:01:00
we can learn by exploration we can learn by viewing observation
00:01:05
and we can also with the user up to twenty guiding the whole bought for the difference that's of that that's
00:01:12
then we have a series of correspondence problems meaning but the task
00:01:18
we need to be transferred from one environment to another
00:01:22
and it will also need to be transferred from one how about two and
00:01:25
those of about so sever corresponds problems that we need to face
00:01:31
and maybe one of the most important challenge here is to find a presentation of
00:01:36
these movements and skied is but the whole what we need to download clips
00:01:42
the challenges that we would like about to learn was an issue demonstrations on we would like all about two
00:01:48
try on the v. wall was only few trials to acquire some skins
00:01:54
so this boils down to finding model priority but are expressive
00:01:59
enough to be using a wide range of tasks
00:02:03
so my presentation will be articulated around three model trials
00:02:07
that also reflecting or research activities in the group
00:02:10
the first one for dates to movement pretty keeps what are the building blocks that
00:02:15
we need to have the whole but learning uh uh efficiently from shouldn't situations
00:02:21
the second concerns the fact that these movements will often be associated to objects in
00:02:26
the environmental unlocking the environment just for the whole it's too short metric aspect
00:02:33
indeed we have a lot of structures in the data but we get a inhabit weeks and sometimes they are
00:02:38
we exploit that's always show i we could build a
00:02:41
tools to better exploit the structures on geometries
00:02:47
so for the first model prior so we bow to respond optimal control
00:02:53
in this you'd get this typical objective function but consist of finding a sequence
00:02:58
of control comments that we try to attract the ta get about half
00:03:04
and that we try to do it by using small control commands so that the typical problem here would
00:03:10
be to say that we want to reach a point i paint it see with the desired precision
00:03:16
and we start somewhere else and we want all system to compute a series of control comment
00:03:21
to reach this point so this is just on the news uh all all the microphone
00:03:27
no we will use a second set of tools but is this time both
00:03:31
from the field of speech processing by using high intensity markov model
00:03:36
which is basically used yet to encode or demonstrations of movements
00:03:41
you can think of these great pass as trajectories been demonstrated by the user
00:03:48
which are represented here by a series of gaussian distributions these agencies
00:03:52
with that big these gaussian distributions meaning but we really
00:03:56
know carrion crow that what are the valuations on what are
00:03:59
the correlations between all uh d'etat uh the articles
00:04:04
we will also were present i we transit form one of these gaussians staged windows uh
00:04:09
on we will also learn how time we spend in each of these gaussians that
00:04:15
if if we use this kind of generative model what we can
00:04:18
do with it is to one generate a sequence of states
00:04:23
responding to a step right reference trajectory for about so we could
00:04:28
not only use back to generate the movement on the robot
00:04:32
but if we combine this with seven staff to guess uh we
00:04:36
is this type of objective function from up to market for
00:04:40
then we get the the the way of learning the kampala instead of running the trajectory
00:04:46
meaning that if we start from that point that is outside of all demonstrations
00:04:51
we can find the controller but we pass for these different
00:04:55
gaussian space on the whole point use movie the task
00:04:59
what is interesting here is but the system we exploited evaluations but it does upset of
00:05:05
meaning that it will also calculate the gains but it needs to use to track stuff
00:05:13
so we say yeah but we're following the minimal intervention principle
00:05:17
in the sense that the whole bought we cooper spectral patients only
00:05:21
if they have an impact on the task to produce
00:05:27
to explain this mechanism involves applications with just one i would like to show
00:05:32
is a collaboration switch products from coach which university of run down
00:05:36
yeah we're interested in applying this concept to it did movements
00:05:42
to switch your finger for the holding applications we often they're this basic of
00:05:46
interface where the user is defining key points to generate the path
00:05:51
yeah with this techniques we can be compatible with this basic of our
00:05:55
presentation but the user we also define valuations around this key points
00:06:02
meaning that the user can define the underlying uh dynamics the the the user is
00:06:08
editing these gaussians but there is an underlying a presentation of the movement
00:06:13
and the this can be used to to it it's movement with valuations yeah
00:06:19
we can see an example of all back stuff about the whole waiting
00:06:23
uh i thought that like the aisles stylised alphabet letters and was not
00:06:28
you are variations but arising from this for presentation of the movement
00:06:34
we're currently extending this process to smart phones yelled the idea is to
00:06:40
have a user friendly interface for end users to what program about
00:06:44
one of the most simple the thing we want all but to do is to
00:06:49
reach a point in space that is a a set by the user
00:06:53
so instead he off using the programming on way draw using coding it says then
00:06:57
to to to to to define this point by using a smart phone
00:07:01
what happens here is we know word of mouth moves buttons located
00:07:06
in space you to the exit on it doesn't cameras
00:07:10
and we can use the display as an adult integrating the phase
00:07:15
what the user needs to do is just clicking on this interface
00:07:18
to define where the end of the whole but should
00:07:22
on the advantage of having the mobile interface is that the user can
00:07:25
select several viewpoints on click several times to refine this point
00:07:33
no does movements of pages describe we need to be associated
00:07:38
in some way to object all landmarks in the environment
00:07:43
so typically if i'm learning to be manual ski in what i would like my whole bought to do is that you find introducing
00:07:50
activation with one of the r. i. would like the or the how to
00:07:53
follow the uh on on the on the speculation in the fast way
00:07:58
so this is how we treat this problem in our group
00:08:01
we reuse the objective function that we've seen before but this
00:08:06
time we encode movements in multiple caught in its distance
00:08:10
so what happens here is that we we have several hopes of us but we cannot be we
00:08:15
collect data but from they own perspectives on these observers can move from one demonstration the next
00:08:22
so maybe it should be the other presentation of the movement and during our
00:08:26
production they we need to agree about uh what the movement should be
00:08:31
this is illustrated yeah with this animation ways of about that needs to bribe issue
00:08:37
here in red you have the has seven's frame of the whole bought in blue
00:08:42
you at the reference frame of the shoe and asked are collecting several demonstrations
00:08:48
we add this to my dad's but tough used to find a a not you might
00:08:53
ah that we are a lot of about can generate the task new situations
00:08:59
so you're the fusion of this to my there will be this yellow a model but
00:09:04
we just appeal now and this model can be adapted in a very fast man
00:09:10
yeah with this kind of course function on the advantage of filling some pragmatic form
00:09:14
yeah is that it's very fast to compute we're actually an analytical solution yep
00:09:19
and it means that if the object is moving we can add that on the frightens perturbation
00:09:27
so a menu is exploiting this within the collaborative i dress project
00:09:32
yeah the idea is to assist people uh to the gas is or by uh helping
00:09:39
people to put an issue of two l. people to put on a jacket
00:09:43
yeah the main challenges are but we need to consider does make people calling it says
00:09:48
that is about operated to the to the the um the body of the user
00:09:54
we actually to have adaptation in force base as well as in position
00:09:59
space so in it's not only about movement it's really about
00:10:02
uh learning is she in this cheek and uh have
00:10:06
several components including a components and the fourth domain
00:10:12
well the ashes also expecting this technique within the text of project and those of collaborative project
00:10:18
where you're the goal is to to operate the menu are on the what about
00:10:24
so in this project we have a tail patience and uh we have the to operate all
00:10:29
wearing an exhaust collect on on looking gets trains like object you are reality environment
00:10:35
where does this user can provide demonstrations the hobart high of permission
00:10:41
and what what is doing this project as that
00:10:45
it is being used to learn the movement and then what we do is that we pass the model panic
00:10:51
the us to the other side of the system which is the and the what the whole bought
00:10:57
so now between the two there is a satellite communication to the communication is very slow
00:11:03
it would not be possible to respond occupation technique because the user would not have
00:11:08
the feedback on time for example if the whole buttons to donate via
00:11:13
the user might not know what is happening and the feedback would be too slow to where it does not work for probably
00:11:20
but we don't techniques what happened says the that maybe we have a
00:11:24
copy of the screen on the two style as this model
00:11:29
and that might be perturbations but are different on one side or the other
00:11:35
two yeah but you can see here is this kind of tube is is an adaptation of the
00:11:40
moderator which is online uh yeah so yeah it's there is a loop being only look out
00:11:46
and being able to add that there is a change uh if the whole but is moving and there is a a change of
00:11:53
position of the and respect to that or it would be like an adaptation but doesn't do to pass the word process
00:12:01
we actually that don't information like this past is where we are and uh uh in this
00:12:08
movement so we which is the the the the the current gaussians what we're interested
00:12:13
and with this we have some form of some yep and i mean and we believe
00:12:17
that this principle can be used in a wide range of their operation applications
00:12:26
so now the the last model prior that i would like to discuss relates to
00:12:31
show me trees of the data that we collect in all but it's
00:12:36
so just question or what was when we were doing this project
00:12:41
where we brought the whole but in front of a coffee machine and we wanted to teach know about up
00:12:46
to use uh uh to discuss machine with the different
00:12:49
types of uh of of your choosing the coffee
00:12:53
the goofy even bringing it to the machine and bringing the cap on so on and so on
00:12:59
and you were wondering in which brought in it's just that we should we call this information
00:13:05
because you have several choices
00:13:09
if we stay a blue whatever we have joint finger trajectories
00:13:14
meaning that here we have asked created drawing which are forming a specific geometry
00:13:22
but then for manipulation task it's often useful to also include the position of
00:13:27
the end of the all button the cafes and space x. y. z.
00:13:34
most often it's also relevant to encode occupation of this and
00:13:40
so it's not only x. y. z. that it's actually a combination
00:13:43
of position and orientation that can be viewed in different way
00:13:47
well we can talk about he did my emotions for example
00:13:52
and then there was so many forms but ah maybe more specialised but that are under exploited so far in
00:13:58
her butt cheeks i'm thinking here of inertia much or c. is all for many rehabilitated so it
00:14:06
on the off stiffness gains in general which are all
00:14:10
on the street with positive definite many force
00:14:15
so you know group the the that tune but we exploit to treat these different
00:14:20
uh problems and is different many force it comes from a human in geometry
00:14:26
yeah the idea is to
00:14:29
have points on the menu for yeah the menu for will be represented by this year
00:14:34
and the idea of human and many for this to define tangent space for each of
00:14:39
this point on the many for on ways to connect these different entrance space
00:14:44
and the interesting point four s. is that these dungeon space out yeah
00:14:49
meaning better if we have designed the long ago reason that is walking all that
00:14:54
has been designed first unaudited and data there is the on off duty
00:14:59
change is either reason to make it well uh with other types of money for it
00:15:05
so yeah but it doesn't know tool but we need to make this one has to
00:15:09
have some kind of mapping between this tangent space on the money for it
00:15:13
here in this case if i'm the whole wink vector or in these
00:15:17
uh a tangent space so we have this right or will
00:15:21
i will have a corresponding visual basic but we follow the shape of the many form the two will be of the same length
00:15:28
by using this simple mapping functions but ah and then it it
00:15:33
got enough okay so we have a lot of structure
00:15:36
of the null data about losers with this in the in
00:15:39
analytical form we can then think of clustering problem
00:15:44
of fusion problem oh off optimal control program but
00:15:49
are adapted to these different uh many fords
00:15:54
we don't really popular for example for the case of interpretation this
00:15:58
time with different types of many forms of foliage body
00:16:02
motions for stiffness um h. so you don't many p. edited
00:16:07
it so it's all even to to interpolate between distributions
00:16:13
we have that advantage of this frame wow is that it's in
00:16:17
the exact same process uh independently of the many ford
00:16:21
we we use so it's it's very easy to to to change now there isn't someone money for to another
00:16:29
these are again all the example this time for clustering problem and walking in different data spaces
00:16:36
so knowing is exploiting this principle noble who prison the tact and
00:16:40
collaborative project this project is about controlling the prosthetic and
00:16:46
he and we are using for this stuff so face select from your coffee signals
00:16:54
so we one of the standard way to treat this information is to have the time window
00:16:59
sliding time window and to protect the task covariance features especially points features
00:17:06
and in this case we really need to walk on this type of many for two point four or and
00:17:12
so so we get this kind of regulation problem yeah we have input
00:17:17
data on this the city positive definite many forgive output data being
00:17:23
different poses but we would like the the the the plus disease to to to to reach
00:17:30
oh yeah i'm just giving an example of a or a of a change of
00:17:35
a screech from one post windows a year it's the reference in black
00:17:39
so the first thing we looked at is what happens if we ignore the geometries which we just vector highs all they thought
00:17:45
yeah we would get what is in greens or very poor approximation of what we wanted to do
00:17:51
no if we are considering the same i question problem on the symmetry positive definite
00:17:56
money for then we get much better result with the blue pair of yeah
00:18:04
so make take away message is that we can combine optimal control techniques
00:18:10
two guys uh with statistical modelling techniques in an efficient way and this is what i think we require info
00:18:16
but it's to cope with this uh with the learning of a controller instead of running the trajectory
00:18:24
then ah i have highlighted the importance of considering multiple calling it stands to learn of skins
00:18:32
and then i emphasise the who or visual mitre uh in this uh running applications
00:18:38
which is particularly important if we want to learn of task was only
00:18:42
as the shoe set of demonstrations of a small number of interaction with the environment
00:18:51
so what of all codes ah online uh does awesome contact

Share this talk: 


Conference Program

Introduction by Hervé Bourlard
BOURLARD, Hervé, Idiap Director, EPFL Full Professor
Aug. 29, 2018 · 9:03 a.m.
916 views
Presentation of the «Speech & Audio Processing» research group
MAGIMAI DOSS, Mathew, Idiap Senior Researcher
Aug. 29, 2018 · 9:22 a.m.
16949 views
Presentation of the «Robot Learning & Interaction» research group
CALINON, Sylvain, Idiap Senior Researcher
Aug. 29, 2018 · 9:43 a.m.
9721 views
Presentation of the «Machine Learning» research group
FLEURET, François, Idiap Senior Researcher, EPFL Maître d'enseignement et de recherche
Aug. 29, 2018 · 10:04 a.m.
14111 views
Presentation of the «Uncertainty Quantification and Optimal Design» research group
GINSBOURGER, David, Idiap Senior Researcher, Bern Titular Professor
Aug. 29, 2018 · 11:05 a.m.
3210 views
Presentation of the «Perception and Activity Understanding» research group
ODOBEZ, Jean-Marc, Idiap Senior Researcher, EPFL Maître d'enseignement et de recherche
Aug. 29, 2018 · 11:24 a.m.
5620 views
Presentation of the «Computational Bioimaging» research group
LIEBLING, Michael, Idiap Senior Researcher, UC Santa Barbara Adjunct Professor
Aug. 29, 2018 · 11:45 a.m.
4130 views
Presentation of the «Natural Language Understanding» research group
HENDERSON, James, Idiap Senior Researcher
Aug. 29, 2018 · 2:03 p.m.
8976 views
Presentation of the «Biometrics Security and Privacy» research group
MARCEL, Sébastien, Idiap Senior Researcher
Aug. 29, 2018 · 2:19 p.m.
6512 views
Presentation of the «Biosignal Processing» research group
RABELLO DOS ANJOS, André, Idiap Researcher
Aug. 29, 2018 · 2:43 p.m.
4027 views
Presentation of the «Social Computing» research group
GATICA-PEREZ, Daniel, Idiap Senior Researcher, EPFL Adjunct Professor
Aug. 29, 2018 · 2:59 p.m.
7158 views

Recommended talks

Interpretable models of robot motion learned from few demonstrations
Dr Sylvain Calinon, Idiap Research Institute
May 3, 2019 · 11:50 a.m.
193 views
The Desert Rose - When poesy meets technology
Nassim Saoud, Trimble Consulting, Paris, France
Oct. 29, 2019 · 4:30 p.m.
370 views