Romain Edelmann, EPFL (LARA)

Thursday, 7 June 2018 · 12:22 p.m. · 02m 50s

Embed code

i one mechanism and today i'm going to talk about are using a neural network to guide expression transformation

so imagine you are given some kind of objects such as religious cubes on which can apply transformations

and you are interested in finding pass so sequence of

transformations which can go from one configuration to another

if you blindly do that during to a problem is that the problem is that there

are so many states and it's impossible to get him visit them all blindly

so what i do in my work is that i use a neural network to guide the search and the idea is very simple so

you take two objects you feed them to the network and used output

an estimation of the number of transformation that must be applied

in order to reach the targets and uses information we can devise not go isn't that we'll really approaches solution

now in my work i'm not using rubik's cubes at my objects but mathematical expressions

and they look something like this so your valuables addition

and multiplication in addition it's expressions of focus

which is where we can apply transformations the transformations we consider our

commute activity as a city city and you should be t. v. t. v. and

also operations which moves focus on the expression to modify different part of expression

oh this is an overview of the system and it's very simple what we do is we take do expressions with a focus

we feed them to us realistically um and requesting on network and whipped in points in a

very high dimensional space and then we simply measures it the distance between those two points

so we twenties network fun fun fun six million barrels of examples of expressions

uh with distances ranging from one to ten and at the end

we obtain it system that is able to approximate the

the distance between expressions within minutes rudolph less than one

so here we see the performance of a whole system that compared to breadth first search

and we can see that our system is able to find parts between expressions um

in both cases in less than one second compared to um the efforts in two minutes

so this project is available online uh and get up at this address and was

built using tightened by george so thank you very much for your attention

yeah

Andreas Mortensen, Vice President for Research, EPFL

7 June 2018 · 9:49 a.m.

Jim Larus, Dean of IC School, EPFL

7 June 2018 · 10 a.m.

K. Rustan M. Leino, Amazon

7 June 2018 · 10:16 a.m.

Katerina Argyraki, EPFL

7 June 2018 · 11:25 a.m.

George Candea, EPFL

7 June 2018 · 12:08 p.m.

Utku Sirin, (DIAS)

7 June 2018 · 12:11 p.m.

Arzu Guneysu Ozgur, EPFL (CHILI)

7 June 2018 · 12:15 p.m.

Lucas Maystre, Victor Kristof, EPFL (LCA)

7 June 2018 · 12:19 p.m.

Romain Edelmann, EPFL (LARA)

7 June 2018 · 12:22 p.m.

Stella Giannakopoulo, EPFL (DIAS)

7 June 2018 · 12:25 p.m.

Eleni Tzirita Zacharatou, EPFL (DIAS)

7 June 2018 · 12:27 p.m.

Matthias Olma, EPFL (DIAS)

7 June 2018 · 12:31 p.m.

Mirjana Pavlovic, EPFL (DIAS)

7 June 2018 · 12:34 p.m.

Eleni Tzirita Zacharatou, EPFL (DIAS)

7 June 2018 · 12:37 p.m.

Maaz Mohiuddlin, LCA2, IC-EPFL

7 June 2018 · 12:40 p.m.

El Mahdi El Mhamdi, EPFL (LPD)

7 June 2018 · 12:43 p.m.

Utku Sirin, (DIAS)

7 June 2018 · 2:20 p.m.

Arzu Guneysu Ozgur, EPFL (CHILI)

7 June 2018 · 2:21 p.m.

Romain Edelmann, EPFL (LARA)

7 June 2018 · 2:21 p.m.

Stella Giannakopoulo, EPFL (DIAS)

7 June 2018 · 2:21 p.m.

Eleni Tzirita Zacharatou, EPFL (DIAS)

7 June 2018 · 2:22 p.m.

Matthias Olma, EPFL (DIAS)

7 June 2018 · 2:22 p.m.

Mirjana Pavlovic, EPFL (DIAS)

7 June 2018 · 2:23 p.m.

Eleni Tzirita Zacharatou, EPFL (DIAS)

7 June 2018 · 2:24 p.m.

Maaz Mohiuddlin, LCA2, IC-EPFL

7 June 2018 · 2:24 p.m.

El Mahdi El Mhamdi, EPFL (LPD)

7 June 2018 · 2:24 p.m.

7 June 2018 · 2:25 p.m.

7 June 2018 · 2:25 p.m.

Erik Meijer, Facebook

7 June 2018 · 2:29 p.m.

François Gallaire, Laboratory of Fluid Mechanics and Instabilities

16 March 2017 · 12:11 p.m.

Anil K. Jain, Michigan State University

3 Sept. 2013 · 2:02 p.m.