Player is loading...

Embed

Copy embed code

Transcriptions

Note: this content has been automatically generated.
00:00:00
yeah i like that analogy car i'm seeing michael you know i i
00:00:07
let's travel back in time seventeen seventy six this is one of the first demonstrate off a projectile
00:00:15
and if we look at what we have today things have not changed so much in the sense that technology's better
00:00:22
but the main component of the project all remains the same something
00:00:27
very static but we play somewhere on but really the only interact with
00:00:34
so we would like to see the project toss on more as more dynamic on more interactive
00:00:40
with latin america we're transforming salt project tossed into interactive orbits
00:00:48
this whole bought can understand its environment and the surrounding
00:00:52
so we need first to be now we can just come down you know as a platform
00:00:56
then we have a knack treated me all in order to explore
00:01:00
it to the environment on the the wall space will be displaced
00:01:04
and then we have a cover or what is used both to understand the surrounding but also
00:01:09
to pipe we used uh in order to interact with the user on provide new communication capability
00:01:16
overworked cost of what of this for the new components that we ah
00:01:21
he is very low compared to the price of a bimbo meaning but
00:01:25
it's not so difficult to improve the be locked up i. b. t.'s nobody's
00:01:31
so what do we need to do that to behind this we need to understand do you in
00:01:35
which the you know he's based on for this while using the single cover all but his first
00:01:42
brought by the user on the user the scanning the one put back the cover on the device
00:01:47
and we just got the device can understand well towards well
00:01:51
the planets of faces but you can use for the projections
00:01:56
and then the system is automatically segmenting planes finding well these productions of feces
00:02:02
on backtracking this got it on we can vent yeah what do you use a or
00:02:06
pointing at this difference of feces and deciding which of phase should be used for the projection
00:02:18
then we have this activation capability so it means that if i'm showing to
00:02:23
the system well actually project different sorts user can be different part of the presentation
00:02:29
then we have the device automatically moving from one production to yourself just
00:02:33
can be table artist can you watch this can be the war surrounding
00:02:39
and you risk your life the more of this just stuff though so this is the backup plan how are you
00:02:48
so we believe that this technology can be used with menus uh whoops
00:02:53
so first that analogy cut can be used at the shop floor in order to project information on objects
00:03:00
to add to the walk else especially if the all these devices placed
00:03:04
on the mobile platform by projecting on initial faces on updating does dynamically
00:03:11
then since we know where the use of these we can also a few yeah fact being
00:03:14
projected in the sense that we can then the l. three d. object but we knew each
00:03:19
as a very uh three d. why they are flat but if we know where
00:03:22
the user is we get at this kind of animal fake effect that we can explore
00:03:27
so it's uh in in a series of experience by using this on with that reality capability
00:03:33
on without worrying special glasses of special and nets all without having to and er lu device because
00:03:39
it's projected so it means that the user can continue the activity without taking too under another device
00:03:48
so we believe it can be using public space used for easy mission fails all schools events all musicians
00:03:55
but it can be used at all if we want to keep places multiple screens
00:04:00
on instead of a single project all i can project indifference bases
00:04:06
all that watch the smart meeting hoops and we splendidly presentations on this is the use case
00:04:12
but we re for all that we have followed during devices so we would like to avoid that
00:04:18
and it's yeah we can do this so there is a for the presentation content but there is also the menu and it's like
00:04:23
this communication medium and let's see what what we can do with this
00:04:26
communication medium today to do that we have this to interact with the device
00:04:32
if you think of what it is it's a point to that point but i can we go on the screen
00:04:38
and it's two buttons will fall while to go back well and if i need to go tense then slides but well that we pressed ten
00:04:44
times this one point three billion can do better by using the space to have something more tangible or to to exploit the space where we'll
00:04:54
so we require support fees to do the scene understanding use
00:04:57
of tracking gesture recognition about code for make at twenty to design
00:05:02
we need the full production mapping in order to distort the way the images of project that
00:05:08
and we to media design to have new contents for these new type of devices
00:05:13
and there's a hoop real we have all of these uh expect
00:05:17
dies as a as a team reason very distinct uh expert flies
00:05:23
so i i think the hoboken interaction hoop with uh no all we know buckle for optimisation
00:05:30
gain id is uh recently uh uh some of the stuff that
00:05:34
could scan terrific on user so years or so ago to channel
00:05:37
with the same name and use them or cut ties in the use of habits which make up a new designs on feeding in printing
00:05:44
and then you are from birds miss universe joe
00:05:46
front on with expectation digital ah visualisation on multimedia
00:05:51
so we believe well the right team to be this
00:05:54
kind of multi uh of milk incompetence uh the device
00:05:59
now if we look at the market it looks interesting marquette because if
00:06:02
we take a single technology for emails that there's a disco d. a. g.
00:06:06
from texas instruments it's already representing five billions units being so or
00:06:12
uh immediate twenty twenty and we isn't usually five percent expected drove
00:06:20
and you know if you look at the competitors so first to
00:06:23
their production markings that are quite popular our production marketing is only fixed
00:06:30
it's a tiny realisation so it's something really that is
00:06:34
don't completely off line it's more like making a movie
00:06:37
so the the business model is very different compared to what we want to do on is not interactive
00:06:44
then there is this whole school i chew system but is mostly done for lights
00:06:49
not projecting a lot like that content but just projecting some light on
00:06:53
the seat and also not interacting the closest would be mule as though
00:06:59
middle head is also using a new hole you can also use of our project alls
00:07:04
but it's only peeping object all the meaning but there is no interaction
00:07:09
and it's also business model where all the everything is known
00:07:13
by the company and then they ran to the device while we
00:07:15
would like the users to create a on content we would like uses to use this kind of device the way they want
00:07:24
we have several things to do next refinished a prototype during the night just that there
00:07:29
so we still have things that we would like to finish there is the continues interaction aspect
00:07:35
there is the gaze talking functionality that we would like to ah we have a new prototype on going with a
00:07:41
moving cover on using exactly the same mechanism but don't upside down on the the colour also being able to move
00:07:49
finally we need to think about the use of software the i thought open train users tune to use these new devices
00:07:55
since we have this new uh movement capability we also
00:08:00
need to find new ways of training the person's to use
00:08:04
and if we go forward with this uh views he will have to find what is the market we want to target
00:08:09
first as there is no these multiple use uh whoops and
00:08:13
ask end users what they would like this device to do
00:08:18
so we're latin america we have transforming but project also into interactive robots
00:08:23
well now we will try to see of prototype inaction on your all looking for for for this uh
00:08:40
yeah
00:08:51
i
00:09:01
uh_huh
00:09:12
uh_huh
00:09:20
in the worst case with a wire frames about being this
00:09:25
ah ah ah ah
00:09:33
oh okay
00:09:46
while trying uh we we we use the the back the backup or
00:09:57
oh
00:10:00
something with manually flashing on g. huh
00:10:06
a a a a a yeah
00:10:15
yeah
00:10:23
yeah
00:10:40
right off of accent things that things than that they're there for a more rapid wanted
00:10:51
okay

Share this talk: 


Conference Program

LightAI4Comfort
Aug. 24, 2022 · 12:13 p.m.
119 views
MHTI
Aug. 24, 2022 · 12:32 p.m.
BISS
Aug. 24, 2022 · 12:49 p.m.
LaternaMagica
Aug. 24, 2022 · 1:08 p.m.
118 views

Recommended talks

LeukoCounter Pitch
Gabriel Minoru
Sept. 17, 2018 · 4:53 p.m.
135 views
Mobile multimedia services based on user attention profiles
Francesca De Simone, EPFL
Sept. 1, 2011 · 10:03 a.m.