Player is loading...

Embed

Embed code

Transcriptions

Note: this content has been automatically generated.
00:00:00
welcome thank you all for joining this afternoon session going to talk a little
00:00:09
bit of big data
00:00:11
my name is David this vast region of global strategic sales at Pinto I've
00:00:16
been a long life for a 10th how so the entire for about eight years which is
00:00:22
sort of the equivalent of 40 years at Hitachi I feel that way as well
00:00:30
no saliva to start his grade we obviously are very proud and very
00:00:36
excited about the whole acquisition that depends how and we continue to be
00:00:42
excited when we are all in all other speakers talking mention in sort of
00:00:47
being part of the whole ecosystem and how we are going to be part of this big
00:00:51
movement that Itachi did a system he touches group companies are going to
00:00:55
make from that of the company the art today into the internet thing space and
00:01:00
a baguette US base and what I thought would be good to do today is sort of
00:01:05
share what we've been up to the past few years
00:01:08
talk a little bit about some of the things we've seen happening at our
00:01:14
customers around Big Data high-level I'm gonna go to technical today we can take
00:01:19
out of line if you want and then also share some customer case studies because
00:01:25
I think a lot of people understand what it really is about to begin talking a
00:01:30
little bit more detail about some of the the actual use cases we've done our
00:01:34
customers and time permitting I may give you sort of a little bit of the size as
00:01:38
well in terms of the five things you gotta think about sort of when we talk
00:01:44
to our customers and we engage with them that we always mention and you sort of
00:01:49
have to consider right it's a small group we have half an hour too many
00:01:54
slides so feel free to ask questions during presentation lists only have to
00:02:02
wait until the end
00:02:03
for all purposes I absolutely do not feel offended in the middle of decision
00:02:10
you think oh my god this guy so boring and when I walk out ok feel free I will
00:02:15
not be offended so we all know this right we all know about the revolution
00:02:22
of data we can give you tons of stats my most favorites that we learned about
00:02:27
this morning that I think one he does she training UK which you know the whole
00:02:32
project that these are sharing a talked about in his keynote penthouse part of
00:02:35
that as well
00:02:36
2013 and will generate in like 25 gigabytes of data per day
00:02:41
787 flying across the Atlantic just the engines Creek hospital about a data that
00:02:46
needs to be processed the average American Chris about 5 was it
00:02:54
5 gigabytes of data us Europeans probably you know we all about quality
00:03:00
you know I'm sure we do less but my most favorite one is there are cows in the
00:03:06
Netherlands that actually create 200 megabytes of data per year per cow
00:03:10
because there's a company called spark and it put a sensor into a cow to
00:03:14
measure the temperature to identify with her two calves pregnant or ill alright
00:03:19
so things could really come to mind one it could give a whole new meaning to the
00:03:26
mad cow disease if somebody performs a denial of service attack on a coward a
00:03:30
sensor thing that went through anyway just to say there's lots of data being
00:03:35
generated people want to leverage all that data and we're part of that whole
00:03:38
ecosystem ok but moreover it what we care the most about his note about this
00:03:44
data bits more about the Christians people are asking around the state and
00:03:48
its not anymore if you think about it
00:03:51
business intelligence and Larry Ellison the relational database about forty-four
00:03:56
years ago and 71 and we were using that technology to answer questions about
00:04:02
things that happened in the past we just have our salt in a particular country by
00:04:07
a particular person in this particular product category we're looking back it
00:04:11
was counting backwards not people start to think about yet those are all the
00:04:14
questions we know we have to
00:04:16
continue to have those questions but there are other things that you want to
00:04:19
start answering and it's not a lot of these questions are new but we just
00:04:23
gonna put into lots more scenarios because of so much more data for
00:04:27
instance no websites could say hey I do not want to track just whatever people
00:04:33
putting shopping baskets and then you know push on the buy button and then the
00:04:37
whole credit card transaction happens and I kept her whole transaction no I
00:04:41
don't understand know how you like your people to buy certain product when this
00:04:47
started viewing is showing certain behavior right so what is the likely to
00:04:51
somebody buy something or will buy something if they bought something else
00:04:55
have you looked at certain things we all know the Amazon experience Amazon change
00:04:59
the whole idea on how you shop in Merseyside it's impossible to go product
00:05:04
by product by product unless you have no no life and I guess you could do that
00:05:10
but otherwise use your recommendation engine whose product review series all
00:05:15
other information now gets what into backend you can start utilizing all that
00:05:19
all those activities that you're doing the search as you type in and tried to
00:05:22
kind of come up with ideas and probabilities and perform based on what
00:05:28
you've done in the past week and the whole thing thing to devices so one of
00:05:34
the things we do for instance we gotta work on hit that she trains and tried to
00:05:38
predict when these trends are going to break down and one to talk about a use
00:05:41
case around preventive maintenance predictive maintenance but it's all
00:05:48
about pattern detection shift addiction know what's going to happen when certain
00:05:53
know if something goes into figure cannot look back and see certain
00:05:56
patterns that I can identify you can go very far for instance we could identify
00:06:02
when medal is starting to break down or start to deteriorate based on data does
00:06:08
anyone have an idea how you do you have a middle barring something metal and i
00:06:16
cant through data identify when that medal is starting to deteriorate
00:06:20
boundaries prices so you could say certain extent of metal
00:06:27
to sound waves and once you start to see shifting the sound waves something has
00:06:32
happened to that middle right I do you may have to clean it was a lot of dirt
00:06:36
and debris on it or two maybe cracks or maybe oxidation is calling English and
00:06:44
Belgian Sun Myung Jin perfect but that's one thing you can do with devices as
00:06:47
well
00:06:48
fraud and security same ideas you know instead of making what happened in the
00:06:52
past because starting about he's somebody testing or network is somebody
00:06:55
doing certain things with our network we could identify in terms of parents
00:07:00
trends and so forth so we can prevent certain security breaches and lastly
00:07:04
what have what's happening more and more once we get all these data source via
00:07:07
transactional data and why are people buying doing at us but we also hold
00:07:12
interactive data the opinion that people's voice D behavior people in a
00:07:16
showcase we can bring it all together and start building a 360 degree views of
00:07:20
whether it's a customer a partner of product supply chain a factory a
00:07:26
taxpayer a criminal or whatever missed that right so challenge though is to do
00:07:37
all those things into a circus christians people who deal with this
00:07:39
this is a dataflow simple data format comma separated or it could be XML again
00:07:46
pretty straightforward or it could be something industry specific to this race
00:07:51
is how hospitals communicate with each other's NHL 7 message or it could be
00:07:55
some of the newer things Jason which is an unstructured kind of silly stricter
00:07:59
type context or how you can describe certain attributes about entities or
00:08:04
just plain simple text people leaving behind your comments and these are just
00:08:10
five examples or six examples but we going to companies that have four
00:08:14
hundred nine hundred of these and they need to bring those together and you
00:08:18
don't want to look at this they want to get this going to look at pretty
00:08:22
pictures preferably 3d which shadow especially if you're an executive
00:08:28
following a certain color scheme but eventually you need to bring all that
00:08:34
data together
00:08:35
showcase this sort of information from the
00:08:38
mess that I've shown before in like sadat was pretty small so in terms of
00:08:42
mess is kind of dead journey that we help customers which is like a bring all
00:08:46
your data together will shape transform it clean and hands in Richet
00:08:52
eventually you can start building views like this and this particular thing is
00:08:56
it is something with it for a telco
00:08:58
quality of service around video streaming and not only do they want
00:09:02
understand the monetization aspect like if somebody buys a certain movie how
00:09:07
likely are they buying the dollar movie but what was your experience what we're
00:09:10
doing the auditors visited any buffering delays or two before leaving delays
00:09:14
related to certain devices IRS and rinse afford or locations potentially
00:09:22
Intel coast early this may surprise you but tell Chris typically have an
00:09:27
operations and every business side and they don't often match there because one
00:09:32
is all about I just want to be able to invoice I don't care how to service was
00:09:36
money in the other side is all about him make the network operate as efficiently
00:09:41
as possible in now by bringing all that data together and have the capability to
00:09:45
get her to consider joining those two things together now what have we learned
00:09:51
from all that right so we know that's what people want to do we know there's
00:09:55
lots of data we've sort of seen three things that kind of people had to deal
00:10:01
with in terms from the technology landscape looking at what college ed
00:10:04
already had in one was people want to really connect all these things from now
00:10:10
like a sieve the telcos right it's not anymore and silos it's really around
00:10:14
here I wanna I want to connect my transactional work with my behavior
00:10:18
world in my opinion will get my interactive for all I want to connect my
00:10:23
operation with my business world second thing is to be able to do that yes the
00:10:27
sort of create new architectures because you want a blended are you don't
00:10:31
necessarily have all the control anymore she need more flexible structures in a
00:10:35
lastly people just also want to consume information differently right to be I'm
00:10:41
probably from the point-and-click generation people thought of you when I
00:10:46
meet there probably from the Swype impeached
00:10:49
they think they don't think a massive enterprise solutions anymore they think
00:10:55
hey if I wanna know something about something and go to the killer app and
00:10:58
it's part of that you see the same thing right our online banking starts includes
00:11:03
certain analytics are whatever fits or activity measure trekkers and support
00:11:08
their whole analytical sweet we can analyze information you don't say oh let
00:11:12
me upload my dad would separate system and then analyzed data and insights into
00:11:18
the idea that we start to see the same thing happening with the business
00:11:21
environment people say I do not want to have that stand-alone solution anymore
00:11:24
where I can visualize my dear I want to cede visualized I wanna see the analyzed
00:11:29
in the application to business application on in we're making the
00:11:32
decisions and ideally it may even suggest some decisions for me so the
00:11:37
whole embedded world in terms of embedded analytics into other
00:11:40
applications is a big challenge that customers are having use it as well I
00:11:45
love all these emerging we saw sort of emergencies particular use cases and
00:11:50
again we didn't kind of came up with this and try to sell it maybe things you
00:11:55
may feel that way in a 360 degree view of customers or any type of entity
00:12:01
streamlining the lake in every year on security intelligence officer should
00:12:05
have the use case of a storm so I'm with all of our customers and we start to
00:12:10
have design patterns from those what we did sort of recognized from all these
00:12:15
different use cases was that was sort of what we call a next-generation
00:12:19
architecture in merger reference architecture why we cannot say the same
00:12:23
thing happening again be people at the same challenges over and over and over
00:12:27
again
00:12:27
one is you have sort of the traditional transactional world of all your
00:12:33
operational systems that generate the data over people buy sell things on this
00:12:40
side there's somebody on the business of certain questions around how many have
00:12:46
we sold you know how many employees we have what's what's our margin whatever
00:12:50
it may be and to do all that historically we've created enterprise
00:12:54
arose because the restructure a return to president is always a number
00:13:00
colombia is a date formatted this way
00:13:02
just the third value is a text it's 25 characters live its 26th lucky loser
00:13:08
character and then we create all these sort of functional security models that
00:13:14
allows you to see the data in a certain way to answer questions very quickly
00:13:18
that the business may have if the pieces at one point since I got a new question
00:13:22
than the people in ninety go over then you have to read architect the whole
00:13:28
thing
00:13:28
York the whole flows if you make some changes here and if make some changes
00:13:32
then eventually to implement sort of the views on top of that data that's kind of
00:13:36
how we've done things historically did so all about answering the questions we
00:13:40
know we have always have and we know they change any of the change we kinda
00:13:44
are not too happy about it because it's a lot of work that's a challenge on any
00:13:51
idea what the average number of tools and enterprise organization has to
00:13:56
address this problem and be applied to any one shot on a number one hundred a
00:14:07
little high 12 to 17 of 17 vendors are typically involved in kind of that
00:14:11
little slow only those things can run on its the storage now we know the
00:14:18
challenge two days we have all these other data of all this other data that
00:14:24
people want to leverage and say hey we want to leverage searches you want to
00:14:28
leverage network location did I G location there are things we pick up
00:14:33
from applications
00:14:35
social social data of social media web logs and so forth and so forth but we
00:14:42
know we can store it here because this guy's little heart in terms of how we
00:14:46
need to structure data we don't have full control over how we can either
00:14:50
secondly this guy is also a little bit expensive to store data it could cost
00:14:56
forty $200,000 to store data per terabyte into a new president everything
00:15:02
included that much if you don't really know what kind of value gonna get out of
00:15:06
this and this is really good group in the sequel technology came into play
00:15:09
people pushed into those stories ahead I don't care about column 12 in 3d working
00:15:14
just dumb
00:15:15
in there right now officially I do want to optimize some of the analytics on top
00:15:20
of these stores because they're not necessarily built to analyse very
00:15:24
quickly
00:15:24
someone is leverage other technology that's a little faster so I can still
00:15:28
answer two questions the business has ok now I could talk literally an hour about
00:15:35
this one slide the couple of challenges with it and this is the best I how I can
00:15:40
picture and sort of tell you what point are really does first of all getting
00:15:45
data in here in dealing with all the little side truth or dare
00:15:51
flume scoop wuz pic heightened parquet floor mats a row formats what else do we
00:16:00
have resumed keeper we have Impala we have to do not coming up we close their
00:16:08
eyes and I stopped when this that we have a zookeeper zookeepers are so many
00:16:14
animals in the zoo keeper something is wrong so not everybody knows all those
00:16:18
tools so you the skills the people that have the skills to deal with all the
00:16:22
technologies of fairly rare ok they're called there are scientists are
00:16:27
scientists like doing all the cool stuff they don't necessarily like to do some
00:16:33
of the pure no-brainer other stuff if you can find one that's a lot of
00:16:38
complexities in terms of how quickly can you actually implement a solution to
00:16:42
drive the analytics can you find people to do with his skill based on the people
00:16:46
who don't want to do manual coding in script thing to do all those things and
00:16:51
also you see there's a big gap between the two world still are not connected
00:16:55
and that's kind of where we come into play to simply put we made delivering
00:16:59
this architecture fairly easy you've got a one user interface that allows you to
00:17:05
integrate all the data bring it all together and talk to all these different
00:17:08
silos of information and through visual interface allows people to to build all
00:17:16
these flows so that you can get any data shape transform clean hands and Richard
00:17:22
so it's ready for analytics the back into also do
00:17:27
some visualization on top of that as well that's very simply put what we do
00:17:31
so we can help people to overcome some of the stiff challenges
00:17:36
desi with implementing disorder of architecture because if you go out and
00:17:42
start hiring some people there are scientists they were immediately jump
00:17:47
into pie turn on to play contra Java even sort of manly Korean script to win
00:17:52
two solutions now I don't know what you guys doing but hardcoded solutions are
00:17:57
necessarily flexible or necessarily easy to maintain they're not very productive
00:18:02
and you can only have the same problem all over if you want to change something
00:18:07
you should reflect and and that's to us is wrong because that's the whole
00:18:13
premise of big deal at all promise around me did I was quickly in a
00:18:16
flexible way you know leveraging all the data you have to come to more businesses
00:18:23
ok so we help it I so that's kind of the key things we do you can some people
00:18:33
call like to call that do crossword in a lake and so that you can stream Monday
00:18:37
too early you can sort of financial data inside now the reason why there's this
00:18:47
question complication in in dealing with the data is that some people really
00:18:52
underestimate the process you have to go through from that no other speakers are
00:18:57
so shown you run data sets the different paper types you know you typically have
00:19:01
to deal with all the way through being able to consumer the original and
00:19:08
there's lots of organizations out there in the ecosystem some of our competitors
00:19:12
the kind of tackle some of the problems but once you start thinking about Big
00:19:17
Data solutions you really have to think about these three main activities no one
00:19:22
is around everything around it and you gonna bring it all together and I'm
00:19:26
gonna sort of clean transform sheep and I'm gonna how am I going to prevent that
00:19:30
I'm going through garbage together
00:19:32
garbage in garbage out to make sure you have a certain process of quality next
00:19:38
level is how do I them further refine that
00:19:40
reach the data and potentially bring data together rich content to that that
00:19:45
ultimately is ready for analytics and really starting him other three kind of
00:19:50
steps to build the whole analytical pipeline so that people can consume
00:19:56
information at the back and then if you do this very well and this becomes
00:20:01
fairly easy we like to think that this is 80% of the engineering data
00:20:06
preparation is 80% of the problem if you would give us a flat file with their I
00:20:12
would ask us to build a visualization demo our sales engineers spend 80%
00:20:16
fiddling with the data you create a pretty round pie chart is not that hard
00:20:24
once the data is right but that's really what the problem is almost be like I
00:20:32
said there's different hopefully get a question I agree with you guys compete
00:20:35
with your competitors are all of them from Annika talent trifecta DoubleClick
00:20:42
and support from our side what do we do to help you provide any did sort of way
00:20:49
to build a refinery in the lobby to drive in me and Alex ok that was a
00:20:56
serious side that had to go through this he continued we do but an hour wanna
00:21:01
talk about some Christian resembles any questions before I jump into some
00:21:04
stories
00:21:06
yes
00:21:12
so very simply put this framework we can store data across multiple notes the
00:21:37
second part of a group is also distributed parallel computing framework
00:21:40
so you can then once you have the data stored you can compute in parallel
00:21:44
across that sort of class across all the notes in a distributed fashion so who do
00:21:49
pursuit of both write it in one point point serves as a region so the way you
00:21:54
put data into dupe it's kind of important as well
00:21:59
already a free since wanting that he doesn't like a lot of small files if you
00:22:04
put a lot of small files into who do I can do with it but he doesn't like it a
00:22:09
lot bigger files to one thing you could already considering he said of dumping
00:22:13
hundred thousand small files into one big holiday in to do because that France
00:22:19
is already better I just tourists in the second part of the compute part of a
00:22:24
dupe is obviously everything you then do once you have that they're in there you
00:22:28
wanna do things like MapReduce you want to leverage things like you are new
00:22:31
enemy be through some are or pie fun at it in terms of the data mining or DNA
00:22:36
and some of the outfit which you may want to push into high over in paula
00:22:41
Reid some of the other frameworks ok so but that all that hope but to come back
00:22:46
to your question the whole flow is definitely part of the idea how you made
00:22:54
it really suits ready for consumption so too quick example have one example which
00:23:01
is probably the biggest scale lucien we've done depends how we talk big data
00:23:07
this is real big data and indeed it is a bedroom and it's all because it doesn't
00:23:12
have to do with volume but this really is a volume solution I think there's not
00:23:17
too many organizations that have this particular organization the terms of
00:23:20
volume of data they have certainly is one for sure but I don't think anyone
00:23:23
top certain
00:23:24
produce the most data anyone can engulf the created their own processors to deal
00:23:31
with the data because we're talking terabytes per second and other one is
00:23:35
really good example of how you can use data not only to have a business impact
00:23:42
save a lot of money but also to have a social in social impact which is what it
00:23:47
stands for in terms of social innovation around predictive maintenance for a
00:23:53
naval ships so one finger up with her to FINRA so finner oversees all stocked
00:24:02
rates by cells and support non-us so they the kind of monitor 4400 agencies
00:24:12
that trade stocks in Aus so at the time when we wrote his presentation they did
00:24:17
on average twenty billion transactions a day will be on a good day today at fifty
00:24:23
billion they presented at the entire roll two weeks ago nasa twelve-hour
00:24:27
averaging 50 billion a day right now roughly a lot of structure a lot of
00:24:32
transactions and they need to find out if somebody is doing anything that he or
00:24:40
she should not be doing inside trading for other transactions ok now the way
00:24:47
they've framed their problem was we don't have to look for a needle in a
00:24:52
haystack because that's quite simple problem are you find a needle in a
00:24:56
haystack that's always a good way to do it but the other what's your way the
00:25:04
final in a haystack
00:25:06
morsel people in the room as well with another way so you kind of go to
00:25:14
destructive for you go to laser focus magnet there the way they describe their
00:25:21
problem is we have a notion of needles and we're looking for the bent one cuz
00:25:27
that's what we gotta do write in the problem is twofold one is the first have
00:25:31
to find a smoke screen all of those billion transactions that happen on a
00:25:34
daily basis they have to find smoke you have to find a smoking gun
00:25:37
but then which is obviously a big processing high-capacity kind of problem
00:25:42
no complex algorithms that yet throughout the button once they have the
00:25:45
problem they have to home in into the problem into the most level of detail
00:25:50
because they have to analyze every transaction if the recreate trading days
00:25:54
to understand the whole sequence of traits and whoever was involved up to
00:25:58
the level of every single transaction and SAP transaction because of one that
00:26:04
goes to court they have to provide a full case of evidence to say hey you
00:26:08
know this is what happened exactly into Swami notice is a bogus man of the
00:26:13
series of trait and it is not about one trading this nobody hopefully you know a
00:26:18
stupid also I'm sure to properly so we'll see all want to do something for
00:26:21
five min you want transactions obviously they spread it around in all sort of
00:26:25
carousel saudi riyal lowell a whole week at home network of transactions related
00:26:30
into some of those cases we basically helping with those two problems so we
00:26:34
help them to massively speed up the process of doing that high compute find
00:26:40
to smoke and then second piece was that and then help them to quickly go through
00:26:45
the cases to map out the fire you want and it's due to problems that we have
00:26:51
just for fun
00:26:56
this is how twenty billion looks like against it right so fit because things
00:27:01
have lots of the Detroit Pistons have lots of data but this is only 20 million
00:27:04
compared with Facebook doesn't on a daily basis because they are often used
00:27:08
as the poster child for having lots of data so what we do is we basically grab
00:27:14
all the data from all the different systems that there isn't always
00:27:17
consistent because there's 4004 in their trade organizations involved so they may
00:27:21
have some certain different file flows and support we push their all into a
00:27:25
cloud in too high which is one of the story just that he do pass that we run
00:27:30
the algorithms to to help identify based on some of the the criteria the Audit
00:27:38
and Risk and analysts give us everything basically fine-tune some of the data
00:27:44
sets that they have available for sort of high performing
00:27:50
ok because think about it this could be this is a petabyte no few kilobytes of a
00:27:55
cluster of data you can ask the same sort of questions to a few petabytes of
00:28:01
data as you would ask to something more nimble something smaller that's a lot
00:28:06
more cricket world where you wanna identifier map out the fire so what you
00:28:10
do here is use all sorts of algorithms to identify some of the hot spots and
00:28:15
then you home in and use again twenty integration to sort of take steps sets
00:28:21
of data to push them into a higher performance engine you can analyze that
00:28:26
sort of more slice and dice drag-n-drop point and click type way ok every sort
00:28:33
of have sped up the whole process from doing in two weeks in in two hours and
00:28:40
one of the things that we do as well as we on the fly generate the data
00:28:46
integration based on the criteria that the analysts give us so it's not rebuild
00:28:52
you know hard coded data integration processes are running based on what the
00:28:58
user selects we run that courier decided that compute process onto onto the
00:29:05
Hadoop cluster and then basically data which also is very bored we all American
00:29:10
model the data and then we publish it so that it can be analyzed and and normally
00:29:16
something to do this involves a role of manual processing a lot of manual data
00:29:22
modeling in support in hand cranking transformations in all happens
00:29:26
automatically and that's how we bring it down from couple of weeks for the need
00:29:29
to file a request to a data service bureau will demand only will grab the
00:29:34
data and make it available
00:29:36
not the whole process is automated and you can sort of quickly go to the
00:29:39
interrelations right which also allows them to find to some of their models
00:29:43
they can play around with their models interactively and they can help
00:29:47
fine-tune the models to identify the smoke and then eventually you know to
00:29:52
find a fire that's one example where two complexes pure volume and being able to
00:29:59
grab data on the fly from a big cluster of it a big deal
00:30:03
scoops and Dennis here now you have and you can analyze it more quickly
00:30:07
the second case is even cooler as far as I get school and sexy so forgive me for
00:30:18
my lack of English here but this was originally done outs caterpillar but
00:30:23
actually killer killer killer required to software company that was our
00:30:27
customer which was yes our G and they basically created models around what you
00:30:34
can do it up with a ship in terms of how frequently you take it out of the water
00:30:39
to clean the whole because if you have a shipping whether all sorts of stuff
00:30:42
attaches itself to go with create which create additional drag on the ship in
00:30:48
terms of fuel consumption all-pro pressure some of the other compression
00:30:52
systems even oil pumps in the fuel pumps get additional has to work harder which
00:31:00
then makes them less efficient in terms of fuel consumption and typically a
00:31:04
owner of a ship would probably take to ship out of the water sick every six to
00:31:08
nine months because it costs about $15,000 per day to have ship out of the
00:31:13
water so they don't like to do that but then you start you start to use all
00:31:18
these criteria and all these sensor data at a half from the oil pump the fuel
00:31:23
pumps the consumption and support to optimize how frequently people had to
00:31:29
take a ship out and even repair fuel pumps in the same go because he could
00:31:34
see once a fuel pump starts to degrade in terms of performance it consumes more
00:31:39
fuel in addition then you have to schedule maintenance and its own
00:31:43
scheduled maintenance for schedule has a bigger impact in support and they were
00:31:48
able to drive millions of cost savings in terms of fuel savings alone so one
00:31:55
large naval vessel so in the US Navy is one of their customers one large naval
00:32:01
vessel they could save one million dollars a year on a few alone just
00:32:07
because they optimize it and what happens is some of the ships are now
00:32:10
taking out the water every six weeks or so every 69
00:32:15
runs and that's a lot better because the cleaning goes faster as well as usual
00:32:19
it's stuck in there but also you know there's no doubt that there's no
00:32:23
additional dragons afford one of those the larger sort of shipowners one of the
00:32:29
transportation companies they save about ten million dollars in costs just by by
00:32:35
optimizing their home maintenance schedule and cleaning schedule and again
00:32:41
what we do is in terms of tonight's not subic it's a little bit of sensor did I
00:32:46
not know people don't have 10,000 chips have a few hundred maybe there is data
00:32:51
what you see here it's just a traditional database Oracle database
00:32:57
with lots of predictive mining and predictive modeling going on with Becca
00:33:01
in our shows the two most popular predictive languages and a model the
00:33:07
whole thing was three data through satellites local machine service which
00:33:11
is on the ships and then some additional operations data blended altogether
00:33:16
include all these Martin them from their own they can optimize the whole schedule
00:33:19
right now that's all I had him I'm sort of out of time as well I did promise you
00:33:27
may be five things you could do it I'll give you an extra one you would say if
00:33:33
you start planning around Big Data really think about what I would call a
00:33:38
general data acquisition strategies or catch the way find out how are you can
00:33:42
capture all these data streams even if you haven't figured out yet how you can
00:33:46
utilize it just think about a general strategy on how to capture all the
00:33:50
different industries have a process for it have a strategy in a plan for what
00:33:54
you gonna put it all into a date a leg and enterprise data hope whatever you
00:33:58
wanna call it but have a strategy or on that second PC is think about a more is
00:34:05
more like candy it's not less is more it's more is more than more did I you
00:34:10
can bring together the more did you can blend more complex you can create around
00:34:14
certain data points to more insights you will drive more value can drive from
00:34:18
that data so don't limit yourself or don't do their isolation in silos like a
00:34:24
lot of people say all due to weather analysis or social media analysis well
00:34:27
if you don't hide out back to you
00:34:28
your real business it's not going to tell you anything
00:34:31
30 is police things don't don't let marketing people or data scientists go
00:34:37
rogue on your day like Rinaldi stand-alone solutions make sure somebody
00:34:41
understands the legal no context in which you doing things and also what
00:34:46
kind of organization wanna be working on strategies you wanna drug because you do
00:34:49
not want to be that organization that Indian comes into the news because
00:34:53
you're creepy because she tracked their customers will be to detail if you
00:34:58
govern things you can provide transparency to your customers and
00:35:02
transparency will lead to trust you know cool up don't rely on people who used to
00:35:11
work there are lots of Technology spent how is one I like to think you're the
00:35:14
best but there's others out there go try them out and then come back to us and
00:35:18
you believe you're the best but no don't rely on data scientists just to drive
00:35:25
the whole process did provide a lot of value it's a six years profession by
00:35:29
Harvard Business Review in 2015 but they didn't call it a miracle workers at the
00:35:34
six years profession 2015 not a miracle workers 2050 so they can do a lot of
00:35:39
things but you have to empower them you have to give them the tools to make it
00:35:43
work and again you know certain things that we deliver
00:35:46
can can help you drive that first one showed the money
00:35:53
famous words by Jerry Maguire I don't know if that movie translates in in this
00:35:57
area but showed no money don't do proof of concept around beginner your business
00:36:03
people your finance people say why are we doing this just doing it because it's
00:36:07
a hybrid of both know find one particular business use case and provide
00:36:12
an end-to-end solution we say hey if we do all these things bring all the data
00:36:15
together drive these insights that we can show this value to the business
00:36:19
we've often from our side we getting pulled into technical showcases over 500
00:36:25
begin a project we know our technology works we know how do you know all these
00:36:29
other things for what you need to find out is how it's going to work in your
00:36:32
business how to drive value and if you don't do that you gonna do a nice proof
00:36:36
of concept and that will be done many of those
00:36:40
lastly call an expert that's a plus one called pins I'll call it does she did
00:36:47
our systems consulting we're here to help
00:36:50
is a great partnership is made a very wise decision to require us and we can

Share this talk: 


Conference program

Begrüssung & Eröffnung des Hitachi Information Forums 2015
Martin Schnider, Country General Manager, Hitachi Data Systems
29 Oct. 2015 · 8:58 a.m.
199 views
Changing role of IT in society: Social Innovation & Internet of Things
Dieter Schöne, Chief of Strategy Social Innovation, Business EMEA, Hitachi Data Systems
29 Oct. 2015 · 9:09 a.m.
108 views
R&D Activity in Europe: Social Innovation for Matured Society - Part One
Kazuyoshi Torii, CTO & General Manager of CSI Europe, Hitachi Europe Ltd.
29 Oct. 2015 · 9:44 a.m.
197 views
R&D Activity in Europe: Social Innovation for Matured Society - Part Two
Vincent Franceschini, Chief Research Officer, Hitachi Data Systems
29 Oct. 2015 · 10 a.m.
700 views
Kundenerfahrungsbericht SFS Group: Wir wollen Veränderung!
Patrick Bichler, Mickael Leitner, Martin Schnider
29 Oct. 2015 · 10:45 a.m.
264 views
The Network Matters: Dedicated Network Resources for HDS Solutions
Joseph O’Connor, Senior Manager – Brocade Hitachi Partnership – EMEA , Brocade
29 Oct. 2015 · 11:21 a.m.
121 views
Application Solutions - Vorteile der Converged Data Protection Lösungen von Veritas und HDS an Kundenbeispielen
Pascal Brunner, Princ. Presales Consultant, Veritas
29 Oct. 2015 · 12:54 p.m.
Big Data Analytics + Insights - Common Advanced Analytics Platform
Bruno Kocher, Presales Manager , Hitachi Data Systems
29 Oct. 2015 · 12:59 p.m.
Business Track - Transform IT-as a Service
Valentin Hamburger, EMEA Solution Specialist for VMware, Solutions and Products Group, Hitachi Data Systems
29 Oct. 2015 · 1 p.m.
Infrastructure Platforms (Private Cloud + Storage) - Mit voller Power in die Zukunft – eine Erfolgsstory der BKW
Chris Stecher, Senior tech. Presales/Sales, LC Systems Engineering AG
29 Oct. 2015 · 1:01 p.m.
Business Track - SAP Cloud Services
Jérôme Valat, VP Global Business Development Oxya, Oxya, A Hitachi Data Systems Company
29 Oct. 2015 · 1:38 p.m.
Big Data Analytics + Insights - The Power of Big Data at Work
Davy Nys, VP of Global Strategic Sales , Pentaho
29 Oct. 2015 · 1:39 p.m.
Application Solutions - Fit für die Digitalisierung und bereit für die Zukunft mit SAP S/4 HANA
Günter Plahl, Business Development Manager Analytics&Finance Solutions, SAP Schweiz
29 Oct. 2015 · 1:45 p.m.
Infrastructure Platforms (Private Cloud + Storage) - IT Transformation in der Praxis
Falko Herbstreuth, Director LANexpert, LANexpert SA
29 Oct. 2015 · 1:45 p.m.
130 views
Application Solutions - Mit Netzwerkvirtualisierung und Hybrid Cloud die Grenzen des Datacenters überwinden
Danny Stettler, Senior Solution Architect, VMware Switzerland GmbH
29 Oct. 2015 · 2:30 p.m.
Business Track - Big Data im Dialog mit den Konsumenten
Robert Schumacher, Customer Intelligence Solutions Manager, SAS Institute AG
29 Oct. 2015 · 2:30 p.m.
Drive the Future - The Hitachi Solution
Bruno Kocher & Henry Schmidt , Presales Manager & Consultant, Hitachi Data Systems
29 Oct. 2015 · 3:17 p.m.
155 views
Die Erfolgsgeschichte von Tesla Motors
Robin Höfler
29 Oct. 2015 · 3:54 p.m.
1,708 views
Wrap-up & Wettbewerbsverlosung
Martin Schnider, Country General Manager, Hitachi Data Systems
29 Oct. 2015 · 3:54 p.m.

Recommended talks

Pitch: Terria Mobile
Terria Mobile Founder, Terria Mobile
30 April 2013 · 4:30 p.m.