IBM: How a Saloon Piano Gave Birth to Your Computer

When was the last time you thought about IBM? Probably a long time ago, and that’s understandable. As far as companies go, IBM could easily win
the award for most boring business. Cause what do they actually do? Well, they make business machines. And that’s it. Boring, right? But what if I told you that IBM can be interesting. They are, for example, older than sliced bread,
Band-Aid, and the State of Arizona. You know those funny self-playing pianos they
always show in old Western saloons? It’s those pianos, specifically their paper
rolls, that gave birth to IBM. This was during the 1890s, a time when everything
was done using pen and paper. As you can imagine, administrative tasks for
the government were painfully slow. None suffered more from this than the US Census
Bureau. They had to count the population every ten
years, but doing everything by hand was so slow that the 1880 census took more than seven
years to complete. The whole thing was a nightmare, but where
most saw disaster, one man saw opportunity. That man was Herman Hollerith. He worked at the Census Bureau, and he was
so fed up with how slow it was that he spent the next ten years developing a machine to
speed it up. He ended up inventing the tabulator, the first
ever electromechanical counting machine. This is where our saloon pianos come in. Hollerith needed a way to store information,
and paper rolls were his first solution. They ended up being too fragile though, so
instead he settled for punch cards. Those might look ancient, but punch cards
actually remained the standard for data storage until well into the 1970s. Hollerith’s tabulator was the beginning
of modern information technology, and he knew it, so in 1896 he established his own company. Census bureaus around the world rejoiced,
and Hollerith ended up becoming a very wealthy man. By the time he sold his business in 1911,
it was worth more than 2.3 million dollars. That might seem cheap, but remember that this
was during a time when the average worker barely earned two dollars a day. So, who bought Hollerith’s company? This guy, Charles Flint. In 1911 he merged it with three other companies
that made clocks and scales. This was the birth of IBM. Well, it wasn’t called IBM until 1924, but
you get the idea. Hollerith’s tabulators were at the heart
of IBM’s business, but as technology improved they would greatly
expand their product range. During World War 2, they even started to build
rifles and equipment for the US military. After the war was over and everyone went back
to business, IBM were well on their way to developing the
first real computers: you know, the ones that weighed 4 tons and
could fill a room. During the 60s they collaborated with NASA
for the Apollo missions. You know, the ones that put Neil Armstrong
and friends on the Moon. The PC revolution of the 80s, however, was
their biggest challenge. By that point personal computers had already
been around for 5 years. The most popular one was the Apple II, courtesy
of Steve Jobs and Steve Wozniak. The market for personal computers wasn’t
too big yet, but it was growing fast, and many pioneering companies had already
settled in. IBM was late to the party, but they didn’t
give up, and on August 12th 1981, after a single short year of development, IBM revealed
their Trump card: The IBM Personal Computer, or PC for short. Nobody saw it coming. IBM spent a fortune on their marketing campaign
and unlike previous projects they actually supported third-party developers. This was a huge step in a time when everyone
was guarding their trade secrets. The IBM PC was such a success that in less
than two years they would be selling one PC every minute. At this point, you’re probably wondering
what happened. After all, nobody’s uses an IBM computer
in their home anymore. So where did it all go wrong? Well part of the answer lies in IBM’s decision
to borrow components from other companies. They didn’t actually develop its own operating
system or microprocessor, but instead borrowed them from Microsoft and Intel. That’s why they were so eager to give free
access to third-party developers. By doing so, IBM unwittingly surrendered the
keys to the PC industry. What they really accomplished was to make
MS-DOS and Intel processors the industry standard. When other brands started copying this model,
IBM had a hard time catching up. The flood of cheap “clones” essentially
drove IBM out of the market. They had a pretty rough time in the years
before the internet, but they actually recovered surprisingly well, despite losing the PC war
of the 80s. Their core products, after all, had always
been their mainframes: the big, heavy computers that could power
large businesses. Small personal computers could never replace
mainframes, so IBM was back in black after only a couple of bad years. The Dot-com crash in the year 2000 actually
helped IBM, since many of their tech competitors went bankrupt. The internet turned out to be a great opportunity
for them. Now they could sell not only their mainframes,
but entire packs of software and services. What they became was, essentially, the do-it-all
IT guy of the businesses world. I actually lied that no one has an IBM computer
anymore. Lots of people do because IBM sold their PC
division to Lenovo in 2005. Their defeat during the 80s taught them just
how important it is to stay on top of technological progress. They’ve actually had a huge role in the
development of modern technology. Take AI for example:
In 1997 their Deep Blue program managed to defeat world chess champion Garry Kasparov. More recently, their Watson AI won a million
dollars playing Jeopardy! in 2011. IBM’s most recent struggle has been with
cloud computing. It’s a very hot topic in enterprise technology
right now, but the idea behind it is actually very simple. Imagine a giant network of computing resources:
think big servers, powerful processors, and lots and lots of storage space. You, or your business can use this network,
and you can rent its vast resources for whatever you need. You don’t need to watch over it, you don’t
need to maintain it, you won’t even know its there. That’s the beauty of it:
a single network can provide service to thousands of companies, which only have to pay a small
fee for using it. That might sound fine and dandy, but cloud
computing is actually a huge threat to IBM. Unlike the 80s, this time they’re not up
against small tech startups. Now they’re facing the big guys: Amazon,
Microsoft, and Google. And to be honest things are not looking to
bright for IBM. Cloud computing has the potential to replace
their main source of income Mainframes and business solutions have always
been IBM’s bread and butter. They do have a presence in the cloud market,
but all their competitors have other cash cows to rely on. Compared to them, IBM doesn’t have much
to fall back to. In a world where you can rent all of your
IT needs for cheap, nobody would really bother to buy IBM’s mainframes. Now, does that mean IBM will die out in a
few years? Probably not. With the amount of money they can throw at
people, I’m sure they have the best minds working on their next move right now. Will they survive the age of cloud computing? Who knows. I guess eventually we’ll find out. Until then, stay smart. Thank you so much for watching this video. As the first one in this series, I hope you’ll
tell me what you think in the comments below. If you enjoyed the video, please like and
share it with your friends, it’ll really help me get the ball going. Feel free to subscribe, I try to post a new
video every two weeks or so, so expect more soon! Again, thank you for watching, really, and
I hope you have a great day!

Building a blockchain for business with the Hyperledger Project

Hyper ledger is one of the
fastest-growing open-source blockchain communities anyone can help lead it and
dozens of companies are working together building a blockchain fabric that can
support production business networks the work started last year with a simple
framework to test the interaction between applications and secure
blockchain networks and it’s allowed us to test use cases in supply chain
Capital Markets manufacturing and healthcare here’s what we learned first
we learned that permission blockchain networks that require every peer to
execute every transaction maintain a ledger and run consensus can’t scale
very well and they can’t support true private transactions and confidential
contracts so the hyper ledger community designed fabric v1 to deliver a truly
modular scalable and secure foundation for industrial blockchain solutions the
most notable changes that peers are now decoupled into two separate runtimes
with three distinct roles endorser committer and concentr here’s how it
works say you run an organic market in
California and I grow radishes on my farm in Chile you and I are in a
blockchain network that supports transactions between various markets
growers shippers banks and others say I agree to sell you my radishes at a
special low price but I need the other markets that buy
for me to continue buying at the standard price they shouldn’t be able to
execute our confidential agreement and find out the details of our deal in fact
if they aren’t part of the deal the transaction shouldn’t appear on the
ledger fabric v1 handles all this my app looks up your identity from a
membership service and then sends the transactions only to our peers both of
our peers will generate a result in this two-party agreement the transaction
requires both of us to render the same result but in transactions with more
parties other rules can apply then the peers send the validated transaction
back to the application which sends it to a consensus cloud for ordering and
then the ordered transactions are sent back to the peers and committed to the
ledger but to get my radishes to your market there are many other parties
involved some need to know that my radishes have been verified and checked
into a shipping container others need to handle bills of lading customs
inspections financing insurance but most of these parties don’t need to know
about our special price now think about our transaction running
out of network handling all the markets all the farms shippers facilitators the
whole supply chain this is the same pattern needed by many industries
anywhere we need to manage confidential obligations to each other without
passing everything through a central authority b1 delivers one network with
everyone working together while ensuring confidentiality scalability and security
and that’s what we’re building right now and we could use your help so how do we
get there go to hyper letter to find out

IBM Spectrum Virtualize: Essential Software for Cloud Deployments

We’re seeing clients really gravitating toward hybrid cloud. Analysts have said
that by 2017 80% of organizations are going to be
using hybrid clouds and by 2018 65% of IT organizations will have their
assets deployed off premises either co-located or located into service
provider organizations with the phenomenon of hybrid clouds. The number
one use case right now for clients moving to the cloud is backup or
disaster recovery implementations. IBM now offers Spectrum Virtualize software
as a software-only product which means that clients who are interested in
deploying software-defined infrastructures can now purchase the
software from IBM as as a downloadable image that they can put on standard x86
Intel hardware. Spectrum Virtualize has been in the market for 13 years shipping
on our appliances on our Storwize platforms, SAN Volume Controller,
VersaStack, and V9000. And so now we’re really extending that capability to offer it as
a software-only release. The benefit of building DR as a service on Spectrum
Virtualize is that it doesn’t matter what the end
client or the subscriber has in their data center they can simply purchase the
software either through an IBM reseller or IBM directly and they can deploy this
on-site on an Intel server and the service provider can deploy it on their
premises to put together a complete DR as a service solution with very low cost
cloud storage on the backend at the service provider location. One of the
key benefits for service providers with Spectrum Virtualize software only is
for the service provider that’s offering disaster recovery as a service. Having
that availability to the data is of the utmost importance and with the five 9s
availability and the hardened nature of the solutions that have been in the market for
so long, service providers–cloud service providers–offering disaster recovery as
a service can count on us and when it comes to disaster recovery, reliability
is king and reputation is king so they can bank their business on this software.
our ISV ecosystem for Spectrum Virtualize and really all the storage
portfolio is of the utmost importance for us to be able to provide complete solutions
to our clients. VMware is one example of a very important ISV: we integrate with
all of VMware’s key API.s around storage. In addition to VMware,
ISVs like CommVault, Symantec, Riverbed and and other infrastructure
ISVs are very critical. If you take a look at industry applications, solutions
such as EPIC in the healthcare space or SAP Hana for instance, these solutions
are well integrated with our Storwize family and Spectrum Virtualize solution.
IBM is excited about the future of the software, we’re continuing to optimize it
and add additional features and capabilities and we’re very excited
about helping our clients on their journey to cloud.

MSP Databalance Case Study: Business decisions in real time based on fast data-driven Analytics

In our modern datacenters, we prefer IBM infrastructure. For our environment we have a mixed platform from Intel, Linux, Power i and AIX.
For this we use the various IBM Power P, Power i and Intel servers. All these systems are linked using the capabilities of the IBM San Volume Controller, FlashSystem V840 and V7000. As central storage solutions we use the V3700, V7000 and V840 storage systems, because of their excellent speed, reliability and low operating costs. The San Volume Controller is used to easy tier, real time compression and mirroring. With these standard techniques present at our systems we are redundant and thus high available. To insure continuity, we are using Tivoli Storage Manager software on most platforms, fully integrated with our SVC solutions. Together with IBM we can offer all possible cloud solutions for our customers, IAAS, PAAS, SAAS. The SAP platform used by Beeztees is hosted by Databalance Services. Databalance advised Beeztees to put the database servers on IBM Flash Storage. This IBM V840 Flash Storage delivers more than 400 thousand IOP’ s. The other servers have been placed on Easy Tier Storage, resulting in an optimal mix of speed and capacity. In practice the generation of reports, lookup jobs and batches are processed much faster. Databalance is a key partner of Beeztees in the field of automation. Throughout the whole migration Databalance has been involved and has advised and supported us. The result of the last months is a very modern and “state of the art” ERP platform based on SAP software and IBM hardware, which enables Beeztees to stay a few steps ahead of the competition. IBM Spectrum is based on software-defined storage and it enables users to obtain increased business benefits from their current storage products whether from IBM or another vendor. IBM has pioneered in this field since 2003 and supports more than 265 storage systems from several brands. This give you more value from earlier storage investments. Databalance is making use of the IBM Spectrum family in serving its clients. The IBM Spectrum Virtualize is a giving maximum flexibility and reliability by virtualizing the storage. You can also get more benefits by using features like Real Time Compression and Easy Tier. And of course you can create a disaster recovery environment by implementing remote mirroring. With IBM Spectrum Protect you enable a reliable, efficient data protection and resiliency for software defined, virtual, physical and cloud environments.

Ubiquity to enable IBM Spectrum Storage in Containers (Docker & Kubernetes) by Robert Haas

As the CTO for Storage Europe in my previous update I had mentioned I had mentioned that we intended to deliver a way to integrate our Storage in container environments such as Docker Swarm and Kubernetes. Well, this is now a reality, and it is called Ubiquity, thanks to the hard work of a team involving our Research and Development labs across the world. Ubiquity is available as open source, in experimental status at this time. Let me briefly explain here where we see the adoption of containers, and what is this Ubiquity technology enabling in a bit more detail. Many surveys are showing that the adoption of containers, and more specifically Docker, is accelerating, also in the enterprise environments. You may have noticed the announcements by many large companies intending to adopt container for most of their infrastructure. This covers many use-cases, such as traditional applications, HPC, cloud, and devops, for instance. In HPC, the portability of containers ensures that a workload can go from the testing laptop of a scientist to the big supercomputer without changes, that is from quality assurance, to staging, to production, with the same code. In a cloud environment, whether on-premise or not, containers are attractive because they deliver the best resource utilization and scalability, with the smallest footprint and the highest agility. Finally, for devops, containers simplify and accelerate application deployment through the reuse of components specified as dependencies, encouraging a micro-service architecture strategy. In summary, containers are a standard way to package applications and all its dependencies; they are portable between environments without changes; they isolate unique elements to enable a standardized infrastructure; all of that in a fast and lightweight fashion. Now, with the adoption of containers increasing beyond just stateless things such as a load balancer or a web application server, there is a need to provide support for persistent storage, that is, storage that remains after containers stop, so that data sets can be shared, so that the output of analysis can be retrieved by other processes, and so on and so forth. For many adopters of container technology, the persistent storage and data management are seen as the top pain points, hence storage vendors have started to support ways to enable their products in the Docker and the Kubernetes container environments using what is called plug-ins. With the technology we call Ubiquity, because it is targeted to support all of the IBM Storage, in all of the types of container environments, we have now released this ability as well. As I said, it is available at the moment in experimental status, so we’re welcoming feedback, and you can download it as open-source from the public github. In a nutshell, Ubiquity is the universal plug-in for all of IBM Storage. With this plug-in, and the underlying framework, storage can be provisioned, and mounted directly by the containerized applications, without manual interventions. This is key to enable the agility in an end-to-end fashion. This allows you to take advantage, for instance, of our Data Ocean technology such as Spectrum Scale in container environments. This way, you can also take advantage of the unique capabilities of Scale, in terms of performance, scalability, and information lifecycle management. And you can also seamlessly integrate our block storage such as Storwize. We are convinced that containers are going to play a role as important as VMs, if not more. Containers are already the norm in the IBM Bluemix offerings, and have been adopted by our Power and Z products. With Ubiquity we’re now able to close the loop with Storage. We’re collaborating with a number of clients testing Ubiquity already now, so that we can develop this technology to match our clients’ needs. Among many other things, we intend to adapt Ubiquity to the rapid changes occurring in the container frameworks such as the CSI (for Container Storage Interface), currently worked on by the Cloud Native Computing Foundation’s (CNCF) storage working group. To conclude, with this you will get the best of new generation applications with the performance and enterprise support of IBM Storage.

Comprehensive hybrid cloud data solutions for any business, any data type

your business is unique your data is unique but there’s one universal truth your data needs are growing faster than your IT budget like other forward-thinking enterprises you’re seeking hybrid cloud solutions to efficiently and flexibly store the data needed for new generation applications with the hybrid cloud capabilities of IBM spectrum storage a cost-effective solution has arrived systems built with IBM spectrum virtualize modernize your unique data infrastructure smoothly integrating enterprise class capabilities to nearly any existing storage infrastructure without new hardware investment now with IBM spectrum virtual eyes you can use cloud storage to store snapshots or even archive copies of on-premise data freeing capacity for new applications combined with existing hybrid cloud functions in the IBM spectrum storage family and IBM Cloud object storage you can create a comprehensive data and storage solution for all your data everywhere so you can build your business’s cognitive future around the core of your evolving data IBM spectrum storage with hybrid cloud capabilities taking what makes you unique and making it better on the cloud

IBM Spectrum Storage Suite at IBM InterConnect Part 1

How is it possible to intelligently monitor a storage environment in a single panel? Well, for this problem we have Spectrum Control, which provides effective management in a subsystem storage environment, from IBM and not from IBM, through a single point of control. It includes an optimized interface for storage specialists, IBM users, UV center, IBM cloud environments and openstack APIs, allowing for greater adherence to IT modules and Agile Cams. This offer can be purchased on-premise and Off-premise, in the IBM cloud.
IBM Spectrum Control. How you can simplify and save on data protection management? Do you need to make multiple copies of the same data so that they are safe? IBM Spectrum Protect simplifies and has the smallest Total Cost of Ownership (TCO) compared to other tools in the market, because it uses a more efficient architecture, without reliance on media servers, appliances or other equipment. Because it uses reduction techniques of native data, such as incremental and progressive security copy, in which only the recently altered files or blocks are continuously protected, without relying on synthetic or equivalent routines. We also have online deduplication, which is also incorporated in the product without additional cost. Deduplication and online compression, on the client and side server. With less data, less cost, less effort to manage this environment. Less infrastructure costs, to protect against media failures or disasters, it is recommended to have at least two copies of your data, which can be electronically replicated by the tool, at no additional cost, or through the media rotation process. With that, IBM Spectrum Protect eliminates the need for multiple backup routines for storing the same data, for example: weekly, monthly, yearly. Without compromising data availability. IBM Spectrum Protect.

Replay: IBM Watson Group Launch Event Jan. 9 in New York

ROMETTY: I am so happy to see everyone here. This is a wonderful day, not only for our
company but for our clients, for this industry. And what I am here to tell
you about is the formation of something called the IBM Watson Group. Now, for those of you that watch us,
we don’t create new units very often. But when we do, it’s because we see something
that is a major, major shift that we believe in. It happened in the 1960s. If you study IBM, we had a unit all around something then called the
System/360, later known as the mainframe. Then again in the eighties, it was the IBM PC. The nineties, we started IBM Global Services. Today is another such moment. Today is an important moment
in our company’s history, and it is also an important moment
in the history of technology. And you’ll hear all about this from Mike Rhodin,
our Senior Vice President of the Watson Group. He, along with clients, launch
partners who are incredibly excited, will all be up here to tell you about Watson. As well, I mean this journey is only beginning; you will hear from Research
on everything yet to come. But what I want to do with you
is I want to put this in context. To date, to date there have only
been two prior eras of computing. The first was called tabulating. It was machines that did just
what it says, they counted. This was, as you would guess, the
mid 19th century, Herman Hollerith. This was punchcards, this is when IBM did
things like the census, Social Security systems. It was the foundation for finance,
control, inventory control. Then, the second era, programmable era. Just what it sounds like, if, then. If, then. You had to program
it, tell it what to do. And it is everything that you know to this day. We did it first with the mainframes
in the fifties and sixties. Then it was PCs, tablets, smartphones, anything
that’s out there today, it is programmed. But 2011, 2011 we introduced a new era to you. And it is a third era, it is cognitive. It is systems that learn. They are not programmed; they learn, and
we debuted this on the video you saw, with the something called
Watson, and it played Jeopardy. Now, many remember it defeated
the two all-time human champions. But I don’t think everyone understood what
was happening behind the icon as it ran. It was a new species, if I can call it that. It is taught, it’s not programmed. By design, it learns by experience,
and it learns from interaction. And by design it gets smarter over
time and better judgments over time. But I think what’s most important for right
now, for all of us, why we took Watson on, it’s built for a world of big data: 2.5 billion
gigabytes per day gets created, and it has — and I underscore the word — “potential” to
transform industries and professions everywhere. And to be unleashed, I want you
to think about this, though. To unleash all that insights of
all this data and I know many of you work on this, you need this new era. In my view, Watson is just in time. A cognitive era is just in time. And this is not just data that the world
thinks of as structured; the data you and I can picture in rows and columns. But 80 percent of the world is
unstructured, tweets, blogs, pictures. And then there’s all this other data about data. Right? So my location, about
an object, about a test. And in fact, I think to understand this, what
Watson does because you don’t program it, it’s thousands and thousands of algorithms
that run, and they improve and they get better and then more algorithms are created. In fact, I was talking to many of you
out there, those of you on Watson now, you have to experience it to see the difference
because it is not a super search engine. It can find a needle in a haystack,
but it understands the haystack. It is about relations, correlations
that you will never see. And this is why we called it a grand
challenge when we undertook it. And you interact with data in a new way. Natural language, and it understands
the implications of your questions. And in fact, soon you’re going to hear from
Guru, it will ask you clarifying questions back. So, today is about Watson to a new level. We started with what we call in
Research a grand challenge — something that we don’t think the
world has yet solved or could solve. It starts as a grand challenge. We then did the work to be sure
it could be commercially viable. And it has already begun
transforming industries. You are going to hear from
many of these partners and you will see and experience it outside. But, as well a growing ecosystem. So, the world will experience Watson four
ways that you will get a taste of today: transformational solutions; enterprise
solutions; I said, a huge ecosystem; and then, something called Watson Foundations. So, let me give you just
a real fast word on each. Transformation solutions. Look, this is about transforming
industries, professions, like I said. We made the decision to tackle the
world’s most difficult problems first. We started with health care,
we started with oncology. We have had partnerships
with world-renowned experts. They are “the” best in the world. To me, the greatest testament to Watson
is they have dedicated their time, their life for years here working with us;
they only do that when they see a breakthrough in science, that this will
change the face of health care. And I know every one of our
partners on this agrees, that we will change the face of health care. So, you will see Memorial Sloan-Kettering, the
cancer center, two years, oncologists working on how to predict best treatment,
evidence, confidence, behind that. Cleveland Clinic, how to
use Watson to do teaching, students, to pass the U.S. medical exam. You will see Wellpoint, how is an insurer
to approve, but approve based on evidence, evidence, fact-based and to have it done fast. And then, MD Anderson. Those that have ever tried bridging the
gap between researchers and clinicians, and that’s what they’re working on. Then you have the enterprise solutions. So, one was transformative; enterprise. Great projects, great problems, but I
consider these more scalable, repeatable. Higher volume, quicker deployment. And we’ve already had in market something
called the Watson Engagement Adviser, how to give you relevant
answers to lots of questions. Today, you will meet new forms
of advisers, and more ahead. Then, the Watson ecosystem. We want, by design, partners,
entrepreneurs, venture capitalists to all build their solutions around Watson. So we are announcing a Watson Developer Cloud. So, think of this, for those of you in
technology, the API world ahead of you. And this is going to be APIs, content,
talent, in the first clouds up, retail, travel, consumer health care. Now, we did a really sort of
quiet launch of this in November, and overnight 750-plus applicants
to build businesses. And that’s with hardly telling anyone here. And then, I said Watson Foundation. What that is, a portfolio of
information analytics, capabilities, because that’s what does
underpin this cognitive era. So, everyone here today, I can’t be prouder to announce this group on
behalf of the IBM company. Another billion of investment
over the next several years. Over 2,000 researchers, developers,
business experts. We’re going to go ahead and
put another 100 million in to fund an equity fund for the ecosystem. And very symbolically, this group is going
to be headquartered in New York City, in Silicon Alley, 51 Astor Place. Hundreds of people, incubator, design
center, solution center, all there. And Mike and all our partners here today are
going to talk to you about what they’re doing. So, you’ll meet on stage, Dr. Craig Thompson, the CEO of Memorial Sloan-Kettering,
the cancer center. Dr. Tom Graham, right in front of me, the
Chief Information Officer for Cleveland Clinic. Jay Katzen, all around President,
Elsevier Clinical solutions. Kent Deverell, CEO, Fluid Retail. And Terry Jones, who is, as many of you know,
the co-founder of Travelocity and Kayak. Now, I said we don’t form
a business unit very often. And when we do, it’s because we believe
we can help our company, our clients and our partners establish
leadership in a new era. I can’t be more excited. You can tell that. I can’t be more excited about the impact that
Watson will make not only on IBM as a company, but our clients, their companies,
institutions and society at large. And a moment about society at large. You know, it’s been 18 months, two
years, Memorial Sloan-Kettering, Dr. Kris has been working and training Watson
with a whole staff of other physicians. Think about this: infusing Watson with
the world’s best knowledge and experience. I want you to think about what that
means to share that kind of expertise from this institution and many more. More physicians, more patients, than
they could ever physically ever see. Now, you imagine that, that
same thought for any profession. And then, those people get
the chance to dialogue with the best collaborator
they could ever have, Watson. Unprecedented learning, constantly
getting better, and making sense out of all
of this world’s data. So, you are going to hear many examples today. They are enormously promising. They are a downpayment on Watson’s
potential to transform industry, enterprises, and actually new levels of knowledge, for
whether that be citizens or the masses. You just have to talk to anyone
here who’s experienced it. The interest, the excitement
from clients, it is unending. They view this as the very
beginning of a journey. So, when an earlier Watson, Thomas
Watson, Jr., son of our founder, a half a century ago announced the
System/360, I will share with you, at that time computer science
was an arcane thing. It was not experienced by many people. So, now today, this is a new era. It’s an era of machine-human
collaboration, and it is dawning now. You will see the Watson Cognitive Cloud
Services, and you will see how they, and you look at it, you will see how it will
understand me, it will engage me, it will learn and get better, help me discover. It will build trust, and it has
an endless capacity for insight. This is a new era, and I can’t be
prouder of the IBMers and the clients that brought us to this day today. So, it is with my great pleasure that
I introduce you to the new leader, the Senior Vice President of
the Watson Group, Mike Rhodin. [ APPLAUSE ] RHODIN: Good morning. Good morning. And welcome to a very deep crowd
of standing people at the back. This is great. This is an exciting point. I couldn’t be prouder, I couldn’t be happier,
and I couldn’t be more honored to be asked to work with 2,000 of our best and brightest
colleagues on how we can take this forward. Working with our partners, great companies,
great institutions, that see the same vision that we see on where we’re going to
take this technology as it evolves. The formation of a new group is a big deal. But it’s part of a journey, it all
starts with the germ of an idea. Someone had the grand challenge idea of, could
we really answer the world’s hardest questions? Right? IBM Research did an
incredible piece of work, culminating in a pretty daring
display on live television. So, you start to look at television and you
start to realize if it had gone the other way, right, it might not be so much fun. [ LAUGHTER ] But an incredible piece of work: 27 core
researchers dedicated four years built on the shoulders of decades
of research and technology. What we did next, we took
that team and we built a team to start looking at, how
would you commercialize it? We built a team under Manoj Saxena that, as
many of you have met, embodied the essence of what it means to be an
innovator, to be a startup. We intentionally hid them, right? A tiny group, protected. Based them out of Austin, let them play,
let them experiment, let them learn. Learn in the market, working
with many of you, our clients. Now, any startup goes through
many phases, right? They had to learn how to fix bugs and
make it better and improve it every day. And over the last two years, working in the
market, cocreating, collaborating with clients, with partners, we believe we’ve
created something that is ready to go. Ready to go mainstream, and
mainstream is where we’re headed. With the creation of the new group, we’re
going to take this from those 27 researchers, to the few hundred people that have been working
in startup mode for the last two to three years, and we’re going to move on to
the next phase: 2,000 people. That’s a lot, right? You’re going to see today examples of
technology that are going to come out. New products, new capabilities that are
going to really improve what we mean by cognitive systems, what Watson really is. I think you’ll see that what you knew
Watson as was merely the tip of the iceberg. The depths of our IBM researchers
that have been working in parallel to the commercialization team have
built a whole new wave of technology that now today is moving over to the new group. That technology is going to be rapidly
commercialized and put in market and you’ll hear about some of those advances
as we go through the morning. We’re going to take some technology from
our world leading software business, stuff that will help us move this
along much faster and join the group. Right, so today we’ve gone from a few hundred to
several hundred and over the course of the year that will continue to expand up to 2,000. Rapid growth environment. Now, many of us have, you know, heard
about Watson, we’ve read about Watson, we see articles about Watson, we
see people speak about Watson. There’s YouTube on Watson, right? So, what is Watson? Right. If you take it at
its essence, at its core, it’s a system that understand natural language. You don’t have to write programs, you don’t
have to learn things like Fortran or Java. You just ask it questions. It reads. Think about it as reading, right? When it reads a lot, it adapts and it learns. It gets smarter. When it gets smarter, you can
start to ask it questions. When you ask it questions, it will
generate and evaluate hypothesis, potential answers with a level of confidence. When you think about that, that’s
how we work: we read, we learn. We start to answer questions. That’s how we know whether
we’ve learned that or not. Watson learns like our children do. How do you know when your children
are learning a new subject? How do you know they’ve actually learned it? You ask them a question. You see whether it gets the right answer. And when it doesn’t get the right answer, you help them discover the
right answer, and it learns. It gets smarter. And the next time, it gets that right
answer and it builds upon things. But it just doesn’t learn
from what it knows today. You can add more data to it. It reads new books, every day. And as it reads new books, it learns. It connects the dots with what it
just read with what it already knew. Sometimes the new reading
contradicts what it already knew. It has to sort that out. The same way we do. Right? It has to understand new information
in the context of its relevance — the connection — to the
old information that it had, and how important is this
new piece of information. So Watson has come a long way. But this is really, think of this really
as just an engine in a cognitive system. It’s not the end state; it’s
the beginning state. So as we start to move forward,
Watson is getting smarter, we’re adding new capabilities to it. It’s learning to reason,
to think through things. To help people using it move
along a journey to come up with the right answer, the right diagnosis. It’s using that first engine I talked about
as a subroutine, as something that it calls. That it asks questions to. That is new technology from IBM
Research called IBM Paths, Watson Paths. Watson is learning to explore, visual
exploration technology, to help you wander through massive amounts of big data to find that
relevance, find that hidden jewel in a haystack. It’s learning how to visualize
answers, not just speak back to you. Draw pictures for you. The art of human communication is not just text. Right? It’s pictures, it’s text, it’s images. All of those things have to join
in to the system we call Watson. So, today we’re going to talk about
this next wave of technologies that are adding to the cognitive system family. We’re going to go through some examples
with some world-leading experts. We’re going to talk about some of the new
products that we’re going to announce today. And then, we’re going to peek off into
the future as to where this may go next, because I think there needs to
be more circles on that chart, and Guru is going to talk
about those a little bit later. So, Ginni mentioned the new headquarters. Whenever you have a growing family,
sometimes you have to get a bigger house. Right? And when you’re getting a bigger
house sometimes you want to focus on location, because location matters. Right? When you walk through the areas of
Silicon Valley, or Silicon Alley and you look at the real estate, you look at the
buildings, the history of that area, the edge of the East Village,
historic Manhattan property. In the center of it, there’s this
brand-new iconic beautiful building. On a block all its own, standing out
from everything else in the area. This is our new home. This will be the headquarters for IBM Watson. It will be a place where our people, our
best and brightest, work with our partners, with our clients to imagine the
future, to help create the future. It’s going to have an incubator that
helps businesses get started with Watson. We’re going to have some of our best and
brightest designers, graphical experts, help those products become
really what they should be. And it’s right in the middle of the
trendiest, hippest area of town. Not exactly where you would have thought
IBM would open a new headquarters. So, we’re pretty excited. We’ll be in our environment
a little bit later this year. The building is done, as you can see. That’s the live picture of it. But the inside of the building
looks pretty much like this. [ LAUGHTER ] So we have a little bit of work to do. Even though, I think this is pretty cool, right? So I don’t think we’re going to
do too much work as we go forward. So, we’re putting the right
people, as Ginni said, we’re investing a billion dollars
in this over the next few years. We’re creating a incredible
environment to put this together. And we’re going to share Watson with the world. Right? Eras are not ours alone; we just
happen to have a history of shepherding them and bringing them to life
for the rest of the world. We make markets, we create entire industries,
and that’s what we’re going to do with this. So, what we’ve learned as we’ve worked with
Watson, as we’ve worked with many of you. As Ginni said, there’s three kind of
classes of things that we see happening, and there’s some homework that
you have to do to get ready. And that’s what we’re going to talk about now. First, we truly believe based
on the work we’re doing, that this is going to transform
entire industries. It’s going to make people rethink how business
gets done, how their organization works, how we treat patients, how
we sell things to clients. It may in fact start to rehumanize the Internet. As we’ve worked on those kinds of solutions,
we’ve recognized that there’s a set of repeating patterns that we’re
seeing over and over again. And in our industry, when we see
repeating patterns, it’s a clear indicator that you can productize something,
so we’ve started to do that. We’ve launched products for the enterprise that can deliver value faster,
repeatability is important. And normally you would view
transformational solutions and enterprise solutions
as right in IBM’s alley. We recognize that the power of this technology
is really about what it can do for everyone. And to get to everyone, we need help. We need an ecosystem, we need partners. Right? And we’re opening Watson up to
the world and we’re asking for that help, because we think everybody that decides
to help, that decides to join us, is going to change the world and
we’re going to make it better. That’s what the ecosystem is all about. So let’s start talking about transformation. Now, when the Jeopardy match occurred and
we were all holding our breath as it came down to Final Jeopardy, we
weren’t the only ones watching. It turned out that many of
our clients were watching. And when the match was over and finally
aired on TV, our phones started ringing. And it wasn’t who we would have
normally expected: doctors, health care was the first to call. They saw something that could be
the light at the end of the tunnel. They’re faced with an enormous
sea of information. Not just the medical reference material you
have to learn in order to become a doctor; it’s the enormous amounts of information
that are published every single day, from the researchers around the world. And then you intersect that with
everything that’s being published, by the pharmaceutical companies on new drugs and
drug trials, and you start to see the magnitude of the information they’re
having to keep up with. And at the same time, these
doctors are having to see X number of patients per hour, eight to 10 hours a day. And they don’t have all day to read
all that information and learn. They need help. Now, the tunnel gets longer. The tunnel gets longer because this idea of
DNA sequencing and genomic medicine is going to bring on a flood of information
that has to be looked at in conjunction to all of that other information. And then, add in electronic medical records, add
in family histories, add in the demographics, add in what’s going on in the city around
you, what bugs are flowing through. That becomes a real problem. And the doctors said, you know,
this could help me sort that out. And they didn’t say, install it so I can use
it; they said, come with us on a journey. It will be worth it, because with
this we can actually change the world. Now, as we get ready here to kick off this next
section, we have a short video to kick it off, and I’ll bring up my first guest. [ MUSIC ] I thought Jeopardy was just a great
example of what this system could do. Watson can probe every nook and cranny
of their record and try to learn more about them than any one doctor can. It truly is personalizing the care of that
patient in a way we were never before possible. We’re developing a set of solutions that will
bring Watson’s cognitive computing capabilities to decision support around medical technology. The cognitive computing capabilities
are truly unique, and unparalleled. Done correctly and applied correctly they
can democratize the use of clinical evidence in a way that’s never been done before. The Watson technology is a
leapfrog in computing technology. And to me, it affords the
capability to learn the art of medicine not just science of medicine. [ MUSIC ] RHODIN: So, as we think about health care
and the implications of Watson on health care, we think it’s not just helping a doctor
see a patient and diagnose a patient. We think it’s far reaching. We think it’s everything from how medicine
is taught, how medicine is practiced. How medicine is paid for. The end-to-end ecosystem, cognitive
computing can actually start to pull together a massive sea of information
to streamline that and make it work better. And that’s really what we’re trying
to do, is make it all better. So, with that I’d like to
invite my first guest up. Dr. Craig Thompson, CEO of Memorial
Sloan-Kettering Cancer Center. [ APPLAUSE ] THOMPSON: Well, I’m incredibly
pleased to be here. Mike’s just laid out the
challenge for you that goes on in the health care community,
and we read about it every day. Memorial Sloan-Kettering Cancer Center is
incredibly pleased to have a partnership with IBM Watson’s team to develop a research
or support tool, decision support tool for medical professionals as they
help patients and their families deal with the most complicated and difficult
challenging decision that patients face in their lifetime, and that’s
a diagnosis of cancer. I want to give you just a
little overview flavor. You heard a little about it at the very
highest level in the video we just saw, but how we’ve gone about it in
partnership with the Watson Research team. So, as Ginni just said, this is a machine
approach that provides a cognitive approach to understanding complex problems. So, how do we go through the cognitive training? That started with a collaboration
between the two research teams, two years ago, as Ginni said. We put our best physicians, of the
thousands of physicians we have taking care of every year 125,000 patients with cancer, all
the experience built into those individuals, and went through a training
process of the cognitive abilities of Watson with the research team. We have refined that over two
years, you met the physician leader in that video, Mark Kris going forward. Out of that two-year process, we have reached
a point where we have developed a partner for the health care professional in making the
best and most informed decisions for patients and their families about
the diagnosis of cancer. How does Watson actually do this? In the largest overview, there are
really three parts of the process. Watson has learned how to retrieve all
the relevant information that’s necessary for a personal decision of that patient and their specific circumstances
relative to their cancer. All the medical information
that exists in the world as well as all the personal information that’s
in that person’s health care record. Second, once it’s retrieved, it
can use its cognitive ability to integrate all that information. And in the way of the world, often patients
worry that someone else is making the decision. Watson is a collaborative partner
with the health care professionals, because from that information and that
integration of cognitive abilities, it provides a prioritized list of what are the
best possible choices of among the full range of choices for the best diagnosis and
treatment of that patient’s illness. And it works in partnership, as a
collaborator, with the health care professionals that are dealing with that patient and
their family about their treatment. That’s the overview. I’m very pleased now to introduce our physician
in chief, who is in charge of all thousand of those health care professionals
that have been training Watson from the medical perspective
to give the specific overview. Dr. Jose Baselga. [ APPLAUSE ] BASELGA: So, good morning, everybody. And this is a standing room only,
gee, you are all the way in the back so I hope you can see the slides well. I’d like to present some practical aspects
on the challenges that we are facing today in taking care of patients with cancer
and then it all will click together. And I hope I can demonstrate clearly that
Watson will become a critical partner in the way we deliver care to thousands of
patients not only at Memorial Sloan-Kettering but all around the country and
hopefully around the world. This is our traditional oncology
decision making process. This is what Dr. Thompson learned when he
was a junior faculty; that is what I learned when I was a junior faculty; and, this is the
tradition of centuries of practicing medicine. Our process to make decisions in cancer care was
based on looking at the chart of the patient, and it was a paper chart
with limited information. Looking at some x-rays, very basic x-rays. You know, things got complicated
and then we had CAT scans, but we had very few CAT scans,
and that was it, what we had. Some laboratory data. We had a few drugs, not too many. When I trained in breast cancer
we had one [hormonal] therapy and three types of chemo therapy, that was it. So, our decision making process
was in this way simplified. And then of course, we had books at that time. Actually, in oncology, we had “the”
book that we had in our clinics and that we would consult
often, but it was just a book. And the number of papers that were
published every year was very minimal and there was no Internet. So, that was something that
we felt comfortable in doing. We had these data that was being
analyzed in the clinic and then we offered to our patients our best decision
and our best proposal for therapy. Fast forward, this is not any
longer what’s happening today. The field of medicine has changed in a
way that we could never have predicted. And there’s a sea of data —
like Mike has mentioned to you — that we have to deal with in every single
aspect of our patient care delivery. To begin with, who is reading
books any longer today? We have thousands of articles
that come up every day to address every single issue of our patients. Every single need, every single question, it’s
written in thousands of articles that come out. We have electronic medical records with
tremendous amount of data about the patients: their prior therapy, family history, response
to therapy, et cetera, allergies, interactions. An imaging revolution: we are talking about
CAT scans, we are talking about PET scans, we are talking about sonograms, we are
talking about multiple imaging techniques that on their own are extremely
complex to interpret and to understand and to integrate into our clinic space. Therapies. Well, in breast cancer now, we have today
80 therapies, but we have 800 therapies that are being studied in clinical trials. Eight hundred therapies —
from four to 800 in 20 years. And then of course, what was being
referred to, we are just not looking at the pathology slides any longer;
we are sequencing our tumors, and we are analyzing 20,000 genes and we
will analyze even more complex parameters in every tumor. And our medical knowledge at Memorial
Sloan-Kettering, from a few hundred physicians to a thousand physicians that we have right now. So, a tremendous amount of
knowledge that is in our system. So, this is the rise in medical
publications, exponential, and this is not going to get easier. The need to develop physician-based
medicine based on genomic data — every patient’s tumor is
different, every tumor is different. We don’t have just two types of breast cancer;
we have now at least 50 types of breast cancer, and this happens to every tumor type. And we will need to have a matching
of the patient’s tumor characteristics with the specific therapies and be
able to monitor this very carefully. And just to give you an example,
this is what’s happening today at Memorial Sloan-Kettering Cancer Center. We are routinely sequencing the tumors of
thousands of patients by these new platforms that I will not go into in detail. But just to give you the sense of
complexity, we can genotype genes in full. And this is our list. And you can read the list,
and this is done on purpose. You’re not supposed to read the list,
because even if you read the list, you will not understand what it means and nor
will a physician understand what it means. And every one of these genes
can have multiple mutations. And these mutations, among them, interact. You cannot deal with this even if you
have an hour per patient in the clinic, which is not the case, by the way. So, the traditional process is not
working any longer, is not working, and that is where Watson comes to play,
and that’s why we are so terribly excited. Watson becomes then our partner, it becomes our
colleague, it becomes the source for integration of all the data and the source for advice. And then, one more important component to
all this: the patient today plays a role, and the decision is not just in one direction. Patients are engaged in expressing their
preferences, in expressing their desires, in expressing their priorities with us,
and Watson takes this into account as well. So, Mark Kris has led a terrific team
of our best physicians in this process in downloading thousands of clinical charts from Memorial Sloan-Kettering,
downloading our guidelines. I’ve taken guidelines from all over the world, teaching Watson to think
medically with medical logic. And Watson begins to make really good
decisions and choices that then they propose us. And the system is self healing and
self improving on a continuous basis. So, I’ll show you a very brief
example of how this is playing out. This is the case of a patient, Ms. Yamato. This is a young patient being
diagnosed with lung cancer. And we…the data is downloaded from the
electronic medical record, and this is a patient with a diagnosis of lung cancer and has a
number of key points by [cascading] the imaging. And Watson does its first round of analysis with
the data that the medical record is providing. And it’s telling us, what you have today from this patient is insufficient
to make the right decision. And it’s telling this on the left side. But if you have to make a decision — because sometimes in medicine you need
to make a decision with what you have — these are the options, and this is our level of
confidence based on our physician’s experience that the machine has been learning. But the machine is telling us, we would…you
should consider doing this molecular analysis, because this is a lung cancer in a non smoker
from Asian descent, and these have mutations that are important for the
right choice of therapy. So, we do that. We follow the advice of Watson. We don’t have to, but we do. And sure enough, Watson had
a feeling about that, because we find a mutation on the EGF receptor. So, here we go. We have a mutation that could direct therapy. So, we go to our guidelines and the
ones that we use across the country are from the network of cancer centers called NCCN. And in the guidelines, it’s very clear. It says if you have a tumor with a EGFR
mutation, you should consider a therapy with a very specific and exciting therapy
called erlotinib, and that’s what we would do. I would do that. I would propose that. Yet there is a paper — and this is a true
story — that has been published recently, that shows that of all the
mutations in EGF receptor, there is one that does not
respond to this therapy. One. There are only about 10 physicians in the
world, maybe less, that would know this offhand. I’m telling you, we have a thousand
physicians at Memorial Sloan-Kettering and at most two or three know this. And I have been very…I’m boasting this because that means we have 30
percent of the world knowledge. [ LAUGHTER ] But only two or three would
have that information. So, based on this, Watson is telling
us, do not give this patient erlotinib; offer this patient conventional tumor therapy. And this is what was done. Do you think this was a difficult case? You probably think it’s a difficult case, right? This is easy. We are dealing with much more complex decisions
to make every day in our clinical practice. So, this is why we are so terribly
excited about this partnership and that’s why we’re devoting so much time. We believe in it. And we feel strongly that this will change
the way health care is being delivered. It’s about integrating data, it’s
about integrating knowledge and it’s about creating wisdom together
between IBM Watson and our physicians. Thank you very much. [ APPLAUSE ] RHODIN: So, outstanding example. That is what this is all about. That’s about making the world better. Right? That’s about making all of us better. Right? Now, while we were working
with Memorial Sloan-Kettering on this, John Kelly’s Research team was
off working on what’s next, and we were lovingly calling
it Watson 2.0 for a while. But they were working on a
set of technology that starts to transform how Watson itself behaves — where Watson just becomes an
element of a bigger system. We’ve been calling this Watson Paths. And what Watson Paths is, is a capability
that enables the system to start to reason. And with that, we started to work in partnership
with Cleveland Clinic to think about, how could this type of reasoning, this type of
capability, change the way medicine is taught? Could we work with students? And in turn, could the students
work with Watson, and could it be a mutually
beneficial arrangement? I’m really pleased to welcome Dr. Tom
Graham up, the chief innovation and head of orthopedics at Cleveland Clinic. Thank you. [ APPLAUSE ] GRAHAM: Thank you, Michael. Humbling to be sharing the stage with these
illuminaries and on such a propitious day. Congratulations to the entire IBM team. The center of the medical universe is
where the patient and doctor get together. Don’t make any mistake about that. And my previous colleagues and I
took an oath, written by this fella, that said we have to bring our A game
to every one of those interactions. And I’d have to say, and I’m not
one to try to avoid difficulty, but I think it may have been easier then,
because it was all right in front of you. All the tools you needed: your eyes,
your hands and the patient were there. Well, we know what’s happened
in the last few centuries. This explosion of data, medical
information, doubling every five years. What a challenge, to do that, and bring
our patients the most expert inferences that we can bring to improve
diagnostic and therapeutic outcomes. We have to be accurate. We not only have to have the right diagnosis,
patients want to know what’s going to happen, and at all the different milestones
that they’re going to face. This is a timed event. It’s like sports; there’s an immediacy to it. And obviously today, we don’t want to
just say, geez, in general this happens. This patient wants to know what
their problem is, their solution is, and their journey through their
medical engagement will be. And we have to make it understandable for them. We have to be able to converse
to them in a way they understand, they can go home and tell their families. The genius of debuting Watson on Jeopardy
just wasn’t its ability to sort through, what, 200 million pages of data in a couple seconds. It was to bring it out and introduce
us all to an amazing capability. And I think, Mike, why you got
those calls from health care is because we didn’t see it as a competition. It wasn’t man versus machine. Medicine is a team sport. We said, hey, this is now somebody with
whom we can think together, learn together and build a team because medicine needs to be
delivered with multiple different perspectives, knowledge capabilities and different tools. So, health care is the natural application,
we believe, for Watson technology. Huge data sets. Again, not just in the library,
but down in patient records. And this is how I started my career,
sorting through paper records. Now, in the big data explosion
in health care infomatics, we have all the information
parcels through which to go through. We’ve already talked about how it expands —
Jose did a great job of talking about that. I’ve already told you how it’s
important to have timely answers. And you know, medicine is still an art. We think algorithmically, and
I think that that’s important, to have a partner who does so also. When we started working with the Watson team,
we understood, I think, what they sought. They literally wanted to know
more about the art of medicine. They wanted that humanistic approach to medical
decision making, the clinical perspective. We at the Cleveland Clinic needed to have a
different kind of conversation with our data. It needed to be more intuitive,
it needed to be a back and forth. And like I said, it needed
to be quickly performed. It’s no surprise that the great
advances in art, music, philosophy, science, happen in the port cities. That’s where ideas had discourse. Innovation happens at the
intersection of knowledge domains. So, if you have a partner that
has these competencies to discover and evidence all the passive information
you may need and can understand you, and here we sit at the Cleveland Clinic
with an approach patient centric, centered around the actual patient,
it’s a case-based learning model. We’re the highest acuity hospital in the world. The thickest patients are our laboratory. And we were early in the pool at the adoption
of electronic medical record keeping. These are two natural partners. So, I call this the virtuous cycle. Who is the student and who is the teacher? Well, both. And it’s constant and consistent
and escalating all the time. What a great paradigm — that these
two groups, professionals in medicine, can be having a conversation with
somebody scanning the world’s data and an individual patient’s data to elevate
quality, access and fiscal responsibility. So, we had two projects: one, the Watson
Paths as Mike has been discussing; and one, focused on electronic medical
record applications. Let me just talk a little bit
about those for just one second. So, on the Watson Paths side,
this case-based learning and care delivery model was a great foundation. Watson assisted us in researching
the world’s data and justifying particular diagnoses
and in order of probability. The data that could be accessed
by our students was real time. And then, they interacted with Watson. They described what they were seeing, what
they thought Watson would come back to that. It would just refine the data exchange further,
and honing it in with greater confidence. So, the concept of powering the decision
support with this inferential model is right up the alley of an art like
medical care delivery. And then again, to go out
and find that one paper by that one group is exactly what we
all need because if it’s your loved one or you in that room, that’s 100
percent of your experience that day. And maybe the most important thing you’ll
do that day, week, maybe in your life. And so, it has to be at that focused level. And then to be presented with, on the
physician side, with likely/unlikely, possible, probable opportunities, that’s when we
get to exercise all the great things that we’ve learned from our mentors. And so as you probably know, the concept of
a simulation center where you can get hands on opportunity with mannequins, et
cetera, has really been the explosion and a real paradigm shift in
American medical education. To practice on a mannequin, hey, heck of a lot
better than practicing on your loved one, right? So, take that three-dimensional
and physical concept that the simulation center has been doing,
now we have a cognitive simulation center. That’s why our students are so
enthusiastic and they spent the kind of time to…they put this together. So, essentially, when you have an opportunity
to not have a deterministic outcome — “this the answer” — well, that’s sometimes
too unidimensional and sometimes not right. But how about a probabilistic answer? Hey, these are the things it could be. Here’s the level of confidence. Use your brain, interact with your
partner, come up with the right solutions. On the other side, when we’re
talking about an individual patient and scanning electronic medical record, we know that all we have done heretofore is
just archived a tremendous amount of data. We have to know how to extract from it,
navigate through it and look for those gems of hidden insight that might be the
answer to the problem that’s facing us. And so, if we’re going to improve diagnostic
accuracy, essentially having somebody over your shoulder — in this case it’s a
dashboard, which we’re all used to looking to — is one of the greatest tools you can have. And that’s what Watson does. It organizes it, it highlights at an
individual’s electronic medical record. Stuff we might have glossed over. Those pages might have been stuck together
or the chart got dropped on the way. That’s the old model. But even now when we have access to electronic
medical records, it’s really hard to look through pages and pages and pages. I’d love somebody to organize
that and present that to me, and that’s what Watson has done for us. I love the power of the partnership. We have to solve the biggest
problems facing us in health care. We all know where health care stands on
our nation’s and the world’s priority list. Taking care of the individual,
large populations, managing the expanding information
explosion that we’ve been discussing. And then frankly, there’s just not enough of
folks with a MD after their name going forward. And we have to all do this in a way
that’s extremely fiscally responsible with your resources. So, it’s an extreme thrill to be here to talk
about, I think, a paradigm shift from a computer at worst being a paperweight and at best
being an archive to now being a partner. It is our partner. And we’re very proud to say that Cleveland
Clinic and IBM are in this partnership for our parents, converting
knowledge to the wisdom we need to improve the world’s health care. Thank you very much for allowing
me to have the stage. [ APPLAUSE ] Thank you, my friend. I hope that was okay. Ginni, I hope that was okay. RHODIN: So, we’ve talked
a lot about health care. Health care is not the only industry
that Watson can help transform. Watson is particularly good at
understanding large amounts of information — information that doesn’t change
instantaneously but changes regularly and starts to correlate information. Banking, wealth management, client
experience is another such domain. A few short hours ago, one of my
colleagues, Bridget Van Kralingen, who runs Global Business Services in IBM,
was in Singapore signing a Watson deal with the largest development bank in Asia. DBS has signed on to start to transform and
reshape what they believe will be the future of banking in Asia, perhaps the world. So we’re starting to expand not just
within industries but also around the world as the promise of Watson
continues to spread globally. Now, we’ve talked about transformation. As we work with clients, we’ve started to
recognize that there are many things that we do in business that are somewhat repetitive,
that we have large numbers of people doing, and they don’t really use our brain power
in the way that we would like to use it. Right? Call centers are a great example, where
people are answering questions all day long, and it’s a high turnover business,
people are coming and going. Training costs are one of the key drivers
of how efficient a call center operates. So, we started to study that space, and we
said, well, what if there’s a better way to engage clients than to operate a call center? What if there’s a way to actually put
an adviser on the front of the company that allows an interactionand also allows
a seamless escalation between interacting with information and interacting
with a human in the call center, understanding what it knows
but also what it doesn’t know. And changing the entire experience
of interaction between clients and the organization they’re working with. We’ve recognized this plays out in a lot of
different industries: insurance, banking, anybody that runs a call center, which is
just about every business that we know. So, we started to recognize repeating patterns. Repeating patterns are things
we build products out of. And once you build a product out of it, you
can install it faster, you can get it up and running faster, and the
time to value is faster. That’s what this is all about. We’re going to bring a series of solutions to
the enterprise that we believe can deliver value in months, not years, and will continue to have
the same capabilities, the same opportunities that you just heard about
in transformative solutions but on a smaller scale and faster delivery. So, we call these cognitive
solutions across the enterprise. A little video to get us started,
and then we’ll move along. In 15 seconds, IBM Watson
can analyze terabytes of data to help operators quickly find
answers to a caller’s question. Soon, Watson will help transform
customer service. Let’s build a Smarter Planet. In 2011, an IBM cognitive system won Jeopardy. Now it’s tackling bigger challenges in
health care, finance and customer service. IBM Watson is ready for work. Let’s build a Smarter Planet. RHODIN: Okay, so how do we make this work? How do we make this come to life? The first product that we
came up with was this idea of helping organizations improve how
they interact with their customers. Making it easier, more natural. Using natural language; not forms,
not, you know, keys that you have to type in in order to use a system. You talk to it, you type in
questions, it gives you answers. It helps you think through things. It has a conversation. It was mentioned earlier that these
conversations can’t just be one question and one answer; they have to continue
to evolve so they become dialogues. They remember context from the
first question to the next question. One answer feeds into just the next question. We built a technology called Watson
Engagement Advisor that we’ve been working with a couple dozen clients now around the world
to start to explore how they take their data — their data — put it in Watson
and allow people to ask questions so that they can deliver
value faster to their clients. To help us take a look at what Watson
Engagement can do we’re going to have one of our newest partners, Scott McKinley,
an EVP at Innovation at Nielsen. Run the video, please. Nielsen is a global market research company. In fact, we’re the largest research
marketing company in the world. We’ve organized our business
into what we call watch and buy. In our watch business, we measure
everything that consumers watch, all the ways they spend time with media. That includes television, radio, digital,
mobile and any new device that comes along. In our buy business, we ingest and
measure trillions of data points from major retailers such as Wal-Mart. We look at the data and analyze it
for indications of buying patterns, we look at demographics, we help the
retailers understand who is buying the products and where those products are being bought. And that’s happening on a global basis. What we’re doing more and more now is bringing
together what people watch with what people buy, and that allows us to comment on the impact of
advertising expenditures on sales, which is, of course, what every marketer wants to know. Our clients are advertisers themselves,
the agencies who service them and the media who deliver media to consumer audiences. We produce lots of data across our watch and
buy functions, and that data feeds the ecosystem and allows marketers to understand
what’s working and what’s not across the media landscape. Our existing tools are good,
but we need to innovate to find better and better tools to do this. We need to do it faster, more efficiently. When we look at Watson, we see this is
a tremendous opportunity, in two ways. First of all, we’ve got decades of
structured data, as I said before. We have syndicated data sets around
television or around consumer behavior, around buying behavior at the retail level. This stuff is stable, but we need
better and faster access to that. More interestingly is the unstructured side — again, these firehoses of data
coming out of Twitter and Facebook and other social media and
digital media channels. We don’t have tools to very quickly paw through
that amount of information and uncover insights that a media planner at an agency can use to do
a better job of understanding who a person is and where reach them in the media to
deliver a message about a product. So, that’s where we think the opportunity is. We envision a world where a
media planner aiming a product at millennials can quickly query mountains
of data in a way that they can’t today and then uncover an insight about where
to most efficiently reach those folks. What do they care about? Where do they spend their time? We today don’t have tools that can do that, and we think that Watson provides
that opportunity for that. We partner with IBM and Watson to uncover
new ways to use this sophisticated technology to help the marketer make better decisions. We envision a world where a marketer
can use natural language queries of enormous data sets — the types
of which I’ve described earlier — to uncover on the spot insights
about how to most efficiently and effectively reach their audiences; to help
understand why audiences care about a product or a product category; and, how to better
create and deliver advertising to folks as it’s more relevant and more effective. RHODIN: Scott had wanted to be here with us
today but they’re having an all-day conference with the entire senior leadership of their
team talking about using Watson down in Florida and about this time I’m on a video down there. [ LAUGHTER ] So, we ended up swapping video stories across
the two events to make this come together. But Watson Engagement Advisor has been about
helping organizations find answers to questions. One of the other patterns we uncovered is
that it’s not just about finding answers; sometimes it’s about finding the right question. It’s about moving through a sea of
data and looking for the white space, something that hasn’t been explored before. Think about pharmaceutical companies. Think about all of the chemical compounds
that have ever been tried and patented. You don’t care about the ones that have
been tried as much as you do the ones that have never been tried,
if you’re doing research. So, it’s all about discovery,
finding new things. So, today we’re announcing the Watson
Discovery Advisor, which is a companion product to the Watson Engagement Advisor, our
second major Watson enterprise solution. And here to help us understand what
it might do for a particular industry, I’d like to bring Jay Katzen up from Elsevier. [ APPLAUSE ] Jay, thank you. KATZEN: Thank you. Again, my name is Jay Katzen, President
of Elsevier Clinical Solutions. And what I want to talk about here a little
bit is just give an introduction to Elsevier, talk about clinical solutions, talk a little
bit about the health care market and some of the challenges and why we believe
a partnership or collaboration with IBM can really help transform this
market and solve some of the problems today. Elsevier is part of Reed Elsevier. Reed Elsevier is a $10 billion company;
Elsevier is about a $3 billion business unit. We’re a world leader in providing
scientific, technical and medical information to clinicians and students across the world. We have customers in 180 countries. We serve more than 30 million
scientists, students, health and information professionals worldwide. We drive innovation by delivering
authoritative evidence based content with cutting edge technology, allowing our
customers to find the answers more quickly. From a clinical solutions perspective, we cut
across the entire health care spectrum both from an academic standpoint as well as
pharma companies, retail pharmacies. But our focus is really on the provider setting:
on the hospitals, the experience of physicians, nurses, pharmacists and extended care team. Our mission is to lead the way in
providing health care practitioners access to the most relevant evidence based information
and tools wherever and how ever they need it, to empower them to make better
informed decisions, to save lives, to improve care and reduce cost. This next point is really
about, how can Watson help us, delivering information at the point of decision. How can we ensure that we take all
the context about a patient into play and make sure we give the clinician the
right information to make a better decision? Health care is changing. To be honest with you, health care is a mess. It’s in a state of disarray, and it’s
been in a state of disarray for decades. We talk about the sea of
information, information explosion. If you look at this, 5,000 [volume]
medical articles published every month. I’d have to read 164 and change
articles every single day to keep up. We have a ton of information,
an explosion of information, and yet now we have more information,
with images, with genomic databases. Do we want more information? Absolutely. We need more information, but we need
to tailor that to the specific patient, to the individual encounter, and we
need tools to help us do that better. There’s incredible waste in health
care today, about $700 billion of waste in the health care system. A lot of that is due to unnecessary
care, lack of care coordination. Can information help us improve
the overall care? Reimbursement. The government is finally starting to get
involved from a health care standpoint. They’re demanding change, regulatory changes, ensuring that hospitals have
to improve outcomes. They have to reduce readmits and
improve patient satisfaction. At the end of the day, it all comes down to,
how do we improve the quality of care delivered, and what can we provide our
clinicians to help them in those roles? Focus on increasing quality and reducing cost. This is going from, it used to be volume-based
metrics; now they’re encounter-based, it’s about the individual patient. I’m going to give you an example. There’s a 55 year old woman, had
metastatic breast cancer for six years. She was battling cancer. She had a double breast reduction. She had three surgeries. A year and a half ago, she wasn’t feeling well. She went to the hospital over Thanksgiving — not the ideal time to go to the
hospital over the major holidays. And she was having trouble breathing. So, they put her on some antibiotics. They didn’t look at her history, per se. They intubated her because she
wasn’t breathing very well. A few days later, they looked at her,
they gave her some other antibiotics. She really wasn’t doing better. A week afterwards, a new doctor
came on rounds, said, you know what? There’s a protocol out there that says after
seven days of intubation we have to do a trach. They didn’t call in a specialist, and they
called in a general surgeon, put the trach in. They didn’t look at her history,
and they didn’t realize that actually her airway was compromised
based on a couple of the surgeries. A couple days went by, she was
still having trouble breathing. She went into respiratory
arrest, she developed sepsis. She died 23 days after entering the hospital — not from cancer; from something that could
be avoided if they had the right information, if care was coordinated the right way,
if they looked at her patient history to bring all this information in. And it’s not the physicians’ fault per se. They’re challenged. We’ve heard about this. They have limited time between patients. They don’t have access to the information
they need to make the right decisions. Now, is this an anomaly in
the health care system? So, if this is one-off, maybe it’s okay. Maybe this is leading into
the next bullet point here. What would you guess — it used to be the sixth
— what would you guess the third leading cause of death is in the United States? Preventible medical errors. There’s a study that came out back in September
that said it actually used to be around 100,000; it’s about 400,000 people die every
year from preventible medical errors. This has to stop. This is real. Okay. We can help prevent this stuff. So, why are we looking at Watson? Why do we think Watson can help us? We’ve invested tens of millions of
dollars in our content, in our technology, to develop comprehensive broad and deep
information to provide to our clinicians to help them make better decisions. We’ve tried to integrate it at every step
of the workflow, whether referential, whether you’re looking at it up on your iPad, on
the Internet, whether there’s a mobile device, whether it’s integrated into the CPOE
system, your electronic medical record. But it’s still not taking into account
everything it can to help make better decisions. We have trusted evidence based information. We don’t look at information from
an individual hospital; guidelines, procedures that they have, genomic databases. There’s an ability for us to transform
the way information is delivered and how questions are answered by working
with the Watson technology: provide fast, clinically-tuned search,
speed to a relevant answer. There was an article in December about a
woman that was going to have a hysterectomy. She selected a noninvasive procedure and
was basically going in, small incision, doing a morcellation, which is
basically breaking up her uterus. Very safe, very effective, heals pretty quickly. Unfortunately what they didn’t
realize was she actually had cancer. So, by doing this, spread the cancer
throughout her entire stomach. She went into stage four cancer. So the question is, using Watson — and
actually there’s some examples earlier today, and there was one in Austin about
this — can we look at the patient, look at the person’s history,
understand what’s happening out there and prevent something like this? I don’t know if they could
have prevented that case, but I guarantee you it can prevent
some of these things in the future. The combination of what we’re doing with the
Watson technology can improve health care. It can transform the type of
information we provide and save lives. This is why I think Watson is important,
why we’re working with the IBM folks and why we think we can transform health care. Thank you. [ APPLAUSE ] RHODIN: All right. Now, Watson reads. It knows how to answer questions
in natural language. My good friend Terry pointed out that
it also needs to learn math, right? And we have an answer for that. Today we’re going to launch Watson Analytics. We previewed this technology in
fourth quarter under a codename of Neo as a natural language way to
start to interact with data sets. Simply ask a question like you would of
Watson, but instead of getting a sentence back, you might get a suggested data
set or set of data sets to choose from that might be relevant to your question. Once you select the data set, the system
interprets the data, understands it, figures out what the best
visualization of that data should be and creates an interactive visual graph for you
to interact and play with and explore your data. In fact, it’s just another answer to a
question, but it may be mathematical as opposed to in natural language — another important
component of the system as we go forward. This capability is going out in beta
next week, it will be live on our cloud. Another capability that we’ve
added is Watson Explore. Part of this art of discovery is exploration,
being able to use tools that allow you to graphically start to explore, to
navigate, to look at large sets of data, to look for that needle in a
haystack so that you can find it and then maybe do something with it. It’s another key, important
product in what’s becoming a system. It’s not just a question
and answer thing anymore. It does questions and answers, it reasons,
it explores, it visualizes, it analyzes. We’re seeing the build-out of the platform
for cognitive computing of the future. We’ve seen the tip of the iceberg. It’s getting a little bigger as it comes into
focus, but there’s still a lot more behind this. Now, one of the biggest learnings of the last
two years as we’ve worked with many of you and with clients around the world is
that everybody wants to work with Watson but not everybody is ready for Watson. One of the key problems is,
how do you sort through all of your information, all of your data? How do you get rid of the duplicates,
get rid of the conflicting information? How do you actually find it all? We built a portfolio of capabilities
across our information management business that helps organizations start to sort
through all the different silos of information from SharePoint sites to enterprise
content management sites, to social sites, to pull together that information
and to make it ready for Watson. These are the foundational technologies
that help speed up the time to value, help you get through the data,
sort it out, clean it up, so that it can be ingested by Watson. So, Watson Foundations becomes an important
part of how we help customers get ready and drive the road to the next
platform, next era of computing. Now, transformations, products,
where you expect us, ecosystem. Why an ecosystem? Because we can’t come up with all the ideas. In fact, when we started this idea,
Manoj and Steve started working with me saying, we need an ecosystem. I said, well, what kind of
ecosystem are you going to build? And we started going through it,
we started thinking through it. And we said, well, what ideas
are they going to come up with? And they said, we have no idea. Right? We have no idea what they’re going to
come up with, because that’s why we need them. We need entrepreneurs with bright ideas that
can imagine the future that we can’t see. So, we picked a handful, a few, really
close partners, to experiment with us, to play with us, help us figure out how
to build a SDK, how to build the tools that are needed to build a ecosystem. How do you build a content store? How do you get a developer cloud up and running? How do you test your application? How do you make sure this makes sense? That’s how you get started in an ecosystem. So, let’s run a little quick
video and then we’ll get started. [ MUSIC ] So, if you can take the best practice of a
lawyer and make it replicable through Watson, or the best practice of the best oncologists
in the world and help people in hospitals around the world to diagnose
patients, that’s really, I think, the true beauty of a Watson-based system. I think the biggest challenge facing
retailers and brands today is the fact that consumers are now digitally
connected at all times. Consumers empowered like they never
have been before with information. If we are able to put this content into an
ecosystem, we’ll be able to make that content that we have usable in ways
that we haven’t even thought of. It’s interesting when you
show the ideas and concepts and the vision to retailers, they get it. There’s an instantaneous aha! moment when they understand that they
will…they can have this great experience, they can interact with their consumers in a
way that they’ve never been able to before, and that most importantly, the content
that they spent a lot of time developing, the content that’s out there, can really be
used in a way that it hasn’t been used before. The ecosystem itself is an environment
in which more innovation will occur and will help us understand even better how
we should be producing content in the future. Very few organizations, I think, have the
scale to really do something meaningful, and I think the fact that the ecosystem approach
in terms of allowing companies like Fluid and enabling companies like Fluid to
tap into Watson can do is a great thing. We have no doubt that it will have profound
implications on how companies are organized, how customers are served, how people will work. And the possibilities are truly endless. [ MUSIC ] RHODIN: All right. So, what is an ecosystem? When we worked on this throughout the last part
of last year, we said the first thing you need to have is a developer cloud, some place
that people can log on, use the technology, develop their applications and test them out. The second thing you need is to
be able to get access to content. The fuel of a cognitive system. How does that content get into the system? They have to get added together. So, the environment makes it so that you
can pick the content, pick the system, put them together, start to
work on your application. But it’s not simply enough to put it
up on the Web and say, play with it. Making a pool of talent available
to the ecosystem so that we can really accelerate
how the ecosystem will build out, the speed it will build out, is another
key element of our ecosystem initiative. As Ginni said earlier, when we announced the
ecosystem environment, we had 750 entrepreneurs and companies approach us very, very quickly
saying, I want in, how do I start, what do I do? Then we had to start the usual
questions: are you ready? Do you have your data? You know, we start to go through the process. And we’re working through that with the 750,
but since we printed the charts it’s now 890. So, it is continuing to climb. And the news of today I have a feeling
we’re going to have a lot more very soon. Once you put up a number like $100
million in ecosystem investment, you get a lot of people’s attention. So, $100 million in equity investment as
Ginni talked about in startups and through VCs to help drive the ecosystem around it, and
500-plus technical experts in our talent hubs to help work and incubate those new startups. Now, the three partners that we worked with
initially, Welltok, MD Buyline, Fluid Retail, they all came up with ideas
that I wouldn’t have thought of, which is exactly why you do an ecosystem. Welltok in the area of consumer health
care, a personal health care concierge. MD Buyline starting to use Watson to help
make procurement decisions better in the B to B health care environment; and, Fluid
Retail putting the expert personal shopper on the back end of e-commerce. Let’s hear from Kent, who was in the
video a minute ago, to talk a little bit about what they’re going to do at Fluid. [ APPLAUSE ] DEVERELL: Saving lives, improving
health care is a tough act to follow. [ LAUGHTER ] But I’ll talk a little bit about
how we’re leveraging Watson to really transform the digital
retail experience. Quick background on Fluid, who
we are, what we do as a company. We live at the heart of digital shopping. We’re all about creating great
digital shopping experiences. We do that by fusing technology, strategy,
design to turn shoppers into buyers, to try and get people to that
moment of conversion, that aha! moment when people say, yes,
I want to buy that product, have a great experience,
buy it, and they get it. We’re fortunate to work with some
great brands, leading national brands like Levi’s, The North Face, ULTA. They’re great innovative partners. Everyone is very excited
about what we’re doing here. I’m going to share some examples, show you
how we’ve developed some initial prototypes where we think it’s really
going to be game changing. First let’s step back and talk a little bit
about what’s going on in digital retail today. We all know e-commerce is massive, right? Eighty percent of consumers research and buy
products online today, pretty much everybody. Digital influences more than 60
percent of all retail transactions, and that’s really important, because
it’s not just about buying online; it’s about the influence
of digital, it’s pervasive. Sixty percent of 4.8 trillion transactions
are influenced by the digital experience, and of course, it’s growing
at a phenomenal rate. But today, the digital retail experience
is really driven by three key things: price, convenience and selection. Quick show of hands: how many people
out here are Amazon Prime members? Wow, more than half. So, it’s probably 75 percent of you are
Amazon Prime members that currently have over 20 million people paying
them for convenience, paying an annual fee for
the factor of convenience. So, that’s amazing when you
think about it, right? It’s really, it’s just…it’s
just changing, it’s really, what do I want, getting convenience out of this? Is there more to it? What is more to it? How do I get better satisfaction
out of this process? There’s problems. Convenience is great, but at the end of the day, most sites have a two percent
average conversion rate and a 60 to 70 percent purchase abandonment rate. So, why is that? We think it’s because consumers want real
advice, but it’s not available online. Can you find great prices? Sure. Is there great selection? Sure. Can I ship it to house tomorrow? Yes. But am I really confident
in what I’m buying? We know that 60 percent of consumers
abandon their purchase practice because they can’t get information. We know that 80 percent of consumers want
advice, want help during the shopping process. And we know that fully two-thirds of
consumers start their digital shopping journey with a specific product in line,
which means they’re not using it as a discovery tool, it’s
not an advice tool yet. So, we look at that and we know technology
can really help change this equation. And there’s been an explosion, certainly. Big data, social media, personalization,
content proliferation. All those things are having a big impact. But sort of just incrementally
nothing is really a game changer yet. They’re all moving the needle
just a small amount. We think that making sense of it for consumers
is the next big thing in digital retail: making it a more personal, relevant, intuitive
experience; moving from keyword-based queries to conversations; making it visual;
and again, providing real advice. So, when we think about the dynamic
and we think about this thing about what is a great retail experience, I
think we’ve all had the experience of going into a store and working with that great
sales associate, that great salesperson that really helps you find the product
versus the e-commerce experience. The great sales associate, they’re
personal, they’re proactive. They’re conversational. e-commerce, it’s very impersonal
and user driven. The great sales associate asks intelligent
questions and clarifies dynamically; e-commerce, very structured, linear,
form and keyword driven. The great sales associate, educates
broadly and effectively; e-commerce, there’s a ton of information out there
but consumers have to dig for it. The great sales associate creates intuitive and
relevant upsells; e-commerce, very data-driven. That’s a good thing, it’s important to leverage
the data, but it’s just one tiny element. You’re not really understanding the
person when the e-commerce upsells. Great sales associate interprets and
recommends in context; e-commerce, very limited recommendations usually
limited only to the site content. And finally and most importantly, the great sales associate makes
you feel good about your purchase. When you walk out of a store you’ve
had a great sales experience, you feel good about what you buy. You feel confident. You feel confident you got the right product. e-commerce, you’re on your own. You click buy, you put in your credit
card information and process transaction. Is that really the right product? Is that really what I wanted? There’s no real way to know for sure. So, we see a world where you
get the best of both worlds. We think the opportunity to deliver a
natural conversation, understand and learn about a consumer’s specific needs,
provide access to product information in context of the shopping experience. Quickly assimilate multiple content
sources, go as deep as you need to, and then quickly resurface; and ultimately,
provide rich and relevant recommendations. So, our goal — we call it the Expert Personal
Shopper, teaming up with the Watson team – the vision is great; how do we bring that
sales associate experience on online? How do we make a more human
experience in the shopping process? The cognitive cloud can bring this to life. We’re able to leverage technology through
the Watson ecosystem that will come to life [we] would never be able to create on
our own, incredibly powerful and being able to put that to work for consumers in a meaningful way during the shopping
experience we think is incredibly powerful. We call it Fluid XPS — Expert Personal Shopper. It’s the ultimate shopper GPS. I’ll show you quick example. Before I get into the example, though, I’ll
talk a little bit about sort of my aha! moment when we started working
with the Watson team. And it was the beginning of the
summer, so just six months ago. We’ve made incredible progress in six months. But I was shopping, it was the beginning of
the backpacking season, summer camping season, shopping for a new sleeping bag
for my son, my 10 year old son. He had outgrown his other sleeping
bag, I had to get him something. I was willing to spend some money to
get something that was a good product, was going to last, was going to be appropriate
for all the various uses where we need it. Went online, probably went
to five, six different sites. Filled out the forms. Put in some data, how much we wanted
to pay, how much I want it to weigh. Where were we going, what was the temperatures. Got back, I don’t know, 15, 20
different suitable products. Started digging through the reviews. Tried to figure out, okay, well,
which one is the right one? What am I going to buy? Three hours later, got in my car, drove to REI. Walked in the store, talked to a
sales rep, told him what I needed. Five minutes later, I left with a product that
we were extremely happy with, great experience. Right? And that is the problem
we’re trying to solve. We want to make that happen. We want to be able to bring that to life online. So, we went to one of our customers,
The North Face, and we said, we’ve got this great opportunity,
this Watson technology. It’s really cool. We think it would really help
you with some of your purchases. We really think that there’s an opportunity to help change how people
want to shop for gear online. So what I’m going to show you are just some
examples of what we created with The Northface. There’s a full demo in the back, I
encourage you all to go take a look. It’s very exciting. We call it the Compass Gear Guide. Our goal here is really, again, to create
that expert personal shopping experience. How does somebody shop for a
product when they go into a store? Not when I go and type in whatever keywords
I’m trying to guess to game the system on the search box and filling out all
the forms to get information back. It starts with a question: how can I help you? Very simple, straightforward, single screen. Not a lot of clutter. And I ask it a question, just like
I’d ask my friends or the sales rep. I’m gearing up for a 14-day backpacking
trip, what equipment do I need? Watson comes back, takes that information,
says, okay, 14-day trip, backpacking, what’s the kind of equipment we need? Parsing through all the information and says, here’s a list of gear suited
for a 14-day backpack trip. What would you like to start with? I can then ask it a question. Say, well, I want to look more
about technical packs for my trip. Please tell me about it. And so on and so forth. You see it becomes more of a conversation,
it’s much more natural, it’s engaging, and I can quickly go deep into products
and I can quickly surface back up. And it’s providing me relevant,
contextual information to what I’m doing. So, we think that that kind of shopping
experience is really changing the way people are going to shop. And ultimately, it’s about two things: it’s,
one, it’s natural, responsive and it’s personal. Meets intelligent, intuitive and limitless. And we think that that is going to
be, you know, three years from now, we hope you’ll be shopping this way. Thank you. [ APPLAUSE ] RHODIN: Okay. So, three entrepreneurs, three ideas,
three examples, up and running, starting down the productization curve
and getting them out into the market. Exciting stuff. So, what other things can
get done in an ecosystem? What other ideas do people have? We’ve invited Terry Jones, co-founder
of Travelocity and Kayak, to talk about, what if we could actually
change the way we travel? You personally invented many
of the e-commerce travel sites. What could we do differently? Terry. [ APPLAUSE ] JONES: Thanks, Mike. So, I’m here to talk about travel,
specifically leisure travel. What is leisure travel, anyway? Well, I like this definition: travel is
the sherbet between courses of reality. Right? [ LAUGHTER ] It’s fun. It’s impactful. But travel planning, well, maybe not so much. Right? I mean, we moved from 19th century
travel planning where you went to see the agent to 20th century travel planning where we
do it all ourselves, and that was great. I mean, I had a lot to do with that. And it’s been very impactful. But 20th century travel planning
isn’t always easy. In fact, the average consumer
visits over 20 sites when they’re trying to plan a leisure trip. And that’s just not the right answer;
it’s kind of like the sleeping bag. Right? Now, I’ve seen a lot of change in travel. I started my career as a travel agent in Chicago
40 years ago, actually, as a receptionist. And believe it or not, the very first
reservations I made were extremely high tech. I sent telegrams. That’s how old I am. So, technology’s moved a bit since then. I mean, we’ve gone, first we went to
connecting computers, and that’s what I did at American Airlines with Sabre when
we automated all those travel agents with reservation systems and ticketing systems. And that was a great advancement,
but we still had to call or visit to get that information that we wanted. Then information found its freedom. It escaped. And we were able to do it
ourselves as travel went online. And that was powerful for a lot of reasons;
one is that prices became transparent. When Orbitz put up this price
grid, all of you said, I’m never going to pay a thousand
dollars for a leisure ticket again. You probably haven’t in the
U.S. So, a very powerful change. And we moved on with kayak to connecting
pages so that you could search in one place and then go buy direct, and that’s meta search. And again, a pretty big change in travel. We moved to connecting people with mobile. At Kayak alone we have over 30
million people using our mobile app. So, travel…travel’s come a pretty long way. We have all kinds of different bidding auction
mobile portals, all kinds of different sites. And you may not be aware that travel
is the biggest part of e-commerce. The blue graph is e-commerce; this
is travel outside of e-commerce. Travel is larger than the next four
categories of e-commerce combined. Why is travel so big? Well, I think there are a number of reasons. One is, prices change quickly, seats
fill quickly, so, we need this immediacy. And plus, on the Web, of course, video can promote the emotion
to sell travel-related products. So, I think that’s why it’s
grown to be the biggest part of e-commerce, but something’s missing. And what’s missing is expert advice. You can’t get expert travel advice on the Web. You can get reviews, but you can’t get advice. So, here’s a secret, never revealed before. The guy who started the online travel
revolution, me, I use travel agents. [ LAUGHTER ] Now, I didn’t use them to
get here; I used Kayak. Okay? I don’t need that for a trip to New York. But I want an experience when I go on
vacation, and using an expert allowed me to cook with a one-star Michelin chef in France,
allowed me to stay in a boma in Botswana, allowed me to hire a guy who could read
hieroglyphics to my son when we went down the Nile in Egypt, allowed me to
make noodles in Beijing with my daughter. So, expert advice is necessary. Today, travel advice comes from
books, comes from newspapers, comes from TripAdvisor, comes from Facebook. Sometimes it comes from data culled
from past trips; not very often. And of course, it comes from friends,
and from conversations, and from agents. But in online, everything has to fit in a box. You know, you have to know where you’re going, you have to know the dates,
you have to know the time. So, what do we do to change? Why can’t discovering my perfect
trip be as easy as a conversation? Well, maybe it can be. A simple entry to answer a difficult question. I think this is one thing you need to walk
away from this Watson meeting today is to understand we’re asking
deceptively simple questions. But actually, we’re making…sorry,
we’re making very simple entries and asking really hard questions. It looks easy but it’s not. So, could you ask this question
today of many travel site? I’d like to go to a four-star beach resort
in January with my wife and two kids, needs to have a great spa, kids
activities and good restaurants? You could ask it, but you
wouldn’t get an answer, would you? [ LAUGHTER ] So we had a Watson demo — I don’t
think we have it here today — that said I want to go on an adventurous but
nice vacation with my husband and children. And the answer, with cognitive
computing, actually can be done. We did that, Watson analyzed 64 million
reviews, 15 million blogs, 7,000 guides. It understood adventurous
vacation, husband, children, so on. And it put it all together and it said, you
should go to Bali, with a very high level of confidence that Bali is the place
for you, 97 percent level of confidence. But maybe that isn’t right. And because it’s about a conversation, how about
the ability to go back and say, well, yeah, but we love the beach but we
want some time on dry land. So, then Watson came back and said, well,
you really want to go to Punta Cana. That’s something you can’t
ask any travel site today. If we do that, we can move
travel outside of the box. And frankly, if people are always
telling you to think outside of the box, maybe something’s wrong with the box. [ LAUGHTER ] And there is something wrong with the box. We have to move travel outside the box. Now, are there other things
we could do with travel? Sure. What about off-schedule operations? How many people here were disrupted last
week trying to move in and out of New York? And what were you doing? You’re on the phone. Your flight has been canceled, press two. I didn’t understand your response. We can book Dallas tomorrow. What were you saying? Agent, agent, agent, agent. Right? [ LAUGHTER ] Because these machines can understand what
you’re saying; they just don’t know what to do. So, what if you could say, I need to
be in Fargo tomorrow at eight a.m., and what if the system understood have
to, Fargo, tomorrow, and eight a.m.? A revolutionary change in customer
service if we could do that. That’s something Watson can do. How about an easier way to use those miles? This is the Jon Iwata program. He asked me for this. How about the ability to simply say, I
want to use my miles to go first class with my wife to a romantic resort in January. Can’t do that today. So, I think if we take cognitive computing
and run it against these huge data problems that we have in travel, we will pick the lock
and turn data into advice and make travel advice as easy as the kind of conversations
we all have today. And maybe, just maybe, we can
revolutionize travel once again. Thank you. [ APPLAUSE ] RHODIN: I think that’s a great
lead-in to our next speaker. We’re going to think about what Terry
just said, which is, imagine the future. It’s limitless. The ideas here are limitless. The idea of being able to have an idea, a
cloud, some content, pull it all together, get some investment, get some expert help
to get an application up and running. Let’s talk a little bit about what the
venture community looks like today, how they think about what
we’re doing with Watson. Jean Sullivan. General Partner at StarVest
Partners here in New York. Come join me on stage and
we’ll have a conversation. [ APPLAUSE ] RHODIN: Hi, Jean. First, why don’t you introduce
yourself for everybody? SULLIVAN: Hello, everyone. I am Jean Sullivan, a co-founder and
general partner with StarVest partners. We are a venture capital
firm here in New York City. We’ve been in business as a firm
with $400 million under management, and solely invest in B to B enterprise software. And so we all have a long experience in that. Even when people gave it
up, we said we’re staying with it all the way, and
as you can see it’s back. RHODIN: Thank you. So, as you look at what we’ve been talking
about today with an ecosystem around Watson and all the investment that’s going in from
IBM and actually in the world around big data and analytics, what do you see in the
venture community right now about ideas, and how can this idea of cognitive computing
really start to revolutionize your business as you think about where to put your money? SULLIVAN: Well, Mike, did you know
that many sectors are in slumberland. And guess what? Watson can wake them up. That’s what we believe, and it is so exciting. Let me give you three quick examples. One is the whole human capital management world. They’ve been sleeping, steeped
in paperwork and data. It’s really been quite confusing. But they have broken out just recently,
and Watson can just take them further. Huge amounts of data. And you know, I believe that Watson could
also solve one of our biggest problems — joblessness — because so many people have no
idea of the array of opportunities that are out there, and think of companies that
want the best people with the best skills. Talk about sifting through
large amounts of data. That’s the perfect role for Watson. I love that use case, don’t you? RHODIN: It’s a great use case. SULLIVAN: A second would be,
certainly…certainly you have heard today about some of the innovations in retail,
retail infrastructure, as you just heard. And we really believe in that as well as
incredible innovations in health care. We think that’s a great area. Thirdly, huge amount of actionable
trends could be discovered from sifting through data in fin tech. Certainly, what about insurance? Ad you have many great insurance customers. Imagine their problems being solved by
being able to normalize what’s going on with over 50…with 51 state regulators
and the issues there of just normalizing opportunity
and regulation there. There’s three quick examples. RHODIN: So, one of the things that you have a
passion for is this idea of incubators, right? And how incubators can play
a role in the VC community. We talked this morning about the new home
for Watson in the middle of Silicon Alley where a lot of the startups in the area reside. How important is it for us to
have an incubator as part of that? SULLIVAN: Well, welcome to putting that
in New York City, because we are proud that we’ve spent, we’ve been able to invest
multiple billions just since 2008 right here in New York and making this
one of the great tech centers. And we believe that IBM can continue
to prosper these tech centers. But we as a firm did participate in a
B to B incubator sponsored by the City of New York just these past six months. We saw companies enter as projects and
leave the program as high-growth businesses. That’s what incubators can do. We are strong believers in them. And that’s the kind of job creation
that incubators and accelerators and certainly this Watson incubator can create. This is very exciting. We are so proud you picked New York for this. This is great. RHODIN: That’s great. In fact, we just actually bought
a company a couple months ago out of one of those incubators with Xtify. It’s an interesting experience, because
usually when we do acquisitions in this, we usually buy companies that
are a little bit later stage that actually have their
own office by that point. And the Xtify team was in shared office
space in that kind of environment and now they’re actually going to come
over and join us in Astor Place as well. SULLIVAN: Well, all this is great, because that’s certainly what a
venture capital investor wants. It’s a win for the company, it’s a win for
IBM, and certainly a win for the investors. This is what promotes innovation and jobs
for the U.S., and it is very exciting. But I’ve got one more idea related to New York. You know, thanks to our wonderful
just outgoing Mayor Bloomberg, he created the Cornell Technion Center,
and I would love to challenge IBM to not only continue the work — which
I know you’re already doing with Watson and universities — but why not have
an East Coast/West Coast bake off and get that Cornell Center really cooking? I’d love to see that one. RHODIN: Well, we may take you up on that. That might be a lot of fun. I’m a big fan of fun. As you can tell with the Jeopardy
match, we do like to compete. So, how important is it to the VC community
when a company like IBM not only talks about building an ecosystem put
tools together but puts its money where its mouth is and frees
up capital to invest? SULLIVAN: Hey, this is critical. And you know, some of the
issues we didn’t talk about. If incredible scientists and innovators
can feed a million people a day in India, why can’t we go further and fix world hunger? I don’t think that’s so far out
of reach for Watson to work on. Certainly joblessness, I
believe in that to the max. And you heard today about potential
proactive cures around cancer. I mean, that’s what it’s all about. I see great, great, profound
changes that Watson can make. RHODIN: In the investment community
that you’re participating in, how are you seeing the influence
of these kind of converging trends between social, mobile, big data and cloud? And how does that kind of relate to
what cognitive is going to be all about. SULLIVAN: Certainly solving
local problems with partnerships. I know that’s what IBM’s all about,
really innovating around partnerships. In many cities, as I said a few seconds
ago, creating global centers of technology, that has a lot to do with creating,
again, jobs and wealth around the world. I think that’s exciting. And because IBM’s so intent on
making the world a Smarter Planet, what about innovations around energy? So, there’s so many different
areas that we can fix. I heard words today like profound, innovation,
these kinds of things are critical for success. And wow, how impressive that IBM’s
right there putting intellectual capital and capital to work. This is very exciting. RHODIN: All right, Jean. Thank you very much for joining us today. SULLIVAN: Thank you so much. [ APPLAUSE ] RHODIN: So, this morning we’ve
talked about transforming industries, we’ve talked about products,
we’ve talked about ecosystems. We’ve talked about imagining
the future with new ideas that ecosystem partners could start to build. But what happens next? What could Watson become next? As we think about it, we start to see
the circles populating around Watson. Watson needs to learn to do more than read. It needs to see. What do I mean by see? Images, video. The analytics around those will feed Watson. They’ll help us understand how to deal with
contextual information in a different way. We need to hear. And like most things, we’ll
also need to learn to listen. We also need to find a way to help the world
experience cognitive technology in a new way. So, I’d like to take a quick
look at a video from IBM Research and then invite Guru up to talk about this. [ MUSIC ] The problems that we’re facing now are too
complicated for a single human to figure out all by themselves, and so, the entity that’s
going to solve the problem is going to be a combination of humans
and machines working together to make a kind of integrated intelligence. One way of framing the core research question
is, how can people and computers be connected so that collectively they act more
intelligently than any person, group or computer has ever done before? At the highest level, you could
imagine a truly intelligent computer that would understand situations, would be
able to figure out explanations of events and be more inventive with respect
to scenarios, overcome the limits of the imagination and creativity. So, there is really no limit to
what computers ultimately could do. We have big problems to deal with as a
country, cost overruns, ramping inefficiency. And now we have tools that can help us deal
with that, where I think 10 years ago we didn’t; we had relatively primitive tools. So, I’m excited for the next decade. I think that we’ll be much better doctors,
we’ll hopefully get much better health care as patients, and some of these
tools will help us get there. I think when people imagine
machines and people working together, sometimes it’s a little frightening idea
of having a computer help you think. But I imagine it’s being kind of like a
violin, that if you look at a violinist and violin together, it’s really
the violin that’s making the noise. But the violin and the violinist
are able to do something much more than either of them could do separately. I think the future is going to be humans
and machines working together like that, like the violinist and the violin. RHODIN: A nice little thought there at the
end of how these systems are going to continue to evolve to become collaborators,
as we talked about earlier. So, as we think about the future, IBM
Research will continue its groundbreaking work in the area of cognitive computing. We’ll continue to get the next
wave of things ready for my teams. And Guru Banavar is going to come
help us understand what the other side of the looking glass looks like. Thank you. Guru. [ APPLAUSE ] BANAVAR: So, as a computer science researcher,
a career computer science researcher, this is actually an inspiring day for me,
because many of the things we’ve been working on for decades is now mainstream,
as you say it, Mike. So, I’d like to give you a sense of all of
those, actually step back a little bit and talk about what’s been going on in this field for
quite some time and what is likely to happen. Now, we’re talking about decades scales here. So, Watson winning Jeopardy may have seemed like
a sort of, you know, it came out of nowhere, but really there’s so much of
foundational science underneath it, computer science underneath it. There’s a lot of work that’s gone
on in academia, but I’m proud to say that IBM Research has actually
invested for over a decade — actually, I would say multiple decades —
on the foundations of cognitive computing. That’s how Watson beating
Jeopardy actually happened. And if you look at all the fundamentals here,
like machine learning and question answering, knowledge representation, which is at
the foundation of cognitive computing, even experiential and interaction
modalities and all of those things, those happen through the great work
of a whole community of researchers. And I’m happy to tell you that I have some
incredibly bright and accomplished researchers who work with me to make that happen. And that’s the team that created the
Watson system that beat Jeopardy. I’m also thrilled to say that we are going to be
not only accelerating all of these innovations that go into the new unit that’s announced
here today, but we are going to be expanding, and we’re going to be focusing our investment. Almost a third of IBM Research is going
to be focused on cognitive computing. And we’re going to be delivering, we’re going
to be generating the next generation of all of the foundations and applications and all
of the technologies that will keep this going in this very, very wonderful and expansive
set of applications that we’ve heard about today from the previous speakers. So, I want to give you a few examples
of the kinds of things we’re doing. And in order to appreciate the examples, I
think it would be good to, again, just step back and think about how humans do cognition. When humans do cognition, you
first have to sense the environment around you, understand what’s going on. You have to get very good at recognizing
patterns of what’s going on around you. And you have to then be able to reason about
things that you have seen patterns for, and then you have to be able to
go into the sort of the art of it, which is the creative exploration
and discovery of it. Now, I’m going to give you examples of all
of those four things that I just mentioned. There’s a number of other things that
are foundational to cognitive computing and as we’ve explored in
neuroscience and other areas, but I’m going to use those
four fundamental faculties to tell you what we’re doing in IBM Research. So, first, the ability to give Watson
the power to see is that of learning from a very large collection of
images and multimedia information — videos, audio, animations, if you will —
all kinds of information that is not textual. And we’re not just talking about
understanding the metadata that’s associated with these images, it’s understanding
the content. It’s not only understanding it’s learning
from the content over time so when you look at an application like looking for anomalies in
an x-ray or in an MRI or any of the other image and video kinds of data sets that we’ve
heard about, it really takes a huge amount of expertise on the part of
humans to be able to do that. And the accuracy can be greatly improved when
you adopt a tool that has learned the anomalies over time through a large number of data
sets and through training from human experts. When you adopt those as assistants, you
can improve accuracy, improve productivity, and you can in fact get into much more
real-time analysis and diagnosis of many kinds of conditions that we cannot do today. That’s going to be the power of “see.” Next, if you look at how we perceive patterns
— again, going back to neuroscience — there are these fundamental blocks of structures
in our neocortex which actually hardwire things that we see as patterns around us and other
kinds of knowledge that we’ve gained over time. Now, when you get to the scale of data and
when you get to the amount of these patterns that we need to learn and
we can learn going forward, our traditional computing
architectures do not work anymore. We call those traditional computing
architectures Von Neumann machines, these have been around for half
a century now, and we’ve all sort of grown up on those architectures. But we believe and we are proving that in
order to get to this new world of huge data and cognitive capabilities,
you need a new architecture; we call those architectures
neurosynaptic systems. And this is a fundamental
rethinking of how computing happens. It actually mimics what happens in the brain
through neurons and synapses, and these patterns that I’m talking about are
actually the fundamental way in which you specify what these systems do:
they learn, they interact, but it all happens because they understand how
to work with patterns. And that’s this ability. This is actually a very long-term
project that we are doing, and it’s a breakthrough in computer science. And this, we believe will be, will also become
mainstream in the future, and we look forward to solving many more applications
as we go forward. You’ve heard a little bit
about the power to reason. But I would like to maybe dig a little bit
under the covers here and tell you about, when Watson answers a question,
it tells you the answer. It tells you what the answer is. But the question of why that is the correct
answer is fundamental to many professions. Like the medical profession, a practicing
physician needs to understand the logic, the reasons why anyone, whether it’s a human
or machine, is giving a particular answer. And they need to analyze it. They need to make a judgment about
whether that sequence of logical steps and the evidence underlying that
sequence of logical steps are appropriate for the particular situation, because there’s
a huge amount of judgment involved here. So, when you look at the technologies like
Watson Paths that was discussed earlier, it gives you the ability to formulate hypotheses
or maybe even specify arguments that says, well, what if I did this; would that be
supported by any existing literature? Or, what if I wanted to…what if the
patient does not want to do something? Or if the patient condition requires that we explore something brand
new that we did not know about. What would be the potential options, and what would be the consequence
of each one of those options? Those are the reasoning methods that are
actually pushing the boundary in not only in our lab but in academia and in the broader
scientific community, because those are very, very difficult questions and it really
requires deep science to think about. And when you look at the extreme or the
ultimate way of doing this reasoning, it becomes interactive reasoning,
because you are having a discussion, even a brainstorming session with somebody,
and you’re saying, what if I did this? Would that work? And somebody says, oh, you know, these
are the reasons why it might not work. These are the reasons it might work. And you pick up on one of those reasons, and
then you start digging deeper and so forth and it becomes an interactive dialogue
between a human and a cognitive system that can assist the person to
really do what they want to do. And we’re building systems
like that in the lab today. You know, we will have other use cases
where we can demonstrate this capability. And I want to get now to the creative
portion of what cognitive systems can do. Right? This is about…it’s not about
knowing precisely what the question is; it’s about exploring about what the
possibilities are, what are the adjacent ideas, what are things that I haven’t
even thought about. What can I discover from things that I
know may be plausible but I’m not sure about what the consequences
are and what the evidence is. And in the case of examples I
use here are those of, let’s say, chemistry, biology, materials science. You may end up needing a combination, a cocktail
of chemicals for a particular medical condition. And we may not be able to imagine what
those are without the help of a system that can pull various different pieces
of information across the entire space of information and put it all together and
suggest something new that we never considered. Same thing with metals. For specific applications, you may need a
combination of elements from the Periodic Table that you may not have considered
before, and those kinds of alloys or materials can be discovered through
this exploratory brainstorming technique that cognitive computing can enable
even human experts to be able to do. That actually brings me to
the idea of innovation. Right? Creativity, innovation, we’ve mentioned
many, many times that we are going to be opening up this whole platform and the APIs
for the broader community to engage and build new cognitive applications. We call them COGs, by the way, this is just a
new terminology that we’re beginning to use. But in order to be innovative, we absolutely
need those innovators, those humans, who will be working with these cognitive
systems to build all sorts of applications. You know, we of course proved the
technology out with our clients. That’s our step one. We’ve been doing that, we
will continue to absolutely do that in partnership with our clients. We’d also would like to encourage
the broader community of partners, of academics, to work with us. And especially encourage the millennia
generation who have so many new great ideas, different ways of doing things, new kinds of
behaviors, all the different kinds of ideas that have sprung up in the last few years
based upon the new generation of everything that can come in and imagine brand-new
ways of interacting with cognitive systems and making their own lives
better, their lifestyle. You know, wellness is a great example of this. So, we can think about how all of those
kinds of applications can be built quickly, iteratively and at Internet scale. I look forward to collaborating
with all of you to do that. Thank you very much. [ APPLAUSE ] RHODIN: Some great ideas there. This idea of Watson Digital Life is
something that we think is really cool. It’s not just enough to open up to an ecosystem
of people that are going to build applications; we want to let some of our best and brightest
researchers put ideas out into the wild, if you will, to let people log
on to play with them try things. One of the examples that are out in the back
is our Chef Watson that can generate recipes for you based upon your dietary
restrictions, novel and unique recipes. And I will have to admit, I’ve eaten
many different recipes from Watson. They’re all edible. [ LAUGHTER ] Some of them are very good. [ LAUGHTER ] But we even fed them to our investors one day. So, it was a safety test. But it worked out pretty well. So, our Chef Watson will be one of our
premium capabilities in Watson Digital Life as we make that available later this year. So, it’s going to be a place for the general
public to log on, to play, to experience and understand the possibilities of
cognitive computing as we go forward. Many new research activities going on. Guru Banavar talked to very brief
piece of what we’re doing here. A third of IBM Research; IBM
Research is 3,000-person organization, dedicated to the next generation,
Watson 3.0 and beyond. A great set of capabilities that
complements what we’re talking about here, complements the investments we’re making. So, today we’ve heard a lot. It’s been a long morning. I appreciate everybody’s patience
as we’ve gone through this. I think everybody understands
the level of excitement we have for what this next generation
of computing is going to bring. I think you’ve seen it from all
of our speakers as they’ve talked about what they see the future to behold. And we’re really, really
excited to work with all of you on how we make the world a better place. Thank you. [ APPLAUSE ]

Paul Homan, CTO Industrial, IBM UK – discusses Industry 4.0 – From Exploration to Adoption

[Applause] so yes mind pull Roman and I was very glad to hear you again refer to the northern powerhouse because they're coming over from Sheffield this morning I must just say though even though I have come over from Sheffield I am really cold up here I don't know it's light down there so what I wanted to actually talk slight change of tack actually as the CTO my role in IBM it isn't to kind of big up the possibility but actually to worry about the delivery and the sustainability of the solutions that we provide so I want to talk a little bit about what that means for me and what I think the big challenge is so hence I've titled it from exploring to adoption I want to talk about how we move from actually those sort of first tentative experiments through to to scaling 0 so just a little bit of about me as a child there were three things I was really interested in one was nature one was machines especially excavators as all kids are and explorers and and whilst I could talk about biomimicry or robots and collaborative robots to capture those sorts of things which are some of the intersections there's a little area there that I wanted kind of reference out which is the interplay between industrial machinery and explorers and there's a quite a book that's written 90 years ago I'm gonna use it as a bit of a theme and reference and some of the lessons when we look back about what we what we're looking at today and that book was written by Maurice Holland 90 years ago called industrial explorers and hopefully you'll see why I use that as a reference point there's also a small quote there from a Robert Browning poem which the part there goes a man's reach should exceed his grasp or what's a heaven for and the idea behind that particular quote and he's used with a lot of is is actually that you should be reaching slightly further than you're comfortable for in order to be able to achieve what may seem impossible today but will be tomorrow's reality so industry for what it currently exceeds in in this room our grasp what is slightly beyond what we're trying to reach and I've put a few terms up and I could have put many many more but the idea was to kind of just put some things up there and I'm sure some of these are seen as enablers some of them are seen as issues is it some of those that we see we struggle with or is it actually the totality of those the some of them if you will and in the next question is do we believe that's beyond everybody's grasp well some of the stories we're hearing clearly not some people are starting to actually be able to take advantage of some of these things and what I put up here is a very very simple value stream for manufacturing you know you think of an idea you design it fabricate to assemble it and then text and Commissioner and we've heard the term digital twin quite a few times and one of the challenges I want to put to you really is thinking about what digital twin does for us in manufacturing but actually one of the challenges which is double-sided and I've come back round to this at the end and the point I'm making is we get the idea of a digital twin a digital copy of the physical product that we make going end-to-end through the entire project lifecycle but it also allows other people to cherry-pick parts of that value stream and that's the challenge I want you to just have at the back of your mind as we go through this particular presentation so I wasn't going to major on this but you may hear this and you'll hear this in other areas I'm sure these are some of the common lessons and some of the common ideas that people will talk about when they're talking about industry for made smart or whatever in that yes you should start now you should start experimenting don't be afraid to fail you know everything there are things that everybody at all levels can do to start off the journey and you should start with small discrete projects preferably things that give you some form of data point help you with your datasets and improve their forward however the kinds of problems that I deal with and not about experiments and not to do with how do we try little things out it's actually so what once you've kind of convinced you want to do something how do you do something so that you can bet your whole business on it where you can't afford to fail so coming back to that book that I mentioned by Morris Holland the industry explorers you can take great heart from the fact that of the 19 organizations that are listed in that book from 90 years ago 78 percent of them are still in existence now contrast that with only sixty years ago only twelve percent of Fortune 500 companies listed 60 years ago it still exists so clearly you could infer that those companies are doing something right now what were they doing why were they in that book they were researchers they had lots of areas that were trying to innovate and they were trying to bring new practices and processes in the way they made things and what they made into their organizations now was it about the technologies well actually I've read what's in the book are not going to sure you know there's some rubbish ideas in there but it was the fact that actually what they were doing was having a cultural change they were willing to invest in in looking forward it most definitely but it was also something else in there which is where I come from coming from and that is and this will be no surprise to anyone that knows me that they had an underpinning architecture that allowed them to take those ideas from from that just an invention in the research lab and scale into their business now I spent 30 years so it's not many different from Lukas in this space and I have condensed those entire 30 years of clicking architectural tires trying to make things work sorting out problems that scale into just three words and this is my gift to anybody I tell every architect I ever mentor these are the three things and only the three things they need to worry about viability integrity and extensibility if you take nothing else you take those back you make sure that anyone who gives you your solutions understands that those are the three things those a recipe for success for scaling so what I mean by that viability simply will it work will it do what it needs to do okay that may sound simple but actually when you look at that from an industry for made smarter point of view two areas really jump to mind integration integration is not about plugging together technology we can do that we can we that's not an issue integration is about making sure that the data within those things that you put together play nice okay think about it like different paints if you start mixing those up you don't just want a big brown muddy puddle like used to create a kindergarten if you mix everything together it has to be compatible it has to work together so integration is absolutely vital and location and this is often forgotten and and is very very relevant in a manufacturing environment and wonderfully trivial the thing about location and we all like the idea that we couldn't sort of you know potentially know exactly where somebody is in a shop floor we could know where an autonomous vehicle is in relation to them we can worry you know all this kind of stuff – ility mapping wonderful and we think about that and a phone and you could walk outside here up and down the street and you know we're kind of easy to make that assumption however your phone assumes you're at ground level or not somewhere flying in the air it also assumes you're near somewhere you're allowed to go a pathway or a road and stuff like that and so it's approximating and of course it's outside now most manufacturing occurs inside there are large lumps of metal around large lumps of concrete sometimes it's under round it confined spaces etc and the fidelity you need to make sure that a person and a faultless truck don't cross over obviously has to be very very high so location and senses and the ability to better worry about location in your viability both is a very very key point and often often either often overlooked integrity integrity in any solution really means will it do any harm will it do any damage to the stuff that's already in there so when you put something new in can you ensure the integrity of the processes can you ensure the integrity of the data and as well as that within inch before it's also about the security of it and you open stuff up you need to know basically who's accessing it and how one of the key points though on top of that is when you do this you will create a lot of data there's a lot more than you can just normally handle so how do you actually make sure you can make use of all of that and that's part of the integrity as well to make sure that you can exploit it correctly means that you have to have layers on top that can do that kind of analytics and a third part around extensibility and I'm going to use this as a framework as we go through some examples is really around platform to something to be extensible you have to have a common platform that can not that doesn't close off doors doesn't close up avenues it can cope with future change that's true of anybody if you make any complex product you'll understand that concept and the third part and we've heard this a few times already today and it's very become a pervasive term is ecosystems now what I mean by ecosystem isn't just a random collection of people coming together and basically having to transact it's actually a deliberately organized unit that they're able to work going forward and an ecosystem has to be able to swap in and out different partners as it goes along and that's not something just happens by accident so I've got three examples and I've got a challenge I wanted to build on – yeah and kicked-off unlooked-for stories and I have to say I really wanted to put some UK stories in there it would be great if next year the stories all had to be UK only stories I think that would be a big big challenge and I actually think that's something that we should consider very cleanly in this room that my the reason for picking the three stories across a piece is another three-pronged mechanism I use around product enterprise and ecosystem I look at industrial manufacturing and basically I divide that into three things the product that we make the enterprise the place that we make it and each system is the place where the product operates and so I've got one of each of those sort of space stories just to kind of show how I think industry for affects those so first one cone a elevator and lift manufacturer had a huge install base they maintain and look after a large number of lists using the viability integrity and extensibility assessment a couple of things I just draw out viability could they make it work could could they connect up the 1 million plus lifts that they've got yes that was that was a test we had to go out and kind of do all that absolutely that was the integration element within a six month period one of the key things behind this though was around the integrity was around the data and the fact that frankly putting that data into an IBM cloud could in theory mean there iBM has got access and understand people float I'm sends a movement of people that data is connais owned so they entirely have that data and that's really really important to them and I urge you to be aware that when you have that insight that it's highly valuable understanding your information and the insights of that information that you have with it is key to these services going forward and talking to services the last part just as an example around extensibility one of the things that connais were keen to do was actually to build this developer ecosystem so other people could use their platform and there's a cab hailing firm that have built an eco built an API service so that when you come into your apartment in the US this happens to be a new presser for a lift to go down to the ground floor at the same time there's a cab hailed and waiting for you as you get outside on the ground the ground floor or street level or whatever they call it in the US so a second one and they ponies like I couldn't refer to the individuals but this is a an automotive company that was taking 150,000 photographs a day in order to look at paint scratches and blemishes and using the viability integrity and extensibility points here but key thing about the integrity that we wanted to look at was could you repurpose that same data and use it for something else in this case actually checking the doing visual inspection on a task that was performed to check whether or not certain things were performed in the right way certain tasks were completed and but by performing a testing out the viability was could we get to a greater than 95% accuracy of that visual inspection so that was tested in the small amount and then scaled and the point about the scaling is it took three and a half days to train up that capability for that use case now that's really important because that's less it's at least 14 days normally for a new use case about three and a half days per use case means it suddenly becomes viable to being able to add more and more use cases on the same data and extend it across the entire shop floor and more operations and so the last case illiterate reference was was Merce whereby there was a lot of different parties are involved Chang and I think this is quite relevant because it was about having some form of neutral access that everybody could kind of get to go and so this was they wanted to produce a a blockchain proof of concept based on an open and neutral platform that had trusted data in it from and that actually was based on this sort of idea of smart contracts that has now been produced and subject to regulatory review is a joint venture that we've gone into Merce with that will has a whole series of parties signed up and that provides that kind of element of extensibility so just to kind of recap on on the viability integrity and extensibility part you know these are the sort of terms these are the kind of the highlights of what my testing framework has used to kind of say how do you take these from ideas and actually do something with them how do you scale these and do something serious and I won't read them through the whole things all over again but just to kind of pick a couple you know connectivity is key you know that connectivity was key in an aftermarket situations that was in other words something that was a pretty existing install base and then the ability to be able to develop new services reuse existing data these things that absolutely can because obviously when most of these are not starting from a from a greenfield situation so one thing I wanted to I said I'd come back to was that point about the union's about the digital twin so this is a chart that I've taken from CB insights deliberately full at one end this deliberate what it is over the last nine years or so is it a chance of what are called unicorns and apologies if you're not familiar with that will you slightly to move essentially these are startups that have gone to a billion dollars or more in valuation and nine years ago there were you know virtually none and you can see nowadays and it only goes up until last year there's a lot and you'll recognize the air being bees and Spotify is in the uber is early on but there's many many more there and one of the things that these organizations have in common is that they have been able to find something digital that actually has challenged their individual markets so if I take an air B&B or an uber or whatever else you know we can look at the platform they've built and the digital equivalent that they have created that digital twinning concept for me is not a million miles away there is a potential unicorn sitting somewhere who can own just the digital tweening coffee and never touch metal and if you think that's a little bit too far-fetched you only have to look in the construction industry to see how close that is and so just add a wonderful timing I wish I could claim it in addition though just this week iBM has launched released its study or C suite study where we've interviewed 12 now a thousand or so c-suite executives and if anybody wants to know any more about it then you know please see me or someone on the stand and we can get some access to that for you but the key thing about it one of the key messages that type of the report is called incumbents strike back because basically it's it's recognizing the biggest thing is that people are recognizing this potential threat is real in all set and we'll come in manufacturing as well and it's about how they are gearing up and what they are doing in response to that to be out of scale of solutions to ensure they don't get left behind and become one of those 7% that I referred to earlier so thank you very much

Transform Insight into Action with IBM’s Industry Analytics Solutions

ladies and gentlemen please welcome alistair Remy thank you everybody for joining us for what I think is going to be some very very exciting content as we take a next step in the journey on analytics the value of an idea lies in the using of it and that's the heart of the issue we find with analytics and the data market today we need to make insights actionable there's really no point and more importantly there's absolutely no value in discoveries that never see the light of day that never change your business every day we hear more and more examples of the profound impacts of analytics that get applied to nearly every class of business problem yet by our estimate less than 10% of organizations have made systemic progress in applying predictive analytics to their problems we know in survey after survey of organizations that they believe analytics leads to increase performance in fact more than 70% of people that have made this journey believe that analytics is not only helpful but that it leads to competitive advantage the adoption of advanced analytics should be much higher clearly there's something holding us back and as the leading analytics vendor we think we have a pretty deep understanding of what these challenges are comes in a number of ways getting started can be too complicated finding the right experts finding the right data finding the right approach thinking about which problems to solve feels very custom very specialized for many of our clients we know that when people get into these projects they require tremendous skills data scientists and more and that no matter what we do to progress the education programs and career development programs that we're going to shop we're going to suffer a chronic lack of of this skilled resource if we don't change something many of these projects feel too customer I feel looks like everybody is starting some of their analytic endeavors from whole cloth and then once they've gotten to their first set of insights they realized they now have to deal with an ongoing maintenance effort to keep these systems that are very much active in living systems current finding data internally is clearly a challenge but that's only the beginning we need to help people with all data how do they take the data that they've got get it into the right structure to deal with productive and predictive analytics but even more importantly than that how do we start to bring more data to the party how do we start to look at external data data that might be something not existing easily within the organization things like weather or social data and help make that part of the calculation a part of the prediction because it can be game-changing and then lastly the analytic insight is the catalyst for change but taking those insights and connecting them to systems of record to change the way supply chain or maintenance or customer care or merchandising happens is the ball game in terms of economic benefit and we have to make those insights more actionable more quickly now at IBM we think we've been leading and solving these problems we've delivered the market leading analytics platform and created the Best of Breed technology in Big Data data management information governance information integration enterprise content management business intelligence predictive analytics and of course cognitive computing and after more than 50,000 analytics engagements around the globe in many industries we've learned how our clients typically focus around key projects in select and very understandable domains almost always around what they do around customer understanding assets and operations finance and risk management and of course safety which we put under a safer planet environment are the key use cases so we integrated aspects of our analytics platform to address these challenges and we combined a set of initial capability around these domains and had some incredible success but today today is the day that we want to take another very big step forward and I'm thrilled to share with you today that we're announcing a major commitment to the development and delivery of the most comprehensive set of industry specific analytic solutions ever these solutions are designed to deliver transformative insights to line of business decision makers they're a combination of our very unique blend of industry knowledge and technology leadership across analytics and data now before we share the details of these solutions we thought it would be valuable to have an outside perspective and we're delighted to have Rebecca Wedeman here from nucleus research to share her perspective on the need and the demand for this kind of a transformative approach to delivering analytic value I'd like to welcome her back up to the stage thank you so I'm delighted to be here today to share a few minutes and a few insights and what we've seen that nucleus research as an analyst firm focused on the return on investment from technology we started about 15 years ago yes when I was 12 looking at the return on investment customers achieved from technology and over those past 15 years we've published more than 500 return on investment case studies it had a lot of opportunity to look at what drives value how value has evolved in the analytic space and also what are those barriers to adoption what are those barriers to increasing the value of analytics now the good news is what we've seen is that the returns have gotten better if we look at technology from both IBM and its competitors over the past few years returns have grown from $10 and 66 cents for every dollar spent in 2011 to 13 dollars in one cent for every dollar spent in 2014 obviously this is because of lower initial cost and faster time to value but also greater usability and as we dug further into the deep data we were really looking at what are those less obvious elements driving an increase in returns if we look at the benefits of being analytic what we find is over time the nature of deployments have evolved as well as companies move from automated to tactical to strategic to extended analytics deployments a couple of things happen usability drives broader adoption from automated one project to tactical one team to strategic and even extending beyond the enterprise and beyond internal data sources as companies move from tactical strategic they also move away from just back we're looking to predictive to forward-looking predictive insights new opportunities for business growth so as we were looking at this data last year we said what's next what can vendors do to deliver more value to help customers move to that next stage as an analytics Enterprise and the answer for us with vertical expertise in fact in our predictions we published in the fall we suggested that commoditization in the analytic space increasing competition would drive vendors to deliver more value to drive more prepackaged vertical solutions to help their customers now if we talk about the actual data and looking at the value of pre-built industry analytic solutions what we found is that it could accelerate time to value by an average of 15 57% reduce initial consulting cost by 65% shift those consulting costs to much more strategic thinking and change management initiatives and reduce ongoing support cost and maintenance cost by one-third now that's really important because with 70% of IT departments today reporting themselves as resource constrained and finding and retaining diet negative scientists as we know continues to be a growing challenge we needed to do something to drive greater value if we think about the traditional analytics deployment model it really kept companies from advancing to the next stage of the analytic Enterprise because of the cost and complexity associated with the deployment companies really took a stair step approach to upgrading expanding adding new dayss data sources adding incremental new projects because of the imparity of those interdependent solutions custom coding and integration has to be retested debugged QA with each new initiative constrained resources risk of disruption and just the cost of managing the ongoing application becomes a challenge for companies effectively losing application value because they're not able to totally take advantage of the innovations that vendors are delivering what we find with prepackaged solutions is they not just accelerate initial time to value but enable more frequent upgrades reduce the data tactical management and let the vendor focus on the complexity so our data scientists can focus on innovation driven customizations prepackaged solutions reduce risk they accelerate time to value and create more predictable time to value because more of the complexity is managed by the vendor we find it instead data scientists can focus on the next opportunity for innovation so as we look at industry solutions moving forward we really see an opportunity to both accelerate time to value maintain that time to value and enable customers to maximize the value they get from their analytics investments over time by closing the analytics application value gap and that's why we're excited to be here today to share this data but also to look forward at IBM's industry vertical solutions thank you very much now I certainly agree with Rebecca's take on the market and I think we're very much uniquely positioned to lead in what is going to be a new era of analytic solutions and I think what's going to really make a difference about these is a combination of unique things deep industry knowledge you are our ability to truly understand the problems and the complexities and the ecosystems going on in a number of different industries it is an important in a critical ingredient market leading predictive analytics you know many many firms and assessment have over and over again put IBM at the very top of vendors delivering predictive analytics packages and we think that's a core technology to make this possible the nature of moving from descriptive predict to predictive can't be underestimated scalable real-time analytics more and more we're going to hear about not only big data but fast data finding the insight in time to make an actionable difference key differentiator support for not only multiple data types moving from structured to inclusion including the mass of unstructured data is critical but also starting to build in access to third-party data sources this is why we've created landmark partnerships with companies like Twitter to bring in social media a real-time voice of the marketplace and The Weather Channel which has turned out to be an incredible resource to help improve analytic models but making all of that powerful data consumable in the context of solutions global cloud capability we know that when we're delivering capability to business people often one of their preferred ways to consume is as a service through the cloud and our very unique approach to cloud where we not only have you know one of the most complete and powerful cloud infrastructures but we also have one that exists in more than 40 different locations around the world allows us to deal with complexities of locale and regulation while delivering all the speed and ability of cloud and then lastly the ability to understand and be able to execute on delivering security around what is inevitably sensitive data so whether it's security in the context of people delivering in their own environment or security in the cloud we know that this is a critical critical aspect so some pretty unique things that we're building on to deliver this new generation of solutions now I'd like to introduce mark Anders mark is the vice president of analytic solutions member of IBM's industry Academy and has been helping lead the team to build these solutions and he's going to take us through some detail in the announcement mark thanks aster so I am very excited to be here with you guys today to get to talk about IBM's plans to deliver a new breed of Industry specific analytic solutions these solutions are focused on delivering predictive insights to help companies answer their most critical business questions now why are we calling these a new greed of Industry specific analytic solutions well let's talk about what we're doing here so the first thing that we're doing is we're actually including pre-built end-to-end capabilities this includes out-of-the-box analytic models focused on very specific use cases within individual industries we're also including predefined data models and connectors to typical industry oriented line of business systems and we're providing out-of-the-box interactive apps and dashboards specific to individual roles within an organization to deliver those insights on top of that iBM is making an unparalleled investment we have aligned over 5,000 people to deliver and develop these industry specific analytics solutions and we're introducing twenty new solutions today with more to come on top of that what we're providing clients with is codified expertise we're embedding the experience gained from those 50,000 plus analytics applications that Alistair talked about and on top of that we're actually working with some of the leading innovators in use of data and analytics in their respective industries as signature design partners these signature design partners are working with us to ensure that we're addressing most critical business requirements and are even working with us to optimize our predictive models based on real world data iBM is working with innovators in every industry as signature design partners to build a new breed of solutions in retail merchandising decisions impact profitability retailers must understand which products influence the sale of other products we are working with retailers such as Urban Outfitters an innovative specialty retailer they offer lifestyle merchandise to highly define customer niches and it is critical as they understand their customers buying behavior for merchandising strategy every bank says they are customer centric but most don't use all available information to better understand customers working with signature designs partners like Bendigo with Adelaide bank which is aiming to be Australia's most customer connected bank we are gaining increased insights into customers based on their behaviors including the spending and interaction patterns so I am extremely excited to introduce to you now the first 20 industry analytics solutions that we'll be making available as you can see these solutions span across 12 different industries and address all of the key domains that you heard us talk about earlier now we're going to go into a little bit more detail on some of these and you can find out more about each of these solutions on our website right after this broadcast but before we take a closer look at some of these let's talk about exactly what we're providing in this new breed of solutions so to begin with we've created an predefined set of industry specific data models relevant to the each of these solutions along with connectors to industry relevant source systems this includes interfaces to SCADA systems to capture machine generated data to core integration to core banking systems so that you can automatically extract spending and transaction data and even to T log files to take transaction data directly out of retail point of sale systems now this will enable clients to streamline the collection and preparation of data for analytics and of course we've already pre-built those analytic models that are each very specific to the industry and use case and all of them focused on generating predictive sites we've even predefined the key business metrics for those use cases and to help you deliver these insights directly to the line of business users we've included out-of-the-box dashboards and interactive applications some of which you'll get a chance to see today we've also provided a set of api's and services so that you can integrate these insights directly into your core business processes and we're providing pre-built integration into some very popular marketing and asset management systems as an example we've done in out-of-the-box we're providing an out-of-the-box integration into IBM's experience one platform so there's you generate insights about customers that can be used to determine who should be targeted for a campaign automatically generate an offer or even define what a customer sees when they go to your website there's also pre-built integration into IBM's at Maximo asset management solution so asks you identify a potential maintenance issue you can automatically generate a work order to get that equipment fixed these set of pre-built capabilities are what differentiates these solutions from anything that's been delivered to the market before and will accelerate companies time to value in delivering analytic capabilities across the organization now as we embarked on creating these solutions we noticed that every company is faced with a paradox as you heard from nucleus research they they want to be fast and flexible at the same time while packaged applications often provide out-of-the-box capabilities they're typically not as easy to integrate into your existing systems and architectures we believe we've found a perfect balance to provide organizations with an out-of-the-box solution that delivers immediate value but can be adapted to your existing enterprise architecture and standards so now let's take a look at some of these solutions in action the first one we're going to show you today is lift analytics for retailers now our lift analytics solution will help retailers answer critical questions like how important is it to ensure that certain products are always in stock and what products should they be promoting together while retailers can easily get a view of the revenue for any individual product that doesn't necessarily represent the total value of carrying that product in their stores oftentimes an individual product can drive significantly more lift for the business based on the other products that are typically purchased with that one so to take us through this lift analytics for retail solution I'd like to welcome Matt McNaughton thanks mark thank you Matt so as someone who spends most of their time working with retailers and consumer product companies to help them get more value from their data can you tell us how a retailer might use this solution sure mark first I'm really excited to share with everyone today lifting analytics for retail solution I'll answer your question with an example and in our example we're going to follow a merchant named Christina who's responsible for the makeup and accessories merchandise categories she's experiencing out of stock issue in that in the makeup brush category and also was seen declining sales in the same category because of this situation christina is actually considering a range of merchandising and assortment changes related to makeup brushes so how would she get started well first Christina is going to want to understand better the selling patterns for makeup brushes including important insights about what other items are most likely to sell with them so first Christina is going to review the sales trend for makeup brushes she will immediately see a steep period of decline in makeup brush sales as shown on the screen next she's going to likely want to look at the overall makeup an accessory category and she's going to note another downward trend in the eye and lip makeup categories and she's also noticing a steep decrease at the top of the screen at top of the page on green line in foundation makeup sales a top seller for her so this is great she can now see the trends how they compare across different product lines what's next well she's going to want to understand how the sales for each merchandise line will relate to one another she basically wants to understand the patterns before she considers making any changes to the makeup brushes or so by looking at predicted affinities christina can see which merchandise lines are purchased most often with makeup brushes for example the green line at the top of the chart is foundation makeup and this chart describes to her that foundation makeup has purchased more than 70% of the time that a customer purchases purchases a makeup brush she'll also notice in the orange blue line that there's a high affinity between makeup brushes and eye makeup and lip lip makeup and then finally down towards the bottom of the screen she'll notice that there are other merchandise lines one that she doesn't actually manage such as moisturizers lotions and treatments that have a weaker affinity with makeup brushes so what about these intermittent lines that I see at the bottom here that's a good question mark those are actually relationships that only show up at certain periods of time an important distinction of this solution is the ability to look at those rough initi relationships over time rather than at one point in time in this case there are three four four instances where perfumes sold at the same time at a reasonably high rate with makeup brushes Cristina reminds herself that actually was a time those time periods were when there was an entire sale in the health and beauty department she keeps note of that for herself so this is great so now I can see what other products oftentimes sell with makeup brushes what do i do then what would what would Kristina do that well next Kristina is going to be curious as to what does that mean in the context of how I look at my business she's going to ask yourself the question how does this translate into sales dollars here she'll see immediately that across that time period she sold one hundred and twenty-one thousand dollars worth of makeup brushes now this isn't a particularly different kind of information this the sales metrics those things that retailers typically have available to them all the time however if you note on this screen down on the bottom right the affinity selector is turned is set to OFF so she has an opportunity here to change it and when she changes it to see the high affinity really chips she can now see both the total sales dollars in the affinity sales dollars on the left side she can view the total sales dollars for makeup brushes and related merchandise lines which provides an important context as she looks at the affinity sales dollars on the right side those affinity sales dollars represent dollars where both makeup brushes and the affinity lines which are eye makeup foundation makeup and lip makeup were purchased together if she looks at these overall results she sees an interesting fact at about four hundred and twelve thousand out of nine hundred and forty six thousand dollars in sales in her category happened when there was a makeup brush sold so now we're seeing basically what products what the total revenue is for products that are very likely to be sold with brushes but you mentioned before there's other products that customers often buy when they go in to buy buy a makeup brush yeah so she's got the ability then to take that affinity strengthening indicator and toggle that down to from high to medium which then lays in additional merchandise lines effectively showing a similar story when you compare the affinity purchases to the total purchases I think it's interesting to also note if you look on the right bar chart that that one hundred and twenty one thousand dollars in sales and makeup brushes has an effect of about seven hundred thousand dollars in total sales across other merchandise lines that's a 7ex lift that she's going to be very careful to note as she looks at making changes to that line so so this is amazing there's a you just quickly were able to identify that these makeup brushes even though we the retailer is only driving one hundred and twenty one thousand dollars from this it's likely that those are having an impact on the business of almost seven over seven hundred thousand dollars so now that Kristina's learned this how would she put these insights into action what would she do with this this new insight that we've we've just provided her with well certainly it's clear now how important makeup brush sales are on her category and some other merchandise lines she'll obviously use this information to work quantify four-horse store managers the value and importance of keeping the merchandise displays of makeup brushes full additionally she may consider some other merchandising and marketing actions so she may decide generally to increase the promotion of makeup brushes to do being benefit from that 7x lift she may think about Co promoting makeup brushes with other affinity merchandise lines she may go to the merchant who sells perfume and say we saw that intermittent reaction and then may just take a little bit of a promotion to get that affinity to show up more often on trend and she may think about adding new makeup brushes to try to influence how affinity purchases happen within different customer groups so I talked a little bit earlier about how we're providing a bunch of pre-built capabilities in these solutions what's in the box when you get lift analytics for retail well very similar what you talked about earlier mark there is a retail specific data model with connectors into T log or point-of-sale transaction log data there are there's a predictive analytic models and in this case an affinity analytic model with the ability to process through the numerous high volume of transactions in retail that can extend into the millions and billions of transactions and then finally as you show before and as we've seen today there's an insight layer and an interface and interactive interface for a user of line of business user great well thank you very much so that's just one of the new industry specific analytic solutions that we are announcing today but we wanted to give you some insight into a few others so the next one we wanted to talk about is for banks now banks have traditionally segmented their customers based on very high-level information like demographics or household income what accounts they hold maybe what their balances are in those accounts however there's a tremendous opportunity to gain deeper insights into customers based on their spending and transactions in fact these behaviors can be used to help banks predict financial and life events and answer critical questions like are people that spend a lot dining-out after 9 p.m. more likely to overdraft than others so what we've done here is we've actually generated predefined segments based on what people spend their money on are people spending more money on dining out an entertainment or on groceries are they spending more on clothing or electronics are they spending more on mortgages or on cars and looking at where they spend their money is it very localized near their home when we identify two distinct areas of spend and classify them as a commuter or is it fairly spread out like a traveler and even looking at what time of day they spend their money are they spending their day their money during the day in the evenings on weekends things like that we can then use these insights to identify which customers are more likely to run into an overdraft situation which companies are more likely to pay their credit-card late and by looking at what they're buying when we might even be able to identify that someone is out house hunting and may be in the market for a mortgage based on the fact that they've just made a bunch of purchases at gas and convenience stores in different in different zip codes and maybe are spending money at home improvement stores to fix up their house so here at the bottom you can even see different segments in the propensity to go into overdraft along with these other financial related events or drill into a particular activity and see which of this types of customers that are more highly likely and more correlated to engage in one of these activities so this is just one of many different solutions that we are providing that is taking a look at customer behavior and using that to generate better insights and each of these are very unique so for example in wealth management companies are more interested in how clients are investing what asset classes they're investing in how frequently they're investing and how they're interacting with their advisers so they can use those insights to drive more trading activity and improve the profitability they have with their customers in insurance they're more focused on churn and looking at a different set of interactions they have with their customers in telecommunications companies want to understand how people are using their phones where they're making calls from even what applications they're using on their phone these are the types of Industry specific insights that we're delivering in each of these solutions so I'd like to highlight one more solution for you and if you think about oil and gas companies out there they face a challenge in that their oil wells are widely distributed geographically and the pumps are underground making maintenance very costly and difficult now one of the big questions they're trying to answer is when should we stop production of an oil well for pump maintenance now if they can predict potential issues before they happen there's an opportunity to optimize the service and maintenance costs while improving the throughput of these wells which can go directly to the bottom line for these oil and gas companies so let's see how our solution can help with this Paula logs onto our executive dashboard to scan for issues that need attention seeing a variety of pumps that are in a critical state she selects them from the map where the pump summary panel dynamically updates with each check-in she clicks esp-2 and sees it's at 82% of probable failure she clicks setting and profile to see how it may impact the process stream she sees that it's a critical pump in the process stream so she investigates further and decides to consult with her power analyst before taking any drastic measures power analyst Jason has been analyzing the forecasting indicators and continues collecting data in preparation for his next actions after conferring with Paula he initiates a maintenance request by clicking the maintenance button he fills out the form add some notes and provides an annotated attachment to help the field technician after completing the form he clicks submit the system notifies Jack the field technician that there's a pump about to fail and he needs to do an assessment Jack is able to further research the pump by looking at the document library that has all the installation guides and schematics for this device jack proceeds to assess the pump and files the site report that will help the executives and analysts on their next steps to remedy the situation make your next decision the right one so this is just one of a new set of solutions we're delivering around asset analytics but this applies again to multiple different industries each with their own unique needs so we're also introducing a asset analytics for transmission and distribution in energy and utilities to analyze the equipment used in the transmission and distribution of power so that you can identify potential maintenances issues earlier and predict down times and audit and power outages which I'm sure no one likes to incur and we're developing an asset analytic solution for robotics equipment in automotive so that automotive companies can identify potential issues with welding robots and PaintShop equipment before they stop production of the automobiles so I've taken you on a whirlwind tour of 20 solutions across 12 different industries I'm sure now you can see the value of having pre-built industry-specific solutions like the ones we've demonstrated today our pre-built capabilities will enable companies to get started and go faster with fewer with fewer resources so they can use those scarce data scientists resources on minor customizations and tweaks and to do the innovation that we talked about earlier as opposed to spending all their time building out the core underlying building blocks and you can leverage our proven expertise from the 50,000 engagements that we've done in analytics and the insights provided by our signature design partners so with that I'd like to bring back Alistair to wrap up today's session Wow thank you Mark seeing these solutions in actions I really think brings the point home it's it's incredible to see how very complicated systems and potentially decisions and information people never had can now be brought to their fingertips in a new and much faster way so let's go right back to where we started there's absolutely a gap Rebecca helped us to kind of quantify that gap in terms of lost economic value in terms of getting on an analytics journey and becoming an analytics driven company and culture and it results in in you in our customers having to build this last piece of track which is difficult and expensive and requires maintenance and aside from the cost is a bigger issue and it's it's lost opportunity which at the pace of change in the pace of innovation in every industry is something that none of us can really tolerate so we think what we're doing today with these solutions with these industry solutions it's a big step in closing that gap and helping people meet their goals and really rise to the opportunity so with our industry analytic solutions we think the value proposition is very very simple and very compelling start faster use fewer and less specialized resources leverage proven expertise to catalyze your efforts and it's really now time to think about where the next step in your journey is we think the reasons to wait have become much less we think it's absolutely time for everyone in every industry to start turning insights into actions thank you very much for joining us today don't give it