Welcome!

SDN Journal Authors: Destiny Bertucci, Liz McMillan, Pat Romanski, Elizabeth White, Amitabh Sinha

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Containers Expo Blog, @CloudExpo, SDN Journal

@DXWorldExpo: Article

Big Data Analysis Helps Improve Customer Satisfaction

Efficient big data capabilities help Cerner drive needed improvements into healthcare outcomes

The next edition of the HP Discover Podcast Series delves into how a healthcare solutions provider leverages big-data capabilities. We’ll see how Cerner has deployed the HP Vertica Analytics platform to help their customers better understand healthcare trends, as well as to help them better run their own systems.

To learn more about how high-performing and cost-effective big data processing forms a foundational element to improving healthcare quality and efficiency, join Dan Woicke, Director of Enterprise Systems Management at Cerner Corp. based in Kansas City, Missouri.

The discussion, which took place at the recent HP Vertica Big Data Conference in Boston, is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:

Gardner: We're going through some major transitions in how healthcare payments are going to be made -- and how good care is defined. We're moving from pay for procedures to more pay for outcomes. So tell me about Cerner, and why big data is such a big deal.

Woicke: The key element here is that the payment structure is changing to more of an outcome model. In order for that to happen, we need to get all the sources of data from many, many disparate systems, bring them in, and let our analysts work on what the right trends are and predict quality outcomes, so that you can repeat those and stay profitable in the new system.

My direct responsibility is to bring in massive amounts of performance data. This is how our Cerner Millennium systems are running.

We have hundreds of clients, both in the data center and those that manage their own systems with their own database administrators (DBAs). The challenge is just to have a huge system like that running with tens of thousands of clinicians on the system.

We need to make sure that we have the right data in place in order to measure how systems are running and then be able to predict how those systems will run in the future. If things are happening that might be going negative, how can we take the massive amounts of data that are coming into our new analytical platform, correlate those parameters, predict what’s going to happen, and then take action before there is a negative?

Effect change

We want to be able to predict what’s happening, so that we can effect change before there is a negative impact on the system.

Gardner: How does big data and the ability to manage big data get you closer to the real-time and then, ultimately, proactive results your clients need?

Woicke: Since January we've begun to bring in what we call Response Time Measurement System (RTMS) records. For example, when a doctor or a nurse is in our electronic medical record (EMR) system is signing an order, I can tell you how long it took to log into the system. I can tell you how long you were in the charting module.

Woicke

All those transactions produce 10 billion timers, per month, across all of our clients. We bring those all into our HP Vertica Data Warehouse. Right now, it’s about a two-hour response time, but my goal, within the next 12 months, is to get it down to 10 minutes.

I can see in real time when trends are happening, either positive or negative, and be able to take action before there is an issue.

Gardner: Tell us more about about Cerner -- what you do in IT.

Woicke: We run the largest EMR in the world. We have well over 400 domains to manage  -- we call them domains -- which allows us to hook up multiple facilities to those domains. Once we have multiple facilities connecting into those domains, at any given time, there are tens of thousands clinicians on the system at one time.

We have two data centers in Kansas City, Missouri and we host more than half for our clients in those data centers. The trend is moving toward being remote-hosted managed like that. We still have a couple of hundred clients that are managing their own Millennium domains. As I said before, we need to make sure that we provide the same quality of service to both those sets of clients.

Single database

Cerner Millennium is a suite of products or solutions. Millennium is a platform where the EMR is placed into a single database. Then, we have about 55 different solutions that go on top of that platform, starting with ambulatory solutions. This year was really neat. We were able to launch our first ambulatory iPad application.

There are about 55 different solutions, and it's growing all the time with surgery and lab that fit into the Cerner Millennium system. So we do have a cohesive set of data all within one database, which makes us unique.

Gardner: Where does the data come from primarily, and how much data we are talking about?

Woicke: We're talking about quite a bit of data, and that’s why we had to transform something away from a traditional OLTP database into an MPP type database, because those systems that are now sending data to Cerner.

We have claims data, and HL7 messages. We're going to get all our continuous care records from Millenium. We have other EMRs. So that’s pretty much the first time that we're bringing in other EMR records.

You’ll have that claim data that comes in from multiple sources, multiple EMRs, but the whole goal of population health is to get a population to manage their own health. That means that we need to give them the tools in their hands. And they need to be accurate, so that they can make the right decisions in the future. What that's going to do is bring the total cost of your healthcare down, which is really the goal.

What that's going to do is bring the total cost of your healthcare down, which is really the goal.

We have health-plan enrollments, and then of course, within Millennium, we're going to drill down into outcomes, re-admissions, diagnosis, and allergies. That’s the data that we need to be able to predict what kind of care we are going to have in the future.

Gardner: So it seems to me that we talk about "Internet of things." We're also going to the "Internet of people." More information from them about their health comes back and benefits you and benefits the healthcare providers. But ultimately, they can also provide great insights to the patients themselves.

Do you see, in the not too distant future, applications where certain data -- well-protected and governed of course -- is made into services and insights that allow for a better proactive approach to health?

Proactive approach

Woicke: Without a doubt. We're actually endorsing this internally within the company by launching our own weight-loss challenges, where we're taking our medical records and putting them on the web, so that we have access to them from home.

I can go on the site right now and manage my own health. I can track the number of steps I'm doing. Those are the types of tools that we need to launch to the population, so that they endorse that good behavior, which will ultimately change their quality of life.

Right now, we're in production with the operation side that we talked about a little bit about earlier. Then, we are in production with what we call Health Facts, a huge set of blinded data. We hire a team of analysts and scientists to go through this data and look for trends.

You can see what that’s going to do for the speed of the amount of analysis we could do on the same amount of data. It’s game changing.

It’s something we haven’t been able to do until recently, until we got HP Vertica. I am going to give you a good example. We had analysts log a SQL query to do an exploratory type of analysis on the data. They would log that at 5 p.m., then issue it, and hopefully, by the time they came back at 8 a.m. the next day, that query would be done.

In Vertica, we've timed those queries at between two and five seconds. So you can see what that’s going to do for the speed of the amount of analysis we could do on the same amount of data. It’s game changing.

There were a lot of competitors that would have worked out, but we had a set of criteria that we drilled down on. We were trying to make it as scientific as possible and very, very thorough. So we built a score sheet, and each of us from the operation side and Health Facts side graded and weighted each of those categories that we were going to judge during the proof of concept (POC). We ended up doing six POCs.

We got down to two, and it was a hard choice. But with the throughput that we got from Vertica, their performance, and the number of simultaneous users on the system at a given period of time, it was the right choice for us.

Gardner: And because we're talking about healthcare, costs are super important. Was there a return on investment (ROI) or cost benefit involved as well?

Extremely competitive

Woicke: Absolutely. You could imagine that this would be the one or two top categories weighted on our score sheet, but certainly HP Vertica is extremely competitive, compared to some of the others that we looked at.

Gardner: Dan, looking to the future, what do you expect your requirements to be, say, two years from now? Is there a trajectory that you need to take as an organization, and how does that compare to where you see Vertica going?

Woicke: Having Vertica as a partner, we navigate that together. They invited me here to Boston to sit on the user board. It was really neat to sit right there with [HP Vertica General Manager] Colin Mahony at the same table and be able to say, "This is what we need. These are our needs coming around the corner," and have him listen and be able to take action on that. That was pretty impressive.

To answer your question though, it’s more and more data. I was describing the operations side, where we bring in 10 billion RTMS records. There's going to be another 10 billion type of records coming in from other sources, CPU, Memory, Disk I/O, everything can be measured.

We want to bring it into Vertica, because I'm going to be able to do some correlation against something we were talking about. If I know that the RTMS records show a negative performance that's going to happen within the next 10-15 minutes, I can figure out which one of those operational parameters is most affecting that outcome of that performance, and then can send the analyst directly in to mitigate that problem.

By bringing in more and more data and being able to correlate it, we're going to show all the clients, as well as the providers, how their system is doing.

On the EMR side, it’s more data as well. On the operations side, we're going to apply this to other enterprises to bring in more data to connect to the experts. So there is always somebody out there. That’s the expert. What we're going to do is connect the provider with the payers and the patient to complete that triangle in population health. That’s where we're going in the next few months.

Gardner: I certainly think that managing data effectively is a huge component of our healthcare challenge here in the United States, and of course, you're operating in about 19 countries. So this is something that will be a benefit to almost any market where efficiency, productivity, quality of care come to bear.

Woicke: At Cerner Corp., we're really big on transparency. We have a system right now called the Lights On Network, where we are taking these parameters and bringing them into a website. We show everything to the client, how they're performing and how the system is doing. By bringing in more and more data and being able to correlate it, we're going to show all the clients, as well as the providers, how their system is doing.

You may also be interested in:

More Stories By Dana Gardner

At Interarbor Solutions, we create the analysis and in-depth podcasts on enterprise software and cloud trends that help fuel the social media revolution. As a veteran IT analyst, Dana Gardner moderates discussions and interviews get to the meat of the hottest technology topics. We define and forecast the business productivity effects of enterprise infrastructure, SOA and cloud advances. Our social media vehicles become conversational platforms, powerfully distributed via the BriefingsDirect Network of online media partners like ZDNet and IT-Director.com. As founder and principal analyst at Interarbor Solutions, Dana Gardner created BriefingsDirect to give online readers and listeners in-depth and direct access to the brightest thought leaders on IT. Our twice-monthly BriefingsDirect Analyst Insights Edition podcasts examine the latest IT news with a panel of analysts and guests. Our sponsored discussions provide a unique, deep-dive focus on specific industry problems and the latest solutions. This podcast equivalent of an analyst briefing session -- made available as a podcast/transcript/blog to any interested viewer and search engine seeker -- breaks the mold on closed knowledge. These informational podcasts jump-start conversational evangelism, drive traffic to lead generation campaigns, and produce strong SEO returns. Interarbor Solutions provides fresh and creative thinking on IT, SOA, cloud and social media strategies based on the power of thoughtful content, made freely and easily available to proactive seekers of insights and information. As a result, marketers and branding professionals can communicate inexpensively with self-qualifiying readers/listeners in discreet market segments. BriefingsDirect podcasts hosted by Dana Gardner: Full turnkey planning, moderatiing, producing, hosting, and distribution via blogs and IT media partners of essential IT knowledge and understanding.

@CloudExpo Stories
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching ...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, discussed how given the magnitude of today's application ...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: imple...
Mobile device usage has increased exponentially during the past several years, as consumers rely on handhelds for everything from news and weather to banking and purchases. What can we expect in the next few years? The way in which we interact with our devices will fundamentally change, as businesses leverage Artificial Intelligence. We already see this taking shape as businesses leverage AI for cost savings and customer responsiveness. This trend will continue, as AI is used for more sophistica...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...