|By Karl Van den Bergh||
|January 11, 2014 12:00 PM EST||
Slide Deck from Karl Van den Bergh's Cloud Expo Presentation: The Intelligence Inside: How Developers of Cloud Apps Will Change the World of Analytics
We live in a world that requires us to compete on our differential use of time and information, yet only a fraction of information workers today have access to the analytical capabilities they need to make better decisions. Now, with the advent of a new generation of embedded business intelligence (BI) platforms, cloud developers are disrupting the world of analytics. They are using these new BI platforms to inject more intelligence into the applications business people use every day. As a result, data-driven decision-making is finally on track to become the rule, not the exception.
The Increased Focus on Analytics
With the emphasis on data-driven decision-making, it is perhaps not a surprise that the focus on analytics continues to mount. According to IDC's Dan Vesset, 2013 was poised to be the first year that the market for data-driven decision making enabled by business analytics broke through the $100 billion mark. IT executives are also doubling-down on analytics, a fact highlighted by Gartner's annual CIO survey which has put analytics as the number one technology priority three times out of the last five years. So, given the importance and spend on analytics, everyone should have access to the insight they need, right?
Most Business People Still Don't Use Analytics
Amazingly, in spite of spending growth and focus, most information workers today do not have access to business intelligence. In fact, Cindi Howson of BI Scorecard has found that end-user adoption of BI seems to have stagnated at about 25%. This stagnation is difficult to reconcile. How is it possible that, at best, one quarter of information workers have access to what is arguably most critical to their success in a world that runs on data?
There are a variety of reasons for stagnant end-user adoption, including the high costs associated with BI projects and an overall lack of usability. However, the biggest impediment to BI adoption has nothing to do with the technology. The reality is that the vast majority of business decision makers do not spend their day working in a BI tool - nor do they want to. Users already have their preferred tool or application: sales representatives use a CRM service; marketers use a campaign management or marketing automation platform; back-office workers will spend a lot of their day in an ERP application; executives will typically work with their preferred productivity suite, and the list goes on. Unless you are a data analyst, you are not going to want to spend much of your day using a BI tool. But, just because business people prefer not to use a BI tool does not mean they don't want access to pertinent data to bolster better decision-making.
The Need for More Intelligence Inside Applications
What's the solution? Simply put, bring the data TO users inside their preferred applications instead of expecting them to go to a separate BI system to find the report, dashboard or visualization that's relevant to the question at hand. If we want to reach the other 75% of business people who don't have access to a standalone BI product, we have to inject intelligence inside the applications and services they use every day. It is only through more intelligent applications that organizations can benefit from broader data-driven decision-making. In fact, according to Gartner, BI will only become pervasive when it essentially becomes "invisible" to business people as part of the applications they use daily. In a 2013 report highlighting key emerging tech trends, Gartner concludes that in order "to make analytics more actionable and pervasively deployed, BI and analytics professionals must make analytics more invisible and transparent to their users." How? The report explains this will happen "through embedded analytic applications at the point of decision or action."
If the solution to pervasive BI is to deliver greater intelligence inside applications, why don't more applications embed analytics? The reality is that only a small fraction of applications built today have embedded intelligence. Sure, they might have a table or a chart but there is no intelligent engine; users typically can't personalize a report or dashboard or self-serve to generate new visualizations on an ad-hoc basis. The culprit here is that business intelligence was originally intended as a standalone activity, not one that was designed to be embeddable. Specifically, the reasons driving developers to ignore BI platforms boil down to cost and complexity.
Cost and Complexity Are Barriers to Embedded BI
Traditionally, BI tools have carried a user-based licensing model. Licenses typically cost from the tens of thousands to millions of dollars. Such high per-user costs might be justified for a relatively small, predictably-sized population that includes a large percentage of power users who will spend a good amount of time working with the BI tool. This user-based model, however, is totally unsuitable for the embedded use case. The embedded use case is geared toward business users who will access the BI features less frequently and likely have less analytics experience than the traditional power user - in this scenario, high per-user costs simply can't be justified.
BI products are complex on a number of different levels. First, they are complex to deploy, often requiring months if not years to roll out to any reasonable number of users. Second, they are complex to use, both for the developers building the reports and dashboards as well as the business people interacting with the tool. Third, they are complex to embed. Designed as standalone products, BI tools are not architected to plug into another application.
Given the cost and complexity of traditional standalone BI offerings, it is no surprise that developers often turn to charting libraries to deliver the visualizations within their application. The cost is low and they are relatively simple for a developer to embed. In the short term, a charting library is a reasonable solution, but over time falls flat. The demands for more charts, dashboards and reports quickly grow, and end users begin looking for the ability to self-serve and create their own visualizations. As a result of these mounting demands, many application developers find themselves essentially building a BI tool, taking them outside their core competency and stealing precious time away from advancing their own application.
Could a New Generation of Embedded BI Provide the Solution?
Utility Pricing Dramatically Reduces Cost
To address the challenge of cost, a new generation of embedded analytics platforms employs a utility-based licensing model where the software is available on a per-core, per-hour or per-gigabyte basis. From a developer's perspective, this is a much fairer model, as one only pays for what is used. At the beginning of the application lifecycle when usage is sporadic, developers can limit their costs. As the application becomes successful and use grows, usage can be easily scaled up. A recent report by Nucleus Research concluded that utility pricing for analytics can save organizations up to 70% of what they would pay for a traditional BI solution. I've written previously about how utility pricing will dramatically increase the availability of analytics, reaching a much broader set of organizations. The rapid adoption of Amazon's Redshift data warehousing service and Jaspersoft's reporting and analytics service on the AWS Marketplace provides rich testimony to the benefits of this model.
Cloud and Web-Standard APIs Reduce Complexity
A cloud-based BI platform significantly simplifies deployment, as there is no BI server to install or configure. The Nucleus Research report found that the utility-priced, Cloud BI solutions could be deployed in weeks or even days as opposed to the months commonly required for a traditional BI product.
The Benefits of Embedded Intelligence
Intuitively, it would seem that, by providing analytics within the applications business people use every day, an organization should experience the benefits of more data-driven decision-making. But is there any proof?
A recent report by the Aberdeen Group, based on data from over 130 organizations, has helped shed light on some of the benefits of embedded analytics. First, as might be expected, those companies using embedded analytics saw 76% of users actively engaged in analytics versus only 11% for those with the lowest embedded BI adoption. As a result, 89% of the business people in these best-in-class companies were satisfied with their access to data versus only 21% in the industry laggards. The bottom line? Companies leading embedded BI adoption saw an average 19% increase in operating profit versus only 9% for the other companies.
Andre Gayle, who helps manage a voicemail service at British Telecom, illustrates the difference embedded analytics can make. "We had reports [before] but they had to be emailed to users, who had to wait for them, then dig through them as needed. It was inefficient and wasteful." Now, thanks to embedded analytics, British Telecom has seen a huge savings in time and cost. As Gayle explains, capacity planning for the voicemail service used to be a "laborious exercise, involving several days of effort to dig up the numbers " but now can be done "on demand, in a fact-based manner, in just a few minutes."
The evidence is mounting that embedding analytics inside the applications business people use every day can lead to quantifiable benefits. However, the protagonist here, unlike in the traditional world of analytics, must be the developer, not the analyst. A new generation of embedded BI platforms is making it easier and more cost effective for developers to deliver the analytical capabilities needed inside the Cloud applications they are building. As developers increasingly avail of these new platforms, we can hope that BI will finally become pervasive as an information service that informs day-to-day operations. As Wayne Eckerson puts it, "In many ways, embedded BI represents the fulfillment of BI's promise." Now it's up to Cloud developers to help us realize that promise.
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee...
Feb. 14, 2016 09:15 AM EST
Companies can harness IoT and predictive analytics to sustain business continuity; predict and manage site performance during emergencies; minimize expensive reactive maintenance; and forecast equipment and maintenance budgets and expenditures. Providing cost-effective, uninterrupted service is challenging, particularly for organizations with geographically dispersed operations.
Feb. 14, 2016 09:00 AM EST Reads: 102
The Quantified Economy represents the total global addressable market (TAM) for IoT that, according to a recent IDC report, will grow to an unprecedented $1.3 trillion by 2019. With this the third wave of the Internet-global proliferation of connected devices, appliances and sensors is poised to take off in 2016. In his session at @ThingsExpo, David McLauchlan, CEO and co-founder of Buddy Platform, will discuss how the ability to access and analyze the massive volume of streaming data from mil...
Feb. 14, 2016 09:00 AM EST Reads: 102
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, will discuss using predictive analytics to ...
Feb. 14, 2016 08:45 AM EST Reads: 428
WebSocket is effectively a persistent and fat pipe that is compatible with a standard web infrastructure; a "TCP for the Web." If you think of WebSocket in this light, there are other more hugely interesting applications of WebSocket than just simply sending data to a browser. In his session at 18th Cloud Expo, Frank Greco, Director of Technology for Kaazing Corporation, will compare other modern web connectivity methods such as HTTP/2, HTTP Streaming, Server-Sent Events and new W3C event APIs ...
Feb. 14, 2016 08:30 AM EST
Join us at Cloud Expo | @ThingsExpo 2016 – June 7-9 at the Javits Center in New York City and November 1-3 at the Santa Clara Convention Center in Santa Clara, CA – and deliver your unique message in a way that is striking and unforgettable by taking advantage of SYS-CON's unmatched high-impact, result-driven event / media packages.
Feb. 14, 2016 08:30 AM EST Reads: 117
At first adopted by enterprises to consolidate physical servers, virtualization is now widely used in cloud computing to offer elasticity and scalability. On the other hand, Docker has developed a new way to handle Linux containers, inspired by version control software such as Git, which allows you to keep all development versions. In his session at 17th Cloud Expo, Dominique Rodrigues, the co-founder and CTO of Nanocloud Software, discussed how in order to also handle QEMU / KVM virtual machin...
Feb. 14, 2016 08:15 AM EST Reads: 156
SYS-CON Events announced today that FalconStor Software® Inc., a 15-year innovator of software-defined storage solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. FalconStor Software®, Inc. (NASDAQ: FALC) is a leading software-defined storage company offering a converged, hardware-agnostic, software-defined storage and data services platform. Its flagship solution FreeStor®, utilizes a horizonta...
Feb. 14, 2016 07:30 AM EST
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Feb. 14, 2016 04:30 AM EST Reads: 406
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Feb. 14, 2016 04:00 AM EST Reads: 491
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Feb. 14, 2016 04:00 AM EST Reads: 263
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
Feb. 14, 2016 03:45 AM EST Reads: 481
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Feb. 14, 2016 01:15 AM EST Reads: 298
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Feb. 13, 2016 11:15 PM EST Reads: 318
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection.
Feb. 13, 2016 09:00 PM EST Reads: 262
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
Feb. 13, 2016 08:45 PM EST Reads: 403
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
Feb. 13, 2016 07:00 PM EST Reads: 425
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Feb. 13, 2016 04:45 PM EST Reads: 213
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
Feb. 13, 2016 02:00 PM EST Reads: 265
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Avere delivers a more modern architectural approach to storage that doesn’t require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbuilding of data centers ...
Feb. 13, 2016 01:30 PM EST Reads: 156