Click here to close now.

Welcome!

SDN Journal Authors: Dana Gardner, Lori MacVittie, Carmen Gonzalez, Liz McMillan, Aria Blog

Related Topics: Big Data Journal, Java, Wireless, Linux, Cloud Expo, SDN Journal

Big Data Journal: Article

The Non-Analytics Company Is History

The fundamental mechanics of business have changed and the non-data centric company will ultimately be history

The fundamental mechanics of business have changed.

Well, they haven't quite.

The basic laws of supply and demand still govern the economic principles inside which firms in all industries bring goods and services to market inside a common monetary system on an international level.

But a change has occurred and it is an information-driven shift.

Our core accounting systems used to represent the motherlode of all company information. Onward from there... somewhere around the end of the last millennium we added so-called Customer Relationship Management to the corporate information arsenal and started to build up the commercial data bank.

Fast forward into the first decade of the new millennium and we found ourselves deeply entrenched (and enamored with) the world of Enterprise Resource Planning (ERP). In the ERP-enabled world we started to define Key Performance Indicators (KPIs) and use business metrics in a more mathematically sensitive way than ever before.

What makes a truly data-centric firm?
Today we take ERP as a given element of a wider total corporate data stack. The modern firm captures data from accounts, from customers, from business units (in the ERP sense) of course, but that's just the start. A truly data-centric firm also captures information from employees, external competitors, business equipment (in the capital expenditure CapEx and operational expenditure OpEx sense) often with Internet of Things style sensors and more besides.

To clarify our argument one crucial step further, this (above example) is not a truly data-centric firm; this is only a data-aware firm. A truly data-centric firm is also capable of capturing these multi-level information streams and being able to analyze them for operational (and so therefore) commercial advantage.

Future investment brokers won't ask to see profit and loss statements; they will ask "how good is your Big Data information capture and analytics procedure system?" or such like. Okay yes they will ask for P&L too, but you get the point.

This practice of analytics is defining the modern 21st century business. Knowing what customer movements mean is important, but knowing how to analyze what connected ancillary factors will influence customer behavior before it happens is what really makes the difference.

This overall trend for change toward analytics has certain effects. Firms need the same mix of salespeople, IT, finance, admin and other staff; but now they need a defined specialist to serve as a Data Scientist (CAPS intended to denote job title) -- or, at least, they need to be able to outsource the consultancy services needed to supply that analytics intelligence.

Patterns and anomalies
Companies who "get" the analytics challenge are using a variety of tools to surmount and conquer the Big Data challenge. The data scientist (lower case from herein) is using elements such as the Apache Hadoop open source software framework for storage and large-scale processing of data-sets on clusters of commodity hardware. On top of Hadoop the data scientist is using In-Database and Hadoop In-Memory Analytics to start to uncover patterns and anomalies to get new insights and make decisions based on facts discovered.

The data scientist uses Data Vizualization tools to (in theory if he/she does it right) begin to uncover patterns in both internal and external data and start to perceive and act upon the resulting analytics at hand. These same analytics tools can of course be turned inwards so-to-speak and focused on the firm's own operations to uncover trends and perceive and predict actions that could and should be taken to maximize profitability and welfare of employees and customers.

Analytics used at its most effective level becomes a tool for firms to drive their ability to compete and innovate.

Nobody's perfect (with data analytics) yet
This is the pure (as opposed to applied) theory of data analytics where it sits in perfect post-deployment harmony inside a Hadoop (or other) managed Big Data framework. Not all of this theorizing is easy to pull off over night and we know that Hadoop installations are complex by their very nature. But taking the purist pure view is a good exercise to undertake at this comparatively still early stage for cloud, Big Data and analytics (and let's not forget mobile too). We need to discuss what is possible and then see how close we can get to perfect.

In this new world of business is it now fair to table our opening gambit again? Have the fundamental mechanics of business changed? For many real-world businesses today there is now an open admission and acceptance that data is the greatest commercial asset that they have. Not every firm has complete control of its data asset base, but this is precisely why we are having this discussion.

The fundamental mechanics of business have changed and the non-data centric company will ultimately be history. Soon after that, the non-analytics company will also be a distant memory. Senior management is (largely) agreeing with the need to shift resources toward data-driven decision making and wider Line-of-Business strategies are also falling in line.

A mindset for the future
The sophisticated data analysis innovator has fine-grained business control and a stable strategic growth path planned out that is capable of constant and continuous dynamic change. We're not re-inventing the light bulb or the wheel today, but we are re-inventing our core operational business ethos and mindset. Nonbelievers in the data revolution will be historical figures sooner than they think.

This post is brought to you by SAS.

SAS is a leader in business analytics software and services and the largest independent vendor in the business intelligence market.

More Stories By Adrian Bridgwater

Adrian Bridgwater is a freelance journalist and corporate content creation specialist focusing on cross platform software application development as well as all related aspects software engineering, project management and technology as a whole.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...