Welcome!

SDN Journal Authors: Liz McMillan, Yeshim Deniz, Elizabeth White, Pat Romanski, TJ Randall

Related Topics: @CloudExpo, Microservices Expo, @DXWorldExpo, SDN Journal

@CloudExpo: Blog Post

Big Compute Gives Life to New Data By @Nimbix | @CloudExpo [#BigData]

Deriving value through computation

What's Big Data without Big Compute? Basically just a large collection of unstructured information with little purpose and value.  It's not enough for data just to exist, we must derive value from it through computation - something commonly referred to as analytics.

The Quantum Nature of Big Data
With traditional data, we simply query it to derive results; all we need is currently stored within the data set itself.  For instance, for a customer database with dates of birth, we may just fetch the list of customers who were born after a certain date.  This is a simple query, not a computation, and therefore cannot be considered analytics.

However, with Big Data, we can distribute the information so that we can run complex analytics on it at scale.  Unlike traditional data, the information itself has little meaning until we process it.  The reason we distribute the data sets is not because they are large, but because we want to leverage clusters of computers to run more than just simple queries.  That is why in the Big Data model, the data itself doesn't hold the answer - to unleash it we must compute it.  Think of this as a virtual "Schrödinger's Cat"... it can mean anything until we actually look "in the box."  The difference is that we're not asking a simple question such as, "is it dead or alive," but rather more complex inquiries such as, "assuming it's alive, what might its future behavior be?"  Analytics, especially predictive ones, rely on patterns and their associations.  Because the data sets tend to change (or grow) over time, then we have to understand that the results of these complex computations will most definitely vary as well.

Knowing this, it's perhaps a major understatement to associate the term "Big Data" with data itself since it cannot really exist without Big Compute to make sense of all the information.

What's So Special About Big Compute?
Big Compute implies one of two things:

  1. Ordinary computing scaled across a massive parallel cluster
  2. High-Performance Computing (HPC)

The problem with the former is that it can only scale so far before performance drops off.  Furthermore, for it to really succeed, the data itself must be scaled just as wide.  This also brings with it practical challenges of systems management, complexity and infrastructure constraints such as networking and power.

High-Performance Computing is a more natural form of "Big Compute" because it scales and packs a powerful per-unit punch.  What does this mean? Simply that we can realize higher computation density with far fewer "moving parts."  A good example is using Graphics Processing Unit (GPUs) for vector calculations.  Sure, you can do this with Central Processing Unit (CPU) cores alone, but you've only got around 8-16 in each typical server node.  Each GPU can have hundreds or even thousands of cores.  If you vectorize your calculations to take advantage of this, you can do far more work with far less power and management complexity (at scale) than if you had to spread it across dozens or even hundreds of CPUs (and the servers they live in).

So this begs the question: Why does Big Compute really matter?  Can't simple algorithms on commodity compute already do predictive analytics?

The answer is of course yes, but there are two problems - one immediate and one future.

The immediate problem is that in many cases, the speed at which you get results matters just as much as the results themselves.  For example, if you are planning on using analytics to improve e-commerce, the best time to do this is while the customer is engaged in a transaction. Sure, there's still value in following up with the customer later, after you've crunched the data, but why not take advantage of the moment while he or she has credit card in hand to provide facts that may increase spend?

When you combine this with the fact that there may be thousands of concurrent transactions at any given time, over-subscribing commodity compute to perform complex analytics won't get you the results you need in time to maximize the value of Big Data.

This is where Big Compute can perform the same operations thousands of times faster.  In many cases, the value of the data is sensitive to the amount of time needed to compute it.  There are many examples of this - e-commerce is just a popular one.  In other cases, the data set itself is changing (generally growing) rapidly.  If analytics take too long, the results may already be obsolete or irrelevant once delivered.

Simply put, Big Compute powered by HPC is the fastest, most efficient way to derive value from data at scale - at the exact moment needed.

This brings us to the future problem with commodity compute.

Innovation in Algorithms
How do you derive future value from the same data you have (or are collecting) today?  If we look at Big Data as a two-part problem - storing the data and analyzing the data - we quickly realize where the greatest potential for innovation is.  It's not in storage because, although challenging, we've seen densities increase dramatically since the dawn of computing.  As a (crude) point of reference, a consumer could buy a three-terabyte hard disk in 2014 for less than the cost of a 200 gigabyte one just 10 years prior.  Higher storage densities mean less infrastructure to manage, and thus make storing large data sets more practical over time as well (not just cheaper.)  So we can rest easy knowing that all things being equal, as the data sets grow, so will the storage to hold it all in a relatively cost-effective way.

Obviously the most room for innovation is in the analytics algorithms themselves.  We will see both the speed and the quality of the computations increase dramatically over time.  Thanks to Big Compute, there's no need to compromise.  Commodity compute is a non-starter for algorithms that are too complex to run quickly against large data sets.

Just imagine the opportunities we'd miss if we avoided problems that are seemingly too hard to solve.  Big Compute makes it possible to run the most complex algorithms quickly, and the sky's the limit when it comes to the types of analytics we'll see as a result.

Big Compute will help Big Data evolve to not just be "bigger," but to be far more meaningful than we can ever imagine.

More Stories By Leo Reiter

Leo Reiter is CTO of Nimbix, providers of cloud-based High Performance Computing and Big Data platforms and applications to help organizations solve their most complex problems faster and easier.

@CloudExpo Stories
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smart...
"We were founded in 2003 and the way we were founded was about good backup and good disaster recovery for our clients, and for the last 20 years we've been pretty consistent with that," noted Marc Malafronte, Territory Manager at StorageCraft, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Here are the Top 20 Twitter Influencers of the month as determined by the Kcore algorithm, in a range of current topics of interest from #IoT to #DeepLearning. To run a real-time search of a given term in our website and see the current top influencers, click on the topic name. Among the top 20 IoT influencers, ThingsEXPO ranked #14 and CloudEXPO ranked #17.
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching o...
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
DXWorldEXPO LLC announced today that All in Mobile, a mobile app development company from Poland, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. All In Mobile is a mobile app development company from Poland. Since 2014, they maintain passion for developing mobile applications for enterprises and startups worldwide.
DXWorldEXPO LLC announced today that ICC-USA, a computer systems integrator and server manufacturing company focused on developing products and product appliances, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City. ICC is a computer systems integrator and server manufacturing company focused on developing products and product appliances to meet a wide range of ...
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
We all know that end users experience the internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices - not doing so will be a path to eventual ...
HyperConvergence came to market with the objective of being simple, flexible and to help drive down operating expenses. It reduced the footprint by bundling the compute/storage/network into one box. This brought a new set of challenges as the HyperConverged vendors are very focused on their own proprietary building blocks. If you want to scale in a certain way, let's say you identified a need for more storage and want to add a device that is not sold by the HyperConverged vendor, forget about it...
Evan Kirstel is an internationally recognized thought leader and social media influencer in IoT (#1 in 2017), Cloud, Data Security (2016), Health Tech (#9 in 2017), Digital Health (#6 in 2016), B2B Marketing (#5 in 2015), AI, Smart Home, Digital (2017), IIoT (#1 in 2017) and Telecom/Wireless/5G. His connections are a "Who's Who" in these technologies, He is in the top 10 most mentioned/re-tweeted by CMOs and CIOs (2016) and have been recently named 5th most influential B2B marketeer in the US. H...
Deep learning has been very successful in social sciences and specially areas where there is a lot of data. Trading is another field that can be viewed as social science with a lot of data. With the advent of Deep Learning and Big Data technologies for efficient computation, we are finally able to use the same methods in investment management as we would in face recognition or in making chat-bots. In his session at 20th Cloud Expo, Gaurav Chakravorty, co-founder and Head of Strategy Development ...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo at the Javits Center in New York City, NY.
Vulnerability management is vital for large companies that need to secure containers across thousands of hosts, but many struggle to understand how exposed they are when they discover a new high security vulnerability. In his session at 21st Cloud Expo, John Morello, CTO of Twistlock, addressed this pressing concern by introducing the concept of the “Vulnerability Risk Tree API,” which brings all the data together in a simple REST endpoint, allowing companies to easily grasp the severity of the ...
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...