Welcome!

SDN Journal Authors: Elizabeth White, Pat Romanski, TJ Randall, Yeshim Deniz, Liz McMillan

Related Topics: @DXWorldExpo, Java IoT, Containers Expo Blog, @CloudExpo, SDN Journal

@DXWorldExpo: Article

What Will It Really Take to Provide an Agile Big Data Infrastructure?

Meet CodeFutures Corporation at Cloud Expo

"An agile approach to data is really a requirement of just about any application, but even more so with Big Data," stated Cory Isaacson, CEO/CTO of CodeFutures Corporation, in this exclusive Q&A with @ThingsExpo conference chair Roger Strukhoff. "What the enterprise needs is a data platform that can adapt to changing requirements in a flexible and rapid manner. This has not been the case with existing databases."

Cloud Computing Journal: With the MapDB announcement, it sounds like to some degree you're bringing the Java Programming Language into the 21st century modern enterprise. To what degree do you agree with that statement?

Cory Isaacson: This is a very good assessment, while Java has always been a capable language there have been many barriers when it comes to delivering full-featured database technology. For example, APIs like JDBC have worked for many years, but fast performance, convenience and tight integration in Java applications with the database were not there. With MapDB Java developers now have the convenience and agility of the native Java Collections API, with the power of a very fast database engine.

Cloud Computing Journal: You emphasize a lightweight, agile approach. To what degree is this simply a requirement of Big Data applications and to what degree do you think agility is required in general for the modern enterprise? Does the real-time (or almost real-time) nature of a lot of Big Data also drive a need for agility?

Isaacson: An agile approach to data is really a requirement of just about any application, but even more so with Big Data. What the enterprise needs is a data platform that can adapt to changing requirements in a flexible and rapid manner. This has not been the case with existing databases. For example, the NoSQL engines move toward this direction, but with a Big Data store it is really tough to gain agility.

Added to this need are the burgeoning real-time data requirements. With a real-time data flow, it is critical to know what is happening now; it is not enough to get results from a typical historical time window (such as days or weeks after something has occurred). The reason an agile approach is so critical is because the results needed from real-time requirements are also likely to change at a rapid pace, more than with traditional enterprise or Big Data applications.

Cloud Computing Journal: What benefits will your customers and your product receive from the open-source approach?

Isaacson: MapDB is freely available under the Apache 2.0 license, this allows customers (or anyone) to use the product as they see fit. Because it is open source also means that we receive major feedback from users, enabling extremely fast support and stability in the product. We often hear in days or hours if there is an issue with a pre-release version, then it can be addressed before it makes its way into a final release. The support of the community is vital.

Cloud Computing Journal: You mention support of databases up to 100GB in size - are there "typical" databases of this size that you encounter? In other words, what sorts of applications and initiatives are driving databases of this size?

Isaacson: We have seen users of MapDB use it for everything from pure in-memory to large disk-based databases. So I would not say there is a "typical" size, but the good news is that a developer can comfortably scale their database as needed, from small to very large without needing to be concerned with the details.

Cloud Computing Journal: What sort of increased interest in Big Data have you seen over the past year or so, and what sort of questions do you anticipate from customers at Cloud Expo?

Isaacson: The interest in Big Data is explosive right now, especially given the new real-time element, and new data generators such as the Internet of Things.

Virtually every customer we are working with has some sort of Big Data initiative or requirement; it is becoming tightly integrated into business strategies and planning. Big Data will be a vital commodity in the economy from here on out and it will affect almost every business from large to small.

The types of questions we anticipate at Cloud Expo are:

  • What will it really take to provide an agile Big Data infrastructure?
  • How can real-time data flows be leveraged for real-time strategic advantage?
  • What capabilities will enable application developers to respond faster to business requirements, without the restrictive nature of current database technologies? In other words, how can we make the job for developers easier, while enabling far more powerful Big Data access?

In addition to all of these questions, we expect to get quite a few regarding our upcoming technology releases - we'll have many things to discuss with technologists and managers while at the event.

More Stories By Elizabeth White

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

Comments (1)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage in priority areas: customer analytics, financial crime prevention, regulatory compliance and risk management.
LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
According to the IDC InfoBrief, Sponsored by Nutanix, “Surviving and Thriving in a Multi-cloud World,” multicloud deployments are now the norm for enterprise organizations – less than 30% of customers report using single cloud environments. Most customers leverage different cloud platforms across multiple service providers. The interoperability of data and applications between these varied cloud environments is growing in importance and yet access to hybrid cloud capabilities where a single application runs across clouds remains elusive to most organizations. As companies eagerly seek out ways to make the multi cloud environment a reality, these new updates from Nutanix provide additional capabilities to streamline the implementation of their cloud services deployments.
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secures more than 4,000 modern applications for its Enterprise customers around the world.