Welcome!

SDN Journal Authors: Pat Romanski, Destiny Bertucci, Liz McMillan, Elizabeth White, Amitabh Sinha

Related Topics: @DXWorldExpo, Agile Computing, @CloudExpo, Cloud Security, Government Cloud, SDN Journal

@DXWorldExpo: Article

Big Data Governance for Good or Evil

Lessons of the NSA PRISM Initiative

In the days since the news of the NSA’s secret PRISM spying – oops, surveillance initiative broke, there has been no end of consternation among the media and the Twitterverse. And regardless of where you fall on the political spectrum or what you think of the morality of the NSA’s efforts to collect information about our phone calls or social media interactions, one clear fact shines through: Big Data are real. They are here to stay. But they are also increasingly dangerous. As I explain in my book The Agile Architecture Revolution, the more powerful the technology, the more importance we must place on governance. So too with Big Data.

PRISM’s Big Data Governance Lessons
Most people would agree that finding terrorists and stopping them before they can wreak havoc is a good thing. It is also safe to assume that most people would allow that the US Government should be in the intelligence-gathering business, if only to stop the aforesaid terrorists. Countries have been gathering intelligence for millennia, after all, and victories frequently go to the adversary with the better intelligence. Why, then, are people livid about the NSA this time around?

The answer, of course, is that we’re not angry that the NSA is gathering intelligence on terrorists. We’re upset that the NSA is gathering intelligence on everybody else, including ourselves. We’re not talking about some James Bond-style spy mission here. We’re talking about Big Data.

Here, then, is PRISM Big Data lesson number one: It’s not just the data you want that are important, you also have to worry about the data you don’t want. Traditional data governance generally focuses on the data you want: let’s make sure our data are clean, correct, and properly secured. When we have a limited quantity of data and they all have value, then issues like data quality are relatively straightforward (although achieving data quality in practice may still be a major headache).

In the Big Data scenario, however, we’re miners looking for that nugget of gold hidden in vast quantities of dross. Yes, we must govern that nugget of value, but that’s the easy task, relatively speaking. The lesson from PRISM is that we must also govern the dross: the data we don’t want, because they open up a range of governance challenges like the privacy issues at the core of the PRISM scandal.

Your Big Data governance challenge may not be privacy related, but the fact remains that the more leftover data you have, the harder it is to govern them. After all, just because you don’t find value in Big Data doesn’t mean your competition or a hacker won’t.

The second lesson from PRISM: metadata may be Big Data as well. Data professionals are used to thinking of metadata as having technical value but little worth outside the bowels of the IT organization. In the case of PRISM, however, the NSA went after call detail records (CDRs), not the calls themselves. True, I felt a strangely geeky thrill when President Obama used the word metadata – and used it correctly, by the way – but the recent focus on call metadata only serves to highlight the fact that the metadata themselves may be the most valuable Big Data you own. Ask yourself: how robust is your metadata governance? If it’s not every bit as rock solid as your everyday data governance, then perhaps you’re not ready for Big Data after all.

PRISM lesson number 3: Big Data analytics apps can be data governance tools themselves, particularly when the central challenge is data quality. Terrorists, after all, aren’t quite stupid enough to send tweets like buying #plasticexplosives now, meet me at the #Boston #Marathon. They may be fanatics, but let’s posit that we’ve already taken out the real numbskulls already, OK? We can safely assume terrorists are actively seeking to obscure their communications, which from the enterprise perspective, is an example of (in this case intentionally) poor data quality.

The NSA naturally has sophisticated algorithms for cutting through such obfuscation. As your Big Data sets grow, you’ll need similarly sophisticated tools for cleaning up run of the mill data quality issues. Remember, the bigger the data sets, the more diverse and messy your data quality challenges will become. After all, fixing mailing address formats in your ERP system is dramatically simpler than bringing a vast hodgepodge of structured, semi-structured, and unstructured information into some kind of order.

On to PRISM lesson number four: Your Big Data analytics results may not only be valuable, they may also be dangerous. While it’s common to liken Big Data analytics to mining for gold, in reality it may be more like mining for uranium. True, uranium has monetary value, but put too much pure uranium in the same place and you’re asking for Big Trouble – Trouble with a capital T.

For example, US Census data are publicly available, but they are not allowed to provide any personally identifiable information. However, if it turns out that there is, say, only one Native American family with two children in a given zip code, then it may be possible to uniquely identify them by crunching the data. As a result, the Census Bureau must be very careful not to publish any data that may lead to such results.

Similarly, a significant danger in the NSA analysis is the risk of false positives. Mistakenly identifying an innocent citizen as a terrorist is an appalling risk that outweighs ordinary privacy concerns – at least in the opinion of the innocent civilian. And while on the one hand, the more data the NSA crunches, the less likely a false positive may be, it also follows that such false positives are all the more dangerous for their rarity.

Onto the fifth lesson, what ZapThink likes to call the Big Data corollary to Parkinson’s Law. You may recall that Parkinson’s Law states that the amount of work you have will expand to fill the available time. The Big Data corollary states that the amount of data you collect will expand to consume your ability to store and process it. In other words, if it’s possible to collect Big Data, then somebody will. It’s a question of what to do with it, not a question of whether to collect it in the first place. So let’s not worry about whether the NSA should collect the data it does. If they don’t, then someone else will – or already has. Any Big Data governance effort faces the same challenge: what to do with your data, not whether to collect it in the first place.

Finally, the sixth lesson, which is actually a lesson from something the NSA isn’t doing. Note that in the case of the NSA, current data are more valuable than historical data, even historical data that are one day old. Their paramount concern is to mine current intelligence: what terrorists are doing right now. But your problem area might find value in historical data as well as current data. If your problem deals with historical trends, then your data sets have just ballooned again, as have your data governance challenges.

The ZapThink Take
The NSA was only collecting phone call metadata, because those metadata met their needs. But what about the data themselves—the call audio? Perhaps they are unable to collect such vast quantities of data. But if not, it’s only a matter of time. The question is, once they’re able to collect all call audio, will they? Yes, of course they will. The corollary to Parkinson’s Law in action, after all.

In fact, we might as well just go ahead and assume that somewhere in the Federal Government, they’re collecting all the data – all the phone calls, all the emails, all the tweets, blog posts, forum comments, log files, everything. Because even if they aren’t quite able to amass the whole shebang yet, it’s just a matter of time till they can. And while this scenario seems like a page out of Orwell’s 1984, the most important lesson here is that data governance is now of central importance. It’s no longer a question of whether we can collect Big Data. The entire question is what we should do with Big Data once we have them.

More Stories By Jason Bloomberg

Jason Bloomberg is the leading expert on architecting agility for the enterprise. As president of Intellyx, Mr. Bloomberg brings his years of thought leadership in the areas of Cloud Computing, Enterprise Architecture, and Service-Oriented Architecture to a global clientele of business executives, architects, software vendors, and Cloud service providers looking to achieve technology-enabled business agility across their organizations and for their customers. His latest book, The Agile Architecture Revolution (John Wiley & Sons, 2013), sets the stage for Mr. Bloomberg’s groundbreaking Agile Architecture vision.

Mr. Bloomberg is perhaps best known for his twelve years at ZapThink, where he created and delivered the Licensed ZapThink Architect (LZA) SOA course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, the leading SOA advisory and analysis firm, which was acquired by Dovel Technologies in 2011. He now runs the successor to the LZA program, the Bloomberg Agile Architecture Course, around the world.

Mr. Bloomberg is a frequent conference speaker and prolific writer. He has published over 500 articles, spoken at over 300 conferences, Webinars, and other events, and has been quoted in the press over 1,400 times as the leading expert on agile approaches to architecture in the enterprise.

Mr. Bloomberg’s previous book, Service Orient or Be Doomed! How Service Orientation Will Change Your Business (John Wiley & Sons, 2006, coauthored with Ron Schmelzer), is recognized as the leading business book on Service Orientation. He also co-authored the books XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996).

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting).

@CloudExpo Stories
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Mobile device usage has increased exponentially during the past several years, as consumers rely on handhelds for everything from news and weather to banking and purchases. What can we expect in the next few years? The way in which we interact with our devices will fundamentally change, as businesses leverage Artificial Intelligence. We already see this taking shape as businesses leverage AI for cost savings and customer responsiveness. This trend will continue, as AI is used for more sophistica...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, discussed the b...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: imple...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching ...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...