Welcome!

SDN Journal Authors: Pat Romanski, Stefan Bernbo, Elizabeth White, Liz McMillan, TJ Randall

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

Data Control and Data Residency

Consider your options carefully when planning the move to the cloud

The benefits associated with adoption of the cloud are well documented and understood. Organizations cite tremendous cost savings, fast deployment times and streamlined application support and maintenance when compared to traditional on-premise software deployments. So what is holding many companies back from adopting the cloud? A recent report from Gartner entitled "Five Cloud Data Residency Issues That Must Not Be Ignored" highlights one key reason for this hesitancy - enterprises' questions and concerns about jurisdictional and regulatory control arising from a lack of clarity on where cloud data truly resides. The report from Gartner recommends that enterprises adopt measures that will simultaneously boost the security of sensitive data as well as assist them in satisfying regulatory compliance with data residency laws.

While the report provides some excellent guidance associated with the implementation of one technique - encryption - to safeguard sensitive information in the cloud, it did not cover a few key points that deserve to be mentioned:

  • Tokenization should be given strong consideration as the data security technique that enterprises deploy when data residency is a critical concern.
  • If encryption is deployed by enterprises, they should take every measure to ensure that they are deploying the strongest form of encryption possible (e.g., use FIPS 140-2 validated modules) to guard against the inherent threats associated with multi-tenant cloud environments.

Why Tokenization?
Tokenization is a process by which a sensitive data field, such as a "Name" or "National ID Number," is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated original value. While various approaches to creating tokens exist, frequently they are simply randomly generated values that have no mathematical relation to the original data field (click here to review third-party evaluation of PerspecSys' tokenization approach). This underlies the inherent security of the approach - it is nearly impossible to determine the original value of a sensitive data field by knowing only the surrogate token value. When deployed as a technique within a Cloud Data Protection Gateway, the token "vault" that matches the clear text value with the surrogate token stays on-site within an organization's data-center. Because of this, the benefit from a data residency compliance perspective is apparent - the data truly never leaves the enterprise's location.

How Encryption Differs
Encryption is an obfuscation approach that uses a cipher algorithm to mathematically transform sensitive data's original value to a surrogate value. The surrogate can be transformed back to the original value via the use of a "key," which can be thought of as the means to undo the mathematical lock. While encryption clearly can be used to obfuscate a value, a mathematical link back to its true form still exists. As described, tokenization is unique in that it completely removes the original data from the systems in which the tokens reside (the cloud) and there is no construct of a "key" that can be used to bring it back into the clear in the cloud.

In our experience with many customers, it is this unique characteristic of tokenization that has made it the preferred approach selected by enterprises when they are explicitly trying to address data residency requirements. In the words of one of our largest customers (who selected tokenization as their data security approach), "encrypted data leaving your premises is still data leaving your premises."

But If Encryption Is Used - Deploy Using Best Practices
If an organization decides to deploy encryption in order to protect sensitive information going to the cloud, then they need to ensure that industry standard best practices on the use of encryption are followed. As highlighted in the Cloud Security Alliance's Guidelines as well as numerous Gartner Reports, the use of published, well-vetted strong encryption algorithms is a must. In fact, the previously mentioned report "Five Cloud Data Residency Issues That Must Not Be Ignored" notes that enterprises need to ensure that the "strength of the security is not compromised." A good guideline is to look for solutions that support FIPS 140-2 validated algorithms from well-known providers such as McAfee, RSA, SafeNet, Symantec and Voltage Security. A unique and highly valued quality of the PerspecSys gateway is that cloud end users can still enjoy the full capabilities of cloud applications (such as SEARCH) even with data that is strongly encrypted with these industry accepted, validated algorithms.

Netting It Out
There is much to gain from using data obfuscation and replacement technologies to satisfy residency requirements in order to pave the way to cloud adoption. But equally, there is much to lose if the implementation is not well thought through. Do your homework - consider tokenization as an approach, question any encryption techniques that are not well vetted and accepted in the industry and finally, compare solutions from multiple vendors (a suggestion - refer to our whitepaper as a guide: "Critical Questions to Ask Cloud Protection Gateway Providers". We know from our experience helping many organizations around the world tackle these challenges via the use of our Cloud Data Protection Gateway, that by charting your path carefully at the beginning of your project, you can arrive at a solution that will fully meet the needs of your Security, Legal, and Business Line teams.

Read the original blog entry...


PerspecSys Inc. is a leading provider of cloud protection and cloud encryption solutions that enable mission-critical cloud applications to be adopted throughout the enterprise. Cloud security companies like PerspecSys remove the technical, legal and financial risks of placing sensitive company data in the cloud. PerspecSys accomplishes this for many large, heavily regulated companies across the world by never allowing sensitive data to leave a customer's network, while maintaining the functionality of cloud applications. For more information please visit http://www.perspecsys.com/ or follow on Twitter @perspecsys.

More Stories By Gerry Grealish

Gerry Grealish is Vice President, Marketing & Products, at PerspecSys. He is responsible for defining and executing PerspecSys’ marketing vision and driving revenue growth through strategic market expansion and new product development. Previously, he ran Product Marketing for the TNS Payments Division, helping create the marketing and product strategy for its cloud-based payment gateway and tokenization/encryption security solutions. He has held senior marketing and leadership roles for venture-backed startups as well as F500 companies, and his industry experience includes enterprise analytical software, payment processing and security services, and marketing and credit risk decisioning platforms.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
SaaS companies can greatly expand revenue potential by pushing beyond their own borders. The challenge is how to do this without degrading service quality. In his session at 18th Cloud Expo, Adam Rogers, Managing Director at Anexia, discussed how IaaS providers with a global presence and both virtual and dedicated infrastructure can help companies expand their service footprint with low “go-to-market” costs.
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effici...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...