Welcome!

SDN Journal Authors: Elizabeth White, Liz McMillan, Pat Romanski, Stefan Bernbo, TJ Randall

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Cloud Security, @BigDataExpo

@CloudExpo: Article

Encryption in Use Deep Dive

What you need to know to secure and control your data

Encryption in Use – Fact and Fiction
Risk-conscious enterprises across the globe have been reluctant to embrace the public cloud model. For many, compliance requirements are the source of the reluctance. For others, concerns about ceding control of their data to a cloud service provider, without the cloud service provider accepting liability for customer data, is the major hurdle. Conforming to data residency regulations, when implementing a distributed services model, present a further complication. Even as these challenges to adoption loom large, the economics and productivity benefits of cloud-based services remain compelling. For these organizations to make the transition to the cloud, a range of elements must be in place, including continuous monitoring of the cloud service provider’s data center, enforcement of appropriate service level agreements, data classification and definition of internal processes to manage cloud-based services.  Encryption in use is a critical piece of this puzzle, since it provides a mechanism for the enterprise to extend their boundary of control to their data stored and processed within the cloud service provider's environment. However, not all encryption in use is created equally, secure, and a generic. A one size fits all approach is likely to fall short in providing a balance between security and functionality.

The Case for Encryption in Use
For almost as long as the field of information security has been in existence, encryption of data at rest and encryption of data in transit have served as cornerstone technologies to prevent access to sensitive, proprietary, confidential or regulated data. Both forms of encryption operate through exchange and presentation of a combination of public and private keys that unlock the encrypted data. The great step forward for modern cryptography was the idea that the key that you use to encrypt your data could be made public while the key that is used to decrypt your data could be kept private. The purpose of both is to ensure that only users or systems with access to the key could access the data.

Encryption in use provides functionality that is almost counter-intuitive to the purpose behind modern encryption for data at rest and data in transit, working to ensure that the data remains in an encrypted state, even as users interact with the data, performing operations like search or sort, for example. However, just like encryption for other states of data, encryption in use serves a clear need. Without encryption in use, organizations cannot retain ownership and control of their data stored and processed in a cloud-based service – whether control is required to address security, compliance, data residency, privacy or governance needs. Encryption in use is similar to format preserving encryption in that it is applied in real time, but allows for a far broader range of cloud service functionality and feature support.

Encryption in use enables enterprises to independently secure their data stored and processed at cloud service providers – while holding on to the encryption keys. The ongoing revelations of government surveillance that are supported by laws compelling cloud service providers to hand over customer data, highlight the challenge that end users face of meeting their obligations to retain direct control of their cloud data.  The recent set of recommendations from the Review Group on Intelligence and Communications Technologies appointed by the White House focused on implementing better privacy steps is only the first step in revisiting policies.

Because encryption in use is an emerging area, the technology can be easily misunderstood, or even easily misrepresented. Typically, encryption in use entails the use of a gateway, or proxy, architecture. The user accesses the application via the gateway – whether the application server is in the cloud or on premise.  The key to decrypt the data resides in the gateway (or in an integrated HSM), ensuring that data stored and processed at the server is persistently encrypted, even as the encryption is entirely transparent to the user. Were the user to access the server directly, bypassing the gateway, the data would simply appear as a string of encrypted gibberish.  As long as the gateway remains under the data owner’s control, only authorized users can gain access to the data stored and processed at the cloud service provider, or other third party.

In the event that the cloud service provider is required to hand over customer data in response to a government subpoena, they must their meet their legal obligation. However, if encryption in use has been implemented, the service provider can only hand over encrypted gibberish. The request for data must then be directed to the entity that holds the encryption keys. Likewise, a rogue administrator, a hacker or government entity would only be able view unintelligible gibberish if they gained access to the user account.

Not Some Kind of Magic
In order to deliver on the promise of encryption in use, the gateway must deliver on a robust set of functionality requirements: comprehensive service functionality and water-tight security based on a strong encryption scheme. What this means in practical terms is that the entirety of the service’s functional elements and behavior must be mapped, and that the encryption scheme must allow for preserving functionality without compromising security. This is because the gateway must recreate the session for the cloud-facing leg, and transpose encrypted data into the flow without disrupting functionality like search, sort and index.  Otherwise, the user experience is degraded, and the value proposition of the cloud-based service of improving productivity is undermined.

Vendors face another set of choices: take shortcuts to cover as much ground to provide a superficial sense of security, or invest in extensive R&D work to deliver the optimal balance between functionality and strong security. For instance, vendors can opt to provide encryption for a just a few data fields, out of hundreds or even a few thousand, to encompass a specific subset of the enterprise’s information. Equally, they can choose to implement a cloud data encryption scheme that preserves features relying on referential integrity such as sort, search and index that is easily reversible by attackers.

By way of illustration, if the scheme involves deterministically encrypting words into very short AES blocks, the encoding pattern is consistent enough for common attacks to yield clear text from what might appear to be encrypted text. There are a variety of iterative attacks such as chosen plaintext attacks that will yield clear text if the encryption relies on a simplistic and consistent encoding pattern. So while the data may appear to be encrypted, and less engineering resources are required to support application features and functionality, the data protection in place is barely skin deep.

Encryption in use is not a kind of magic – it requires dedicated engineering expertise, with collaboration between infrastructure, information security and encryption experts. And, the encryption scheme must be tailored to a specific application or service to deliver on the appropriate balance of security and functionality.

Another significant consideration is evaluating encryption in use in the context of a specific application or service. From the customer’s perspective, it is appealing to use a single encryption platform for multiple applications. No customer wants to have to manage multiple appliances, management interfaces and vendors. The reality, however, is that to strike an acceptable balance for any risk conscious organization between security and functionality requires deep application knowledge and encryption-in-use expertise. Dig a little deeper on degree of support, or risk a gamble on production readiness. The degree of support is as critical as the extent of support.

Evaluating Encryption in Use Claims
Can enterprises rely on a standard validation for encryption in use? Precisely because encryption in use is a new area, third-party validation is a critical requirement before it is implemented in production environments. Unfortunately, the current set of standard validation and certification tests have limited applicability.

The most frequently cited third-party validation by vendors in the space is FIPS 140-2 validation. As critical as 140-2 validation is as an evaluation benchmark, and specifically required under some federal procurement mandates, it has some limitations for encryption in use.

Taking a step backward, its important to note the scope of FIPS validation. The process essentially verifies that the algorithms are implemented according to defined specifications. However, it does not provide any validation about how the platform would use the cryptographic module in order to support encryption in use.

For instance, the FIPS validation doesn't outline a set of best practices on how to use the cryptographic module. Instead, it verifies that whenever the system invokes AES encryption, the module performs AES encryption according to the standard specification.  FIPS validation is limited to the cryptographic modules used, not the overall integrity of the platform, or the encryption scheme used in production environments. While FIPS validation is an important consideration, enterprises should be aware of its limitations as the sole third party validation for encryption. In an outside world example, validation would demonstrate that a $500 bicycle lock is impervious to any lock picking attempts, but when used to lock a bike to a fire hydrant, it does nothing to protect the bike from a thief simply lifting the bike up and driving away.

Hopefully this has been useful in helping you to determine the right approach your organization can take to secure and maintain control of your data. I look forward to hearing any further points I might have missed.

More Stories By Elad Yoran

Elad is Chairman and CEO of cloud encryption company, Vaultive. His nearly 20 years in the cyber security industry spans experience as an executive, consultant, investor, investment banker and a several-time successful entrepreneur. Elad’s entrepreneurial experience includes Riptech, the pioneering provider of managed security services to governments and Fortune 500 corporations around the world, acquired by Symantec Corporation, Sentrigo, a leading provider of database security recently acquired by McAfee, and MediaSentry, a provider of anti-piracy technology solutions to the motion picture, music and software industries, acquired by SafeNet. Elad has also served as Vice President, Global Business Development at Symantec and as Vice President at Broadview International (acquired by Jeffries), an investment bank focusing on mergers and acquisitions in the technology industry, where he led the firm’s information security practice. Elad has been recognized as “Entrepreneur of the Year” by Ernst & Young.

@CloudExpo Stories
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2017 New York The 7th Internet of @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, New York. Chris Matthieu is the co-founder and CTO of Octoblu, a revolutionary real-time IoT platform recently acquired by Citrix. Octoblu connects things, systems, people and clouds to a global mesh network allowing users to automate and control design flo...
You are moving to the Cloud. The question is not if, it’s when. Now that your competitors are in the cloud and lapping you, your “when” better hurry up and get here. But saying and doing are two different things. In his session at @DevOpsSummit at 18th Cloud Expo, Robert Reeves, CTO of Datical, explained how DevOps can be your onramp to the cloud. By adopting simple, platform independent DevOps strategies, you can accelerate your move to the cloud. Spoiler Alert: He also makes sure you don’t...
President Obama recently announced the launch of a new national awareness campaign to "encourage more Americans to move beyond passwords – adding an extra layer of security like a fingerprint or codes sent to your cellphone." The shift from single passwords to multi-factor authentication couldn’t be timelier or more strategic. This session will focus on why passwords alone are no longer effective, and why the time to act is now. In his session at 19th Cloud Expo, Chris Webber, security strateg...
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web co...
Redis is not only the fastest database, but it has become the most popular among the new wave of applications running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 18th Cloud Expo, Dave Nielsen, Developer Relations at Redis Labs, shared the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
"We are a custom software development, engineering firm. We specialize in cloud applications from helping customers that have on-premise applications migrating to the cloud, to helping customers design brand new apps in the cloud. And we specialize in mobile apps," explained Peter Di Stefano, Vice President of Marketing at Impiger Technologies, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
The idea of comparing data in motion (at the sensor level) to data at rest (in a Big Data server warehouse) with predictive analytics in the cloud is very appealing to the industrial IoT sector. The problem Big Data vendors have, however, is access to that data in motion at the sensor location. In his session at @ThingsExpo, Scott Allen, CMO of FreeWave, discussed how as IoT is increasingly adopted by industrial markets, there is going to be an increased demand for sensor data from the outermos...
"Qosmos has launched L7Viewer, a network traffic analysis tool, so it analyzes all the traffic between the virtual machine and the data center and the virtual machine and the external world," stated Sebastien Synold, Product Line Manager at Qosmos, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2017 New York. The 20th Cloud Expo and 7th @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Internet to enable us all to im...
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...