|By Elad Yoran||
|May 19, 2014 10:00 AM EDT||
Encryption in Use – Fact and Fiction
Risk-conscious enterprises across the globe have been reluctant to embrace the public cloud model. For many, compliance requirements are the source of the reluctance. For others, concerns about ceding control of their data to a cloud service provider, without the cloud service provider accepting liability for customer data, is the major hurdle. Conforming to data residency regulations, when implementing a distributed services model, present a further complication. Even as these challenges to adoption loom large, the economics and productivity benefits of cloud-based services remain compelling. For these organizations to make the transition to the cloud, a range of elements must be in place, including continuous monitoring of the cloud service provider’s data center, enforcement of appropriate service level agreements, data classification and definition of internal processes to manage cloud-based services. Encryption in use is a critical piece of this puzzle, since it provides a mechanism for the enterprise to extend their boundary of control to their data stored and processed within the cloud service provider's environment. However, not all encryption in use is created equally, secure, and a generic. A one size fits all approach is likely to fall short in providing a balance between security and functionality.
The Case for Encryption in Use
For almost as long as the field of information security has been in existence, encryption of data at rest and encryption of data in transit have served as cornerstone technologies to prevent access to sensitive, proprietary, confidential or regulated data. Both forms of encryption operate through exchange and presentation of a combination of public and private keys that unlock the encrypted data. The great step forward for modern cryptography was the idea that the key that you use to encrypt your data could be made public while the key that is used to decrypt your data could be kept private. The purpose of both is to ensure that only users or systems with access to the key could access the data.
Encryption in use provides functionality that is almost counter-intuitive to the purpose behind modern encryption for data at rest and data in transit, working to ensure that the data remains in an encrypted state, even as users interact with the data, performing operations like search or sort, for example. However, just like encryption for other states of data, encryption in use serves a clear need. Without encryption in use, organizations cannot retain ownership and control of their data stored and processed in a cloud-based service – whether control is required to address security, compliance, data residency, privacy or governance needs. Encryption in use is similar to format preserving encryption in that it is applied in real time, but allows for a far broader range of cloud service functionality and feature support.
Encryption in use enables enterprises to independently secure their data stored and processed at cloud service providers – while holding on to the encryption keys. The ongoing revelations of government surveillance that are supported by laws compelling cloud service providers to hand over customer data, highlight the challenge that end users face of meeting their obligations to retain direct control of their cloud data. The recent set of recommendations from the Review Group on Intelligence and Communications Technologies appointed by the White House focused on implementing better privacy steps is only the first step in revisiting policies.
Because encryption in use is an emerging area, the technology can be easily misunderstood, or even easily misrepresented. Typically, encryption in use entails the use of a gateway, or proxy, architecture. The user accesses the application via the gateway – whether the application server is in the cloud or on premise. The key to decrypt the data resides in the gateway (or in an integrated HSM), ensuring that data stored and processed at the server is persistently encrypted, even as the encryption is entirely transparent to the user. Were the user to access the server directly, bypassing the gateway, the data would simply appear as a string of encrypted gibberish. As long as the gateway remains under the data owner’s control, only authorized users can gain access to the data stored and processed at the cloud service provider, or other third party.
In the event that the cloud service provider is required to hand over customer data in response to a government subpoena, they must their meet their legal obligation. However, if encryption in use has been implemented, the service provider can only hand over encrypted gibberish. The request for data must then be directed to the entity that holds the encryption keys. Likewise, a rogue administrator, a hacker or government entity would only be able view unintelligible gibberish if they gained access to the user account.
Not Some Kind of Magic
In order to deliver on the promise of encryption in use, the gateway must deliver on a robust set of functionality requirements: comprehensive service functionality and water-tight security based on a strong encryption scheme. What this means in practical terms is that the entirety of the service’s functional elements and behavior must be mapped, and that the encryption scheme must allow for preserving functionality without compromising security. This is because the gateway must recreate the session for the cloud-facing leg, and transpose encrypted data into the flow without disrupting functionality like search, sort and index. Otherwise, the user experience is degraded, and the value proposition of the cloud-based service of improving productivity is undermined.
Vendors face another set of choices: take shortcuts to cover as much ground to provide a superficial sense of security, or invest in extensive R&D work to deliver the optimal balance between functionality and strong security. For instance, vendors can opt to provide encryption for a just a few data fields, out of hundreds or even a few thousand, to encompass a specific subset of the enterprise’s information. Equally, they can choose to implement a cloud data encryption scheme that preserves features relying on referential integrity such as sort, search and index that is easily reversible by attackers.
By way of illustration, if the scheme involves deterministically encrypting words into very short AES blocks, the encoding pattern is consistent enough for common attacks to yield clear text from what might appear to be encrypted text. There are a variety of iterative attacks such as chosen plaintext attacks that will yield clear text if the encryption relies on a simplistic and consistent encoding pattern. So while the data may appear to be encrypted, and less engineering resources are required to support application features and functionality, the data protection in place is barely skin deep.
Encryption in use is not a kind of magic – it requires dedicated engineering expertise, with collaboration between infrastructure, information security and encryption experts. And, the encryption scheme must be tailored to a specific application or service to deliver on the appropriate balance of security and functionality.
Another significant consideration is evaluating encryption in use in the context of a specific application or service. From the customer’s perspective, it is appealing to use a single encryption platform for multiple applications. No customer wants to have to manage multiple appliances, management interfaces and vendors. The reality, however, is that to strike an acceptable balance for any risk conscious organization between security and functionality requires deep application knowledge and encryption-in-use expertise. Dig a little deeper on degree of support, or risk a gamble on production readiness. The degree of support is as critical as the extent of support.
Evaluating Encryption in Use Claims
Can enterprises rely on a standard validation for encryption in use? Precisely because encryption in use is a new area, third-party validation is a critical requirement before it is implemented in production environments. Unfortunately, the current set of standard validation and certification tests have limited applicability.
The most frequently cited third-party validation by vendors in the space is FIPS 140-2 validation. As critical as 140-2 validation is as an evaluation benchmark, and specifically required under some federal procurement mandates, it has some limitations for encryption in use.
Taking a step backward, its important to note the scope of FIPS validation. The process essentially verifies that the algorithms are implemented according to defined specifications. However, it does not provide any validation about how the platform would use the cryptographic module in order to support encryption in use.
For instance, the FIPS validation doesn't outline a set of best practices on how to use the cryptographic module. Instead, it verifies that whenever the system invokes AES encryption, the module performs AES encryption according to the standard specification. FIPS validation is limited to the cryptographic modules used, not the overall integrity of the platform, or the encryption scheme used in production environments. While FIPS validation is an important consideration, enterprises should be aware of its limitations as the sole third party validation for encryption. In an outside world example, validation would demonstrate that a $500 bicycle lock is impervious to any lock picking attempts, but when used to lock a bike to a fire hydrant, it does nothing to protect the bike from a thief simply lifting the bike up and driving away.
Hopefully this has been useful in helping you to determine the right approach your organization can take to secure and maintain control of your data. I look forward to hearing any further points I might have missed.
To assist customers with legacy Windows Server 2003 that is no longer supported by Microsoft, Racemi has introduced fixed price packages for upgrading and migrating Windows Server 2003 servers to either Windows 2008 R2 or Windows 2012 R2 and the choice of Amazon Web Services (AWS) or SoftLayer cloud. "We're extending a lifeline by upgrading the legacy servers to more modern Windows Server platforms while taking advantage of cloud computing," said James Strayer, vice president of product managem...
Sep. 4, 2015 02:12 PM EDT
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Learn what is going on, contribute to the discussions, and e...
Sep. 4, 2015 02:00 PM EDT Reads: 221
In today's digital world, change is the one constant. Disruptive innovations like cloud, mobility, social media, and the Internet of Things have reshaped the market and set new standards in customer expectations. To remain competitive, businesses must tap the potential of emerging technologies and markets through the rapid release of new products and services. However, the rigid and siloed structures of traditional IT platforms and processes are slowing them down – resulting in lengthy delivery ...
Sep. 4, 2015 01:45 PM EDT Reads: 648
SYS-CON Events announced today that IceWarp will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IceWarp, the leader of cloud and on-premise messaging, delivers secured email, chat, documents, conferencing and collaboration to today's mobile workforce, all in one unified interface
Sep. 4, 2015 12:00 PM EDT Reads: 502
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of tech...
Sep. 4, 2015 12:00 PM EDT Reads: 297
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of ...
Sep. 4, 2015 11:45 AM EDT Reads: 401
U.S. companies are desperately trying to recruit and hire skilled software engineers and developers, but there is simply not enough quality talent to go around. Tiempo Development is a nearshore software development company. Our headquarters are in AZ, but we are a pioneer and leader in outsourcing to Mexico, based on our three software development centers there. We have a proven process and we are experts at providing our customers with powerful solutions. We transform ideas into reality.
Sep. 4, 2015 11:45 AM EDT Reads: 585
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises ar...
Sep. 4, 2015 11:00 AM EDT Reads: 1,617
In 2014, the market witnessed a massive migration to the cloud as enterprises finally overcame their fears of the cloud’s viability, security, etc. Over the past 18 months, AWS, Google and Microsoft have waged an ongoing battle through a wave of price cuts and new features. For IT executives, sorting through all the noise to make the best cloud investment decisions has become daunting. Enterprises can and are moving away from a "one size fits all" cloud approach. The new competitive field has ...
Sep. 4, 2015 11:00 AM EDT Reads: 192
Introducing Containers & Microservices Bootcamp at @CloudExpo Silicon Valley | #Containers #Microservices
SYS-CON Events announced today the Containers & Microservices Bootcamp, being held November 3-4, 2015, in conjunction with 17th Cloud Expo, @ThingsExpo, and @DevOpsSummit at the Santa Clara Convention Center in Santa Clara, CA. This is your chance to get started with the latest technology in the industry. Combined with real-world scenarios and use cases, the Containers and Microservices Bootcamp, led by Janakiram MSV, a Microsoft Regional Director, will include presentations as well as hands-on...
Sep. 4, 2015 11:00 AM EDT Reads: 431
SYS-CON Events announced today that the "Second Containers & Microservices Expo" will take place November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
Sep. 4, 2015 10:45 AM EDT Reads: 663
Moving an existing on-premise infrastructure into the cloud can be a complex and daunting proposition. It is critical to understand the benefits as well as the challenges associated with either a full or hybrid approach. In his session at 17th Cloud Expo, Richard Weiss, Principal Consultant at Pythian, will present a roadmap that can be leveraged by any organization to plan, analyze, evaluate and execute on a cloud migration solution. He will review the five major cloud transformation phases a...
Sep. 4, 2015 10:15 AM EDT Reads: 129
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
Sep. 4, 2015 10:00 AM EDT Reads: 871
Amazon and Google have built software-defined data centers (SDDCs) that deliver massively scalable services with great efficiency. Yet, building SDDCs has proven to be a near impossibility for ‘normal’ companies without hyper-scale resources. In his session at 17th Cloud Expo, David Cauthron, founder and chief executive officer of Nimboxx, will discuss the evolution of virtualization (hardware, application, memory, storage) and how commodity / open source hyper converged infrastructure (HCI) so...
Sep. 4, 2015 09:45 AM EDT Reads: 163
API-Driven Digital Healthcare Solution By @AkanaInc | @DevOpsSummit #API #IoT #DevOps #Microservices
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device acce...
Sep. 4, 2015 09:30 AM EDT Reads: 336
This Enterprise Strategy Group lab validation report of the NEC Express5800/R320 server with Intel® Xeon® processor presents the benefits of 99.999% uptime NEC fault-tolerant servers that lower overall virtualized server total cost of ownership. This report also includes survey data on the significant costs associated with system outages impacting enterprise and web applications. Click Here to Download Report Now!
Sep. 4, 2015 09:15 AM EDT Reads: 386
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Sep. 4, 2015 09:15 AM EDT Reads: 232
Enterprises can achieve rigorous IT security as well as improved DevOps practices and Cloud economics by taking a new, cloud-native approach to application delivery. Because the attack surface for cloud applications is dramatically different than for highly controlled data centers, a disciplined and multi-layered approach that spans all of your processes, staff, vendors and technologies is required. This may sound expensive and time consuming to achieve as you plan how to move selected applicati...
Sep. 4, 2015 09:15 AM EDT Reads: 151
Containers are not new, but renewed commitments to performance, flexibility, and agility have propelled them to the top of the agenda today. By working without the need for virtualization and its overhead, containers are seen as the perfect way to deploy apps and services across multiple clouds. Containers can handle anything from file types to operating systems and services, including microservices. What are microservices? Unlike what the name implies, microservices are not necessarily small,...
Sep. 4, 2015 09:00 AM EDT Reads: 212
Advances in technology and ubiquitous connectivity have made the utilization of a dispersed workforce more common. Whether that remote team is located across the street or country, management styles/ approaches will have to be adjusted to accommodate this new dynamic. In his session at 17th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., will focus on the challenges of managing remote teams, providing real-world examples that demonstrate what works and what...
Sep. 4, 2015 08:45 AM EDT Reads: 146