|By Gerry Grealish||
|January 10, 2014 10:45 AM EST||
Many news organizations including The Washington Post are reporting that the latest documents leaked by former NSA contractor turned whistleblower Edward Snowden show the NSA is in the early stages of working to build a quantum computer that could possibly crack most types of encryption. The NSA actually discloses they are working on quantum computing technology on their website, however, the status of the research was previously unknown. According to these new documents the agency is working on a "cryptologically useful quantum computer" as part of a research program called "Penetrating Hard Targets" and the goal is to use it for cracking encryption.
With headlines that scream, "NSA Secretly Funding Code-Breaking Quantum Computer Research," it's easy to see why many executives and enterprises are anxious and perhaps starting to lose faith in internet communications and transactions. Encryption is used to protect medical, banking, business and government records around the world. But, as many of the articles in the media today point out, the reality today is quantum computing is a theoretical research topic and is many years away from being a usable real-world technology. The Washington Post article quotes Scott Aaronson, an associate professor of electrical engineering and computer science at the Massachusetts Institute of Technology, "It seems improbable that the NSA could be that far ahead of the open world without anybody knowing it."
As cryptography expert Bruce Schneier said in a piece in USA Today, "I worry a lot more about poorly designed cryptographic products, software bugs, bad passwords, companies that collaborate with the NSA to leak all or part of the keys, and insecure computers and networks. Those are where the real vulnerabilities are, and where the NSA spends the bulk of its efforts."
Mr. Schneier's comments do re-affirm the importance of never locking-in, intractably, to a single encryption algorithm/technique. If an organization does happen to lose faith in the integrity of a specific encryption algorithm, and it has become core to many/all of the systems it runs, they would be in a very difficult position. The systems that are used to protect information, like cloud encryption gateways, need to be flexible enough to do their job regardless of what encryption algorithms are used. This design approach provides organizations with the flexibility to swap algorithms in and out over time based upon their preference without impacting the core capabilities of the solutions using these encryption modules.
Even though quantum computers are years away, the news today is an important reminder to ensure that if you are using encryption, take care to make sure you are using the strongest, most well-vetted techniques available when protecting sensitive data. Groups such as National Institute of Standards and Technology (NIST) have standards such as Federal Information Processing Standards (FIPS) for use across the Federal Government in the United States. The FIPS 140-2 standard is an information technology security accreditation program for validating that the cryptographic modules produced by private sector companies meet well-defined security standards. Organizations should look for strong, industry acknowledged encryption approaches that meet accredited standards such as FIPS 140-2 when protecting sensitive and private information, and have well documented third-party peer-reviewed security proofs.
Also, enterprises should strongly consider the inherent strength of an alternative data protection technique known as tokenization (which has no keys to crack). Tokenization is a process by which a sensitive data field, such as a primary account number (PAN) from a credit or debit card, is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated original value. While there are various approaches to creating tokens, they typically are simply randomly generated values that have no mathematical relation to the original data field. The inherent security of tokenization is that it is nearly impossible to determine the original value of the sensitive data field by knowing only the surrogate token value. If a criminal got access to the token (in a cloud environment for example), there is no "quantum computer" that could ever decipher it back into its original form.
For more information on encryption, tokenization and retaining control over sensitive data in the cloud, please visit our resource center.
PerspecSys Inc. is a leading provider of cloud protection and cloud encryption solutions that enable mission-critical cloud applications to be adopted throughout the enterprise. Cloud security companies like PerspecSys remove the technical, legal and financial risks of placing sensitive company data in the cloud. PerspecSys accomplishes this for many large, heavily regulated companies across the world by never allowing sensitive data to leave a customer's network, while maintaining the functionality of cloud applications. For more information please visit / or follow on Twitter @perspecsys.
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Jul. 29, 2016 12:30 AM EDT Reads: 1,678
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Jul. 29, 2016 12:15 AM EDT Reads: 1,986
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
Jul. 29, 2016 12:15 AM EDT Reads: 2,081
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Jul. 28, 2016 11:45 PM EDT Reads: 1,218
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Jul. 28, 2016 10:15 PM EDT Reads: 1,228
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Jul. 28, 2016 10:15 PM EDT Reads: 2,222
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 28, 2016 10:15 PM EDT Reads: 1,438
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Jul. 28, 2016 10:00 PM EDT Reads: 2,098
As companies gain momentum, the need to maintain high quality products can outstrip their development team’s bandwidth for QA. Building out a large QA team (whether in-house or outsourced) can slow down development and significantly increases costs. This eBook takes QA profiles from 5 companies who successfully scaled up production without building a large QA team and includes: What to consider when choosing CI/CD tools How culture and communication can make or break implementation
Jul. 28, 2016 10:00 PM EDT Reads: 1,724
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 28, 2016 10:00 PM EDT Reads: 1,450
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 28, 2016 09:00 PM EDT Reads: 2,711
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Jul. 28, 2016 07:15 PM EDT Reads: 401
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 28, 2016 06:45 PM EDT Reads: 1,646
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Jul. 28, 2016 06:00 PM EDT Reads: 969
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Jul. 28, 2016 05:30 PM EDT Reads: 1,878
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 28, 2016 05:30 PM EDT Reads: 2,217
[webcast] Continuous Delivery in the Enterprise | @DevOpsSummit @IBMDevOps #IBM #DevOps #ContinuousDelivery
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
Jul. 28, 2016 05:00 PM EDT Reads: 446
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
Jul. 28, 2016 04:30 PM EDT Reads: 921
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 28, 2016 04:30 PM EDT Reads: 1,198
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Jul. 28, 2016 04:15 PM EDT Reads: 396