Welcome!

SDN Journal Authors: Yeshim Deniz, Elizabeth White, Liz McMillan, Pat Romanski, TJ Randall

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Agile Computing, Cloud Security, Government Cloud, @DXWorldExpo, SDN Journal

@CloudExpo: Article

Industry Standards Drive Cloud Adoption

Applying this enabling technology to core business and mission objectives

With a focus on developing affordable solutions that drive innovation for our customers' missions, I believe that the development of cloud standards can have a positive impact on cloud adoption. The more than 370 members of the cross-domain Cloud Standards Customer Council (CSCC) are providing customer-focused business and mission requirements that help drive the adoption and usefulness of cloud standards, especially in the areas of security, interoperability and data portability. The public and private sectors are making important contributions and we need to continue the progress industry and government have made in the development of standards.

In thinking about the way standards positively impact cloud adoption, three main themes arise:

First, open standards are important not only because they help accelerate the adoption of cloud computing, but also because technical cloud standards actually impact innovation and enable more innovative business models and technical solutions. With open standards in place we can more easily create new services - such as harnessing the power of mobile and utilizing social media collaboration - and do so more quickly. It's like using building blocks to build a foundation rather than unique fragments.

For that innovation to occur, the industry offers a unique perspective in the development of these technical standards. The CSCC has developed industry use cases that highlight gaps that standards need to fill and these help to shape the priorities of standards development organizations. These use cases accelerate the development of reference implementations (or cloud prototypes) that guide the development of cloud standards, and are generally requisite for their approval.

Second, we're encouraged by an increased standards development focus on the three areas that are most significant to cloud adoption - cloud security (which remains the top impediment), cloud interoperability (hybrid cloud models are driving interoperability requirements), and data portability (the need to move data easily from the enterprise to clouds and across clouds). Organizations that are producing standards have looked at these three priority areas and created significant momentum around them. Soon we'll have standards that help customers achieve best practice engagement of cloud providers on the three issues that matter most, with a beneficial effect on cloud adoption.

And third, the positive influence of Government. Government is playing an important leadership role in the development of cloud standards and making significant advances. For example, the National Institute of Science and Technology, the Department of Homeland Security and the General Services Administration have all been extremely focused on cloud standards, with special attention to security, cloud interoperability and data portability. In fact, the National Institute of Science and Technology's definition for cloud computing and its cloud security baseline as expressed in the Federal Risk and Authorization Program (FedRAMP) are becoming de facto standards. The efforts of these government agencies have made significant impact on cloud adoption and the development of open cloud standards.

As cloud computing based on open standards continues to gain ground and issues like security, interoperability and portability are addressed, we'll see greater ability to apply this enabling technology to core business and mission objectives.

More Stories By Melvin Greer

Melvin Greer is Senior Fellow and Chief Strategist, Cloud Computing, Lockheed Martin, Chief Technology Office. With over 25 years of systems and software engineering experience, he is a recognized expert in Service-Oriented Architecture and Cloud Computing. He functions as a principal investigator in advanced research studies. He significantly advances the body of knowledge in basic research and critical, highly advanced engineering and scientific disciplines. Mr. Greer is a Certified Enterprise Architect, the Vice-chair of the Network Centric Operations Industry Consortium (NCOIC), Cloud Computing Working Group and an Advisory Council member of the Cloud Security Alliance.

Greer received his BS in Computer Information Systems and Technology and his MS in Information Systems from American University, Wash. D.C. He also completed the Executive Leadership Program at the Cornell University, Johnson Graduate School.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is being used on IBM Cloud, Amazon, and Microsoft Azure and how to gain access to these resources in the cloud... for FREE!
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure.
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully been able to harness the excess capacity of privately owned vehicles and turned into a meaningful business. This concept can be step-functioned to harnessing the spare compute capacity of smartphones that can be orchestrated by MEC to provide cloud service at the edge.