Welcome!

SDN Journal Authors: Yeshim Deniz, Liz McMillan, Elizabeth White, Pat Romanski, TJ Randall

Related Topics: @CloudExpo, Java IoT, Containers Expo Blog, Cloud Security, Government Cloud, SDN Journal

@CloudExpo: Blog Feed Post

The Federal Government Journey to Cloud Computing: Lessons Learned

When selecting the appropriate deployment model don’t reflexively pick private as the “obvious low risk choice”

In February 2011, Vivek Kundra announced the “Cloud First” policy across the US Government. The directive, issued through the Office of Management and Budget, required agencies to give cloud technology first priority in developing IT projects. He also described cloud computing as a “10 year journey”. According to a Deltek report, federal agency spending on cloud computing will grow from $2.3 billion in fiscal 2013 to $6.1 billion by fiscal 2018. This forecast clearly raises the importance of understanding what has happened over the past few years. In my opinion, the Top 5 most important lessons learned are:

  1. When selecting the appropriate deployment model (Public, Private, Hybrid or Community) don’t reflexively pick private as the “obvious low risk choice”. Private cloud with no resource sharing doesn’t deliver the promised cost savings. Do the math, do the science and do the engineering. Develop a real business case. Start with functional requirements related to the mission. If the numbers don’t make sense, don’t do it
  2. Failure to modify business processes to take advantage of the parallel nature of cloud computing platforms will lead to minimal improvements in those processes. Government IT managers must accept that cloud computing means the purchase of a service, not the purchase of technology. This usually represents a fundamental change in how technology is acquired and managed.
  3. Treating your cloud transition as only an IT project is a big mistake. Business/Mission owners and Procurement officials must be intimately involved. According to an Accenture report sponsored by the Government Business Council, the challenges federal agencies have experienced in cloud development have restrained deployments to date, but alleviating these impediments should spur adoption. Agencies looking to cloud infrastructure need to develop standardized procurement requirement statements and SLAs that address both cyber security and operational issues at a level of detail to minimize interpretation issues.
  4. Cloud transitions have significant education and cultural challenges. Cloud transition strategies also require a robust change management plan. Change is hard, and I think change in government is harder. So I think having a well formed plan for communication and change management is incredibly important. Implementing cloud-based best practices requires an immense and continuous effort to ensure that new practices are embraced.
  5. Federal agencies need to improve the maturity of their respective enterprise architectures lack of which makes cloud transitions difficult. This will require focused agency leadership. GAO’s experience has shown that attempting to modernize and evolve IT environments without an architecture to guide and constrain investments results in operations and systems that are duplicative, not well integrated, costly to maintain, and ineffective in supporting mission goals.

The road ahead still has storm clouds and heavy rain, but all in all, we’ve made a good start.

[Republished from"On The FrontLines" magazine "Cloud Computing in Government: Lesson's Learned" issue. Download the full 20 page issue online at http://digital.onthefrontlines.net/i/319551 )
Bookmark and Share

 

Cloud Musings

( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2012)
-->

Read the original blog entry...

More Stories By Kevin Jackson

Kevin Jackson, founder of the GovCloud Network, is an independent technology and business consultant specializing in mission critical solutions. He has served in various senior management positions including VP & GM Cloud Services NJVC, Worldwide Sales Executive for IBM and VP Program Management Office at JP Morgan Chase. His formal education includes MSEE (Computer Engineering), MA National Security & Strategic Studies and a BS Aerospace Engineering. Jackson graduated from the United States Naval Academy in 1979 and retired from the US Navy earning specialties in Space Systems Engineering, Airborne Logistics and Airborne Command and Control. He also served with the National Reconnaissance Office, Operational Support Office, providing tactical support to Navy and Marine Corps forces worldwide. Kevin is the founder and author of “Cloud Musings”, a widely followed blog that focuses on the use of cloud computing by the Federal government. He is also the editor and founder of “Government Cloud Computing” electronic magazine, published at Ulitzer.com. To set up an appointment CLICK HERE

CloudEXPO Stories
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a member of the Society of Information Management (SIM) Atlanta Chapter. She received a Business and Economics degree with a minor in Computer Science from St. Andrews Presbyterian University (Laurinburg, North Carolina). She resides in metro-Atlanta (Georgia).
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. His expertise is in automating deployment, management, and problem resolution in these environments, allowing his teams to run large transactional applications with high availability and the speed the consumer demands.
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage of data and analytics in the cloud, Architecture, integration, governance and security scenarios and Key challenges and success factors of moving data and analytics to the cloud
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value extensible storage infrastructure has in accelerating software development activities, improve code quality, reveal multiple deployment options through automated testing, and support continuous integration efforts. All this will be described using tools common in DevOps organizations.