Click here to close now.

Welcome!

SDN Journal Authors: Pat Romanski, Liz McMillan, Elizabeth White, Yeshim Deniz, Carmen Gonzalez

Related Topics: CloudExpo® Blog, Java IoT, @MicroservicesE Blog, Cloud Security, BigDataExpo® Blog, SDN Journal

CloudExpo® Blog: Article

Cloud as Delivery Vehicle – The Next Wave of the Internet

Meet FireHost at Cloud Expo New York

"While cost has traditionally been a foundational benefit of the cloud, we believe that other factors can play a more important part in defining the cloud of tomorrow," said Todd Gleason, Vice President of Technology at FireHost, in this exclusive Q&A with Cloud Expo conference chairs Larry Carvalho and Vanessa Alvarez. "We believe that the generalist cloud provider - those low-cost commodity clouds that created price competition and insecure clouds - are going to either be gobbled up or beaten up as the industry evolves from this generalist mentality to focus on specialist clouds that have a particular focus."

Cloud Computing Journal: How are cloud standards playing a role in expanding adoption among users? Are standards helping new business models for service providers?

Todd Gleason: Standardization is important, but we're concerned because security desperately needs to be part of this effort. It is not addressed enough, if at all. Much of the open standards work in cloud and other IT technologies traditionally is not security-conscious, making adoption riskier than many want to admit or truly understand. So, while standards are a good thing, we believe security should be baked in at their inception. We advocate for innovation, but it's important that the innovative spirit is not blind to the need to protect vendors and customers.

Cloud Computing Journal: How are hybrid clouds evolving to allow the coexistence of private and public clouds? What are the challenges to meeting a true hybrid cloud scenario?

Gleason: Too much attention and marketing real estate has been paid to debating the merits of public, private and hybrid clouds. Definitions of what's public, what's private and what's hybrid are wide-reaching and are often created to suit a vendor's point of view rather than an educational, objective delineation for the industry. From where we sit, use cases and security are key to disarming this debate. The conversation needs to shift from which model is best in general to which one is appropriate for an individual vendor, provider, or customer, and it must address and incorporate security because regardless of the model chosen, a cloud is insecure unless it's built secure. Use cases vary for industries and individual companies, but security is a common denominator and applicable for all of them. This is where the industry discussion needs to shift.

Cloud Computing Journal: Are on-premise software vendors successfully migrating their business model to a SaaS model? What are the challenges faced in this journey?

Gleason: Most are and the bigger question is how effective will they be at delivering from the cloud and will it be secure. Will they expand their business model and provide infrastructure, or will they stay in their niche as software only players? And for those who do not migrate, will they become more inefficient and outdated?

Cloud as a delivery vehicle is the next wave of the Internet, especially as automation, software techniques, and infrastructures adapt toward applications and servicing their performance more and more. The efficiencies of today are miles ahead of an insulated, on-premise software solution - a situation that gets much sweeter when security is architected into the infrastructure. However, we are not even close to being a truly application-centric world.

Truthfully, though, we need to focus on what the customer wants. Every model will be viable to various customers, so some software vendors may become irrelevant due to the cloud providing a better way to service the customer's needs. The cloud industry can be too hung up on terms such as hybrid, public, SaaS or PaaS. Instead, we have to focus on the customers' needs and their applications that run their business and give them competitive advantages operationally and provide a truly seamless experience.

Cloud Computing Journal: What are the challenges for end users to adopt a new model for application development using Platform as a Service? Are vendors doing enough to meet their needs?

Gleason: Platforms are important to reduce time-to-market for SaaS or internal use cases, and it's good to see the industry shifting from infrastructures being the center of the universe to how they can be architected to service applications in a more on-demand, dynamic, intelligent manner. But, again, where is security in the discussion? It's not anywhere to be found in the majority of industry conversations and vendor communications. It badly needs to be.

In addition to security, there are two other big to-dos that need to be addressed: orchestration and inter-cloud technology. To make applications work well across infrastructures, those infrastructures must work well together in an inter-cloud fashion. This brings us back to security. Businesses will have many clouds, as will their customers. The weakest link is the most insecure cloud, so change must happen to bring security into the larger, strategic conversations rather than focusing so much on architecting an infrastructure or specific software techniques.

Cloud Computing Journal: With several vendors lowering costs for infrastructure, is there a way for new cloud service providers entering this space to make money?

Gleason: While cost has traditionally been a foundational benefit of the cloud, we believe that other factors can play a more important part in defining the cloud of tomorrow. We believe that the generalist cloud provider - those low-cost commodity clouds that created price competition and insecure clouds - are going to either be gobbled up or beaten up as the industry evolves from this generalist mentality to focus on specialist clouds that have a particular focus. For FireHost, that's security. We evolved out of the need for security and have created an infrastructure that provides an advanced security architecture, top security personnel and compliance expertise for customers that are security- and compliance-driven.

•   •   •

Todd Gleason, Vice President of Technology at FireHost, is a central figure in continuously innovating and architecting the secure cloud infrastructure for FireHost and its customers. He brings 15 years of global IT and R&D experience, including deep knowledge of security, cloud, networking, compute, virtualization, storage, and application delivery technologies.

Gleason is driven by the notion that customers ultimately care about servicing their applications to perform efficiently in a secure environment. With this in mind, Gleason has displayed a knack for understanding the latest technology trends in the industry, synthesizing what is applicable for FireHost's customers, and incorporating relevant technology into the company's cloud infrastructure to ensure businesses' applications and data are protected in a cutting-edge, secure cloud. Gleason has helped redefine industry expectations about performance, security, and compliance in the cloud. His creative approach to architecting FireHost's secure cloud proves that security can be incorporated without compromising infrastructure performance, which improves customers' risk management while maintaining thrifty usage of infrastructure resources.

Prior to his role with FireHost, Gleason worked as director of information technology at Panini America, formerly Donruss Trading Cards, which is now a subsidiary of the global Panini conglomerate. There he implemented a high-performance infrastructure and unique business applications enabling rapid high-quality production workflows and integration into the Panini global business. Gleason holds a degree in computer information systems from Remington College.

More Stories By Pat Romanski

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

Comments (1)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at ...
There has been a lot of discussion recently in the DevOps space over whether there is a unique form of DevOps for large enterprises or is it just vendors looking to sell services and tools. In his session at DevOps Summit, Chris Riley, a technologist, discussed whether Enterprise DevOps is a unique species or not. What makes DevOps adoption in the enterprise unique or what doesn’t? Unique or not, what does this mean for adopting DevOps in enterprise size organizations? He also explored differe...
Storage administrators find themselves walking a line between meeting employees’ demands to use public cloud storage services, and their organizations’ need to store information on-premises for security, performance, cost and compliance reasons. However, as file sharing protocols like CIFS and NFS continue to lose their relevance, simply relying only on a NAS-based environment creates inefficiencies that hurt productivity and the bottom line. IT wants to implement cloud storage it can purchase a...
The emergence of cloud computing and Big Data warrants a greater role for the PMO to successfully manage enterprise transformation driven by these powerful trends. As the adoption of cloud-based services continues to grow, a governance model is needed to orchestrate enterprise cloud implementations and harness the power of Big Data analytics. In his session at Cloud Expo, Mahesh Singh, President of BigData, Inc., discussed how the Enterprise PMO takes center stage not only in developing the app...
Cloud Foundry open Platform as a Service makes it easy to operate, scale and deploy application for your dedicated cloud environments. It enables developers and operators to be significantly more agile, writing great applications and deliver them in days instead of months. Cloud Foundry takes care of all the infrastructure and network plumbing that you need to build, run and operate your applications and can do this while patching and updating systems and services without any downtime.
Are your Big Data initiatives resulting in Big Impact or Big Mess? In her session at Big Data Expo, Penelope Everall Gordon, Emerging Technology Strategist at 1Plug Corporation, shared her successes in improving Big Decision outcomes by building stories compelling to the target audience – and her failures when she lost sight of the plotline, distracted by the glitter of technology and the lure of buried insights. The cast of characters includes the agency head [city official? elected official?...
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial C...
SYS-CON Events announced today that BMC will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. BMC delivers software solutions that help IT transform digital enterprises for the ultimate competitive business advantage. BMC has worked with thousands of leading companies to create and deliver powerful IT management services. From mainframe to cloud to mobile, BMC pairs high-speed digital innovation with robust...
After a couple of false starts, cloud-based desktop solutions are picking up steam, driven by trends such as BYOD and pervasive high-speed connectivity. In his session at 15th Cloud Expo, Seth Bostock, CEO of IndependenceIT, cut through the hype and the acronyms, and discussed the emergence of full-featured cloud workspaces that do for the desktop what cloud infrastructure did for the server. He also discussed VDI vs DaaS, implementation strategies and evaluation criteria.
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core...
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective ...
Collecting data in the field and configuring multitudes of unique devices is a time-consuming, labor-intensive process that can stretch IT resources. Horan & Bird [H&B], Australia’s fifth-largest Solar Panel Installer, wanted to automate sensor data collection and monitoring from its solar panels and integrate the data with its business and marketing systems. After data was collected and structured, two major areas needed to be addressed: improving developer workflows and extending access to a b...
When an enterprise builds a hybrid IaaS cloud connecting its data center to one or more public clouds, security is often a major topic along with the other challenges involved. Security is closely intertwined with the networking choices made for the hybrid cloud. Traditional networking approaches for building a hybrid cloud try to kludge together the enterprise infrastructure with the public cloud. Consequently this approach requires risky, deep "surgery" including changes to firewalls, subnets...
Move from reactive to proactive cloud management in a heterogeneous cloud infrastructure. In his session at 16th Cloud Expo, Manoj Khabe, Innovative Solution-Focused Transformation Leader at Vicom Computer Services, Inc., will show how to replace a help desk-centric approach with an ITIL-based service model and service-centric CMDB that’s tightly integrated with an event and incident management platform. Learn how to expand the scope of operations management to service management. He will al...
2015 predictions circa 1970: houses anticipate our needs and adapt, city infrastructure is citizen and situation aware, office buildings identify and preprocess you. Today smart buildings have no such collective conscience, no shared set of fundamental services to identify, predict and synchronize around us. LiveSpace and M2Mi are changing that. LiveSpace Smart Environment devices deliver over the M2Mi IoT Platform real time presence, awareness and intent analytics as a service to local connecte...
High-performing enterprise Software Quality Assurance (SQA) teams validate systems that are ready for use - getting most actively involved as components integrate and form complete systems. These teams catch and report on defects, making sure the customer gets the best software possible. SQA teams have leveraged automation and virtualization to execute more thorough testing in less time - bringing Dev and Ops together, ensuring production readiness. Does the emergence of DevOps mean the end of E...
Amazon and Google have built software-defined data centers (SDDCs) that deliver massively scalable services with great efficiency. Yet, building SDDCs has proven to be a near impossibility for companies without hyper-scale resources. In his session at 15th Cloud Expo, David Cauthron, CTO and Founder of NIMBOXX, highlighted how a mid-sized manufacturer of global industrial equipment bridged the gap from virtualization to software-defined services, streamlining operations and costs while connect...
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, ...