Welcome!

SDN Journal Authors: Elizabeth White, Yeshim Deniz, Liz McMillan, Pat Romanski, TJ Randall

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @DXWorldExpo, SDN Journal

@CloudExpo: Blog Feed Post

Weighing the Options for Onboarding Data into the Cloud

One of the questions we hear most frequently is “how do I get my data into the cloud?”

One of the questions we hear most frequently is “how do I get my data into the cloud?” For many organizations, the benefits of expanding on-premise data storage to include hybrid cloud storage have begun to resonate, but they struggle to get started as they determine how to get move data into the cloud. The decision on how to onboard initial data to the cloud, or what we call the initial ingest, is one that cannot be overlooked.

Cloud-truck

While there is more than one way to perform the initial ingest, it shouldn’t be a surprise that the best solution can vary on an individual case basis. Relevant factors influencing the decision include: amount of data intended for ingestion, amount of available bandwidth, timeframe in which you want to load the data. Typically, most organizations will decide on one of the following three methods for the initial ingest:

  • Use existing bandwidth to perform the transfer over time
  • Increase or “burst” bandwidth for the duration of the transfer
  • Ship media directly to a cloud provider

Use existing bandwidth
Calculating how long it takes to upload a large amount of data across a WAN involves a bit of straightforward arithmetic. For instance, an uplink speed of 100Mbit/sec should be able to push nearly 1TB per day.

While this approach sounds cut and dry, in practice, organizations need to consider a few additional factors:

  • Subtract typical WAN usage to more accurately calculate available bandwidth
  • Employ bandwidth throttling and scheduling to minimize impact on existing applications
  • Cache/buffer the data so they can continue to access data during the ingest process – sometimes starting with a large buffer and shrinking it over time

Temporarily increase bandwidth
For circumstances where existing bandwidth will not onboard data in the cloud in a timely manner, another option is to temporarily increase bandwidth during the upload process. Some telcos and internet providers offer bursting capability for short durations lasting weeks or months. Once the ingest completes, bandwidth can be restored as before to accommodate the normal course of data accesses and updates

An alternative to increasing bandwidth is using a temporary colocation or data center facility that has higher-bandwidth access to the cloud provider. This adds the additional costs of transportation, equipment setup and leasing but may offer a cost-effective compromise.

Physically ship media
Ultimately, if data cannot be onboarded in a timely manner via network (let’s say it’s a few PB in size), shipping physical media to a cloud provider is the next option. While this option may seem deceptively easy, it’s  important not to ignore best practices when physical shipping media.

Whereas many organizations have adopted a “zero trust” model for their data already stored in the cloud (meaning all data is encrypted with a set of keys maintained locally), transporting data requires similar safeguards.

This week, TwinStrata announced the latest release of CloudArray, which includes a secure import process that encrypts and encapsulates data into object format stored in the cloud prior to shipping the data. Following the same security practice used for storing data online in the cloud eliminates security compromises that may lead to possible data breaches.

The bottom line
While there are benefits to expanding on-premise storage infrastructure with a secure, hybrid cloud strategy, often the starting point involves answering the question of how to get initial data there. Choosing the right option can both satisfy the need for timeliness while mitigating risks around security and disruption.

The post Weighing the options for onboarding data into the cloud appeared first on TwinStrata.

Read the original blog entry...

More Stories By Nicos Vekiarides

Nicos Vekiarides is the Chief Executive Officer & Co-Founder of TwinStrata. He has spent over 20 years in enterprise data storage, both as a business manager and as an entrepreneur and founder in startup companies.

Prior to TwinStrata, he served as VP of Product Strategy and Technology at Incipient, Inc., where he helped deliver the industry's first storage virtualization solution embedded in a switch. Prior to Incipient, he was General Manager of the storage virtualization business at Hewlett-Packard. Vekiarides came to HP with the acquisition of StorageApps where he was the founding VP of Engineering. At StorageApps, he built a team that brought to market the industry's first storage virtualization appliance. Prior to StorageApps, he spent a number of years in the data storage industry working at Sun Microsystems and Encore Computer. At Encore, he architected and delivered Encore Computer's SP data replication products that were a key factor in the acquisition of Encore's storage division by Sun Microsystems.

CloudEXPO Stories
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to advisory roles at startups. He has worked extensively on monetization, SAAS, IoT, ecosystems, partnerships and accelerating growth in new business initiatives.
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments that frequently get lost in the hype. The panel will discuss their perspective on what they see as they key challenges and/or impediments to adoption, and how they see those issues could be resolved or mitigated.
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app security and encryption-related solutions. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University, and is an O'Reilly author.
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.