Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @DXWorldExpo, SDN Journal

@CloudExpo: Blog Feed Post

Weighing the Options for Onboarding Data into the Cloud

One of the questions we hear most frequently is “how do I get my data into the cloud?”

One of the questions we hear most frequently is “how do I get my data into the cloud?” For many organizations, the benefits of expanding on-premise data storage to include hybrid cloud storage have begun to resonate, but they struggle to get started as they determine how to get move data into the cloud. The decision on how to onboard initial data to the cloud, or what we call the initial ingest, is one that cannot be overlooked.

Cloud-truck

While there is more than one way to perform the initial ingest, it shouldn’t be a surprise that the best solution can vary on an individual case basis. Relevant factors influencing the decision include: amount of data intended for ingestion, amount of available bandwidth, timeframe in which you want to load the data. Typically, most organizations will decide on one of the following three methods for the initial ingest:

  • Use existing bandwidth to perform the transfer over time
  • Increase or “burst” bandwidth for the duration of the transfer
  • Ship media directly to a cloud provider

Use existing bandwidth
Calculating how long it takes to upload a large amount of data across a WAN involves a bit of straightforward arithmetic. For instance, an uplink speed of 100Mbit/sec should be able to push nearly 1TB per day.

While this approach sounds cut and dry, in practice, organizations need to consider a few additional factors:

  • Subtract typical WAN usage to more accurately calculate available bandwidth
  • Employ bandwidth throttling and scheduling to minimize impact on existing applications
  • Cache/buffer the data so they can continue to access data during the ingest process – sometimes starting with a large buffer and shrinking it over time

Temporarily increase bandwidth
For circumstances where existing bandwidth will not onboard data in the cloud in a timely manner, another option is to temporarily increase bandwidth during the upload process. Some telcos and internet providers offer bursting capability for short durations lasting weeks or months. Once the ingest completes, bandwidth can be restored as before to accommodate the normal course of data accesses and updates

An alternative to increasing bandwidth is using a temporary colocation or data center facility that has higher-bandwidth access to the cloud provider. This adds the additional costs of transportation, equipment setup and leasing but may offer a cost-effective compromise.

Physically ship media
Ultimately, if data cannot be onboarded in a timely manner via network (let’s say it’s a few PB in size), shipping physical media to a cloud provider is the next option. While this option may seem deceptively easy, it’s  important not to ignore best practices when physical shipping media.

Whereas many organizations have adopted a “zero trust” model for their data already stored in the cloud (meaning all data is encrypted with a set of keys maintained locally), transporting data requires similar safeguards.

This week, TwinStrata announced the latest release of CloudArray, which includes a secure import process that encrypts and encapsulates data into object format stored in the cloud prior to shipping the data. Following the same security practice used for storing data online in the cloud eliminates security compromises that may lead to possible data breaches.

The bottom line
While there are benefits to expanding on-premise storage infrastructure with a secure, hybrid cloud strategy, often the starting point involves answering the question of how to get initial data there. Choosing the right option can both satisfy the need for timeliness while mitigating risks around security and disruption.

The post Weighing the options for onboarding data into the cloud appeared first on TwinStrata.

Read the original blog entry...

More Stories By Nicos Vekiarides

Nicos Vekiarides is the Chief Executive Officer & Co-Founder of TwinStrata. He has spent over 20 years in enterprise data storage, both as a business manager and as an entrepreneur and founder in startup companies.

Prior to TwinStrata, he served as VP of Product Strategy and Technology at Incipient, Inc., where he helped deliver the industry's first storage virtualization solution embedded in a switch. Prior to Incipient, he was General Manager of the storage virtualization business at Hewlett-Packard. Vekiarides came to HP with the acquisition of StorageApps where he was the founding VP of Engineering. At StorageApps, he built a team that brought to market the industry's first storage virtualization appliance. Prior to StorageApps, he spent a number of years in the data storage industry working at Sun Microsystems and Encore Computer. At Encore, he architected and delivered Encore Computer's SP data replication products that were a key factor in the acquisition of Encore's storage division by Sun Microsystems.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...