Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: @CloudExpo

@CloudExpo: Article

Cloud Computing Public or Private? How to Choose Cloud Storage

Understanding cloud computing, your data and usage

Factors That Will Impact the Public Versus Private Decision
Initial Investment
There is often an assumption that private clouds require a million dollar capital outlay and investment in racks and racks of equipment. The reality is that private clouds can be built for under $5,000 and deployments are simple. Users can download software and have a cloud running in under an hour.

Public clouds can start as low as $100. Since there is no hardware or software to purchase, the initial investment is trivial assuming your applications can speak the required protocols.

Amount of Data
Cloud storage is known for its massive scalability but most companies start out small. Private clouds can start in the few TB range and provide easy scale out of capacity by adding additional nodes or disks.

Public clouds start even smaller. For example, a public cloud makes it easy to back up a single laptop or deploy an application starting at a few GBs. As you grow you can lease more capacity and the cost scales linearly.

Longevity of Data
How long you plan on keeping data in the cloud can greatly impact your selection. As data ages within the public cloud, the cost continues to rise. If you are publishing frequently changing or short-lived content, such as movie trailers or daily newscasts, the flexibility of a public cloud is a good solution.

Private clouds are licensed like enterprise. Longevity of data does not increase the cost of the solution, which bodes well for archive or content repository applications.

Required Performance
Private clouds are deployed inside the firewall and accessed over the Ethernet LAN at wire speed. Read access in the 100 MB/s range per node is not uncommon. Adding nodes provides additional performance to the cloud. Files can be replicated to many nodes, each of which can serve requests independently.

Public clouds are accessed over the Internet and face the limits of your and the providers bandwidth connection. This usually caps out around 10MB/s. To scale performance you can initiate additional 10MB/s connections but doing so increases the bandwidth charges.

Access Patterns and Locations
Public cloud offerings typically include replication of data to multiple geographically dispersed locations, sometimes for an extra fee. If your users are global and can benefit from locality of data, a public cloud can sometimes substitute for a content distribution network.

Private clouds are typically deployed in a single location for LAN-based access. Remote users will need to connect over the WAN and work with Internet-type latencies. Larger private cloud deployments can include multiple locations and start to approach the public cloud distribution, albeit at a higher initial investment.

Security and Data Isolation
There are many published opinions and dedicated websites that cover security of public cloud offerings. The bottom line is it comes down to control of your data. Public clouds are just that, public. Isolation of data is only as strong as the virtualization technologies used to build the cloud and the provider's firewall. If you are concerned about the data being outside of your company, it should not be in a public cloud.

Private clouds are owned, deployed and managed by internal employees. Data is isolated based on your requirements and security is based on internal processes.

Confidentiality and Destruction of Data
Similar to security, confidentiality of data is a factor to consider when choosing a cloud storage solution. The law is defined based on control of the data. If the service provider is subpoenaed for your data based on their control of the data, they must comply regardless of your knowledge or objections.

With private clouds you maintain control and have input, or at least knowledge of, legal activities. When it comes time to destroy or delete the data, it is in your power and can be confirmed by your own team.

Service Level Agreements
Private clouds have different mechanisms for data availability and service of access. Most leverage multiple copies of files on multiple nodes and treat each node as a failure domain. Individual server failures do not bring down the cloud or create data loss, so most SLA agreements are satisfied. Be sure to have a complete understanding of the architecture and its capabilities when selecting and deploying a private cloud.

Public cloud SLAs are published by the provider and are their responsibility. Remediation is typically a cash payment and while they will do their best to recover data, there is no guarantee of data availability. SLAs can also be impacted by Internet connectivity. If your link goes down, you cannot access your data and there is no remediation (unless your network provider has guaranteed uptime).

In-House Technical Resources
Public clouds remove the need for server and storage administrators. Data center space and associated costs are removed, enabling businesses to focus on core competencies, although not all technical resource requirements go away. Many public offerings use newer protocols such as WebDav or REST for access. If your applications do not support this protocol, it will require changes in process, code, and/or technical staff.

Deployed inside the firewall, private clouds require system administrators but at a scale that dwarfs existing NAS standards. A single system administrator can easily manage a 100-node cloud with a part-time effort. Private clouds leverage widely deployed standards, such as NFS or CIFS, removing the need for custom application development.

How Do I Choose?
Choosing the appropriate solution is not an exercise in rocket science and comes down to a basic understanding of your data and usage. Apply the above criteria to your use case and the answer will emerge.

More Stories By Mike Maxey

Mike Maxey is director of product management for ParaScale, a Silicon Valley startup focused on addressing the exploding bulk storage requirements for digital content and archival data.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
Drug Rehab 03/13/10 12:44:00 PM EST

Nice read mate! I have a problem though, I can't seem to get your RSS feed to work right in google chrome, is it on my end?

IoT & Smart Cities Stories
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
All in Mobile is a place where we continually maximize their impact by fostering understanding, empathy, insights, creativity and joy. They believe that a truly useful and desirable mobile app doesn't need the brightest idea or the most advanced technology. A great product begins with understanding people. It's easy to think that customers will love your app, but can you justify it? They make sure your final app is something that users truly want and need. The only way to do this is by ...
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Cell networks have the advantage of long-range communications, reaching an estimated 90% of the world. But cell networks such as 2G, 3G and LTE consume lots of power and were designed for connecting people. They are not optimized for low- or battery-powered devices or for IoT applications with infrequently transmitted data. Cell IoT modules that support narrow-band IoT and 4G cell networks will enable cell connectivity, device management, and app enablement for low-power wide-area network IoT. B...
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...