Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: @CloudExpo, Containers Expo Blog

@CloudExpo: Blog Feed Post

The Virtual Public-Private Cloud Connection

Secure, optimized tunnels to a remote site, e.g., the cloud. Haven’t we been here before?

Secure, optimized tunnels to a remote site, e.g. the cloud. Haven’t we been here before?

In the continuing discussion around Business Intelligence in the cloud comes a more better (yes I did, in fact, say that) discussion of the reasons why you’d want to put BI in the cloud and, appropriately, some of the challenges. As previously mentioned, BI data sets are, as a rule, huge. Big. Bigger than big. Ginormous, even. One of the considerations, then, if you’re going to leverage a cloud-based business intelligence offering – or any offering in which very, very large data sets/files are required - would be how the heck are you going to transfer all that data to the cloud in a timely fashion?

The answer is, apparently, to ship a disk to the provider. Seriously. I couldn’t make this stuff up if I tried.

Ann All, in “Pros and Cons of Business Intelligence in the Cloud” paraphrases a post on the subject by “two guys from Persistent Software, Mukund Deshpande and Shreekanth Joshi.”

Moving large data sets to the cloud could get costly. They recommend shipping disks, an approach they say is often recommended by cloud providers like Amazon.

The original authors, cite network costs as a reason to choose that “never gets old sneaker network” option.

Moving data to the cloud – Large data sets in silos sitting on premises need to get to the cloud before they can be completely used. This is an expensive proposition due to the network costs. The cheapest option is to ship disks, and this is often recommended by cloud providers like Amazon. Although this introduces latencies, it is often acceptable for most BI options.

It’s not just latency that’s the problem – though latency that’s measured in days is certainly not a good thing - you’re talking about shipping off disks and taking the risk that it will get “lost” en route, damaged, or misplaced at some point in time.

Think I’m kidding about that risk?

 


WHY ARE WE REINVENTING THE WHEEL?


 

Actually, the basic problem is with our perception of “the cloud” as an external, completely separate entity. The problem is that cloud providers, too, have this basic perception. Basically the cloud is “over here” and your data is “over there” and ne’er the twain shall meet. That means that it is costly and time consuming to transfer extremely large files to the cloud. Hence the suggestion by cloud providers and bloggers, apparently, alike to “ship some disks” instead.

But as Christofer Hoff has pointed out in the past, why can’t there be “private, external clouds” or, at a minimum, “private, secure access” to external clouds that as a benefit of such a point-to-point connection, also employs WAN optimization and application acceleration techniques to improve the performance of large data transfers across the Internet?

Haven’t we been here before? Haven’t we already addressed this exact scenario when organizations realized that using the Internet to connect with partners and remote offices was less expensive than dedicated lines? Didn’t we go through the whole PKI / SSL VPN / WAN optimization controller thing and solve this particular problem already?

Seems we did, and there’s no reason that we can’t apply what we’ve already learned and figured out to this problem. After all, if you treat the cloud as the “headquarters” and its customers as “remote offices” you have a scenario that is not unlike those dealt with by organizations across the world every day.

In fact, Amazon just announced their version of a cloud VPN and Google’s SDC (Secure Data Connection) has been around for quite some time, providing essentially the same kind of secure tunnel/access to the cloud functionality.


BUT IT’S NOT THE SAME


The scenario is almost identical, but in many ways it isn’t. First, organizations have control – physical control – over remote offices. They can determine what products/solutions/services will be implemented and where, and they can deploy them as they see fit. This is not true with cloud, both from the cloud provider’s perspective and from the customer’s perspective.

isessiontothecloud After all, the solution to the pain point of hefty data transfer from organization to cloud is a combination of WAN optimization and secure remote access but what we don’t want is the traditional “always-on point-to-point secure tunnel.” A cloud provider has (or will have, hopefully) more customers than it can possibly support with such a model. And the use of such secure tunnels is certainly sporadic; there’s no need for an “always on” connection between organizations and their cloud provider.

What’s needed is a dynamic, secure, optimized tunnel that can be created and torn down on an on-demand basis. The cloud provider needs to ensure that only those organizations who are authorized are allowed to create such a tunnel, and it needs to be deployed on a platform that is able to be integrated into the provisioning process such that the management of such external connectivity and access doesn’t end up consuming human operational cycles.

But Lori, you say looking at the eye-candy diagram, it looks like such a solution requires hardware at both ends of the connection.

Yes, yes it does. Or at least it requires a solution at the cloud provider and a solution at each customer site. Maybe that’s hardware, maybe it’s not. But you aren’t going to get around the fact that a secure, encrypted, accelerated on-demand session that enables a more efficient and secure transfer of large data sets or virtual machine images across the Internet is going to require some network and application level optimization and acceleration. We’ve been down this road before, we know where it ends: point-to-point encrypted, optimized, and accelerated tunnels.

You’ll note that Amazon and Google already figured this one out, and yes, it’s going to be proprietary and it’s going to require software/hardware/something on both ends of the connection.

The difference with cloud – and it is a big difference, make no mistake about that – is that a cloud provider needs to support hundreds or perhaps thousands of periodic sessions with remote sites. That means it needs to be on-demand and not always-on as most site-to-site tunnels are today.

Your next observation will be to note that if this is going to require a solution on both sides of the relationship then who gets to decide what that solution is? Good question. Probably the cloud provider. Again, you’ll note that Amazon and Google have already decided. At least in the short term. There are no standards around point-to-point connectivity of this kind. There’s IPSEC VPNs and SSL VPNs, of course, but there’s no standards around WAN optimization and no way to connect product A to product B, so the solution in the short term is a single vendor solution. The long term solution would be to adopt some standards around WAN optimization but before that can happen WAN optimization controllers need to support an on-demand model. Most, unfortunately, do not.

You can, of course, continue to use a sneakernet. It’s an option. You can also continue to transfer over the Internet in the clear, which may or may not be an acceptable method of data transfer for your organization. But there is another solution out there and it’s not nearly as difficult to implement as you might think – as long as you have a solution capable of providing such a service in the first place and your cloud provider is willing to offer or, apparently, reinvent it.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@ThingsExpo Stories
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
The Founder of NostaLab and a member of the Google Health Advisory Board, John is a unique combination of strategic thinker, marketer and entrepreneur. His career was built on the "science of advertising" combining strategy, creativity and marketing for industry-leading results. Combined with his ability to communicate complicated scientific concepts in a way that consumers and scientists alike can appreciate, John is a sought-after speaker for conferences on the forefront of healthcare science,...
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abilit...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his session at @ThingsExpo, Arvind Radhakrishnen discussed how IoT offers new business models in banking and financial services organizations with the capability to revolutionize products, payments, channels, business processes and asset management built on strong architectural foundation. The following topics were covered: How IoT stands to impact various business parameters including customer experience, cost and risk management within BFS organizations.
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
DXWorldEXPO LLC announced today that All in Mobile, a mobile app development company from Poland, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. All In Mobile is a mobile app development company from Poland. Since 2014, they maintain passion for developing mobile applications for enterprises and startups worldwide.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
The best way to leverage your CloudEXPO | DXWorldEXPO presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering CloudEXPO | DXWorldEXPO will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at CloudEXPO. Product announcements during our show provide your company with the most reach through our targeted audienc...