Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog

@CloudExpo: Blog Feed Post

Useful Cloud Advice, Part Two. Applications

Where cloud usage makes sense given the current state of cloud evolution

This is the second part of this series talking about things you need to consider, and where cloud usage makes sense given the current state of cloud evolution. The first one, Cloud Storage, can be found here. The point of the series is to help you figure out what you can do now, and what you have to consider when moving to the cloud. This will hopefully help you to consider your options when pressure from the business or management to “do something” mounts. Once again, our definition of cloud is Infrastructure as a Service (IaaS) - “VM containers”, not SOA or other variants of Cloud. For our purposes, we’ll also assume “public cloud”. The reasoning here is simple, if you’re implementing internal cloud, you’re likely already very virtualized, and you don’t have the external vendor issues, so you don’t terribly need this advice – though some of it will still apply to you, so read on anyway.


By now, most of you will have read about the FBI seizures in Texas data centers that impacted innocent corporations because of where the data the FBI was after was stored in these shared environments. If you haven’t, take a second and read the article on CNET, we’ll wait for you, and it’s worth the read (hat tip to both James Urquhart for his excellent article, and George V. Hulme for pointing it out). Okay then, this is an issue that you’re not going to get relief from a vendor for. The government is going to have to acknowledge when raiding a shared datacenter that VMS, not hardware, are the evidence they’re after. Law enforcement always lags behind tech innovation, so I’m confident that will come, but be aware of it if you are placing applications in the cloud between now and then.

There’s the further issue of what standards and regulations your cloud provider will support. I haven’t checked too deeply into it in the cloud space, but what I’ve seen is that there’s a wildly varying range of coverage, so check into your chosen vendor before you get knee deep in implementation. This seems obvious to those that have a solid outsourcing policy/procedure in place, but isn’t obvious to those that don’[t do a lot of outsourcing, so worth mentioning. Your provider must not cause you to violate any standard or regulation you are subject to… Since it’s your data they have.


Next up is the security of your data – both that which is stored at the cloud service provider and that which is transferred over the wire between you and them. You need a way to insure encryption is turned on for all connections back to the datacenter (more on these connections in a moment). There are solutions that will take care of this, both offered by some CSPs and through third party products like our LTM/LTM-VE combination to secure your connections, so you just have to concern yourself with the data at-rest. If that data is in a database, you can encrypt it with the database tools, if it is in a file accessed by a web application, you’ll have to encrypt/decrypt in code.

There have been many people saying that your VM is on the public Internet and so would your server be if it was a physical install in your datacenter, and to a limited extent, they’re right. There are not a ton of known weaknesses introduced by VM engines, but that is not a complete answer. IBM used to tout how secure mainframes were because there were no major penetrations of them… Because most of them weren’t on the Internet. Eventually, one came along. Expect the same of a cloud platform, it’s not that VMs are inherently insecure, indeed, in many ways they help with security, but they are another layer between app and hardware that holes can be found in. So eventually one will be found. Secure your data, for safety’s sake.


There is one truth to putting applications in the cloud. Eventually they will increase traffic on your Internet connection. No matter how you slice it, putting an application out there will require you to have access to the information that it is responsible for. Be it a dashboard that is monitored, or heavy traffic coming back for authentication, you’re going to see more traffic, make sure you can handle it. Products like BIG-IP LTM can help by optimizing TCP connections and doing compression should you need it, but there’s more to this than just the bandwidth.

The more connected to your datacenter a remote app is, the more fragile it is, and the greater security risk it is. This is just a function of networking – if your connection goes down, a dependent application goes down with the datacenter, if there are WAN link – or even LAN – slowdowns or congestion at your datacenter, dependent applications will feel it. If you provide your cloud VMs with too much trust on their trips back to the datacenter, they may become a trusted gateway into your network should a hacker gain access to your application in the cloud. This is true of any Internet enabled server, so should be minimized by locking them down, but it can also be minimized by containing connections between the two.


You never, ever, ever, get something worth having for free. Cloud is cheaper up-front than purchasing more computing power, but you should watch the TCO numbers closely to make certain you’re getting the bargain you think you are. Cloud Service Providers are not some great evil that wants to get rich off of you, but they are a business, and businesses exist to make money. Keep an eye on your total monthly bill, know what they’re charging you for and what the rates are, and at least as importantly, under what circumstances those rates can change. It is a simple exercise to concoct a hypothetical scenario where you pay far more for cloud VMs than in-house VMs, but you know your specific situation and your vendors’ rates (or proposed rates if you’re in the selection phase), so hypothetical situations aren’t necessary, just do some extra homework.


These are all issues you can work around, and that’s where this blog is going. How to choose applications wisely, and when to consider cloud usage. Lots of other people have proposed where cloud makes the most sense, so I’ll keep this short, and just offer you some solid suggestions.

Best Fit.

You want something that is not highly sensitive security-wise, and won’t require a ton of connections back to the datacenter as a first project. The usefulness of cloud can then be evaluated in the context of lower risk, while exploring how to fill the gaps listed above.

1. Application Testing. If you’re already developing in VMs and utilizing scrubbed data for testing, this is a no-brainer. You can throw your test environment out in the cloud and see if it flies. The organizations that have already done this say it’s a great fit, though if you rely on infrastructure like BIG-IP to help cut development time or improve application performance, check to see if the vendor in question supports cloud deployments, or at a minimum has a virtual version of their product.

2. Stand-Alone Departmental Apps. An application being implemented for a department that doesn’t require connections to data center services can be a good fit for the cloud, but make sure the business leader understands the limitations of connecting back to the datacenter in the future. It never fails that this “stand-alone” application will need access to data center services, but if they’re willing to work with you on what/where/when/how, you can take advantage of the application’s infancy to put it in the cloud. Since we’re talking about VM hosting services, you can also tell them that if the application eventually needs access to the datacenter, you can move it back in-house.

3. Email. This is one that a lot of organizations who have global operations move early, simply because you can free up space in many datacenters around the globe, move to a single email system, and hold your vendor accountable for access… But remember that internal email will leave the company’s network, so you’ll have to make sure you’re covered on the security front.

Worst Fit.

1. Sensitive stuff. Anything you’re going to be audited on or has a regulation that requires you have control of the hardware is a bad fit, as are things that pass a ton of personally identifiable information. Though the latter might work, depending upon your requirements and your vendor. You can talk with them, but this project will take a lot longer to get running than the ones in the “best fit” category.

2. Constantly high-volume applications. Just be careful doing TCO comparisons on these applications. Cloud is still new, and we haven’t heard the horror stories yet, but no architecture is a best-fit for everything, and again, you could conceivably end up paying more for such an application than purchasing hardware, if you do a three year comparison. Do your homework, and you’ll be okay, as long as you lock rates in at contract negotiation time.


The longer-term appeal of cloud is scalability, particularly for new applications in an existing organization or a new organization. If your application goes wild, and your hits spike by a factor of 10,000 overnight, you can be covered if you have access to auto-scale in (or into) the cloud. That’s quite an appeal, but first lets get everyone comfortable with the idea of cloud, and understanding the ins and outs. Technology is full of great ideas in technology that have flopped because, as my dear mother says, they “got the cart before the horse”.

With an ever-increasing amount of infrastructure available in VMs for supporting the cloud, we’ll eventually have a situation where your datacenter walls are somewhat transparent, but for now you should choose carefully what goes to the cloud and what stays safely at home.

Read the original blog entry...

More Stories By Don MacVittie

Don MacVittie is founder of Ingrained Technology, A technical advocacy and software development consultancy. He has experience in application development, architecture, infrastructure, technical writing,DevOps, and IT management. MacVittie holds a B.S. in Computer Science from Northern Michigan University, and an M.S. in Computer Science from Nova Southeastern University.

IoT & Smart Cities Stories
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and G...
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
DXWorldEXPO LLC announced today that "IoT Now" was named media sponsor of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. IoT Now explores the evolving opportunities and challenges facing CSPs, and it passes on some lessons learned from those who have taken the first steps in next-gen IoT services.