Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog

@CloudExpo: Blog Feed Post

Useful Cloud Advice, Part Two. Applications

Where cloud usage makes sense given the current state of cloud evolution

This is the second part of this series talking about things you need to consider, and where cloud usage makes sense given the current state of cloud evolution. The first one, Cloud Storage, can be found here. The point of the series is to help you figure out what you can do now, and what you have to consider when moving to the cloud. This will hopefully help you to consider your options when pressure from the business or management to “do something” mounts. Once again, our definition of cloud is Infrastructure as a Service (IaaS) - “VM containers”, not SOA or other variants of Cloud. For our purposes, we’ll also assume “public cloud”. The reasoning here is simple, if you’re implementing internal cloud, you’re likely already very virtualized, and you don’t have the external vendor issues, so you don’t terribly need this advice – though some of it will still apply to you, so read on anyway.


By now, most of you will have read about the FBI seizures in Texas data centers that impacted innocent corporations because of where the data the FBI was after was stored in these shared environments. If you haven’t, take a second and read the article on CNET, we’ll wait for you, and it’s worth the read (hat tip to both James Urquhart for his excellent article, and George V. Hulme for pointing it out). Okay then, this is an issue that you’re not going to get relief from a vendor for. The government is going to have to acknowledge when raiding a shared datacenter that VMS, not hardware, are the evidence they’re after. Law enforcement always lags behind tech innovation, so I’m confident that will come, but be aware of it if you are placing applications in the cloud between now and then.

There’s the further issue of what standards and regulations your cloud provider will support. I haven’t checked too deeply into it in the cloud space, but what I’ve seen is that there’s a wildly varying range of coverage, so check into your chosen vendor before you get knee deep in implementation. This seems obvious to those that have a solid outsourcing policy/procedure in place, but isn’t obvious to those that don’[t do a lot of outsourcing, so worth mentioning. Your provider must not cause you to violate any standard or regulation you are subject to… Since it’s your data they have.


Next up is the security of your data – both that which is stored at the cloud service provider and that which is transferred over the wire between you and them. You need a way to insure encryption is turned on for all connections back to the datacenter (more on these connections in a moment). There are solutions that will take care of this, both offered by some CSPs and through third party products like our LTM/LTM-VE combination to secure your connections, so you just have to concern yourself with the data at-rest. If that data is in a database, you can encrypt it with the database tools, if it is in a file accessed by a web application, you’ll have to encrypt/decrypt in code.

There have been many people saying that your VM is on the public Internet and so would your server be if it was a physical install in your datacenter, and to a limited extent, they’re right. There are not a ton of known weaknesses introduced by VM engines, but that is not a complete answer. IBM used to tout how secure mainframes were because there were no major penetrations of them… Because most of them weren’t on the Internet. Eventually, one came along. Expect the same of a cloud platform, it’s not that VMs are inherently insecure, indeed, in many ways they help with security, but they are another layer between app and hardware that holes can be found in. So eventually one will be found. Secure your data, for safety’s sake.


There is one truth to putting applications in the cloud. Eventually they will increase traffic on your Internet connection. No matter how you slice it, putting an application out there will require you to have access to the information that it is responsible for. Be it a dashboard that is monitored, or heavy traffic coming back for authentication, you’re going to see more traffic, make sure you can handle it. Products like BIG-IP LTM can help by optimizing TCP connections and doing compression should you need it, but there’s more to this than just the bandwidth.

The more connected to your datacenter a remote app is, the more fragile it is, and the greater security risk it is. This is just a function of networking – if your connection goes down, a dependent application goes down with the datacenter, if there are WAN link – or even LAN – slowdowns or congestion at your datacenter, dependent applications will feel it. If you provide your cloud VMs with too much trust on their trips back to the datacenter, they may become a trusted gateway into your network should a hacker gain access to your application in the cloud. This is true of any Internet enabled server, so should be minimized by locking them down, but it can also be minimized by containing connections between the two.


You never, ever, ever, get something worth having for free. Cloud is cheaper up-front than purchasing more computing power, but you should watch the TCO numbers closely to make certain you’re getting the bargain you think you are. Cloud Service Providers are not some great evil that wants to get rich off of you, but they are a business, and businesses exist to make money. Keep an eye on your total monthly bill, know what they’re charging you for and what the rates are, and at least as importantly, under what circumstances those rates can change. It is a simple exercise to concoct a hypothetical scenario where you pay far more for cloud VMs than in-house VMs, but you know your specific situation and your vendors’ rates (or proposed rates if you’re in the selection phase), so hypothetical situations aren’t necessary, just do some extra homework.


These are all issues you can work around, and that’s where this blog is going. How to choose applications wisely, and when to consider cloud usage. Lots of other people have proposed where cloud makes the most sense, so I’ll keep this short, and just offer you some solid suggestions.

Best Fit.

You want something that is not highly sensitive security-wise, and won’t require a ton of connections back to the datacenter as a first project. The usefulness of cloud can then be evaluated in the context of lower risk, while exploring how to fill the gaps listed above.

1. Application Testing. If you’re already developing in VMs and utilizing scrubbed data for testing, this is a no-brainer. You can throw your test environment out in the cloud and see if it flies. The organizations that have already done this say it’s a great fit, though if you rely on infrastructure like BIG-IP to help cut development time or improve application performance, check to see if the vendor in question supports cloud deployments, or at a minimum has a virtual version of their product.

2. Stand-Alone Departmental Apps. An application being implemented for a department that doesn’t require connections to data center services can be a good fit for the cloud, but make sure the business leader understands the limitations of connecting back to the datacenter in the future. It never fails that this “stand-alone” application will need access to data center services, but if they’re willing to work with you on what/where/when/how, you can take advantage of the application’s infancy to put it in the cloud. Since we’re talking about VM hosting services, you can also tell them that if the application eventually needs access to the datacenter, you can move it back in-house.

3. Email. This is one that a lot of organizations who have global operations move early, simply because you can free up space in many datacenters around the globe, move to a single email system, and hold your vendor accountable for access… But remember that internal email will leave the company’s network, so you’ll have to make sure you’re covered on the security front.

Worst Fit.

1. Sensitive stuff. Anything you’re going to be audited on or has a regulation that requires you have control of the hardware is a bad fit, as are things that pass a ton of personally identifiable information. Though the latter might work, depending upon your requirements and your vendor. You can talk with them, but this project will take a lot longer to get running than the ones in the “best fit” category.

2. Constantly high-volume applications. Just be careful doing TCO comparisons on these applications. Cloud is still new, and we haven’t heard the horror stories yet, but no architecture is a best-fit for everything, and again, you could conceivably end up paying more for such an application than purchasing hardware, if you do a three year comparison. Do your homework, and you’ll be okay, as long as you lock rates in at contract negotiation time.


The longer-term appeal of cloud is scalability, particularly for new applications in an existing organization or a new organization. If your application goes wild, and your hits spike by a factor of 10,000 overnight, you can be covered if you have access to auto-scale in (or into) the cloud. That’s quite an appeal, but first lets get everyone comfortable with the idea of cloud, and understanding the ins and outs. Technology is full of great ideas in technology that have flopped because, as my dear mother says, they “got the cart before the horse”.

With an ever-increasing amount of infrastructure available in VMs for supporting the cloud, we’ll eventually have a situation where your datacenter walls are somewhat transparent, but for now you should choose carefully what goes to the cloud and what stays safely at home.

Read the original blog entry...

More Stories By Don MacVittie

Don MacVittie is founder of Ingrained Technology, A technical advocacy and software development consultancy. He has experience in application development, architecture, infrastructure, technical writing,DevOps, and IT management. MacVittie holds a B.S. in Computer Science from Northern Michigan University, and an M.S. in Computer Science from Nova Southeastern University.

IoT & Smart Cities Stories
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Never mind that we might not know what the future holds for cryptocurrencies and how much values will fluctuate or even how the process of mining a coin could cost as much as the value of the coin itself - cryptocurrency mining is a hot industry and shows no signs of slowing down. However, energy consumption to mine cryptocurrency is one of the biggest issues facing this industry. Burning huge amounts of electricity isn't incidental to cryptocurrency, it's basically embedded in the core of "mini...
The term "digital transformation" (DX) is being used by everyone for just about any company initiative that involves technology, the web, ecommerce, software, or even customer experience. While the term has certainly turned into a buzzword with a lot of hype, the transition to a more connected, digital world is real and comes with real challenges. In his opening keynote, Four Essentials To Become DX Hero Status Now, Jonathan Hoppe, Co-Founder and CTO of Total Uptime Technologies, shared that ...
Every organization is facing their own Digital Transformation as they attempt to stay ahead of the competition, or worse, just keep up. Each new opportunity, whether embracing machine learning, IoT, or a cloud migration, seems to bring new development, deployment, and management models. The results are more diverse and federated computing models than any time in our history.
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...