Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog

@CloudExpo: Blog Feed Post

Useful Cloud Advice, Part Two. Applications

Where cloud usage makes sense given the current state of cloud evolution

This is the second part of this series talking about things you need to consider, and where cloud usage makes sense given the current state of cloud evolution. The first one, Cloud Storage, can be found here. The point of the series is to help you figure out what you can do now, and what you have to consider when moving to the cloud. This will hopefully help you to consider your options when pressure from the business or management to “do something” mounts. Once again, our definition of cloud is Infrastructure as a Service (IaaS) - “VM containers”, not SOA or other variants of Cloud. For our purposes, we’ll also assume “public cloud”. The reasoning here is simple, if you’re implementing internal cloud, you’re likely already very virtualized, and you don’t have the external vendor issues, so you don’t terribly need this advice – though some of it will still apply to you, so read on anyway.


By now, most of you will have read about the FBI seizures in Texas data centers that impacted innocent corporations because of where the data the FBI was after was stored in these shared environments. If you haven’t, take a second and read the article on CNET, we’ll wait for you, and it’s worth the read (hat tip to both James Urquhart for his excellent article, and George V. Hulme for pointing it out). Okay then, this is an issue that you’re not going to get relief from a vendor for. The government is going to have to acknowledge when raiding a shared datacenter that VMS, not hardware, are the evidence they’re after. Law enforcement always lags behind tech innovation, so I’m confident that will come, but be aware of it if you are placing applications in the cloud between now and then.

There’s the further issue of what standards and regulations your cloud provider will support. I haven’t checked too deeply into it in the cloud space, but what I’ve seen is that there’s a wildly varying range of coverage, so check into your chosen vendor before you get knee deep in implementation. This seems obvious to those that have a solid outsourcing policy/procedure in place, but isn’t obvious to those that don’[t do a lot of outsourcing, so worth mentioning. Your provider must not cause you to violate any standard or regulation you are subject to… Since it’s your data they have.


Next up is the security of your data – both that which is stored at the cloud service provider and that which is transferred over the wire between you and them. You need a way to insure encryption is turned on for all connections back to the datacenter (more on these connections in a moment). There are solutions that will take care of this, both offered by some CSPs and through third party products like our LTM/LTM-VE combination to secure your connections, so you just have to concern yourself with the data at-rest. If that data is in a database, you can encrypt it with the database tools, if it is in a file accessed by a web application, you’ll have to encrypt/decrypt in code.

There have been many people saying that your VM is on the public Internet and so would your server be if it was a physical install in your datacenter, and to a limited extent, they’re right. There are not a ton of known weaknesses introduced by VM engines, but that is not a complete answer. IBM used to tout how secure mainframes were because there were no major penetrations of them… Because most of them weren’t on the Internet. Eventually, one came along. Expect the same of a cloud platform, it’s not that VMs are inherently insecure, indeed, in many ways they help with security, but they are another layer between app and hardware that holes can be found in. So eventually one will be found. Secure your data, for safety’s sake.


There is one truth to putting applications in the cloud. Eventually they will increase traffic on your Internet connection. No matter how you slice it, putting an application out there will require you to have access to the information that it is responsible for. Be it a dashboard that is monitored, or heavy traffic coming back for authentication, you’re going to see more traffic, make sure you can handle it. Products like BIG-IP LTM can help by optimizing TCP connections and doing compression should you need it, but there’s more to this than just the bandwidth.

The more connected to your datacenter a remote app is, the more fragile it is, and the greater security risk it is. This is just a function of networking – if your connection goes down, a dependent application goes down with the datacenter, if there are WAN link – or even LAN – slowdowns or congestion at your datacenter, dependent applications will feel it. If you provide your cloud VMs with too much trust on their trips back to the datacenter, they may become a trusted gateway into your network should a hacker gain access to your application in the cloud. This is true of any Internet enabled server, so should be minimized by locking them down, but it can also be minimized by containing connections between the two.


You never, ever, ever, get something worth having for free. Cloud is cheaper up-front than purchasing more computing power, but you should watch the TCO numbers closely to make certain you’re getting the bargain you think you are. Cloud Service Providers are not some great evil that wants to get rich off of you, but they are a business, and businesses exist to make money. Keep an eye on your total monthly bill, know what they’re charging you for and what the rates are, and at least as importantly, under what circumstances those rates can change. It is a simple exercise to concoct a hypothetical scenario where you pay far more for cloud VMs than in-house VMs, but you know your specific situation and your vendors’ rates (or proposed rates if you’re in the selection phase), so hypothetical situations aren’t necessary, just do some extra homework.


These are all issues you can work around, and that’s where this blog is going. How to choose applications wisely, and when to consider cloud usage. Lots of other people have proposed where cloud makes the most sense, so I’ll keep this short, and just offer you some solid suggestions.

Best Fit.

You want something that is not highly sensitive security-wise, and won’t require a ton of connections back to the datacenter as a first project. The usefulness of cloud can then be evaluated in the context of lower risk, while exploring how to fill the gaps listed above.

1. Application Testing. If you’re already developing in VMs and utilizing scrubbed data for testing, this is a no-brainer. You can throw your test environment out in the cloud and see if it flies. The organizations that have already done this say it’s a great fit, though if you rely on infrastructure like BIG-IP to help cut development time or improve application performance, check to see if the vendor in question supports cloud deployments, or at a minimum has a virtual version of their product.

2. Stand-Alone Departmental Apps. An application being implemented for a department that doesn’t require connections to data center services can be a good fit for the cloud, but make sure the business leader understands the limitations of connecting back to the datacenter in the future. It never fails that this “stand-alone” application will need access to data center services, but if they’re willing to work with you on what/where/when/how, you can take advantage of the application’s infancy to put it in the cloud. Since we’re talking about VM hosting services, you can also tell them that if the application eventually needs access to the datacenter, you can move it back in-house.

3. Email. This is one that a lot of organizations who have global operations move early, simply because you can free up space in many datacenters around the globe, move to a single email system, and hold your vendor accountable for access… But remember that internal email will leave the company’s network, so you’ll have to make sure you’re covered on the security front.

Worst Fit.

1. Sensitive stuff. Anything you’re going to be audited on or has a regulation that requires you have control of the hardware is a bad fit, as are things that pass a ton of personally identifiable information. Though the latter might work, depending upon your requirements and your vendor. You can talk with them, but this project will take a lot longer to get running than the ones in the “best fit” category.

2. Constantly high-volume applications. Just be careful doing TCO comparisons on these applications. Cloud is still new, and we haven’t heard the horror stories yet, but no architecture is a best-fit for everything, and again, you could conceivably end up paying more for such an application than purchasing hardware, if you do a three year comparison. Do your homework, and you’ll be okay, as long as you lock rates in at contract negotiation time.


The longer-term appeal of cloud is scalability, particularly for new applications in an existing organization or a new organization. If your application goes wild, and your hits spike by a factor of 10,000 overnight, you can be covered if you have access to auto-scale in (or into) the cloud. That’s quite an appeal, but first lets get everyone comfortable with the idea of cloud, and understanding the ins and outs. Technology is full of great ideas in technology that have flopped because, as my dear mother says, they “got the cart before the horse”.

With an ever-increasing amount of infrastructure available in VMs for supporting the cloud, we’ll eventually have a situation where your datacenter walls are somewhat transparent, but for now you should choose carefully what goes to the cloud and what stays safely at home.

Read the original blog entry...

More Stories By Don MacVittie

Don MacVittie is founder of Ingrained Technology, A technical advocacy and software development consultancy. He has experience in application development, architecture, infrastructure, technical writing,DevOps, and IT management. MacVittie holds a B.S. in Computer Science from Northern Michigan University, and an M.S. in Computer Science from Nova Southeastern University.

IoT & Smart Cities Stories
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
If a machine can invent, does this mean the end of the patent system as we know it? The patent system, both in the US and Europe, allows companies to protect their inventions and helps foster innovation. However, Artificial Intelligence (AI) could be set to disrupt the patent system as we know it. This talk will examine how AI may change the patent landscape in the years to come. Furthermore, ways in which companies can best protect their AI related inventions will be examined from both a US and...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...