Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Agile Computing, @CloudExpo

Agile Computing: Blog Post

Using the Cloud to Rein in Rogue IT

Five good reasons not to Go Rogue

Cloud services have opened up new opportunities for organizations to be more efficient and flexible and operate more competitively, but what happens when I.T. just isn’t moving quickly enough to meet the increasing infrastructure needs of various departments?

Well, often, they’ll go rogue. The advantage of being able to procure infrastructure in a utility model makes the cloud an attractive choice for departments wanting to sidestep corporate IT processes, controls, and timelines. At the same time, they can become a compounding collection of headaches for I.T. departments who have mandates not always understood by every department.

While, these out-of-band rogue deployments can bring to projects the enabling potential of the cloud, they also pose security and compliance concerns, and can miss not only the ongoing technology support and stewardship but the big picture strategic advantage an enterprise’s I.T. department bring to the organization.

According to Phil Shih, senior analyst focused on hosting at Teir1 Research, it’s an emerging issue, but one that has already started to show its face in enterprises.

“We’ve definitely seen this develop across large organizations and large enterprises,” Shih notes. “Development teams have work that requires infrastructure that I.T. is unable to meet in an efficient and cost-effective manner, so they’re turning to the cloud.”

Shih suggests that traditional infrastructure procurement through I.T. can take as much as six to nine months for some organizations and require a budget commitment of a year or more, making it hard for developers to resist the call of deploying something themselves by simply “pulling out a credit card and expensing it to their internal budget.”

And there are numerous advantages, Shih says, from the perspective of the rogue developers. Traditional infrastructure procurement can result in overspending, especially in a testing and training environments where need for I.T. ebbs and flows. Cloud instead gives elastically available infrastructure that is basically turned on when needed, scaled up or down, and turned off when it’s not. For that same reason, it creates flexible cost structures that are comfortable for a department that is ‘going it alone’ without I.T. support or approval.

Without I.T. support on a project, predicting the nature, scope, and duration of infrastructure needs can be even more daunting, further making cloud appealing. In fact, even with I.T. involvement planning infrastructure needs can be a serious challenge and a strong driver for cloud services.

Shih draws parallels between enterprise departments deploying rogue I.T. projects and small businesses, a segment he says has been a hungry early adopter of cloud.

“Smaller organizations are nimble enough and open enough to implement cloud,” Shih says. “With larger I.T. organizations, they often don’t work that way, so it’s a bigger transition for them.”

Of course, parallels also can exist in size and scope, but it all comes back to the single biggest factor: people who need infrastructure quickly.

So, with all this opportunity for efficiency and productivity, where is the problem?

The problem is I.T. is tasked with fulfilling a business’s infrastructure needs and tends to have a greater understanding of its regulatory requirements, processes, security controls and overall strategy.

Mitigating security risks can be a non-stop, full-time job for IT. Threats of losing or compromising corporate data come from all sides today, from hackers and criminal elements, to industrial espionage, to simple and unfortunate human errors. Does a department engaging in its own I.T. projects understand these risks? What steps are they able to take to ensure correct security protocols are deployed throughout the job?

And the risks are great. Besides meeting regulatory standards of security and privacy an organization has committed to, businesses today live and die on the faith their customers have in the protection of their data and information.

If a department uses the cloud to deploy a new application, for example, do its developers even know where any collected data resides? Many departments outside of I.T. might not recognize that knowing and controlling the location at which data resides can be a regulatory requirement for their business.

What about Sarbanes-Oxley, PCI-DSS (if in the payment card industry), HIPAA (if in healthcare), or PIPEDA (if in Canada)? Are rogue developers aware of the implications these (and other) acts and standards have on their applications and data? Quite possibly, not.

According to Teir1’s Shih, rogue IT deployments in larger enterprises today are often being used mainly as test or training environments and, as such, have little or less live data; however, the cloud provides easy scaling to a live application environments, and it’s safe to assume, businesses could ultimately be dealing with rogue customer-facing applications.

“It definitely complicates management and regulatory compliance. You have to understand where your data is stored and how it’s accessed and how it’s secured,” Shih says. “IT has to be involved. It’s a good idea for IT in large organizations to understand what their staff is doing in the cloud and to bring it in under the larger IT strategy of the organization.”

A management nightmare begins to emerge, unless, of course, IT can find a way to commit to a comparably quick response to departmental needs, while ensuring the oversight required for regulatory and security compliance.

The irony is that the cause for the symptoms can also be the cure: the cloud.

Through cloud services, IT departments can internally "sell" computing resources as utilities, providing the scalability and flexibility needed to have departments deploy the technologies they need, when they need them.

This can either be done through the development of private Infrastructure-as-a-Service clouds with I.T. approved virtual machines doled out as a utility to their internal customers. These clouds could be built and managed internally, delivered through third-party cloud providers or as a hybrid solution, extending existing I.T. infrastructure to the providers’ cloud.

In simplest terms, I.T. becomes that formerly-rogue internal customer’s cloud services provider.

IT can turn to reliable third-party cloud providers where the can with confidence be certain of security control, service levels, and scalability. They know the questions to ask to ensure the business’s needs are met. Better still, the right provider will be as concerned with answering those questions of privacy and compliance as the I.T. department is with asking them.

IT oversight in cloud projects can also bring to bear a wealth of I.T. acumen, expertise in software licensing, and economies of scale by tying together projects across the enterprise.

As well, IT departments, with expertise in negotiating with outside technology suppliers and industry knowledge, are often best equipped to find a provider, such as CentriLogic, who can offer cloud services that meet enterprise requirements for power, security and scalability and provide a true virtual extension to an existing physical infrastructure.

Despite the challenges facing today’s IT departments, with the experience in negotiating with outside technology vendors and suppliers and industry knowledge, are often best equipped to find a service provider, like CentriLogic, who can offer cloud services that meet enterprise requirements for capacity, security and scalability and provide a true virtual extension to an existing physical infrastructure.

In fact, while some departments might be angered by corporate IT’s perceived inability to effectively keep up with the nimbleness they need, by offering infrastructure through cloud services I.T. departments can bring the speedy infrastructure deployment desired, coupled with its own expertise, to help ensure the project really works, meets standards of compliance, and guards the business’s intellectual property.

Five Reasons Not to Go Rogue

1. Failure to meet regulatory compliance
2. Placing corporate and customer data at risk
3. Poor link to overall corporate IT strategy
4. Waste of IT expertise and market insight
5. Lost advantage of economies of scale

More Stories By Jim Latimer

Jim Latimer is Vice President of Client Solutions for CentriLogic, a trusted provider of global data center solutions that delivers enterprise-class hosting, cloud computing and managed services, bringing advantage to businesses where the data center outsourcing is critical to their success. For more information, visit www.centrilogic.com.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...