Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Containers Expo Blog, Microservices Expo

Containers Expo Blog: Article

Can Virtualization Help with Governance?

Five ways data virtualization improves data governance

As with motherhood and apple pie, who can argue with data governance?

Business users like it because it assures critical business decisions are made based on sound data.

IT likes data governance because as the organization's data stewards, it shows they are doing a good job.

Compliance officers and risk managers like data governance because it lets them sleep at night.

Data Governance Is Challenging
Liking it is one thing.  Doing it is another.

Enterprises are struggling to turn the concept of data governance into a reality due to significantly growing data volumes, variety and variability, along with onerous new compliance requirements.

Effective data virtualization can improve data governance in numerous ways.

Five Requirements for More Effective Data Governance
Many articles and white papers define data governance, so it does not make sense to include a lengthy treatment here.  However, it is helpful to identify data governance's most critical requirements.

Data governance is a set of well-defined policies and practices designed to ensure that data is:

  • Accessible - Can the people who need it access the data they need? Does the data match the format the user requires?
  • Secure - Are authorized people the only ones who can access the data? Are non-authorized users prevented from accessing it?
  • Consistent - When two users seek the "same" piece of data, is it actually the same data? Have multiple versions been rationalized?
  • High Quality - Is the data accurate? Has it been conformed to meet agreed standards?
  • Auditable - Where did the data come from? Is the lineage clear? Does IT know who is using it and for what purpose?

Data Virtualization Helps Five Ways
Enterprises cannot buy data governance solutions off-the-shelf because effective data governance requires complex policies and practices, supported by software technology, integrated across the wider enterprise IT architecture.

As such, enterprises are turning to enabling technologies such as data virtualization support the accessibility, security, consistency, quality and auditability capabilities required for effective data governance.

Data Accessibility
It is generally agreed that as much as 80 percent of any new development effort is spent on data integration, making data access--rather than developing the application--the most time-consuming and expensive activity.

Most users access their data via business intelligence (BI) and reporting applications.  These applications typically rely on data integration middleware to access and format the data, before the application displays it.  So, ensuring proper governance falls on the data integration middleware.

By eliminating the need for the physical builds and testing that replication and consolidation approaches require, data virtualization is more agile and cost-effective method to access, integrate, and deliver data.  This agility lets enterprise provide data access faster and more easily.

Data Security
Ensuring that only authorized users can see appropriate data and nothing more is a critical data governance requirement.  This is a straightforward task for single systems and small user counts, but becomes more complex and difficult in larger enterprises with hundreds of systems and thousands of users.

As a first step, many enterprises have implemented single-sign-on technologies that allow individuals to be uniquely authenticated in many diverse systems. However, implementing security policies (i.e., authorization to see or use certain data) in individual source systems alone is often insufficient to ensure the appropriate enterprise-wide data security.  For some hyper-sensitive data, encryption as it moves through the network is a further requirement.

Data virtualization not only leverages single-sign-on capabilities to authorize and authenticate individuals, it can also encrypt any and all data.  As such, data virtualization becomes the data governance focal point for implementing security policies across multiple data sources and consumers.

Data Consistency
Consider the following commonplace scenario:  Two people attend a meeting with reports or graphs generated from the "same" data, but they show different numbers or results. Likely, they believed they were using the same data.  In reality, they were each using their own replicated, consolidated, aggregated version of the data.

Data virtualization allows enterprises to prevent this scenario from occurring by establishing consistent and complete data canonicals applicable across all aspects of business use.

Data Quality
Correct and complete data is a critical data governance requirement.  However, data quality is often implemented as an afterthought to data creation and modification, and it is usually performed during data consolidation.  This approach impedes the achievement of good data quality across the enterprise.

The modern trend in data quality and governance, however, is to push the practices of ensuring quality data back toward the source systems, so that data is of the highest quality right from the start.

Data virtualization leverages these "systems of record" when delivering data to the consumer, so it naturally delivers high-quality data. In addition, data virtualization allows data quality practices like enrichment and standardization to occur inline, giving the data stewards more options for ensuring data is of the highest quality when it reaches the consumer.

Data Auditablity
On the data source side, good data governance policy requires that IT can explain where data comes from, and prove its source. On the data consumer side, good data governance policy requires that IT show who used the data, and how it was used.

Traditional data integration copies data from one place to another.  As a result, the copied data becomes "disconnected" from the source, making it difficult to establish a complete source-to-consumer audit trail.

Data virtualization integrates data directly from the original source and delivers it directly to the consumer.  This end-to-end flow, without creating a disconnected copy of the data in the middle, simplifies and strengthens data governance. When auditing is required, full lineage is readily available at anytime within the data virtualization metadata and transaction histories.

Bottom-line
As data governance becomes increasingly prevalent in enterprise information management strategies, forward-looking organizations are deploying methods that simplify data governance.  Data virtualization platforms such as Composite 6 not only makes data governance easier in practice, but it also shortens the time to begin achieving the data governance benefits of consistent, secure high-quality data for more intelligent business decision-making.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@ThingsExpo Stories
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics gr...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things’). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing? IoT is not about the devices, it’s about the data consumed and generated. The devices are tools, mechanisms, conduits. In his session at Internet of Things at Cloud Expo | DXWor...