Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: @BigDataExpo, Java IoT, Apache

@BigDataExpo: Article

Apache Spark: A Key to Big Data Initiatives | @CloudExpo #Microservices

As with other data processing technologies, Spark is not suitable for all types of workloads

Apache Spark continues to gain a lot of traction as companies launch or expand their big data initiatives. There is no doubt that it’s finding a place in corporate IT strategies.

The open-source cluster computing framework was developed in the AMPLab at the University of California at Berkeley in 2009 and became an incubated project of the Apache Software Foundation in 2013. By early 2014, Spark had become one of the foundation’s top-level projects, and today it is one of the most active projects managed by Apache.

Because Spark was optimized to run in-memory, it is capable of processing data much faster than other approaches such as MapReduce. As a result, Spark can provide much higher performance levels for certain types of applications. By enabling programs to load data into a cluster's memory and query it repeatedly, the framework is ideal for machine learning algorithms.

As with other data processing technologies, Spark is not suitable for all types of workloads. But companies launching big data efforts can leverage the framework for a variety of projects, such as interactive queries across large data sets; the processing of streaming data from sensors, as with Internet of Things (IoT) applications; and machine learning tasks.

In addition, developers can use Spark to support other processing tasks, taking advantage of the open source framework’s huge set of developer libraries and application programming interfaces (APIs) and comprehensive support of popular languages such as Java, Python, R and Scala.

Apache Spark has three key things going for it that IT organizations should keep in mind:

  1. The framework’s relative simplicity. The APIs are designed specifically for interacting easily and rapidly with data at scale, and are structured in such a way that enable application developers to use Spark right away.
  2. The framework is designed for speed, operating both in-memory and on disk. Spark’s performance can be even greater when supporting interactive queries of data stored in memory.
  3. Spark supports multiple programming languages as mentioned above, and it includes native support for tight integration with leading storage solutions in the Hadoop ecosystem and beyond.

Spark is proving to be well suited for a number of business use cases and is helping companies to transform their big data initiatives and deliver analytics much faster and with greater efficiency.

One company, a provider of cloud-based predictive analytics software specifically designed for the telecommunications industry, is using the full Spark stack as part of its Hadoop-based architecture on MapR. This has helped the company achieve horizontal scalability on commodity hardware and reduce storage and computing costs.

The new technology stack allows the software company to continuously innovate and deliver value to its telecommunications customers by offering predictive insights from the cloud. Today’s telecommunications data has higher volumes and frequency and more complex structures, particularly with new types of devices generating data for IoT and the use of mobile phones for a fast-growing number of apps. The company needs to use this data to generate predictive insights using data science and predictive analytics, and Spark helps make this possible.

Another business benefiting from Spark is a global pharmaceuticals manufacturer that relies on big data solutions for drug discovery processes. One of the company’s areas of drug research requires lots of interaction with diverse data from external organizations.

Combined Spark and Hadoop workflow and integration layers enable the company’s researchers to leverage thousands of experiments other organizations have conducted, providing the pharmaceuticals company with a significant competitive advantage. The big data solutions the company uses allows it to integrate and analyze data so that it can speed up drug research.

These technologies are now being used for a variety of projects across the enterprise, including video analysis, proteomics, and meta-genomics. Researchers can access data directly through a Spark API on a number of databases with schemas that are designed for their specific analytics needs.

And a third business use case for Spark comes from a service provider that delivers analytics services to various industries. The company deployed the Spark framework in conjunction with its Hadoop big data initiative, and is able to dramatically cut query times and improve the accuracy of analytics results. That has enabled the company to provide enhanced services to its customers.

Clearly, Apache Spark can provide a number of benefits to organizations looking to get the most value out of their information resources and the biggest returns on their big data investments. The framework provides the speed and efficiency improvements companies need to deliver on the promise of big data and analytics.

To further explore the advantages of Spark, see the free interactive eBook, Getting Started with Apache Spark: From Inception to Productionby James A. Scott.

More Stories By Jim Scott

Jim has held positions running Operations, Engineering, Architecture and QA teams in the Consumer Packaged Goods, Digital Advertising, Digital Mapping, Chemical and Pharmaceutical industries. Jim has built systems that handle more than 50 billion transactions per day and his work with high-throughput computing at Dow Chemical was a precursor to more standardized big data concepts like Hadoop.

@ThingsExpo Stories
Organizations do not need a Big Data strategy; they need a business strategy that incorporates Big Data. Most organizations lack a road map for using Big Data to optimize key business processes, deliver a differentiated customer experience, or uncover new business opportunities. They do not understand what’s possible with respect to integrating Big Data into the business model.
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities – ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups. As a result, many firms employ new business models that place enormous impor...
SYS-CON Events announced today that Taica will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Taica manufacturers Alpha-GEL brand silicone components and materials, which maintain outstanding performance over a wide temperature range -40C to +200C. For more information, visit http://www.taica.co.jp/english/.
SYS-CON Events announced today that Dasher Technologies will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Dasher Technologies, Inc. ® is a premier IT solution provider that delivers expert technical resources along with trusted account executives to architect and deliver complete IT solutions and services to help our clients execute their goals, plans and objectives. Since 1999, we'v...
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, will discuss how they b...
SYS-CON Events announced today that MIRAI Inc. will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MIRAI Inc. are IT consultants from the public sector whose mission is to solve social issues by technology and innovation and to create a meaningful future for people.
SYS-CON Events announced today that TidalScale, a leading provider of systems and services, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale has been involved in shaping the computing landscape. They've designed, developed and deployed some of the most important and successful systems and services in the history of the computing industry - internet, Ethernet, operating s...
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
SYS-CON Events announced today that TidalScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale is the leading provider of Software-Defined Servers that bring flexibility to modern data centers by right-sizing servers on the fly to fit any data set or workload. TidalScale’s award-winning inverse hypervisor technology combines multiple commodity servers (including their ass...
As hybrid cloud becomes the de-facto standard mode of operation for most enterprises, new challenges arise on how to efficiently and economically share data across environments. In his session at 21st Cloud Expo, Dr. Allon Cohen, VP of Product at Elastifile, will explore new techniques and best practices that help enterprise IT benefit from the advantages of hybrid cloud environments by enabling data availability for both legacy enterprise and cloud-native mission critical applications. By rev...
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Amazon is pursuing new markets and disrupting industries at an incredible pace. Almost every industry seems to be in its crosshairs. Companies and industries that once thought they were safe are now worried about being “Amazoned.”. The new watch word should be “Be afraid. Be very afraid.” In his session 21st Cloud Expo, Chris Kocher, a co-founder of Grey Heron, will address questions such as: What new areas is Amazon disrupting? How are they doing this? Where are they likely to go? What are th...
SYS-CON Events announced today that N3N will exhibit at SYS-CON's @ThingsExpo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. N3N’s solutions increase the effectiveness of operations and control centers, increase the value of IoT investments, and facilitate real-time operational decision making. N3N enables operations teams with a four dimensional digital “big board” that consolidates real-time live video feeds alongside IoT sensor data a...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, will lead you through the exciting evolution of the cloud. He'll look at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering ...
Digital transformation is changing the face of business. The IDC predicts that enterprises will commit to a massive new scale of digital transformation, to stake out leadership positions in the "digital transformation economy." Accordingly, attendees at the upcoming Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA, Oct 31-Nov 2, will find fresh new content in a new track called Enterprise Cloud & Digital Transformation.
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, will discuss how given the magnitude of today's applicati...
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp emp...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...