Professional Documents
Culture Documents
PAGE 14
GOING HYBRID:
THE NEXT ERA OF
DATA MANAGEMENT
MemSQL
PAGE 16
A HYBRID APPROACH
TO DATA PROCESSING
Denodo
Technologies
PAGE 17
DATA VIRTUALIZATION:
THE FOUNDATION FOR A
SUCCESSFUL HYBRID DATA
ARCHITECTURE
Splice Machine
GOING
HYBRID
The Next Era of
Data Management
PAGE 18
POWERING REAL-TIME
APPLICATIONS
AND OFFLOADING
OPERATIONAL REPORTS
WITH AN OPERATIONAL
DATA LAKE
GridGain Systems
PAGE 19
ACCELERATE BUSINESS
INSIGHTS BY MANAGING
HYBRID DATA IN MEMORY
12
GOING
HYBRID
SUPPORT HYBRID
STORAGE SYSTEMS
13
ACHIEVE QUICK
IMPLEMENTATION
Hybrid data storage and warehouse
appliances can be quickly implemented
into existing infrastructures at relatively
low costs and with small footprints.
Appliances offer high-capacity
alternatives for mixed application
workloads and virtualized environments.
ENABLE DATA AS A
SERVICE (DAAS)
The challenge is to provide a data
environment in which the entire
organization can benefit, enabling all
partiesno matter how distributed they
areto acquire, transform, move, clean,
stage, model, govern, deliver, explore,
collect, move, replicate, share, analyze,
catalog, publish, search, back up, and
archive the data they are working with.
Ultimately, this ends up as a data as a
service layer that provides for all these
requirements, while ensuring control,
security, privacy, reliability, and scalability
along with a great user experience on
the front end.
14
Sponsored Content
GOING HYBRID
THE NEXT ERA OF DATA MANAGEMENT
differentiationas an example, by
building a single, 360-degree view of
their customers and leveraging advanced
predictive analytics in Apache Hadoop.
Red Hat and Hortonworks are committed
to helping enterprises mine and monetize
their data for deeper business insights.
Learn more about the collaboration at
hortonworks.com/partner/redhat.
HYBRID DEPLOYMENT
SCENARIOS
Hortonworks and Red Hat provide
many choices of infrastructure to deploy
Hortonworks Data Platform (HDP): on
premise, cloud, and virtualized. Further,
our customers have a choice of deploying
on Linux and Windows operating
systems. We believe you should not be
limited to just one option, but have the
option to choose the best combination
of infrastructure and operating system
based on the usage scenario. In a hybrid
deployment model, you should have all
of these options. Our customers come
to us asking to meet the requirements
for their organizations for the following
scenarios:
Cluster Backup
IT Operations teams expect Hadoop
to provide robust, enterprise-level
capabilities, like other systems in the data
center, and business continuity through
replication across on-premises and
cloud-based storages targets is a critical
requirement. In HDP 2.2, Hortonworks
helped extend the capabilities of Apache
Sponsored Content
15
CONCLUSION
Hybrid is more than just a good idea.
Its the way forward. As traditional lines
blur (IT vs. Business, Cloud vs. On-Premise,
Big Data vs. EDW), it is important that
enterprises are prepared to juggle the
demands of traditional data systemswith
a modern data architecture. Monetizing
all types of data has emerged as the new
battleground, and a hybrid model for data
management ensures that tomorrows
enterprises are set up for success. n
Call to action:
Learn more at
http://hortonworks.com/
labs/redhat/
and go over the tutorials.
HORTONWORKS
For more information, visit
www.hortonworks.com
18
Sponsored Content
Gartner: Hybrid Transaction/Analytical Processing Will Foster Opportunities for Dramatic Business Innovation
https://www.gartner.com/doc/2657815/hybrid-transactionanalytical-processing-foster-opportunities
memsql.com/download
MEMSQL
www.memsql.com
Sponsored Content
17
18
Sponsored Content
OPERATIONAL DATA
LAKE STRUCTURES THE
UNSTRUCTURED
While Hadoop is a great platform
for unstructured data, it traditionally
has not been conducive to structured,
relational data. Hadoop uses read-only
flat files, which can make it very difficult
to replicate the cross-table schema in
structured data.
A data lake is operationalized via a
Hadoop RDBMS (see Figure 1 above),
where Hadoop handles the scale out,
and the RDBMS functionality supports
structured data and reliable real-time
updates. With this setup, the operational
data lake is never overwhelmed like a
traditional ODS. Its nearly bottomless
or limitless in its scalabilitycompanies
can continue to add as much data as they
want because expansion costs so little.
With the data lake, users can extract
structured metadata from unstructured
data on a regular basis and store it in the
operational data lake for quick and easy
querying, thus enabling better real-time
data analysis. And, just as importantly,
because all data is in a single location, the
operational data lake enables easy queries
across structured and unstructured data
simultaneously.
Figure 1.
Operational
Data Lake
Architecture
SPLICING TOGETHER A
SOLUTION: A CASE STUDY
Marketing services company Harte
Hanks needed to power its campaign
management and BI applications to
deliver 360-degree customer views to
its client base, but found that its queries
were slowing to a crawl, taking half an
hour to complete in some cases. Given the
companys prediction that its data would
grow by 30% to 50%, query performance
would only get worse.
Harte Hanks replaced its Oracle
RAC databases with Splice Machine, a
Hadoop RDBMS, thereby experiencing
a 3-to-7 fold increase in query speeds
at a cost that is 75% less than its Oracle
implementation.
Splice Machine allows Harte Hanks
to seamlessly support their OLTP and
CONCLUSION
Creating a Hadoop-based operational
data lake to support core applications
and services involves selecting scaleout technologies that can effectively
encompass the best of all worlds. A
Hadoop RDBMS like Splice Machine
brings together the scalability of Hadoop,
the ubiquity of industry-standard SQL,
and the transactional integrity of a fully
ACID-compliant RDBMS.
An operational data lake can be an
excellent way of implementing a hybrid
architecture approach to not only ride
the wave of big data, but also ensure that
businesses face smooth sailing in the
future. n
SPLICE MACHINE
To learn more about how Splice
Machine can power an operational
data lake for your enterprise, visit
www.splicemachine.com.
Sponsored Content
19
THE PROMISE
Modern in-memory technology
provides the most logical and
comprehensive way to harness the
computing power necessary to manage
the growing demands of hybrid data
management. The GridGain In-Memory
Data Fabricavailable as an open source
project (Apache Ignite incubating) and
a hardened enterprise productoffers
companies unique capabilities and a
competitive advantage in managing diverse
data with the speed and scale necessary to
address the requirements of modern Cloud,
Big Data, social and IoT applications.
Its easy to test our promise. Download
a free evaluation copy of the GridGain
In-Memory Data Fabric at http://www
.gridgain.com/download/. n
GRIDGAIN SYSTEMS
www.gridgain.com