You are on page 1of 6

4

CA ERWIN
Modeling

THOUGHT LEADERSHIPSERIES

REALE MUTUA
SIMPLIFIES
BACK OFFICE
MANAGEMENT
WITH CA ERWIN
DATA MODELER

Couchbase
NOSQL DATABASES
IN A WORLD OF
BIG DATA

Aerospike
DEMAND FOR
FAST DATA RENEWS
THE NEED FOR
IN-MEMORY
DATABASES

ENABLING THE

REAL-TIME

ENTERPRISE
2014 | MAY

THOUGHTLEADERSHIPSERIES | MAY 2014

7 QUALITIES OF
HIGHLY EFFECTIVE

REAL-TIME
ENTERPRISES

best technology to deliver such real-time


capabilities when and where it is needed?
Who will be the recipient of such data
human beings or applications? And which
is more important: that the data be realtime fresh, or that decision makers be able
to access data instantaneously, regardless
of its age? These are all questions that need
to be answered by business and technology
leaders as they move forward.

With technology chipping away


at the time it takes to gather
relevant and accurate data,
theres less need for bureaucratic,
hierarchical decision-making
structures.
However, becoming a real-time
enterprise means more than systems
upgrades to handle nano-second response
times. Becoming a real-time enterprise
requires managing and structuring the
organization in new, more open ways.
The traditional command-and-control
structure that has been in place for decades
within many organizations is not built to
handle real-time engagementthey are
designed to slow or impede reaction times
to events on the ground. There was a good
reason for this as well, since gathering
enough of the right data to make an
informed decision was often a laborious,
time-consuming process.
Now, with technology continuing to
steadily chip away at the time it takes to
gather relevant and accurate data, theres

less of a requirement for bureaucratic,


hierarchical decision-making structures.
Plus, todays hyper-competitive global
marketplace is increasingly unforgiving
of slow decision times. Visibility needs
to occur at all levels of the organization,
with information streaming in from a
multitude of external and internal data
sources, flowing through analytical engines
that quickly process real-time data feeds
from a wide and ever-changing variety of
sources. Intelligence needs to be embedded
directly into business processes, and be
capable of not only notifying decision
makers or consuming applications of
whats going on, but also capable of being
prescriptiveproviding courses of action
that need to be taken.
Emerging technologies now becoming
part of the enterprise scenesuch as inmemory technology, cloud, mobile, and
NoSQL databasesare bringing more
real-time capabilities to the fore.
The enormous competitive pressure to
become a real-time enterprise is reflected
in a survey of organizations conducted
by Unisphere Research, a division of
Information Today, Inc. The survey of
323 data managers finds considerable
pressure within organizations for data
and information delivered, as needed, in a
rapid fashion. A majority of respondents,
77%, describe their end users interest or
demand for access to real-time (defined
for purposes of the survey as up to
the moment) data to accelerate and
improve decision-making to be important.
Close to half of data executives and
s
s

Real-time information processing, a


concept that has been around for a long
time, has been in vogue lately. One reason
for its popularity is the fact that real-timecapable technology and online services
have become very affordable, even for
small businesses. Another factor is that
real time has the attention and interest of
the boardroom and executive suite. The
idea of being able to instantaneously sense
and respond to threats and opportunities
in the style of a lion in the junglehas
a lot of appeal for business leaders vying
for an edge in a fiercely competitive global
economy.
At a systems level, real-time capabilities
have provided enterprise software
architects and administrators with ways
to keep close tabs on their systems, to
be able to respond quickly in the event
of performance lags or outages. Now,
managers want the same kind of senseand-respond capabilities applied to the
business at large, in order to react to events
arising within their operations, among
customers, or within markets.
Of course, this begs a lot of questions.
Namely, what informationcoming
out of data stores that may scale into the
petabytesis of material importance to
the business, and needs to be identified
and thrust forward for real-time delivery
and action? Who gets to decide what data
is important? How will decision-makers
know the data is accurate and properly
vetted? Are there some data flows that
need to be intentionally slowed down to
ensure quality and accuracy? What is the

THOUGHTLEADERSHIPSERIES | MAY 2014

professionals, 49%, describe this level


of interest to be very to extremely
important on the part of end users.
Traditional data environments are not
keeping up with the explosive growth
of data volume (or big data) and the
demand for real-time analytics, the
survey also finds. Fewer than one out of
10 data warehouse sites in the survey,
for example, can deliver analysis in what
respondents would consider a real-time
timeframe. Overall, existing database and
data warehouse environments are timeconsuming for both administrators and
end users.
For starters, these new platforms
and services are increasing the speed of
business itself, enabling faster innovation
and time to market at far less cost than in
times gone by. In addition, less reliance
on the extract, transform and load (ETL)
paradigm means source data arrives at
a faster rate. In addition, organizations
that enable all levels of decision makers
from line of business managers to
customer care representatives to field
service techniciansto assemble their
own reports and interfaces will achieve
faster time to market. Data sources are
arising or evolving too quickly, and
organizations cant afford to extract,
transform, and load every data bit that
flows through their networks.
This calls for a cultural shift within
many of todays organizations, which
empowers end users at all levels to work
with and make decisions based on realtime data. Timeframes are too short, and
competition is too intense. Organizations
that are achieving this real-time advantage
have the following characteristics:
1. Successful real-time enterprises
enable real-time priorities and decisions
to be driven by the business. Its essential
to anchor real-time deployments to
ongoing business needs, through a crossenterprise team or board that will direct
where and how real-time technology will
be applied. In addition, not every piece
of information can or should be brought
forward in real time. Often, only a small
fraction of data needs to be processed
and acted upon in real-time, such as a
logistics problem, weather event, or a
larger-than-usual order suddenly hitting

the pipeline. Let the business decide what


it wants to see real time.
2. Real-time enterprise leaders build
organizational support. Since real time
will touch many parts of the business,
calculating and presenting the business
case for the real-time enterprise requires
organizational support. And, since data
may be moving at light-speed across the
enterprise, it will be essential to work
with data source owners to ensure that
data is of high quality, and is timely and
relevant.

Emerging technologies
now becoming part of the
enterprise scenesuch as
in-memory technology, cloud,
mobile and NoSQL databases
are bringing more real-time
capabilities to the fore.
3. Real-time enterprises inventory
existing technologies and platforms.
The typical enterprise these days has an
array of different types of applications,
databases, and analytic tools. When it
comes to BI and analytics, its likely
that a company has many BI tools that
business users are employing, from juryrigged spreadsheets to OLAP cubes to
dashboard environments to behind-thescenes applications analytics. Real-time
proponents need to consider exactly
which areas of the enterprise real-time
applications and platforms will be
replacing.
4. Real-time enterprises ensure
employees and stakeholders are wellprepared and well-trained. With more
and faster analytic capabilities shifting
to end users, there will be a shift in the
way business intelligence and analytics
are handled within enterprises. Along
with an analytics-driven culture comes
responsibilitythe need for more user
training and awareness, and to be in a
position to take advantage of the increased
speed of analysis. In addition, IT and data
management professionals will also require
additional training to be able to manage
and support these new environments.

5. Real-time enterprises manage


users expectations. Moving to real time
means decision makers expectations
will be raised as well. They will expect
responsiveness and answers to queries
as fast as they can obtain them from an
outside search engine. They will expect
the data they examine to reflect what
is going on in and around the business
at that moment. Technology alone will
not turn the organization overnight into
a real-time enterprise. It will take time
to ensure that the correct data is being
accessed and flowed through the system.
Business processes themselves will need
to be reconfigured or re-aligned to take
advantage of real time. Business users need
to be made aware that moving to a realtime enterprise is a journey, not a sprint.
6. Real-time enterprises introduce
real-time capabilities in a gradual and
highly targeted fashion. As with any
major technology shift, deliberate and
steady wins the race, versus attempting a
big bang conversion all at once. Develop
a plan, a roadmap, and apply the solution
to the most pressing problems where
slow response times and bottlenecks are
frequently seen. Build your business case
around early wins. Move department by
department.
7. Real-time enterprises measure
everything, but avoid old formulas.
Measuring cost of ownership and
potential return on investment for realtime implementations was vexing up
until recently when the solutions were
expensive to purchase, and required
teams of consultants to install. Even in
the current era of low-cost open-source
and cloud solutions, business intelligence
and analytics in and of themselves are
difficult to calculate, with tangible benefits
difficult to assess. Blazing fast analytics
at the speed of thought can free up end
users imagination, enabling them to
ask questions they wouldnt even have
considered before. As a result, conventional
technology measures may not be suitable
for this new environment. n

Joe McKendrick

sponsored content

THOUGHTLEADERSHIPSERIES | MAY 2014

Reale Mutua Simplifies Back Office


Management with CA ERwin Data Modeler
BUSINESS

The largest Italian mutual insurance


company
The Reale Mutua Group operates in the
insurance sector in Italy and Spain through
its parent companySocieta Reale Mutua
di Assicurazioniand its subsidiaries.
The group offers a range of insurance
and financial products for risk protection,
healthcare insurance, pensions and
savings. It is also active in the real estate,
banking and services sector.
Established in Turin in 1828, Reale
Mutua di Assicurazioni is the largest
Italian mutual insurance company. Today
the company is a mid-size organization,
offering customized solutions to its clients
through accurate and careful services.

CHALLENGE

Simplify information management


The continuous growth of the amount
of information to be processed, and the
need to make data easily shareable across
all business lines prompted Reale Mutua
Assicurazioni to undertake a simplification
initiative. A number of factors increased the
complexity of managing data, including:
A greater volume of information
replicated across multiple systems
Larger information exchange with
external authorities such as the IVASS
and ANIA
Growing need to exploit data for
business intelligence applications
The need for transcoding among the
systems used by the various business
lines
Alexis Mendoza from the IT
Architecture Office at Reale Mutua
Assicurazioni comments, The growing
demand for data integration and for a
common, company-wide data processing
method highlighted the inadequacy of the
pre-existing solution, which was based on
a Data Warehouse environment (I and II
level) but built in a siloed fashion.
Reale Mutua Assicurazioni therefore
decided to reorganize its back office
data management through a new firstand second-level EDW (Enterprise

Data Warehouse). The initiative was


accompanied by a new strategic plan
supported by the senior management
team, with the involvement of end users.
The EDW system had to provide an
overall picture that met the needs of the
various business lines, and specifically
provide:
A series of Key Performance Indicators
(KPIs) valid for the whole company
A single master system capable of
containing all certified domain
information
A single point for the integration and
consolidation of the information coming
from the front office to reduce or
eliminate point-to-point integrations
A system protected from data
modification to avoid incongruities
The ultimate goal of the initiative was to
develop a unified and shared business
intelligence system for the directors of all
business lines.
SOLUTION

A common data model for the entire


company
To achieve these goals, Reale Mutua
Assicurazioni first created a conceptual
data model that would represent the
information assets managed by frontoffice systems.
The data model used standardized
nomenclature to define each attribute,
entity and relationship, according to
users specifications. Approximately ten
databasesimplemented on Oracle,
DB2 and SQL Server relational database
management systemswere analyzed.
Mendoza recalls, We needed to
identify, analyze and resolve all potential
conflicts within the data in terms of
naming, structure and coding. We also
needed to resolve redundancies and verify
the data integrity according to business
rules. The conceptual data model that we
established, and the associated glossary,
provided the basis for designing the EDW
solution.
CA ERwin Data Modeler was used for
data modeling, data documentation and

physical implementation of the database.


The tool allowed the documentation
of all metadata that was necessary to
build a common business glossary for
all areas in the company, including the
description of attributes, value domains,
entities, relationships between entities,
and mapping towards data sources. CA
ERwin Data Modeler has been widely used
to validate the conceptual data model and
for publishing its content in formats more
familiar to the final users, such as Excel
and HTML.
BENEFITS

Standardized and effective


information management
The main benefits achieved with the
adoption of the new back office EDW
include:
Implementation of a unified conceptual
data model
Less redundancy of information across
various systems
KPIs are now common to all business
lines
More efficient and faster information
retrieval, thanks to a unified software
integration layer between front and back
offices
Stabilization of the back-office
environment, due to its separation from
the front-office processes.
Using CA ERwin Data Modeler to
analyze metadata and create reports
has been key to our ability to share the
model with business stakeholders during
validation meetings, and has helped us
establish a common, company-wide
language, says Mendoza.
Furthermore, CA ERwin Data
Modeler has enabled us to gather data
from all company models, including
both the Data Warehouse and its feeding
systems, in a single repository, enabling us
to govern the evolution in a consistent and
integrated manner. n
CA ERWIN MODELING
www.erwin.com

sponsored content

THOUGHTLEADERSHIPSERIES | MAY 2014

NoSQL Databases
in a World of Big Data
Big data is a big term. It encompasses
concepts about data types, dozens of
different technologies to manage those
data types and the eco-system around all
those technologies. And everything in it
moves fast!
The key to being successful in big
data initiatives is being able to manage
the speed, scale and structure at submillisecond speed.
There was a time when big data was
Hadoop. It was offline analytics. Thats
no longer the case. Its a solution. Its
a solution that includes Hadoop but is
not Hadoop. Its a solution that meets
both real-time analytical requirements
and offline analytical requirements. Its
a solution that meets both analytical
requirements and operational
requirements.
In todays increasingly connected
world, Hadoop can no longer meet big
data challenges by itself. The explosion
of user-generated data is being followed
by an explosion of machine-generated
data. While Hadoop is great for storing
and processing large volumes of this
data, it is not equipped to ingest it at
this velocity. This has led to the birth of
a new generation of systems engineered
specifically for real-time analytics.
As technology advances at an everincreasing rate, so are best practices
for big data solutions: a modern big
data solution relies on real-time data
processing via stream processing. A
modern big data solution leverages
integration with Storm for real-time
processing, Couchbase Server for high
performance data access, Hadoop for
offline analytics, Elasticsearch, and more.
It enables real-time analysis and
search while meeting operational
requirements. In order to enable realtime analysis and search, a modern big

data solution requires a high performance


NoSQL database that is scalable. The
NoSQL database must fulfill operational
requirements while meeting the
performance requirements necessary
to enable real-time analysis and search.
Couchbase Server is that NoSQL database.
THERE ARE THREE BIG DATA
CHALLENGES:

1. T
 he amount of data being generated:
data volume.
2. The rate at which data is being
generated: data velocity.
3. The rate at which information must
be generated: information velocity.
Hadoop addresses data volume. It can
store and process a lot of data, later. It
scales out to store and process more data.
Hadoop does not address data velocity.
However, it meets offline analytical
requirements.
Couchbase Server addresses data
velocity. It is a high performance NoSQL
database that can store a lot of data,
now. It scales out to store a lot of data,
faster. Couchbase Server does not address
information velocity. It can store and
process data at rest. However, it meets
operational requirements.
Storm addresses information velocity.
It can process a real-time stream of data.
It scales out to process streams of data,
faster. Storm does not address volume
or data velocity. It does not store data.
It processes data in motion. However, it
meets real-time analytical requirements.
All three big data challenges can be
met by integrating Storm, Couchbase
Server and Hadoop to create a faster big
data solution. Hadoop and Couchbase
Server form the inner core of a faster
big data solution. Hadoop enables
big data analysis. It powers analytical
applications. Couchbase Server enables
high performance data access. It powers

operational applications. Sqoop can be


used to replicate data between Hadoop
and Couchbase Server. It can create a big
data refinery. The outer core of a faster
big data solution is a stream processing
system such as Storm. It enables real-time
analysis of data before it enters the core.
Couchbase Server fills the gaps between
Storm and Hadoop while powering
operational applications.
The best thing about building a realtime big data solution is designing the
architecture. Its like playing with Legos.
The pieces come in many shapes and sizes,
and its up to you to identify and connect
the pieces necessary to build the most
efficient and effective solution possible.
Its an exciting challenge. What will your
architecture look like? n

Learn how real


companies like

LivePerson

are making their


big data faster.
Be aware of slow
big data.
And, explore

Big Data
Central

to learn how the


innovative enterprise is building
big data solutions
with Hadoop and
NoSQL.

COUCHBASE
www.couchbase.com

sponsored content

THOUGHTLEADERSHIPSERIES | MAY 2014

Demand for Fast Data Renews


the Need for In-Memory Databases
To date, much of the enterprise agenda
has focused on big data for strategic
planning. That is changing in the wake
of successes by Internet advertising and
marketing firms in using fast, contextual
data to interact with customers based on
what they are interested in, what they are
doing, and where they are now.
In the process, companies are turning
to in-memory databases to handle
extremely large volumes of information
within milliseconds.
In fact, Forrester estimates that more than
50 percent of the enterprises will be using inmemory databases by 2017 1. Forrester, which
published this estimate in the TechRadar:
Enterprise DBMS, Q1 2014 2 observes in
the report that, Storing and processing
customer data in-memory enables
upselling and cross-selling new products
based on customer likes, dislikes, friend
circles, buying patterns, and past orders.
Many in-house enterprise in-memory
database projects are just getting off the
ground. However, pioneers in Internet
advertising, marketing and retail provide
real-world examples from which other
companies can learn.
THE POWER OF IN-MEMORY AND
NOSQL

Supporting real-time customer


interactions often relies on the ability
to respond based on in-the-moment
contextual data along with terabytes of
historical and analytical data. This has led
many businesses to combine in-memory
and NoSQL database capabilities.
Take for example, The Trade Desk Inc.,
a global demand-side platform in real-time
bidding advertising founded by pioneers
in real-time bidding. The Trade Desk
uses the Aerospike in-memory NoSQL
database in multiple datacenters to process
millions of transactions per second (TPS)

for hundreds of application servers and


manage billions of records and terabytes
of data across multiple clusters. Realtime data is augmented with historical
context processed in Hadoop clusters,
combined in Aerospike, and used for realtime bidding and real-time attribution.
Aerospike database software runs baremetal on Dell servers, each with four 400
GB Intel 3700 solid-state drives (SSDs).
Dave Pickles, The Trade Desks cofounder and CTO, observes, Aerospikes
flash-optimized architecture takes advantage
of Intel SSDs to deliver the predictable
low latency that enables us to receive a
request from across the Internet, perform
the multiple database lookups that are crucial
to making the best decision, make a note
of that decision with a database write, and
send the bid response back to the sell-side
platform within 100 milliseconds.
While The Trade Desk prioritizes
on processing millions of TPS, eXelate
focuses on global scale with data centers
worldwide managing several tens of
terabytes of data.
eXelate powers digital marketing
decisions worldwide for agencies, platforms,
and publishers. It runs Aerospike databases
on Internap bare-metal cloud servers with
direct-attached SSDs in four Internap
data centers. Data is synchronized across
Internaps high-bandwidth networks via
Aerospike cross-data center replication. The
Aerospike-Internap combination enables
eXelate to provide its customers with
8,000 distinct segments of demographic,
behavioral and purchase intent data on
more than 700 million consumers within
strict, 100-millisecond service-level
agreements (SLAs), cost-effectively and
with 100% uptime.
Elad Efraim, eXelate chief technology
officer and co-founder, noted,
Combining the cost-effective, high

1 Forrester Research, TechRadar: Enterprise DBMS, Q1 2014, by Noel Yuhanna

with Boris Evelson, Brian Hopkins and Emily Jedinak, February 13, 2014.
2 Forrester Research, TechRadar: Enterprise DBMS, Q1 2014, February 13, 2014.

performance of Internaps bare-metal


cloud and Aerospikes NoSQL database,
we were able to accomplish programmatic
scaling of our infrastructure to support
one trillion real-time data transactions
monthly for more than 200 marketers and
publishers worldwidegiving us an edge
to compete in todays high-speed, datadriven Internet economy.
PURE IN-MEMORY IN THE CLOUD

Simultaneously, more companies


are using pure in-memory front-facing
databases in the cloud to drive customer
interactions. One example is Snapdeal,
Indias largest online marketplace, which
has a network of more than 20,000 sellers,
serving 20 million-plus membersone
out of every six Internet users in India.
The Snapdeal.com platform enables
sellers to list products for sale on the site,
manage inventory, and make pricing
changes in real-time based on what is
happening in the marketplace. High
volumefor example, a pair of shoes sells
every 30 secondsmeans that thousands
of sellers are making dynamic price
adjustments.
Today, the Java-based Snapdeal
inventory and pricing management system
uses Aerospike to provide predictable
sub-millisecond responses while managing
100 million-plus objects stored in 32 GB
of DRAM. Product and price changes are
made to both Aerospike and a MySQL
database while seller rankings and product
details are read from Aerospike. The
implementation runs on two Linux servers
on the Amazon Elastic Compute Cloud
(EC2), and it takes advantage of Amazon
Elastic Block Store (EBS) for persistent
block-level cloud storage.
Amitabh Misra, Snapdeal vice president
of engineering explains, With Aerospike,
we can push through huge price changes
while maintaining the same response time
experience on the buyers sideeven
with millions of buyers. That has been the
biggest advantage. n

AEROSPIKE

www.aerospike.com

You might also like