Monday 3 August 2009

Top 5 Trends in BI in 2009


Top 5 trends in Business Intelligence for 2009


Trend #1: Complex Event

Processing (CEP) comes of age


The first generation of successful enterprise data
warehouses uncovered new insights and led to
innovative ways to improve business. These systems are
optimized for one-time queries on mostly static data
captured in the data warehouse long after the event
that generated it has occurred. The paradigm is longlived
data, short-lived queries.
CEP, a logical follow-on from business activity
monitoring (BAM), enables analysis of data streams
and linking of seemingly unrelated events in a
meaningful way. Instead of storing data and having
the execution of a query as the catalyst for results, a
continuous query system effectively "stores" the queries
and new results are initiated by the arrival of new
data, generating real-time insight and/or triggering
appropriate action. (The new paradigm is long-lived
queries, short-lived data).
CEP has heretofore been conspicuously missing from
the mainstream BI arena, necessitating stovepipe CEP
implementations that are only loosely integrated with
organizations' existing visualization, reporting,
dashboarding, information modeling, metadata, and
other BI infrastructure components. We are seeing
indications of that changing as leading BI vendors
partner with and acquire CEP engine providers, and
as BI users incorporate CEP in growing numbers.


Trend #2: Convergence of structured

and unstructured data

This has been a topic of interest for years, but it is
coming up in more conversations with current and
prospective customers than ever before. In the HP
2009 survey, 60% of respondents indicated that they
have an identified need to analyze unstructured data
as part of their BI systems, with over half of those either
doing the unstructured data analysis today, or
developing the capability.
As retailers strive to be more customer-centric, and
healthcare organizations strive to efficiently improve
and manage patient care, and financial institutions
strive to better detect risk and fraud threats, they will hit
limitations by not including unstructured data in the
ever-increasing analysis.
Using only structured data as a proxy for "what is
happening" and making an inference from that,
without correlating with available unstructured data,
can lead to very wrong decisions. For example, coded
diagnoses targeted for the payer often do not indicate
what's really wrong with a patient. Analysis of cancer
case reimbursements might indicate that more money
should be put into brain cancer research because of its
prevalence. But, if any cancer metastasizes to the
brain, it's often coded as brain cancer because of the
greater likelihood of reimbursement. Patient file notes
would indicate the true diagnosis.
Another common business driver is to mine call center
service logs and e-mail together to better understand
customers, for early problem detection and to discern
actual cause of problems.

Trend #3: The line is blurring

between data warehouse,

operational data store (ODS) and

operational systems

Initially, users expected ERP systems to provide needed
reporting. When these systems couldn't meet
requirements due to backlog and overload, users
turned to the data warehouse. Traditional BI satisfies
most strategic reporting and analysis, but not real-time
operational reporting with its associated needs for
high-volume real-time data updates, high availability
needs and high throughput rate of operational queries.
Operational reporting has high overhead and often
ties up the data warehouse, preventing other analytics
from running.
We are seeing operational reporting as a top business
initiative, and increasing interest in the use of a data
provisioning platform as companies need to extend the
data warehouse to more operational use. The platform
needs to go beyond the capabilities of an ODS,
providing operational reporting, data cleansing,
metadata management and data warehouse staging.
Such a data hub enables agility and new applications,
while preserving and enhancing the existing data
warehouse structure, and does it in a much more
efficient and cost-effective manner than using disparate
independent data marts. A hub that connects to
existing enterprise service buses (ESBs) and allows
architectural flexibility, including federation for remote
data, reflects the changing nature of the business while
allowing centralized control over data quality and
data access privileges.
The result is the ability to do operational BI which
involves embedding and automating analytics in a
process so that a person — or another process — can
act on generated information in real time, making
decisions and taking action in the context of a
business process.


Trend #4: Data integration focus

gaining new momentum

Many BI systems in place today were built for strategic
decisions, the sweet spot of traditional BI. Analysis is
done by a small number of people, over a period of
time, allowing for analysts to manually cleanse and
reconcile data from multiple disparate sources, and to
ensure that business rules are applied appropriately
and consistently. Many organizations would like to
increase their intelligence by giving more employees
access to these analytic tools, and applying them to
operational decisions. But it's more than a matter of
increasing capacity for data volumes and query
throughput, and giving the users simpler tools. The
limitation of first generation BI systems is not simply
their inability to handle large volumes of data and
users, but their lack of data integration rigor, including
data cleansing, MDM, and metadata management.
Operational analysis does not afford the time for
manual oversight to ensure proper quality,
reconciliation and classification of the source data,
which may have to be served to applications or
processes where the decisions are then made.
Organizations intent on leveraging their data and
expanding their analytic capability are recognizing the
value of an underlying infrastructure which provides
well-integrated, high quality data to applications,
processes and people.
According to Gartner, "Contemporary pressures are
leading to an increased investment in data integration
in all industries and geographic regions."6 In addition,
"recent focus on cost control has made data integration
tools a surprising priority as organizations realize the
'people' commitment for implementing and supporting
custom-coded or semi-manual data integration
approaches is no longer reasonable."7 The weak
economy will drive M&A in many industries, resulting in
a further need to integrate disparate data to get a
single view of the business, supporting continued
demand for MDM. And financial regulations are likely
to increase. Transparency needed for regulatory
compliance requires a consistent and complete view of
the data which represents the performance and
operation of the business.
In addition to early implementations, we are now also
seeing the results of more recent data warehouse
modernization and data mart consolidation projects
that were undertaken to cut costs, improve performance
and provide more headroom. Where the approach was
to move existing data structures to a new platform to
meet those immediate goals without addressing the
fundamental data integration issues, organizations are
left with the same unwieldy data structure as before,
preventing them from expanding the use of the data
warehouse to meet additional business needs.
Organizations are realizing the need for an overall
enterprise information management (EIM) strategy in
order to leverage data as a corporate asset, to apply
advanced analytics that will help them achieve a
discipline of fact-based decision making, and eliminate
the wastefulness of different teams using different tools
with little consistency and lots of overlap and
redundancy. They are also seeing that data integration
is a critical component of an overall EIM strategy.
Inconsistent meanings create barriers to reliable
analytics. As the boundaries between application
domains like CRM, ERP and product lifecycle
management continue to erode, there is a growing
need to create an enterprise-wide information strategy
to ensure semantic consistency for all users,
applications and services.

----
A7 Gartner, "Magic Quadrant for Data Integration Tools," by Ted Friedman,
Mark A. Beyer, Andreas Bitterer, 22 September 2008.
----

Trend #5: Analytics moves to the

front office — More sophistication in

the hands of business users

Companies are looking to apply advanced analytics to
ERP, CRM and supply chain management systems in
order to achieve strategic competitive differentiation.
The traditional approach to analytics has been to hire
modelers with PhDs who spend three months
developing a model, up to a few dozen or a few
hundred per year. The modeling runs offline to do
customer segmentation, for example. Capturing this
sophistication in tools that can be used by business
managers enables the development and use of not
hundreds, but thousands of models, with a much
shorter time to market. This approach makes it possible
for someone who doesn't know what a neural network
is, to use one, as mainstream capability.
There is an Internet influence on interfaces as well.
Instead of pulling data from multiple sources and
building an analysis cube, the user will go to a portal
and request data elements. Provisioning will be
automated rather than manual, assuming that a data
integration infrastructure has been put in place.

2 comments:

Will Dwinnell said...

"Trend #5: Analytics moves to the front office — More sophistication in the hands of business users "

I challenge this notion. The experience with BI tools, query front-ends and even spreadsheet has been that unsophisticated workers abuse data and turn out questionable results.

I worked as a post-sales consultant for one of the biggest BI vendors and watched thousands of licenses be deployed. My question is: When a business analyst runs into the room screaming that "2 out of 3 of our customers also use brand X", does he or she mean "67% of thousands", or literally "2 out of 3"?

KIRAN PANDE said...

Agree! Sample population verdict should not be labelled in percentages in certain cases as strata will differ geography wise.