You are on page 1of 11

See

discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/224221404

Business Intelligence Maturity: Development and


Evaluation of a Theoretical Model

Conference Paper · February 2011


DOI: 10.1109/HICSS.2011.90 · Source: IEEE Xplore

CITATIONS READS

38 1,064

4 authors, including:

Robert Winter Felix Wortmann


University of St.Gallen University of St.Gallen
627 PUBLICATIONS 4,606 CITATIONS 88 PUBLICATIONS 649 CITATIONS

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Value Co-Creation Language View project

Bosch IoT Lab - Connected Car View project

All content following this page was uploaded by Felix Wortmann on 20 October 2015.

The user has requested enhancement of the downloaded file.


Proceedings of the 44th Hawaii International Conference on System Sciences - 2011

Business Intelligence Maturity:


Development and Evaluation of a Theoretical Model
Gerrit Lahrmann Frederik Marx Robert Winter Felix Wortmann
Institute of Information Management
University of St. Gallen
St. Gallen, Switzerland
{gerrit.lahrmann, frederik.marx, robert.winter, felix.wortmann}@unisg.ch

Abstract strengths and weaknesses of certain domains of an or-


In order to identify and explore the strengths and ganization are maturity models (MMs) [38]. They con-
weaknesses of business intelligence (BI) initiatives, sist of multiple, archetypal levels of maturity of a cer-
managers in charge need to assess the maturity of their tain domain and can be used e.g. for organizational
BI efforts. For this, a wide range of maturity models assessment and development [38]. In the case of BI,
has been developed, but these models often focus on quite a high number of MMs has been proposed [34,
technical details and do not address the potential value 60].
proposition of BI. Based on an extensive literature In previous research, we identified and analyzed a
review and an empirical study, we develop and eva- total of ten BI MMs (summarized in section 2.3) [34].
luate a theoretical model of impact-oriented BI maturi- Regarding MM content, our results show that classic
ty. Building on established IS theories, the model inte- IT topics, e.g. applications, data, and infrastructure, are
grates BI deployment, BI usage, individual impact, and highly present, while other topics like efficiency, orga-
organizational performance. This conceptualization nizational structures, staff, and strategy are rarely ad-
helps to refocus the topic of BI maturity to business dressed. Overall, the links between BI technology, BI
needs and can be used as a theoretical foundation for usage, and organizational performance stay unclear.
future research. Furthermore, we identified methodical gaps in the de-
sign process of the MMs regarding the reliability (i.e.
the theoretical foundation of the model) and the expli-
1. Introduction cation of the underlying maturity concept (i.e. the fun-
damental understanding of BI maturity).
Business intelligence (BI) is a topic widely dis- The work presented within this paper is part of a
cussed in information systems (IS) literature. Since its larger research project (cf. fig. 1). In order to address
first mentioning [35], BI has become an essential com- the above mentioned gaps, our overall research goal is
ponent of the information supply infrastructure and a to develop a methodically sound BI MM which is
contributor (and prerequisite) to overall organizational comprehensive as regards content and which clearly
success [14, 60]. Therefore, BI has gained steadily guides its users how to design BI in order to contribute
increasing managerial and scientific attention. to overall organizational performance. As a prerequi-
Over the time, the role of BI and its impact have site towards this goal, a sound theoretical foundation is
evolved [60]. Its role has changed from a single- needed. Based on the results of previous research in
analytical-application-view to an organizational capa- form of a qualitative literature review and on an empir-
bility of strategic importance [41]. Challenges of tech- ical study, the objective of this paper therefore is to
nical implementation are more and more replaced by develop and evaluate a theoretical model of impact-
questions of business value of the overall approach, oriented BI maturity. This model will be the foundation
e.g. strategic business alignment, competence in usage, for the development of an assessment instrument for
operations, and further development of a broad solu- impact-oriented BI maturity and the confirmation and
tion architecture [46, 58]. Recently, BI has been deno- broad application of an impact-oriented BI MM.
minated as a top business priority from formerly being The paper is structured as follows. Next, founda-
denominated as a top technology priority [37, 46]. tions of BI and MMs are presented and the related
In order to be more effective, BI requires an com- work is analyzed. Following, we develop and evaluate
prehensive overall view of the design and changing of a theoretical model of impact-oriented BI maturity.
its structures [48]. An established means to identify Concluding, we discuss implications for practice and
research and give an outlook on future research.

1530-1605/11 $26.00 © 2011 IEEE 1


Proceedings of the 44th Hawaii International Conference on System Sciences - 2011

Focus of this paper is prudent to remember that BI is also about organiza-


tional decision-making, analytics, information and
qualitative
literature
quantitative knowledge management, decision flows and processes,
1st survey
review and human interaction.” Further related terms, partially
used synonymously for BI, are decision support sys-
Theoretical tems, executive information systems, management
State of the art model of support systems, and (business/corporate) performance
of BI MMs impact-oriented
BI maturity
management (PM) [12, 30, 40, 57].

2.2 Maturity models and development thereof


qualitative
case studies quantitative
and/or expert 2nd survey Maturity describes a “state of being complete, per-
interviews fect or ready.” [53] To reach a desired state of maturi-
ty, an evolutionary transformation path from an initial
Assessment Confirmation & to a target stage needs to be progressed [20]. MMs are
instrument for application of
impact-oriented impact-oriented used to guide this transformation process. Initially pro-
BI MM BI MM posed in the 1970’s [25], more than a hundred MMs
have been published in the field of IS up to date [4,
Figure 1. Overall research process 38]. As these high numbers led to a certain arbitrari-
ness of the design process [4, 15, 38], methods for the
2. Research framework design of MMs were developed.
Important characteristics of MMs are the maturity
2.1 Business intelligence concept, the dimensions, the levels, the maturity prin-
ciple, and the assessment approach [33]. An overview
Mentioned as early as 1958 [35], BI was initially of these characteristics is given in table 1.
coined as a collective term for data analysis tools [2].
Meanwhile, the understanding broadened towards BI Table 1. Properties of MMs [33]
as an encompassment of all components of an inte-
Property Description
grated decision support infrastructure [3]. In BI sys- Maturity Three different maturity concepts (or understandings of
tems, data from operational systems is combined with concept maturity) can be distinguished [38]. People ( or work-
analytical frontends to “present complex and competi- force) capability defines “the level of knowledge, skills,
tive information to planners and decision makers.” [40] and process abilities available for performing an organi-
A central component of BI systems is the data ware- zation’s business activities.” [13] Process maturity de-
fines “the extent to which a specific process is explicitly
house (DW), which integrates data from various trans-
defined, managed, measured, controlled, and effective.”
actional IS for analytical purposes [31, 32]. [42] Object (or technology) maturity defines the respec-
But still, “there is no universally-accepted defini- tive level of development of a design object [24].
tion of BI.” [60] In the context of this article, we ad- Dimension Dimensions are specific capability areas, process areas,
here to the BI definition of Wixom & Watson [60], or design objects structuring the field of interest. They
which also includes processes (including the usage of should be exhaustive and distinct [15, 38]. Each dimen-
sion is further specified by a number of measures (prac-
data, i.e. business processes): “Business intelligence
tices, objects, or activities) at each level [15, 20].
(BI) is a broad category of technologies, applications, Level Levels are archetypal states of maturity of a certain
and processes for gathering, storing, accessing, and dimension or domain. Each level has a distinguishing
analyzing data to help its users make better decisions.” descriptor clearly providing the intent of the level and a
Advantages of this definition are, for example, that detailed description of its characteristics.
it does not restrict BI to analytical frontend applica- Maturity MMs can be continuous or staged. Continuous MMs
tions, but includes “getting data in […] and getting principle allow a scoring of activities at different levels. Therefore,
the level can be either the (weighted) sum of the indi-
data out.” [60] Furthermore, it does not only focus on vidual scores or the individual levels in different dimen-
technology and/or applications. For our purpose, it is sions. Staged models require the compliance with all
important to have a BI working definition which is not elements of one level [20]. They specify a number of
too narrow in scope, as we want to cover all kinds of goals and key practices to reach a predefined level.
aspects which can be attributed to BI. Staged MMs reduce the levels to the defined stages,
whereas continuous MMs open up the possibility of
Besides data warehousing (“getting data in”), there
specifying situational levels.
are other terms which consistently occur within the Assess- The assessment approach can be qualitative using de-
context of BI. Or, as Herschel [28] puts it: “Today, the ment scriptions or quantitative using e.g. Likert scales [20].
practice of BI clearly employs technology. However, it

2
Proceedings of the 44th Hawaii International Conference on System Sciences - 2011

The literature proposes slightly different MM de- Table 3. Overview of BI maturity models [34]
velopment processes which all adhere to a basic design
[4, 15, 38]. De Bruin et al. [15] propose a development Addressed
process consisting of the phases of scope, design, po- No. Name (year(s)) Source(s) topic(s)
pulate, test, deploy, and maintain. Becker et al. [4] 1 Watson et al. (2001) [56] DW
2 SAS (2004, 2009) [26, 50] Inf. mgt.
propose a similar process emphasizing the use of exist-
3 Eckerson (2004, 2009) [16, 17] BI
ing MMs and an iterative development. Table 2 gives 4 SMC (2004, 2009) [8, 51] BI
an overview of the basic MM development process. 5 Cates et al. (2005) [7] BI
6 Dataflux (2005) [19] Data mgt.
Table 2. Basic MM development process [33] 7 Sen et al. (2006) [52] DW
8 HP (2007, 2009) [27, 29] BI
Phase Description 9 Gartner (2008) [45] BI & PM
Scope The scope phase defines the focus and identifies the 10 Teradata (2008) [54] BI & DW
relevant stakeholders and targeted audiences of the
model. It determines the balance between complex reali- We analyzed the BI MMs regarding methodology
ty and model simplicity.
and content. Our findings show that future research
Design In the design phase, the principle concept of maturity, the
structure of levels, dimensions, and sub-dimensions (i.e. should pay special attention to theoretical foundation,
the meta-model) are outlined. Based on this, the descrip- validation, explication of the underlying maturity con-
tors of the levels and their definitions are derived. There- cept, and comprehensiveness of BI MMs.
by, the design process can follow a top-down or a bot- The theoretical foundation describes if the model is
tom-up approach. A top-down approach first specifies the explicitly based on accepted (design) theories [5]. Only
levels and their descriptions. A bottom-up approach first
one model is theory-based: Watson et al. [56] refer to
defines dimensions and characteristics representing ma-
turity and then derives descriptions from it. the stages of growth approach [25]. An explicated
Populate In the populate phase, the corresponding characteristics theoretical foundation helps to understand how the
are determined. It also defines the maturity assessment, different parts of the MMs influence each other. Due to
which includes the specification of instruments. the missing theoretical foundation in most of the ana-
Test The constructed model is tested on content completeness lyzed models, the links between BI maturity, BI im-
and intended model scope accuracy, and the assessment
pact, and organizational success stayed quite unclear.
instrument is tested regarding validity and reliability.
Deploy The model is deployed to the initial stakeholders and to
The maturity concept outlines the fundamental un-
an independent community. derstanding of BI maturity. The most common maturi-
Maintain The model is maintained to ensure its evolution. ty concept is object-centric (e.g. from no explicit archi-
tecture to an enterprise DW on the highest levels of BI
For designing and populating MMs, different ex- maturity), closely followed by people-centricity. Five
ploratory research methods as well as combinations models use a maturity concept which is either people-,
thereof are proposed. Commonly mentioned are litera- process-, or object-centric. The other models use a
ture analysis, Delphi studies, case studies, and focus mixed maturity concept. A methodically sound BI MM
groups [4, 15]. Quantitative methods are less frequent- should explicate its understanding of BI maturity in
ly used for constructing MMs [20], as these require a order to be clear what exactly is measured and what the
sound theoretical foundation. Testing is also mostly MM’s purpose is.
done using qualitative methods. The choice of the rele- Regarding the comprehensiveness of the BI MM
vant research method is influenced by the scope, stake- content, our results show that classic IT topics, e.g.
holders, and targeted audiences [38]. applications, data, and infrastructure, are highly
As our objective (cf. section 1) is to develop and present, while other topics like efficiency, organiza-
evaluate a theoretical model of impact-oriented BI ma- tional structures, staff, and strategy are rarely ad-
turity as a foundation for an impact-oriented BI MM, dressed (cf. Table 5 in the appendix). A bit surprising
the study at hand contributes to the design phase of the is the fact that organizational structure and strategy are
MM development process. rarely addressed, as BI organization (e.g. BI competen-
cy centers) and BI strategy (e.g. strategic alignment)
2.3 Overview of BI maturity models and require- are two topics highly present in current IS literature
ments elicitation for an improved model [22, 55]. With regard to people, users and staff are dif-
ferentiated. Five models address users, but only one
Quite a high number of BI MMs has been proposed model explicitly addresses staff. A comprehensive BI
[34, 60]. In previous research, we identified and ana- MM should integrate all these and further, eventually
lyzed a total of ten BI MMs [34]. Table 3 gives an in current BI MMs neglected aspects.
overview of the identified MMs.

3
Proceedings of the 44th Hawaii International Conference on System Sciences - 2011

3. Theoretical development BI maturity

3.1 Re-conceptualizing BI maturity

In the following, we argue that maturity as a „state


Deployment Use Impact
of being complete, perfect or ready” [53] has to reflect
causes, e.g. “BI technology deployed”, as well as ef- Capa-
Practices IT artifact Individual
Organiza-
Individual
Organiza-
bilities tional tional
fects, e.g. “organizational impact of BI realized”. MMs
focusing solely on effects do not give any insights on
how to improve the situation at hand. Therefore, they Figure 2. Theoretical conceptualization of BI
are of limited practical utility. MMs focusing solely on maturity (adapted from [21])
causes do not give any insights on the value realized,
thereby facing similar challenges. Building on this ar- 3.2 Integrated research model
gumentation, we conceptualize BI on the basis of IS
success models and their underlying theory [21, 43, Based on the re-conceptualization of BI maturity in
49]. By doing this, we try to bring together the rigor of the previous section, we begin to design our research
the IS success models and the practical relevance of the model (cf. fig. 3) with detailing the impact of BI. On
MM field. an individual level, the BI value lays in enabling better
As written above, most often BI is understood as an decisions (individual impact) [14, 59, 60]. On an orga-
IT artifact. But in order to address the above mentioned nizational level, this will lead to a better overall orga-
comprehensiveness (cf. section 2.3), not only the BI IT nizational performance (organizational impact) [18].
artifact needs to be examined, but other aspects, i.e.
people with their capabilities and organizations (struc-
tures) with their practices, should also be considered. Organizational
support
The theoretical foundation for this differentiation is
Deployment
provided by socio-technical theory (STT) [6]. STT BI
differentiates a technical and a social system. The core capabilities
message of STT is that these sub-systems are interde-
pendent and need to be working in harmony in order to BI IT BI practices
maximize performance for an organization. Quite simi-
lar to calls to replace the IT artifact with a work system
as the core subject matter of the IS field [1], this would Individual Organizational
Use

use use
enable a broader view on BI which also recognizes the
capabilities and practices needed for successful BI.
Another important aspect is that technical maturity Individual
on its own does not lead to overall BI success. In the
Impact

impact
context of BI, “technologies, applications, and
Organizational
processes for gathering, storing, accessing, and analyz- impact
ing data” are used to “help […] users make better deci-
sions” (cf. section 2.1). Having the greatest architec-
ture deployed does not imply overall BI value, as there Figure 3. Integrated research model
might be no usage of the BI technologies and applica-
tions and therefore no impact on organizational per- The impact of BI is a consequence of use, which
formance. For IS in general, this ideas have been for- can be differentiated into individual use (e.g. ease of
malized in various IS success models [21, 43, 49]. To use, efficiency, and effectivity) and organizational use
put it simple, their gist is that the deployment of an IS (e.g. covered business topics and spread throughout the
influences the individual and organizational use of this organization) [49].
IS, which again influences the impact of this IS, both BI capabilities, BI practices, BI IT, and organiza-
on an individual and an organizational level. tional support form the BI deployment system. As re-
On the basis of this argumentation, we conceptual- gards BI IT, BI applications, architecture, data, and
ize BI maturity based the three interrelated concepts infrastructure need to be considered [52, 56]. BI prac-
“deployment”, “use”, and “impact” (cf. fig. 2). In the tices are formed on the on hand by the BI development,
following, this conceptualization will serve as a foun- operations, and management processes (e.g. BI gover-
dation for the development of an integrated research nance, change management), and on the other hand by
model. strategic alignment, definition, and measurement of BI

4
Proceedings of the 44th Hawaii International Conference on System Sciences - 2011

services [52, 56]. Both BI IT and BI practices have a A total of 103 questionnaires were returned. This
direct impact on individual and organizational use and correlates to a response rate of 66 %. Because of miss-
are a result of BI capabilities, which describe the team ing values and contradictory values in control ques-
skills and the competencies of the BI staff [56, 59]. tions, ten questionnaires had to be excluded from the
Based on findings of critical success factors literature, analysis. On the basis of these criteria, 93 question-
overall organizational support in the form of sponsor- naires were selected for further analysis.
ship and championship are preconditions to establish
successful BI capabilities [44, 59]. Therefore, we diffe- Table 4. Demographic profile of the data set
rentiate these from BI capabilities which can be attri-
buted to the BI function itself, although the literature Industry sector No. % Employees No. %
does not always support this distinction. Banking 18 21% 1-199 17 18%
Altogether, successful deployment and successful Manufacturing 4 5% 200-999 15 16%
usage lead to successful decisions, which will contri- Insurance 11 13% 1000-5000 32 34%
bute to overall organizational success (cf. fig. 3). Public government 7 8% > 5000 29 31%
Services 12 14%
4. Empirical evaluation Software and IT 12 14% BI exper. in years No. %
Telecomm. 5 6% <1 5 5%
4.1 Survey design Utilities 7 8% 1-5 26 29%
Wholes. & retailing 5 6% 6-10 24 26%
For the design of the survey, we adhered to the Other 5 6% > 10 36 40%
process as proposed by [39]. Based on validated in-
struments from established IS theories [1, 6, 21, 43, 5. Results
49], we first compiled question groups for each con-
struct. Second, we modified the wording to the BI con- Partial least squares (PLS) analysis was used to test
text and converted all of the questions to semantic dif- the research model. PLS is a regression-based tech-
ferential scale format from their Likert scale format nique that can estimate and test the relationships
following the guidelines as provided by [11]. Semantic among constructs [10]. PLS was chosen due to its ro-
differential scales have been shown to yield a reduction bustness and the large number of constructs in the re-
in survey completion time leading to superior efficien- search model [23]. SmartPLS (version 2.0) was used
cy, provide a better SEM fit for research models than for data analysis [47]. The research model consists of a
using equivalent Likert scales, and statistical analyses measurement model and a structural model. The inter-
can be performed on these scales in just the way that relations between indicator variables and latent va-
they can be done for Likert scales [11]. Third, in mul- riables are specified by the measurement model. The
tiple iterations senior researchers and graduate students structural model delineates the causal relationships
identified poorly or ambiguous worded items and made between latent variables.
minor wording changes until consensus was found and
no further problems were uncovered. The remaining 5.1 Measurement model
items were included in the survey instrument, each
measured on a 5-point semantic differential scale. Be- Internal consistency (construct reliability), conver-
fore being distributed, the instrument was pretested as gent validity, and discriminant validity determine the
a whole. The final survey included demographic items quality of the reflective measurement model [23].
and items for the representation of the eight constructs As regards internal consistency, for each construct
as depicted in fig. 3. In table 6 in the appendix, the composite reliability (CR) should exceed 0.70 and av-
specific items, organized by construct, are shown. erage variance extracted (AVE) should exceed 0.50. As
table 7 shows, these criteria were met for all constructs.
4.2 Data set Regarding convergent validity, table 6 lists the survey
items and constructs and shows that loadings for all
The data was collected using a written question- items were significant at p < 0.001 and exceeded the
naire distributed at a practitioner BI conference in May minimum loading criterion of 0.70. Discriminant valid-
2010. The 157 participants were specialists and execu- ity between constructs is acceptably high when the
tives, working in the field of BI and DW in business, square root of AVE for each construct exceeds the cor-
management, and IT functions. The conference was relations between that and all other constructs [23].
attended by representatives of companies of different Table 7 lists the correlation matrix, with correlations
industry sectors, size, and BI experience (cf. table 4). among constructs and the square root of AVE as di-

5
Proceedings of the 44th Hawaii International Conference on System Sciences - 2011

agonal elements. According to table 7, the highest in- examined. A larger data set would provide better em-
ter-construct correlation was 0.71 between organiza- pirical support for our hypotheses. The data set con-
tional support and BI practices, which is still lower tains answers of business (25 %), IT (50 %), manage-
than the lowest square root of AVE (0.74 for BI prac- ment (18 %), and other (7 %) personnel. It can be ques-
tices). Thereby, the discriminant validity criterion was tioned, if, e.g., IT personnel are able to give correct
also met. Convergent and discriminant validity can be answers as regards organizational performance. In a
further confirmed when individual items load above future study, it might be advisable to relate certain
0.50 on their associated construct (cf. above) and load parts of the study to certain functional areas of an or-
more highly on their associated construct than on any ganization.
other construct. All items loaded on their constructs as
expected. Org. support
R2 = 0.00
0.50**
5.2 Structural model
BI capabilities
R2 = 0.25
At the structural level, good model fit is established 0.48** 0.60**
with acceptably high internal consistency (cf. section BI IT BI practices
5.1), significant path coefficients, and acceptably high R2 = 0.23 R2 = 0.36
R2 values [23]. The path coefficients indicate the 0.37**
0.28* 0.32*
0.52**
strengths of the relationships between the dependent
and independent variables. Standardized paths should Individual use Org. use
R2 = 0.38 R2 = 0.52
be at least 0.20 and ideally above 0.30 in order to be
0.59** 0.14
considered meaningful [9]. Bootstrapping with 200
samples was used in order to determine the signific- Individual impact
ance of the paths within the structural model. The R2 R2 = 0.46
0.65**
values indicate the amount of variance of dependent
Org. impact
variables explained by the independent variables. In
R2 = 0.42 * p < 0.01; ** p < 0.001.
general, no official guidelines exist, but, clearly, the
larger the R2 values, the better [23]. In early stages of
theory building and depending on the object of investi- Figure 4. Research model results
gation, lower values may be accepted, e.g. if other po-
Second, not all model inherent hypotheses are em-
tential determinants influence the dependent variables.
pirically supported. On the one hand, BI capabilities
In the structural model at hand, this might be the case
and BI IT show rather low R2 values. On the other
with the BI IT construct, as this may depend on va-
hand, the results do not attest significance for the rela-
riables not included in the model, e.g. general IT cha-
tionship between organizational use and individual
racteristics. Fig. 4 presents path coefficients and R2
impact. These issues should be addressed in future
values for the research model. All path coefficients
research.
except one are significant at p < 0.001 or p < 0.01. No
significance can be attested for the relationship be-
tween organizational use and individual impact. All R2 6.2 Implications for research
values except BI capabilities and BI IT indicate a mod-
Overall, our theoretical understanding of BI maturi-
erate model fit.
ty which comprises deployment, use, and impact of BI
has been empirically evaluated, but certain aspects
6. Discussion need further attention.
BI capabilities and BI IT show rather low R2 val-
The results of our study provide several key find- ues. Therefore, the model needs to be extended by con-
ings which should help to improve the understanding structs which better explain the amount of variance of
of BI maturity and the development and use of BI BI capabilities and BI IT as dependent ones. For BI IT,
MMs in research and practice. this may be found in general IT characteristics or the
general development history of the BI IT (e.g. if hete-
6.1 Limitations rogeneity was introduced by mergers & acquisitions).
For BI capabilities, this may be found in company rep-
In general, there are two limitations which should utation and similar items, which may attract better edu-
be noted when interpreting our research results. cated BI personnel to an organization. Some research
First, the size of the data set (93 questionnaires) and attributes items like company reputation to organiza-
the heterogeneity of the sample need to be critically

6
Proceedings of the 44th Hawaii International Conference on System Sciences - 2011

tional performance [36]. As a consequence, this would derstood as groundwork and will be the foundation for
result in a cyclic causal chain. the development and the application of an impact-
The results did not attest significance for the rela- oriented BI MM.
tionship between organizational use and individual
impact. Modifications of the model might be needed to 8. References
better explain the causalities in this part of the model.
Some ideas, which might lead to better model fit, are [1] Alter, S., A Work System View of DSS in its Fourth Dec-
the inclusion of culture aspects (e.g. fact-based deci- ade. Decision Support Systems, 2004. 38(3): p. 319-327.
sion making) and a modification of the individual im-
pact construct. [2] Anandarajan, M., A. Anandarajan, and C.A. Srinivasan,
As a next step in research, the development of an Business Intelligence Techniques - A Perspective from Ac-
impact-oriented BI MM can now be based on the re- counting and Finance. 2004, Berlin: Springer.
sults of this study (cf. fig. 1). We understand impact-
[3] Baars, H. and H.-G. Kemper, Management Support with
oriented BI maturity as a concept that encompasses Structured and Unstructured Data – An Integrated Business
deployment, use, and impact of BI on individual and Intelligence Framework. Information Systems Management,
organizational performance. The identified causal 2008. 25(2): p. 132-148.
chains show which constructs and how much the con-
structs contribute to BI success. [4] Becker, J., R. Knackstedt, and J. Pöppelbuß, Developing
Besides findings which relate to the research model Maturity Models for IT Management - A Procedure Model
itself, the use of a fast form approach led to many duly and its Application. Business & Information Systems Engi-
completed questionnaires. For future empirical studies, neering, 2009. 1(3): p. 213-222.
this might be a hint how to improve the quality and
[5] Biberoglu, E. and H. Haddad, A Survey of Industrial Ex-
quantity (fewer terminations in filling out question- periences with CMM and the Teaching of CMM Practices.
naires) of the data collection methodically. Journal of Computing Sciences in Colleges, 2002. 18(2): p.
143-152.
6.3 Implications for practice
[6] Bostrom, R.P. and S. Heinen, MIS Problems and Failures
In section 3, we proposed a theoretical understand- - A Socio-Technical Perspective. Part I - The Causes. MIS
ing of BI maturity which comprises deployment, use, Quarterly, 1977. 1(3): p. 17-32.
and impact of BI. Within sections 4 and 5, we eva-
luated our theoretical model empirically, with the re- [7] Cates, J.E., S.S. Gill, and N. Zeituny, The Ladder of
Business Intelligence (LOBI): a framework for enterprise IT
sult that most of our hypotheses hold true. Based on planning and architecture. International Journal of Business
our findings, practitioners now have a way to assess BI Information Systems, 2005. 1(1-2): p. 220-238.
maturity in a manner that goes beyond measuring tech-
nical capabilities. On the basis of our results, the use [8] Chamoni, P. and P. Gluchowski, Integration trends in
and the impact of the socio-technical BI system should business intelligence systems: an empirical study based on
be considered as well. the business intelligence maturity model. Wirtschaftsinforma-
One central statement can be derived from our find- tik, 2004. 46(2): p. 119-128.
ings that everybody responsible for BI certainly likes
to hear: we empirically derived that financial and gen- [9] Chin, W.W., Issues and Opinion on Structural Equation
Modeling. MIS Quarterly, 1998. 22(1): p. vii-xvi.
eral support for BI by management and business func-
tions have a positive impact on the overall organiza- [10] Chin, W.W., The Partial Least Squares Approach to
tional performance. Structural Equation Modeling, in Modern Methods for Busi-
ness Research, G.A. Marcoulides, Editor. 1998, Lawrence
7. Conclusion Erlbaum Associates: Mahwah. p. 295-336.

Based on results of previous research in form of a [11] Chin, W.W., N. Johnson, and A. Schwarz, A Fast Form
Approach to Measuring Technology Acceptance and Other
qualitative literature review and on an empirical study,
Constructs. MIS Quarterly, 2008. 32(4): p. 687-703.
this paper developed and evaluated a theoretical model
of impact-oriented BI maturity which aims to integrate [12] Clark, J.T.D., M.C. Jones, and C.P. Armstrong, The
the methodical strengths of established IS theories and Dynamic Structure of Management Support Systems: Theory
the content of existing BI MMs. Development, Research Focus, and Direction. MIS Quarter-
The model of impact-oriented BI maturity is a spe- ly, 2007. 31(3): p. 579-615.
cific conceptualization of BI maturity that should stand
next to others. Nonetheless, the model should be un-

7
Proceedings of the 44th Hawaii International Conference on System Sciences - 2011

[13] Curtis, B., W.E. Hefley, and S.A. Miller, The People [27] Henschen, D., HP Touts Neoview Win, Banking Solu-
Capability Maturity Model – Guidelines for Improving the tion, BI Maturity Model. Intelligent Enterprise, 2007. 10(10):
Workforce. 2 ed. SEI Series in Software Engineering. 2010, p. 9.
Boston, MA: Addison-Wesley.
[28] Herschel, R.T., Editorial Preface. International Journal
[14] Davenport, T.H., J.G. Harris, and R. Morison, Analytics of Business Intelligence Research, 2010. 1(1): p. i.
at Work: Smarter Decisions, Better Results. 2010, Boston:
Harvard Business Press. [29] Hewlett Packard, The HP Business Intelligence Maturity
Model: Describing the BI journey. 2009. Available:
[15] de Bruin, T., et al., Understanding the Main Phases of http://h20195.www2.hp.com/V2/GetPDF.aspx/4AA1-
Developing a Maturity Assessment Model. in ACIS 2005. 5467ENW.pdf.
2005. Sydney.
[30] Hostmann, B., N. Rayner, and G. Herschel, Gartner's
[16] Eckerson, W.W., Gauge Your Data Warehousing Ma- Business Intelligence, Analytics and Performance Manage-
turity. DM Review, 2004. 14(11): p. 34. ment Framework. 2009, Gartner: Stamford.

[17] Eckerson, W.W., TDWI's Business Intelligence Maturity [31] Inmon, W.H., D. Strauss, and G. Neushloss, DW 2.0:
Model. 2009, The Data Warehousing Institute: Chatsworth. The Architecture for the Next Generation of Data Warehous-
ing. 2008, Amsterdam: Elsevier Science.
[18] Elbashir, M.Z., P.A. Collier, and M.J. Davern, Measur-
ing the effects of business intelligence systems: The relation- [32] Kimball, R., et al., The Data Warehouse Lifecycle Tool-
ship between business process and organizational perfor- kit. 2 ed. 2008, New York: John Wiley & Sons.
mance. International Journal of Accounting Information Sys-
tems, 2008. 9: p. 135-153. [33] Lahrmann, G. and F. Marx, Systematization of Maturity
Model Extensions. in DESRIST 2010. 2010. St. Gallen:
[19] Fisher, T., How Mature Is Your Data Management Envi- Springer.
ronment? Business Intelligence Journal, 2005. 10(3): p. 20-
26. [34] Lahrmann, G., et al., Business Intelligence Maturity
Models: An Overview. in itAIS 2010. 2010. Naples, Italy:
[20] Fraser, P., J. Moultrie, and M. Gregory, The Use of Ma- Springer.
turity Models/Grids as a Tool in Assessing Product Devel-
opment Capability. in IEEE IEMC 2002. 2002. Cambridge, [35] Luhn, H.P., A Business Intelligence System. IBM Jour-
UK. nal of Research and Development, 1958. 2(4): p. 314-319.

[21] Gable, G.G., D. Sedera, and T. Chan, Re- [36] Marchand, D., W. Kettinger, and J. Rollins, Information
conceptualizing Information System Success: The IS-Impact Orientation - The Link to Business Performance. 2001, New
Measurement Model. Journal of the Association for Informa- York: Oxford University Press.
tion Systems, 2008. 9(7): p. 377-408.
[37] McDonald, M., Without the Business in Business Intelli-
[22] Gansor, T., A. Totok, and S. Stock, Von der Strategie gence, BI Is Dead! 2010. Available:
zum Business Intelligence Competency Center (BICC): Kon- http://blogs.gartner.com/mark_mcdonald/2010/02/11/without
zeption - Betrieb - Praxis. 2010: Hanser Fachbuchverlag. -the-business-in-business-intelligence-bi-is-dead/.

[23] Gefen, D., D. Straub, and M.-C. Boudreau, Structural [38] Mettler, T. and P. Rohner, Situational Maturity Models
Equation Modeling and Regression: Guidelines for Research as Instrumental Artifacts for Organizational Design. in
Practice. Communications of the AIS, 2000. 4(7): p. 1-77. DESRIST 2009. 2009. New York.

[24] Gericke, A., P. Rohner, and R. Winter, Networkability in [39] Moore, G.C. and I. Benbasat, Development of an In-
the Health Care Sector - Necessity, Measurement and Syste- strument to Measure the Perceptions of Adopting an Infor-
matic Development as the Prerequisites for Increasing the mation Technology Innovation. Information Systems Re-
Operational Efficiency of Administrative Processes. in ACIS search, 1991. 2(3): p. 192-22.
2006. 2006. Adelaide.
[40] Negash, S., Business Intelligence. Communications of
[25] Gibson, C.F. and R.L. Nolan, Managing the four stages the Association for Information Systems, 2004. 13: p. 177-
of EDP growth. Harvard Business Review, 1974. 52(1): p. 195.
76-88.
[41] Negash, S. and P. Gray, Business Intelligence, in Hand-
[26] Hatcher, D. and B. Prentice, The Evolution of Informa- book on Decision Support Systems 2, F. Burstein and C.W.
tion Management. Business Intelligence Journal., 2004. 9(2): Holsapple, Editors. 2008, Springer: Berlin, Heidelberg. p.
p. 49-56. 175-193.

8
Proceedings of the 44th Hawaii International Conference on System Sciences - 2011

[42] Paulk, M.C., et al., Capability Maturity Model, Version [52] Sen, A., A.P. Sinha, and K. Ramamurthy, Data Ware-
1.1. IEEE Software, 1993. 10(4): p. 18-27. housing Process Maturity: An Exploratory Study of Factors
Influencing User Perceptions. IEEE Transactions on Engi-
[43] Petter, S., W. DeLone, and E. McLean, Measuring in- neering Management, 2006. 53(3): p. 440-455.
formation systems success: models, dimensions, measures,
and interrelationships. European Journal of Information [53] Simpson, J.A. and E.S.C. Weiner, The Oxford English
Systems, 2008. 17: p. 236-263. Dictionary. 2 ed. 1989, Oxford, UK: Oxford University
Press.
[44] Ramamurthy, K., A. Sen, and A.P. Sinha, An empirical
investigation of the key determinants of data warehouse [54] Töpfer, J., Active Enterprise Intelligence, in Active En-
adoption. Decision Support Systems, 2008. 44: p. 817-841. terprise Intelligence, J. Töpfer and R. Winter, Editors. 2008,
Springer: Berlin, Heidelberg. p. 1-28.
[45] Rayner, N. and K. Schlegel, Maturity Model Overview
for Business Intelligence and Performance Management. [55] Vierkorn, S. and D. Friedrich, Organization of Business
2008, Gartner: Stamford. Intelligence. 2008, BARC Institute: Würzburg.

[46] Richardson, J. and A. Bitterer, Findings: The Risks of [56] Watson, H.J., T. Ariyachandra, and R.J. Matyska, Data
Losing Faith in BI. 2010, Gartner: Stamford. warehousing stages of growth. Information Systems Man-
agement, 2001. 18(3): p. 42-50.
[47] Ringle, C., S. Wende, and A. Will, SmartPLS 2.0. 2005.
Available: http://www.smartpls.de. [57] Watson, H.J., Tutorial: Business Intelligence – Past,
Present, and Future. Communications of the Association for
[48] Robbins, S.P., Organization Theory: Structure, Design, Information Systems, 2009. 25: p. 487-510.
and Applications. 2 ed. 1987, Englewood Cliffs: Prentice-
Hall. [58] Williams, S. and N. Williams, The profit impact of busi-
ness intelligence. 2007, San Francisco, CA: Morgan Kauf-
[49] Sabherwal, R., A. Jeyaraj, and C. Chowa, Information mann.
System Success: Individual and Organizational Determi-
nants. Management Science, 2006. 52(12): p. 1849-1864. [59] Wixom, B.H. and H.J. Watson, An empirical investiga-
tion of the factors affecting data warehousing success. MIS
[50] SAS, Information Evolution Model. 2009. Available: Quarterly, 2001. 25(1): p. 17-41.
http://www.sas.com/software/iem/.
[60] Wixom, B.H. and H.J. Watson, The BI-Based Organiza-
[51] Schulze, K.-D., et al., Business Intelligence-Studie 2009. tion. International Journal of Business Intelligence Research,
2009, Steria Mummert Consulting AG: Hamburg. 2010. 1(1): p. 13-28.

Appendix
Table 5. Content of BI maturity models [34]
Watson et al.

Cates et al.

Sen et al.
Eckerson

Teradata
Dataflux

Gartner
SMC
SAS

HP

Dimension Description ∑
Applications Kinds of (analytical) applications in use, e.g. data mining, OLAP, or reporting 6
Architecture Overall structure of e.g. source systems, platform, integration infrastr., and appl. 3
Behavior Prevailing analytic decision culture in the org. (i.e. fact-based decision making) 4
Change Controlling and tracking of changes over time 3
Data # of subject areas, the data model(s) used, and the quality and quantity of data 6
Efficiency Ratio of resource input compared to resource output 1
Impact Individual impact and organizational impact 6
Infrastructure Components of the integration infrastr., e.g. data bases, application servers 6
Org. structure Characteristics, structure, and placement of the BI org. in the overall org. 2
Processes Degree to which BI-related activities are performed 3
Staff Experience, skills, and specialization of the BI staff 1
Strategy Strategic alignment of BI, e.g. on corporate, business, or IT objectives 1
Users Types, numbers, and locations of the BI users 5

9
Proceedings of the 44th Hawaii International Conference on System Sciences - 2011

Table 6. Survey items and constructs

Construct/Item Key literature Mean S.D. Loading t-statistic


BI capabilities (CA) [56, 59]
high / low experience of (internal) BI staff 3.38 1.17 0.88 22.42
high / low business competency of (internal) BI staff 3.48 1.07 0.83 24.38
high / low technical competency of (internal) BI staff 3.73 1.04 0.85 27.03
high / low social competency of (internal) BI staff 3.60 0.86 0.74 13.71
high / low specialization of (internal) BI staff 3.39 1.00 0.87 17.73
BI practices (PR) [52, 56]
high / low strategic alignment on corporate and IT strategy 3.02 1.02 0.75 12.53
planned / chaotic further development 3.09 1.04 0.73 16.21
clearly / not clearly defined BI service portfolio 2.99 1.01 0.74 17.60
distinct / non distinct measurement of BI success 2.55 1.07 0.76 17.38
distinct / non distinct BI management processes 2.67 0.95 0.74 14.43
standardized / non standardized BI development processes 2.97 1.10 0.78 17.40
standardized / non standardized BI operations processes 3.16 1.08 0.74 15.06
BI information technology (IT) [52, 56]
extensive / limited set of analytical capabilities 3.35 1.09 0.76 24.02
homogeneous / heterogeneous analytical applications 3.09 1.19 0.81 18.02
standardized / non standardized (integration-) infrastructure 3.39 1.03 0.80 15.99
standardized / non standardized overall architecture 3.01 1.09 0.74 9.20
integrated / non integrated data 3.20 1.04 0.74 11.21
Individual impact (II) [14, 59, 60]
good / bad decisions 3.72 0.70 0.82 20.97
time-saving / time-consuming decision making 3.59 0.74 0.78 11.83
performance enhancing / performance degrading 3.44 0.76 0.79 14.91
productivity increasing / productivity decreasing 3.48 0.65 0.85 19.96
Individual usage (IU) [11, 21, 43, 49]
easy / hard 3.23 0.85 0.83 27.16
efficient / inefficient 3.20 0.87 0.86 21.58
effective / ineffective 3.31 0.86 0.87 33.87
helpful / unhelpful 3.53 0.83 0.87 31.75
quite useful / quite useless 3.55 0.83 0.85 32.14
Organizational impact (OI) [18]
high / low efficiency of internal processes 3.39 0.77 0.82 21.15
good / bad organizational performance 3.52 0.69 0.82 12.01
Organizational support (OS) [44, 59]
general / no general top-management support 3.45 1.07 0.83 13.94
financial / no financial top-management support 3.29 1.04 0.88 18.26
general / no general business support 3.34 0.91 0.80 16.69
Organizational usage (OU) [21, 43, 49]
high / low number of covered business topics 3.24 1.07 0.83 23.53
high / low spread throughout the organization 3.15 1.07 0.93 51.26
universal / insular usage 2.97 0.96 0.95 48.08

Table 7. Construct properties and inter-construct correlations

Inter-construct correlationsa
2
Construct AVE CR R CA II IT IU OI OS OU PR
BI capabilities (CA) 0.65 0.90 0.25 0.81
Individual impact (II) 0.66 0.89 0.46 0.45 0.81
BI information technology (IT) 0.59 0.88 0.23 0.48 0.46 0.77
Individual usage (IU) 0.73 0.93 0.38 0.50 0.67 0.56 0.85
Organizational impact (OI) 0.66 0.79 0.42 0.28 0.65 0.37 0.43 0.81
Organizational support (OS) 0.68 0.87 0.00 0.50 0.65 0.58 0.60 0.54 0.83
Organizational usage (OU) 0.79 0.92 0.52 0.48 0.46 0.59 0.54 0.35 0.55 0.89
BI practices (PR) 0.55 0.90 0.36 0.60 0.45 0.60 0.55 0.42 0.71 0.69 0.74
a
Diagonal elements (in italics) represent the square root of AVE for that construct.

10
View publication stats

You might also like