Professional Documents
Culture Documents
Article views: 29
85
86
ASSESSING THE INFORMATION SYSTEM (IS) functions performance has long been an
important issue to IS executives. This interest is evident from the prominence of this
issue in the various IS issue studies [12, 13, 34, 49, 72] as well as the popularity of
annual publications such as ComputerWorld Premier 100 and InformationWeek 500,
which involve the use of surrogate metrics to assess overall IS functional performance (ISFP). Executives routinely seek evidence of returns on information technology (IT) investments and sourcing decisionsboth types of choices that have become
more substantial and a competitive necessity. As the unit that has major responsibilities for these decisions, the IS function is usually believed to be an integral part of
achieving organizational success. Yet the overall performance of the IS function has
proved to be difficult to conceptualize and to measure.
As the outsourcing of IS subfunctional areas such as data centers and help desks
has grown into the outsourcing of the entire IS function, there is an ever-growing
need for formal performance assessment [61]. This will permit the establishment of
baseline measures to use in judging outsourcing success. So, the issue of an overall IS
functional metric, which is, and has been, high on IS executives priorities, is becoming even more important.
Although there has been a good deal of research on IS efficiency, effectiveness, and
success at various levels of analysis, overall functional-level performance is one of
the least discussed and studied. According to Seddon et al. [88], only 24 out of 186
studies between 1988 and 1996 can be classified as focusing on the IS functional
level. Nelson and Coopriders [70] work epitomizes this need.
Moreover, while there exist metrics and instruments to assess specific IS subfunctions
and specific IS subareas, such as data center performance, productivity and data quality, typically these measures cannot be aggregated in any meaningful way. This limits
their usefulness as the bases for identifying the sources of overall performance improvements or degradations. As an anonymous reviewer of an earlier version of this
paper said, The critical issue is that the performance of the IS function is now under
the microscope and decisions to insource/outsource and spend/not spend must be
made in a structured context.
The objective of this research is to develop such an instrumenta scorecard
for evaluating overall ISFP.
87
tion uses resources to produce IS performance, which in turn influences both business process effectiveness and organizational performance.
The resources utilized by the IS function are shown in Figure 1 to be hardware,
software, human resources, and integrated managerial and technical capabilities [14,
15, 36]. The IS function is shown to produce systems, information, and services [56,
92], which collectively affect the organization in a fashion that is termed IS functional
performance (ISFP), which is to be assessed through an IS functional scorecard (ISFS),
the development of which is the objective of this study.
In the theoretical model, IS outputs are also shown as significant enablers and drivers of business process effectiveness, since IS are often the basis for business process
operations and redesign [94, 113]. ISFP also is shown to influence business process
effectiveness, and both influence overall organizational performance [113].
Although it is not the primary purpose of this study to directly address the business
process effectiveness and organizational performance elements of Figure 1, data were
collected on these elements of the model for purposes of nomological validation of
the scorecard that is being developed.
The model of Figure 1 is based on streams of research in IS capabilities, IS effectiveness/success, IS service quality, IS functional evaluation, and IS subfunctional
assessment.
IS Capabilities
IS capabilities are integrated sets of hardware, software, human skills, and management processes that serve to translate financial investments in IS into IS performance
[17, 23, 42, 83, 99, 111, 113]. For instance, an IS strategic planning capability might
consist of well-trained planners, computer-based planning models, knowledgeable
88
IS Effectiveness/Success
DeLone and McLean [30] categorized over 100 IS dependent variables into six
categories and developed an IS success model to describe the relationships between
the categories. They concluded that IS success should be a multidimensional measure
and recommended additional research to validate the model. Other researchers have
since tested and expanded their model [7, 46, 79]. DeLone and McLean [31] have
updated the model based on a review of research stemming from their original work.
They concluded that their original model was valid and suggested that service quality be incorporated as an important dimension of IS success.
IS Service Quality
Recognizing the importance of the services provided by the IS function, the
SERVQUAL measure, originally developed in marketing [74], has been adapted to
measure IS service quality [75, 110]. However, the controversy over SERVQUAL in
marketing [27] has carried over into IS [52, 104], suggesting that more research needs
to be conducted to measure IS service quality. Proponents of this measure sometimes
advocate its use as proxy for ISFP. However, as depicted in Figure 1, it is directly
applicable only to one of the three major outputs of the IS function.
IS Functional Evaluation
Only a few studies directly address the comprehensive evaluation of the performance
of the IS function. No one has developed a validated metric. Wells [112] studied
existing and recommended performance measures for the IS function and identified
six important goals/issues. Saunders and Jones [84] developed and validated 11 IS
function performance dimensions through a three-round Delphi study. They proposed
an IS function performance evaluation model to help organizations select and prioritize IS performance dimensions and to determine assessments for each dimension.
Both studies focused on top managements perspective of ISFP and did not offer any
specific measures.
IS Subfunctional Assessment
Measuring IS subfunctional performance has been important to IS practitioners and
academics, and such measures have been developed at a variety of levels using a
number of different perspectives. For instance, measurements have been made of the
effects of IS on users (e.g., [105]), learning outcomes (e.g., [1]), service (e.g., [52]),
e-business (e.g., [96]) and other contexts using economic approaches (e.g., [16]), a
89
Implementations
financial perspective (e.g., [10]), a social science perspective (e.g., [80]), an IT value
approach [24], a business process viewpoint (e.g., [98]), and probably others.
So, there has been no paucity of interest in IS assessment, or in the development of
measures. However, there is a great need for a comprehensive measure of IS performance that will provide a configural, or Gestalt [41], view of an organizations
formal IS activities and facilitate decision making and functional improvement.
90
Despite its focus on users, this approach is different from the popular user satisfaction measures [6, 8, 35], because it is designed to assess peoples perceptions of the
overall IS function rather than to capture users attitudes toward a specific system.
Operationalization of Constructs
Two sets of constructs were operationalized in this studythe three-dimensional ISFS
construct and the constructs related to the consequences of ISFP. These consequences
91
92
included for more updated measures published subsequent to DeLone and McLeans
original review.
Information Effectiveness
Measures of information effectiveness assess the quality of the information provided
by IS as well as the effects of the information on the users job. Although DeLone and
McLeans [30] information quality provided a good source for existing measures,
Wang and Strong [109] developed a more comprehensive instrument that encompasses all measures mentioned in DeLone and McLeans review. Therefore, the 118
measures developed by Wang and Strong make up the majority of items in this dimension. However, since the focus of their instrument is on quality of information, in
order to ensure coverage of measures on the effects of information on the users job,
some new items were developed.
Service Performance
Measures of service performance assess each users experience with the services provided by the IS function in terms of the quality and flexibility of the services. The
entire ISSERVQUAL instrument is included in this dimension for comprehensiveness. New measures were also incorporated to augment the ISSERVQUAL items
based on the more comprehensive view of service performance proposed by Fitzgerald
et al. [38]. In addition, literature on three areas of IS functional services that were not
explicitly covered by the service quality literaturetraining [60, 65, 71], information
centers [11, 44, 47, 66], and help desks [20]were also reviewed and included to
ensure the comprehensiveness of measures for this dimension.
In addition to utilizing existing items to measure these constructs, the emergence of
innovations that have come into use since most of the prior instruments were developed
prompted the inclusion of new items to measure the IS functions performance in seven
new areas: ERP [51], knowledge management [45, 64, 97], electronic business [9, 55],
customer relationship management [39], supply chain management [37], electronic
commerce [22, 33, 102], and organizational learning [86, 100]. In total, 31 new items
gleaned from the practitioner and research literatures to reflect potential user assessments of IS functions contribution to those areas in terms of systems, information, and
services were incorporated to expand the item pools for each dimension.
Instrument Development
A total of 378 items were initially generated. Multiple rounds of Q-sorting and item
categorization were conducted [29, 68] to reduce the number of items and to ensure
the content validity of the ISFS instrument. The last round of Q-sort resulted in the
identification of subconstructs with multiple items for each of the three dimensions.
Table 2 shows the subconstructs for each dimension that resulted from the Q-sort
process.
93
Information
effectiveness
Service
performance
Effect on job
Effect on external
constituencies
Effect on internal
processes
Effect on knowledge
and learning
Systems features
Ease of use
Intrinsic quality of
information
Contextual quality of
information
Presentational quality
of information
Accessibility of
information
Reliability of information
Flexibility of information
Usefulness of information
Responsiveness
Reliability
Service provider quality
Empathy
Training
Flexibility of services
Cost/benefit of services
The Q-sorts resulted in an ISFS instrument that consists of 42, 36, and 32 items for
the three dimensions, respectively. All items were measured using a Likert-type scale
ranging from 1 (hardly at all) to 5 (to a great extent) with 0 denoting not applicable.
The final version of the instrument is in the Appendix.
94
Two rounds of reminders were sent to initial nonrespondents to improve the response rate. In addition, where appropriate, letters were sent to the CIOs who returned the CIO survey soliciting additional user participation.
At the conclusion of data collection in 2001, 346 usable ISFS instruments and 130
Consequences of ISFS surveys were received, with 120 companies having responses
from at least one IS user and the CIO. This resulted in a response rate of 7.2 percent
for the CIO survey, 5.6 percent for the ISFS questionnaire, and 6.1 percent matchedpair responses.
Two analyses were conducted to assess possible nonresponse bias. t-tests of company size in terms of revenue, net income, and number of employees between responding and nonresponding companies showed no significant differences. t-tests of
30 items randomly selected from the three ISFS dimensions (10 items each) between
the early (first third) and late (last third) respondents [4, 62] also showed no significant differences. Therefore, it can be concluded that there was no nonresponse bias in
the sample and that the relatively low percentage response rate does not degrade the
generalizability of the ISFS instrument [57].
Sample Demographics
The participating companies represent more than 20 industries with nearly a quarter
of the companies in manufacturing (24.6 percent), followed by wholesale/retail (13.8
percent), banking/finance (10.8 percent), and medicine/health (7.7 percent). The range
of annual sales was between $253 million and $45.352 billion, with an average of $4
billion for the sample. For the Consequences of ISFP surveys, 46.9 percent of the
respondents hold the title of CIO. More than 80 percent of the respondents have titles
that are at the upper-management level, indicating that the returned surveys were
responded to by individuals at the desired level. For the ISFS instrument, 47.7 percent
of the respondents are at the upper-management level and 39.9 percent were at the
middle management level. The respondents are distributed across all functional areas, with accounting and finance, sales and marketing, and manufacturing and operations being the top three.
Instrument Validation
INSTRUMENT VALIDATION REQUIRES THE EVALUATION of content validity, reliability,
construct validity, and nomological validity. Following Segarss [89] process for instrument validation, we first use exploratory factor analysis to determine the number
of factors, then use confirmatory factor analysis iteratively to eliminate items that
loaded on multiple factors to establish unidimensionality.
Content Validity
Content validity refers to the extent to which the measurement items represent and
cover the domain of the construct [54]. It is established by showing that the items are
95
96
fit. First, items with standardized factor loading below 0.45 were eliminated [78] one
at a time. Second, error terms between pairs of items were allowed to correlate based
on a modification index. However, this modification was only implemented when
theories suggested that the two items should be correlated. This process was conducted iteratively by making one modification at a time until either good model fit
was achieved or no modification was suggested.
Following Segars and Grovers [90] procedure, after every measurement model
completed its modification process, pairs of models within each dimension were tested
iteratively to identify and eliminate items with cross-loadings. With all cross-loading
items eliminated, all factors within the same dimension were tested in a full measurement model. Again, items with cross-loadings in the full model were dropped. After
the full measurement models were purified, second-order models that reflect the subconstructs within each ISFS dimension were tested. The final, second-order measurement models for the three ISFS dimensions are presented in Figures 3, 4, and 5.
The chi-square and significant factor loadings provide direct statistical evidences
of both convergent validity and unidimensionality [91]. With each of the three ISFS
dimensions properly tested independently, all three dimensions were combined and
tested for model fit (Figure 6). The complete ISFS model, shown in Figure 6, showed
remarkably good fit for such high complexity.
Reliability
In assessing measures using confirmatory factor analysis, a composite reliability for
each factor can be calculated [5, 93]. This composite reliability is a measure of
internal consistency of the construct indicators, depicting the degree to which they
indicate the common latent (unobserved) construct [48, p. 612]. Another measure
of reliability is the average variance extracted (AVE), which reflects the overall amount
of variance that is captured by the construct in relation to the amount of variance due
to measurement error [48, 89]. The value of AVE should exceed 0.5 to indicate that
the variance explained by the construct is larger than measurement error. The construct reliability and AVE of all dimensions and subconstructs are presented in Table 3.
Table 3 indicates that all subconstructs showed good composite reliability except
the IS training scale. However, there are some scales with an AVE below 0.50. This
suggests that even though all scales (except one) were reliable in measuring their
respective constructs, some of them were less capable of providing good measures of
their own construct. Despite the low AVE, those scales were retained to ensure the
comprehensiveness of the ISFS instrument.
Discriminant Validity
Discriminant validity refers to the ability of the items in a factor to differentiate themselves from items that are measuring other factors. In structural equation modeling
(SEM), discriminant validity can be established by comparing the model fit of an
unconstrained model that estimates the correlation between a pair of constructs and a
97
constrained model that fixes the correlation between the constructs to unity. Discriminant validity is demonstrated when the unconstrained model has a significantly
better fit than the constrained model. The difference in model fit is evaluated by the
chi-square difference (with one degree of freedom) between the models. Tests of all
possible pairs of subconstructs within each dimension were conducted; the results are
98
presented in Table 4. As shown, all chi-square differences are significant at p < 0.001,
indicating that each scale captures a construct that is significantly unique and independent of other constructs. This provides evidence of discriminant validity.
Nomological Validity
A nomological network that specifies probable (hypothetical) linkages between the
construct of interest and measures of other constructs [85, p. 14] further clarifies the
ISFS construct and provides an additional basis for construct validation. An
operationalization of the theoretical model of Figure 1 that considers the two important consequences of ISFP was used. This model consists of the rightmost portions of
99
Figure 1 that relate ISFP to business process effectiveness and to organizational performance.
Organizational Performance
Although a positive relationship between IS effectiveness and business performance
has been suggested, the evidence of such an effect has proved to be elusive [58].
Using user information satisfaction and strategic impact of IS as surrogate IS
effectiveness measures and several perceptual measures as business performance, Chan
et al. [21] empirically showed a significant positive relationship between IS and business performance. Since ISFS is posited as a more comprehensive measure of IS
performance, the positive relationship should hold in this study.
100
101
Factor names
Systems performance
Impact on job
Impact on external constituencies
Impact on internal processes
Impact on knowledge and learning
Systems usage characteristics
Intrinsic systems quality
Information effectiveness
Intrinsic quality of information
Reliability of information
Contextual quality of information
Presentational quality of information
Accessibility of information
Flexibility of information
Usefulness of information
Service performance
Responsiveness of services
Intrinsic quality of service provider
Interpersonal quality of service provider
IS training
Flexibility of services
Reliability
AVE
0.92
0.95
0.88
0.89
0.89
0.85
0.79
0.92
0.73
0.79
0.85
0.87
0.80
0.81
0.91
0.89
0.88
0.84
0.93
0.59
0.69
0.66
0.68
0.56
0.80
0.62
0.49
0.56
0.63
0.48
0.66
0.75
0.77
0.57
0.58
0.66
0.63
0.79
0.56
0.82
0.33
0.37
considered to be acceptable in the literature [32, 107], seven items that assess the
CIOs perception of ISs contribution to improving the organizations performance in
those areas were used.
Business Processes Effectiveness
Aside from directly affecting organizational performance, the IS function should also
have an effect on organizational performance through its impact on the effectiveness
of business processes, as shown in Figure 1. IS have traditionally been implemented
to improve the efficiencies of internal operations. This use of IT has more recently
been applied in redesigning both intra- and interfunctional business processes [94].
Improvements to the value-chain activities through IT are captured in this construct. Based on Porter and Millar [76] and Devenport [28], Xia [113] developed a
39-item instrument to assess executives perception of the extent to which IT improved the effectiveness of six value-chain activities. Data analysis resulted in six
factors: production operations, product development, supplier relations, marketing
services, management processes, and customer relations. Items representing those
six factors were generated for this construct.
Validity and Reliability of the Measures Used in Nomological Analysis. Although all
scales in the Consequences of ISFP survey were from previously tested instruments,
102
Chi-square differences
Factor1
Factor2
Factor3
Factor4
Factor5
Systems performance
Factor2
39.61***
Factor3
31.33***
Factor4
26.31***
Factor5
66.45***
Factor6
49.47***
37.00***
38.67***
78.42***
64.66***
28.55***
59.97***
55.23***
65.97***
58.68***
70.76***
Information effectiveness
Factor2
63.61***
Factor3
80.96***
Factor4
64.16***
Factor5
65.65***
Factor6
62.68***
Factor7
67.31***
73.22***
48.21***
40.60***
52.31***
47.84***
75.03***
65.60***
78.87***
67.80***
47.34***
60.04***
57.82***
48.66***
47.27***
52.93***
48.52***
46.89***
73.42***
75.73***
53.93***
Service performance
Factor2
23.84***
Factor3
51.52***
Factor4
46.31***
Factor5
30.69***
*** p < 0.001.
Factor6
47.36***
since a different sample was used, tests were conducted to ensure the reliability and
validity of those constructs. Reliability was evaluated using Cronbachs alpha. Items
with low corrected item-total correlation, indicating low internal consistency for the
items, were dropped. Construct validation was assessed by exploratory factor analysis
using principal components with oblique rotation as the extraction method. Table 5
presents the results of the analyses.
Although both constructs had two factors extracted, the two factors were significantly correlated in both cases. Therefore, all items within each construct were retained and used to create an overall score for the construct. Items in the final
measurement models were used to create an overall score for the ISFS construct. The
average of all items for each construct was used to avoid problems that may occur due
to differences in measurement scales. Table 6 shows the correlation among the constructs.
As shown in Table 6, there were significant positive correlations between the ISFS
construct and the two consequence constructs. There was also significant positive
correlation between business processes effectiveness and organizational performance.
Although correlation is not sufficient to establish causal relationships, the purpose
here is to demonstrate the expected association between the constructs. Therefore, as
shown in Tables 5 and 6, the nomological network was supported.
103
Constructs
Business processes
effectiveness
Organizational
performance
Number of
factors
extracted
Number of
items
retained
Variance
explained
(percent)
Cronbachs
alpha
63.48
0.759
70.04
0.860
Business
processes
Organizational performance
0.750**
ISFS
0.214**
** Correlation is significant at the 0.01 level (two-tailed).
Organizational
performance
0.205**
104
105
tion and for researchers to use in studies that require ISFP as a dependent or independent construct, as well as in studies that seek to complement the ISFS through other
analyses.
REFERENCES
1. Alavi, M.; Marakas, G.M.; and Yoo, Y. A comparative study of distributed learning environments on learning outcomes. Information Systems Research, 13, 4 (December 2002), 404
415.
2. Anderson, J.C. An approach for confirmatory measurement and structural equation modeling of organizational properties. Management Science, 33, 4 (April 1987), 525541.
3. Anderson, J.C., and Gerging, D.W. Structural equation modeling in practice: A review
and recommended two-step approach. Psychological Bulletin, 103, 3 (May 1988), 411423.
4. Armstrong, J.S., and Overton, T.S. Estimating nonresponse bias in mail surveys. Journal
of Marketing Research, 14, 3 (August 1977), 396402.
5. Bagozzi, R.P. An examination of the validity of two models of attitude. Multivariate
Behavioral Research, 16, 3 (July 1981), 323359.
6. Bailey, J.E., and Pearson, S.W. Development of a tool for measuring and analyzing
computer user satisfaction. Management Science, 29, 5 (May 1983), 530545.
7. Ballantine, J.; Bonner, M.; Levy, M.; Martin, A.; Monro, I.; and Powell, P.L. Developing
a 3-D model of information systems success. In E.J. Garrity and G.L. Sanders (eds.), Information Systems Success Measurement. Hershey, PA: Idea Group, 1998, pp. 4659.
8. Baroudi, J.J., and Orlikowski, W.J. A short-form measure of user information satisfaction: A psychometric evaluation and notes on use. Journal of Management Information Systems, 4, 4 (Spring 1988), 4459.
9. Basu, A., and Kumar, A. Workflow management issues in e-business. Information Systems Research, 13, 1 (March 2002), 114.
10. Benaroch, M. Managing information technology investment risk: A real options perspective. Journal of Management Information Systems, 19, 2 (Fall 2002), 4384.
11. Bergeron, F.; Rivard, S.; and De Serre, L. Investigating the support role of the information center. MIS Quarterly, 14, 3 (September 1990), 247260.
12. Brancheau, J.C., and Wetherbe, J.C. Key issues in information systems management.
MIS Quarterly, 11, 1 (March 1987), 2345.
13. Brancheau, J.C.; Janz, B.D.; and Wetherbe, J.C. Key issues in information systems management: 199495 SIM Delphi results. MIS Quarterly, 20, 2 (June 1996), 225242.
14. Broadbent, M., and Weill, P. Management by Maxim: How business and IT managers
can create IT infrastructers. Sloan Management Review, 38, 3 (Spring 1997), 7792.
15. Broadbent, M.; Weill, P.; OBrien, T.; and Neo, B.N. Firm context and patterns of IT
infrastructure capability. In J.I. DeGross, S.L. Jarvenpaa, and A. Srinivasan (eds.), Proceedings of Seventeenth International Conference on Information Systems. Atlanta: Association for
Information Systems, 1996, pp. 174194.
16. Brynjolfsson, E. The productivity paradox of information technology. Communications
of the ACM, 36, 12 (December 1993), 6777.
17. Byrd, T.A., and Turner, D.E. Measuring the flexibility of information technology infrastructure: Exploratory analysis of a construct. Journal of Management Information Systems,
17, 1 (Summer 2000), 167208.
18. Cameron, K.S. A study of organizational effectiveness and its predictors. Management
Science, 32, 1 (January 1986), 87112.
19. Cameron, K.S., and Whetton, D.A. Some conclusions about organizational effectiveness. In K.S. Cameron and D.A. Whetton (eds.), Organizational Effectiveness: A Comparison
of Multiple Models. New York: Academic Press, 1983, pp. 261277.
20. Carr, C.L. Managing service quality at the IS help desk: Toward the development and
testing of TECH-QUAL, a model of IS technical support service quality. Ph.D. dissertation,
University of Minnesota, Minneapolis, 1999.
106
21. Chan, Y.E.; Huff, S.L.; Barclay, D.W.; and Copeland, D.G. Business strategic orientation, information systems strategic orientation, and strategic alignment. Information Systems
Research, 8, 2 (June 1997), 125150.
22. Chatterjee, D.; Grewal, R.; and Sambamurthy, V. Shaping up for e-commerce: Institutional enablers of the organizational assimilation of Web technologies. MIS Quarterly, 26, 2
(June 2002), 6590.
23. Chatterjee, D.; Pacini, C.; and Sambamurthy, V. The shareholder-wealth and tradingvolume effects of information technology infrastructure investments. Journal of Management
Information Systems, 19, 2 (Fall 2002), 742.
24. Chircu, A.M., and Kauffman, R.J. Limits to value in electronic commercerelated IT
investments. Journal of Management Information Systems, 17, 2 (Fall 2000), 5980.
25. Churchill, G.A. A paradigm for developing better measures of marketing constructs.
Journal of Marketing Research, 16, 1 (February 1979), 6473.
26. Cronbach, L.J., and Meehl, P.E. Construct validity in psychological tests. In D.M. Jackson and S. Messick (eds.), Problems in Human Assessment. New York: McGraw-Hill, 1967,
pp. 5777.
27. Cronin, J.J.J., and Taylor, S.A. SERVPERF versus SERVQUAL: Reconciling performance-based and perceptions-minus-expectations measurement of service quality. Journal of
Marketing, 58, 1 (January 1994), 125131.
28. Davenport, T.H. Process Innovation: Reengineering Work Through Information Technology. Boston: Harvard Business School Press, 1993.
29. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13, 3 (September 1989), 319340.
30. DeLone, W.H., and McLean, E.R. Information systems success: The quest for the dependent variable. Information Systems Research, 3, 1 (March 1992), 6095.
31. DeLone, W.H., and McLean, E.R. The DeLone and McLean model of information systems success: A ten-year update. Journal of Management Information Systems, 19, 4 (Spring
2003), 930.
32. Dess, G.G., and Robinson, R.B.J. Measuring organizational performance in the absence
of objective measures: The case of the privately-held firm and conglomerate business unit.
Strategic Management Journal, 5, 3 (JulySeptember 1984), 265273.
33. Devaraj, S.; Fan, M.; and Kohli, R. Antecedents of B2C channel satisfaction and preference: Validating e-commerce metrics. Information Systems Research, 13, 3 (September 2002),
316333.
34. Dickson, G.W.; Leitheiser, R.L.; Nechis, M.; and Wetherbe, J.C. Key information systems issues for the 1980s. MIS Quarterly, 8, 3 (September 1984), 135148.
35. Doll, W.J., and Torkzadeh, G. The measurement of end-user computing satisfaction. MIS
Quarterly, 12, 2 (June 1988), 259274.
36. Duncan, N.B. Capturing flexibility of information technology infrastructure: A study of
resource characteristics and their measure. Journal of Management Information Systems, 12, 2
(Fall 1995), 3757.
37. Fan, M.; Stallaert, J.; and Whinston, A.B. Decentralized mechanism design for supply
chain organizations using an auction market. Information Systems Research, 14, 1 (March
2003), 122.
38. Fitzgerald, L.; Johnston, R.; Brignall, S.; Silvestro, R.; and Voss, C. Performance Measurement in Service Businesses. London: Chartered Institute of Management Accountants, 1993.
39. Gefen, D., and Ridings, C.M. Implementation team responsiveness and user evaluation
of customer relationship management: A quasi-experimental design study of social exchange
theory. Journal of Management Information Systems, 19, 1 (Summer 2002), 4770.
40. Gerbing, D.W., and Anderson, J.C. An updated paradigm for scale development incorporating unidimensionality and its assessment. Journal of Marketing Research, 25, 2 (May 1988),
186192.
41. Glazer, R. Measuring the knower: Towards a theory of knowledge equity. California
Management Review, 40, 3 (Spring 1998), 175194.
42. Gold, A.H.; Malhotra, A.; and Segars, A.H. Knowledge management: An organizational
capabilities perspective. Journal of Management Information Systems, 18, 1 (Summer 2001),
185214.
107
43. Goodhue, D.L., and Thompson, R.L. Task-technology fit and individual performance.
MIS Quarterly, 19, 2 (June 1995), 213236.
44. Govindarajulu, C., and Reithel, B.J. Beyond the information center: An instrument to
measure end-user computing support from multiple sources. Information & Management, 33,
5 (May 1998), 241250.
45. Grover, V., and Davenport, T.H. General perspectives on knowledge management: Fostering a research agenda. Journal of Management Information Systems, 18, 1 (Summer 2001),
522.
46. Grover, V.; Jeong, S.R.; and Segars, A.H. Information systems effectiveness: The construct space and patterns of application. Information & Management, 31, 4 (December 1996),
177191.
47. Guimaraes, T., and Igbaria, M. Exploring the relationship between IC success and company performance. Information & Management, 26, 3 (March 1994), 133141.
48. Hair, J.F.J.; Anderson, R.E.; Tatham, R.L.; and Black, W.C. Multivariate Data Analysis,
5th ed. Upper Saddle River, NJ: Prentice Hall, 1998.
49. Hartog, C., and Herbert, M. 1985 opinion survey of MIS managers: Key issues. MIS
Quarterly, 10, 4 (December 1986), 351361.
50. Hattie, J. Methodology review: Assessing unidimensionality of tests and items. Applied
Psychological Measurement, 9, 2 (June 1985), 139164.
51. Hitt, L.M.; Wu, D.J.; and Zhou, X. Investment in enterprise resource planning: Business
impact and productivity measures. Journal of Management Information Systems, 19, 1 (Summer 2002), 7198.
52. Jiang, J.J.; Klein, G.; and Carr, C.L. Measuring information system service quality:
SERVQUAL from the other side. MIS Quarterly, 26, 2 (June 2002), 145166.
53. Jreskog, K.G. Testing structural equation models. In K.A. Boland and L.S. Long (eds.),
Testing Structural Equation Models. Newbury Park, CA: Sage, 1993, pp. 294316.
54. Kerlinger, F.N. Foundations of Behavioral Research. New York: McGraw-Hill, 1978.
55. Kim, J.; Lee, J.; Han, K.; and Lee, M. Businesses as buildings: Metrics for the architectural quality of Internet businesses. Information Systems Research, 13, 3 (September 2002),
239254.
56. King, W.R. Management information systems. In H. Bidgoli (ed.), Encyclopedia of
Management Information Systems, vol. 3. New York: Academic Press, 2003.
57. King, W.R., and He, J. External validity, coverage and nonresponse errors in IS survey
research. Katz Graduate School of Business, University of Pittsburgh, 2004.
58. Kohli, R., and Devaraj, S. Measuring information technology payoff: A meta-analysis of
structural variables in firm-level empirical research. Information Systems Research, 14, 2 (June
2003), 127145.
59. Kraemer, K.L.; Danziger, J.N.; Dunkle, D.E.; and King, J.L. The usefulness of computer-based information to public managers. MIS Quarterly, 17, 2 (June 1993), 129148.
60. Kraut, R.; Dumais, S.; and Susan, K. Computerization, productivity, and quality of worklife. Communications of the ACM, 32, 2 (February 1989), 220238.
61. Lacity, M., and Willcocks, L. Global Information Technology Outsourcing. Chichester,
UK: Wiley, 2001.
62. Lambert, D.M., and Harrington, T.C. Measuring nonresponse bias in customer service
mail surveys. Journal of Business Logistics, 11, 2 (1990), 525.
63. Larsen, K.R.T. A taxonomy of antecedents of information systems success: Variable
analysis studies. Journal of Management Information Systems, 20, 2 (Fall 2003), 169246.
64. Lee, H. Knowledge management enablers, processes, and organizational performance:
An integrative view and empirical examination. Journal of Management Information Systems,
20, 1 (Summer 2003), 179228.
65. Lee, H.; Kwak, W.; and Han, I. Developing a business performance evaluation system:
An analytical hierarchical model. Engineering Economist, 40, 4 (Summer 1995), 343357.
66. Magal, S.R.; Carr, H.H.; and Watson, H.J. Critical success factors for information center
managers. MIS Quarterly, 12, 3 (September 1988), 314425.
67. Mirani, R., and King, W.R. The development of a measure for end-user computing support. Decision Sciences, 25, 4 (JulyAugust 1994), 481498.
68. Moore, G.C., and Benbasat, I. Development of an instrument to measure the perceptions
108
of adopting an information technology innovation. Information Systems Research, 2, 3 (September 1991), 192222.
69. Myers, B.L.; Kappelman, L.A.; and Prybutok, V.R. A comprehensive model for assessing the quality and productivity of the information systems function: Toward a theory for
information systems assessment. In E.J. Garrity and G.L. Sanders (eds.), Information Systems
Success Measurement. Hershey, PA: Idea Group, 1998, pp. 94121.
70. Nelson, K.M., and Cooprider, J.G. The contribution of shared knowledge to IS group
performance. MIS Quarterly, 20, 4 (December 1996), 409432.
71. Nelson, R.R., and Cheney, P.H. Training end users: An exploratory study. MIS Quarterly,
11, 4 (December 1987), 547559.
72. Niederman, F.; Brancheau, J.C.; and Wetherbe, J.C. Information systems management
issues for the 1990s. MIS Quarterly, 15, 4 (December 1991), 475500.
73. Olian, J.D., and Rynes, S.L. Making total quality work: Aligning organizational processes, performance measures, and stakeholders. Human Resource Management, 30, 3 (Fall
1991), 303333.
74. Parasuraman, A.; Zeithaml, V.A.; and Berry, L.L. Refinement and reassessment of the
SERVQUAL scale. Journal of Retailing, 64, 4 (Winter 1991), 420450.
75. Pitt, L.F.; Watson, R.T.; and Kavan, C.B. Service quality: A measure of information
systems effectiveness. MIS Quarterly, 19, 2 (June 1995), 173185.
76. Porter, M.E., and Millar, V.E. How information gives you competitive advantage. Harvard
Business Review, 63, 4 (JulyAugust 1985), 149160.
77. Premkumar, G. Evaluation of Strategic Information Systems Planning: Empirical Validation of a Conceptual Model. Pittsburgh: University of Pittsburgh, 1989.
78. Raghunathan, B.; Raghunathan, T.S.; and Tu, Q. Dimensionality of the strategic grid
framework: The construct and its measurement. Information Systems Research, 10, 4 (December 1999), 343355.
79. Rai, A.; Lang, S.S.; and Welker, R.B. Assessing the validity of IS success models: An
empirical test and theoretical analysis. Information Systems Research, 13, 1 (March 2002), 5069.
80. Ryan, S.D., and Harrison, D.A. Considering social subsystem costs and benefit in information technology investment decisions: A view from the field on anticipated payoffs. Journal
of Management Information Systems, 16, 4 (Spring 2000), 1140.
81. Ryker, R., and Nath, R. An empirical examination of the impact of computer information
systems on users. Information & Management, 29, 4 (1995), 207214.
82. Saarinen, T. An expanded instrument for evaluating information system success. Information & Management, 31, 2 (1996), 103118.
83. Santhanam, R., and Hartono, E. Issues in linking information technology capability to
firm performance. MIS Quarterly, 27, 1 (March 2003), 125154.
84. Saunders, C.S., and Jones, J.W. Measuring performance of the information systems function. Journal of Management Information Systems, 8, 4 (Spring 1992), 6382.
85. Schwab, D.P. Construct validation in organizational behavior. In B.M. Staw and L.L.
Cummings (eds.), Research in Organizational Behavior, vol. 2. Greenwich, CT: JAI Press,
1980, pp. 343.
86. Scott, J.E. Facilitating interorganizational learning with information technology. Journal
of Management Information Systems, 17, 2 (Fall 2000), 81114.
87. Seddon, P.B. A respecification and extension of the DeLone and McLean model of IS
success. Information Systems Research, 8, 3 (September 1997), 240253.
88. Seddon, P.B.; Staples, S.; Patnayauni, R.; and Bowtell, M. Dimensions of information
systems success. Communications of the AIS, 2 (November 1999), 239.
89. Segars, A.H. Assessing the unidimensionality of measurement: A paradigm and illustration within the context of information systems research. Omega, 25, 1 (February 1997), 107121.
90. Segars, A.H., and Grover, V. Re-examining perceived ease of use and usefulness: A
confirmatory factor analysis. MIS Quarterly, 17, 4 (December 1993), 517525.
91. Segars, A.H., and Grover, V. Strategic information systems planning success: An investigation of the construct and its measurement. MIS Quarterly, 22, 2 (June 1998), 139163.
92. Segars, A.H., and Hendrickson, A.R. Value, knowledge, and the human equation: Evolution of the information technology function in modern organizations. Journal of Labor Research, 21, 3 (Summer 2000), 431445.
109
93. Sethi, V., and King, W.R. Development of measures to assess the extent to which an
information technology application provides competitive advantage. Management Science, 40,
12 (December 1994), 16011627.
94. Sethi, V., and King, W.R. Organizational Transformation Through Business Process
Re-Engineering. Upper Saddle River, NJ: Prentice Hall, 1998.
95. Steers, R.M. Problems in the measurement of organizational effectiveness. Administrative Science Quarterly, 20, 4 (December 1975), 546558.
96. Straub, D.W.; Hoffman, D.L.; Weber, B.W.; and Steinfield, C. Measuring e-commerce
in Net-enabled organizations: An introduction to the special issue. Information Systems Research, 13, 2 (June 2002), 115124.
97. Sussman, S.W., and Siegal, W.S. Informational influence in organizations: An integrated approach to knowledge adoption. Information Systems Research, 14, 1 (March 2003),
24765.
98. Tallon, P.P.; Kraemer, K.L.; and Gurbaxani, V. Executives perceptions of the business
value of information technology: A process-oriented approach. Journal of Management Information Systems, 16, 4 (Spring 2000), 145173.
99. Tam, K.Y. The impact of information technology investment on firm performance and
evaluation: Evidence from newly industrialized economies. Information Systems Research, 9,
1 (March 1998), 8598.
100. Templeton, G.F.; Lewis, B.R.; and Snyder, C.A. Development of a measure for the
organizational learning construct. Journal of Management Information Systems, 19, 2 (Fall
2002), 175218.
101. Teo, T.S.H. Integration between business planning and information systems planning:
Evolutionary-contingency perspectives. Ph.D. dissertation, University of Pittsburgh, 1994.
102. Torkzadeh, G., and Dhillon, G. Measuring factors that influence the success of Internet
commerce. Information Systems Research, 13, 2 (June 2002), 187204.
103. Torkzadeh, G., and Doll, W.J. The development of a tool for measuring the perceived
impact of information technology on work. Omega, International Journal of Management
Science, 27, 3 (June 1999), 327339.
104. Van Dyke, T.P.; Kappelman, L.A.; and Prybutok, V.R. Measuring information systems
service quality: Concerns on the use of the SERVQUAL questionnaire. MIS Quarterly, 21, 2
(June 1997), 195208.
105. Venkatesh, V.; Morris, M.G.; Davis, G.B.; and Davis, F.D. User acceptance of information technology: Toward a unified view. MIS Quarterly, 27, 3 (September 2003), 425478.
106. Venkatraman, N. Strategic orientation of business enterprises: The construct, dimensionality, and measurement. Management Science, 35, 8 (August 1989), 942962.
107. Venkatraman, N., and Ramanujam, V. Measurement of business economic performance:
An examination of method convergence. Journal of Management, 13, 1 (1987), 109122.
108. Wand, Y., and Wang, R.Y. Anchoring data quality dimensions in ontological foundations. Communications of the ACM, 39, 11 (November 1996), 8695.
109. Wang, R.Y., and Strong, D.M. Beyond accuracy: What data quality means to data consumers. Journal of Management Information Systems, 12, 4 (Spring 1996), 534.
110. Watson, R.T.; Pitt, L.F.; and Kavan, C.B. Measuring information systems service quality: Lessons from two longitudinal case studies. MIS Quarterly, 22, 1 (March 1998), 6179.
111. Weill, P. The relationship between investment in information technology and firm performance: A study of the valve manufacturing sector. Information Systems Research, 3, 4 (December 1992), 307333.
112. Wells, C.E. Evaluation of the MIS function in an organization: An exploration of the
measurement problem. Ph.D. dissertation, University of Minnesota, Minneapolis, 1987.
113. Xia, W. Dynamic capabilities and organizational impact of IT infrastructure: A research
framework and empirical investigation. Ph.D. dissertation, University of Pittsburgh, 1998.
110
Hardly
at all
The extent that the information is:
Interpretable
Understandable
Complete
Clear
Concise
Accurate
Secure
Important
Relevant
Usable
Well organized
Well defined
Available
Accessible
Up-to-date
Received in a timely manner
Reliable
Verifiable
Believable
Unbiased
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
4
4
4
4
4
4
4
4
4
4
4
4
4
4
4
4
4
4
4
4
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
To a
great
extent N/A
Hardly
at all
The extent that the information:
Can be easily compared to past
information.
Can be easily maintained.
Can be easily changed.
Can be easily integrated.
Can be easily updated.
Can be used for multiple purposes.
Meets all your requirements.
1
1
1
1
1
1
1
111
2
2
2
2
2
2
2
3
3
3
3
3
3
3
4
4
4
4
4
4
4
5
5
5
5
5
5
5
0
0
0
0
0
0
0
The following statements ask you to assess the outcome of using the information that
IS provided to you.
To a
great
extent N/A
Hardly
at all
The extent that:
The amount of information is
adequate.
It is easy to identify errors in
information.
It helps you discover new
opportunities to serve customers.
It is useful for defining problems.
It is useful for making decisions.
It improves your efficiency.
It improves your effectiveness.
It gives your company a
competitive edge.
It is useful for identifying problems.
1
1
1
1
1
2
2
2
2
2
3
3
3
3
3
4
4
4
4
4
5
5
5
5
5
0
0
0
0
0
1
1
2
2
3
3
4
4
5
5
0
0
112
IS Service Performance
The following statements ask you to assess the performance of services provided by the
IS department or function. Please circle the number that best represents your evaluation
of each statement. If a statement is not applicable to you, circle the number 0.
To a
great
extent N/A
Hardly
at all
The extent that the:
Training programs offered by the
IS function are useful.
Variety of training programs
offered by the IS function is
sufficient.
IS functions services are
cost-effective.
Training programs offered by the
IS function are cost-effective.
IS functions services are valuable.
IS functions services are helpful.
1
1
1
2
2
2
3
3
3
4
4
4
5
5
5
0
0
0
To a
great
extent N/A
Hardly
at all
The extent that the IS function:
Responds to your service requests
in a timely manner.
Completes its services in a timely
manner.
Is dependable in providing services.
Has your best interest at heart.
Gives you individual attention.
Has sufficient capacity to serve all
its users.
Can provide emergency services.
Provides a sufficient variety of
services.
Has sufficient people to provide
services.
Extends its systems/services to your
customers/suppliers.
1
1
1
1
2
2
2
2
3
3
3
3
4
4
4
4
5
5
5
5
0
0
0
0
1
1
2
2
3
3
4
4
5
5
0
0
To a
great
extent N/A
Hardly
at all
The extent that IS people:
Provide services for you promptly.
Are dependable.
Are efficient in performing their
services.
Are effective in performing their
services.
Have the knowledge and skill to do
their job well
Are reliable.
Are polite.
Are sincere.
Show respect to you.
Are pleasant to work with.
Instill confidence in you.
Are helpful to you.
Solve your problems as if they
were their own.
Understand your specific needs.
Are willing to help you.
Help to make you a more
knowledgeable computer user.
113
1
1
2
2
3
3
4
4
5
5
0
0
1
1
1
1
1
1
1
1
2
2
2
2
2
2
2
2
3
3
3
3
3
3
3
3
4
4
4
4
4
4
4
4
5
5
5
5
5
5
5
5
0
0
0
0
0
0
0
0
1
1
1
2
2
2
3
3
3
4
4
4
5
5
5
0
0
0
Systems Performance
The following statements ask you to assess the extent that systems produce various
outcomes for you. The term systems does not refer to the information itself. Rather, it
refers to the capability to access, produce, manipulate, and present information to
you (e.g., to access data bases, or to develop a spreadsheet). Please circle the number
that best represents your evaluation of each statement. If a statement is not applicable
to you, circle 0.
To a
great
extent N/A
Hardly
at all
The extent that systems:
Make it easier to do your job.
Improve your job performance.
Improve your decisions.
1
1
1
2
2
2
3
3
3
4
4
4
5
5
5
0
0
0
114
1
1
2
2
3
3
4
4
5
5
0
0
1
1
1
2
2
2
3
3
3
4
4
4
5
5
5
0
0
0
1
1
2
2
3
3
4
4
5
5
0
0
1
1
1
1
1
1
1
2
2
2
2
2
2
2
3
3
3
3
3
3
3
4
4
4
4
4
4
4
5
5
5
5
5
5
5
0
0
0
0
0
0
0
1
1
1
1
1
1
2
2
2
2
2
2
3
3
3
3
3
3
4
4
4
4
4
4
5
5
5
5
5
5
0
0
0
0
0
0
115
The following statements ask you to assess general characteristics of the information
systems that you use regularly. Please circle the number that best represents your
evaluation of each statement. If a statement is not applicable to you, circle 0.
To a
great
extent N/A
Hardly
at all
The extent that:
Systems have fast response time.
System downtime is minimal.
Systems are well integrated.
Systems are reliable.
Systems are accessible.
Systems meet your expectation.
Systems are cost-effective.
Systems are responsive to meet
your changing needs.
Systems are flexible.
Systems are easy to use.
System use is easy to learn.
Your companys intranet is easy
to navigate.
It is easy to become skillful in
using systems.
1
1
1
1
1
1
1
2
2
2
2
2
2
2
3
3
3
3
3
3
3
4
4
4
4
4
4
4
5
5
5
5
5
5
5
0
0
0
0
0
0
0
1
1
1
1
2
2
2
2
3
3
3
3
4
4
4
4
5
5
5
5
0
0
0
0