CCE Theses and Dissertations

Date of Award

2009

Document Type

Dissertation

Degree Name

Doctor of Philosophy in Information Systems (DISS)

Department

Graduate School of Computer and Information Sciences

Advisor

John Scigliano

Committee Member

William Hafner

Committee Member

Sumitra Mukherjee

Keywords

evaluation methods, evaluation model, information systems management, interpretive evaluation, IS/IT evaluation, IS/IT investments

Abstract

Evaluation is a vital--yet challenging--part of IS/IT management and governance. The benefits (or lack therefore) associated with IS/IT investments have been widely debated within academic and industrial communities alike. Investments in information technology may or may not result in desirable outcomes. Yet, organizations must rely on information systems to remain competitive. Effective evaluation serves as one pathway to ensuring success. However, despite a growing multitude of measures and methods, practitioners continue to struggle with this intractable problem.

Responding to the limited success of existing methods, scholars have argued that academicians should first develop a better understanding of the process of IS/IT evaluation. In addition, scholars have also posited that IS/IT evaluation practice should be tailored to fit a given organization's particular context. Of course, one cannot simply tell practitioners to "be contextual" when conducting evaluations and then hope for improved outcomes. Instead, having developed an improved understanding of the IS/IT evaluation process, researchers should articulate unambiguous guidelines to practitioners.

The researcher addressed this need using a multi-phase research methodology. To start, the researcher conducted a literature review to identify and describe the relevant contextual elements operating in the IS/IT evaluation process: the purpose of conducting the evaluation (why); the subject of the evaluation (what); the specific aspects to be evaluated (which); the particular evaluation methods and techniques used (how); the timing of the evaluation (when); the individuals involved in, or affected by, the evaluation (who); and the environmental conditions under which the organization operates (where). Based upon these findings, the researcher followed a modeling-as-theorizing approach to develop a conceptual model of IS/IT evaluation. Next, the conceptual model was validated by applying it to multiple case studies selected from the extant literature. Once validated, the researcher utilized the model to develop a series of methodological guidelines to aid organizations in conducting evaluations. The researcher summarized these guidelines in the form of a checklist for professional practitioners.

The researcher believes this holistic, conceptual model of IS/IT evaluation serves as an important step in advancing theory. In addition, the researcher's guidelines for conducting IS/IT evaluation based on organizational goals and conditions represents a significant contribution to industrial practice. Thus, the implications of this study come full circle: an improved understanding of evaluation should result in improved evaluation practices.

  Link to NovaCat

Share

COinS