An Input-Process-Output Structural Framework
for Evaluating Web-based Instruction

Chong Ho Yu, Ph.D.s


Various evaluation methodologies have been proposed for assessing the impact of Web-based instruction (WBI) on learners (Svanum, Chen, & Bublitz, 1997; Swigger, Brazile, Lopez, & Livingston, 1997). Due to the relative newness of the medium and the vagueness of the scope of WBI, there isn't a unified theoretical framework for WBI evaluation. To fill this gap, a structural framework with an emphasis on input-process-output is proposed. The framework also functions as a reference for conceptualizing an evaluation from the abstract level to the specific level.

The input-process-output structural framework (see Figure 1) is a specification of how different input, intermediate, and output variables form causal relationships in a system. The structural framework has four levels of abstraction: paradigm, theory, model, and measurement.


The highest layer of the framework has the highest of abstraction and vice versa. The degree of specification increases as the layer goes down. A theory is an implementation of a paradigm, a model is a specification of a theory, and a measurement is the quantification or the empirical representation of a model. Implementations of the structural framework may vary in the theory, model, and measurement levels. However, the emphasis on input-process-output is firmly embedded in the paradigm level.


In this writing, the meaning of "paradigm" is adopted from the usage by Thomas Kuhn (1962). In Kuhn's view, a paradigm is a set of metaphysical beliefs and underlying assumptions that make up a theoretical framework. In other words, a paradigm establishes the "boundary of the playground" and "rules of the game" for researchers in the discipline.

The proposed structural framework endorses a paradigm that emphasizes input-process-output of human learning inspired by cognitive psychology. This paradigm challenges the input-output emphasis under a "black-box" approach.

Under a "black box" approach, an input, which is usually a treatment, is delivered to the intended audience. Afterwards, the output, which is usually an observable outcome, is evaluated and the effectiveness of the treatment is inferred. This input-output approach leaves many important questions unanswered such as "What are the properties of the instructional media?" "What effects are resulted from these properties?" "How do the learners use the program?" and "How do the learners process the information?"

This "black box" approach could be traced back to Behaviorism and classic learning theory, in which the learner or the subject is treated as a black box. In Behaviorism, only stimulus (input) and responses (output) are emphasized for controlling behaviors. In this system the details of underlying structure, mechanism, and dynamics are either unknown or regarded as unimportant (Fontana, 1984).

With the increasing popularity of cognitive psychology, questions about human mental structure and process are being addressed. For instance, "How is knowledge represented?" "How does an individual acquire new knowledge?" (Kellogg, 1995) Because of the process-orientation of cognitive psychology, educational psychologists pay much attention to learning processes (Anderson, 1990; West, Farmer, & Wolff, 1991). In the context of Web-based instruction, questions about relationships between instructional media and learning processes should be raised. For example, "How does hyperlinking enhance connectivity of human schema? How does asynchronous collaboration on the Internet improve intrinsic motivation?"

Approaching from two dimensions

Under the input-process-output structural framework, the procedures of evaluating WBI could be viewed in two dimensions. The first dimension is the level of abstraction. As shown in Figure 1, approaching the research from the abstract layer to the specific layer can be visualized as a vertical process:

  • defining various constructs and concepts in the theory
  • defining variables, which indicate the constructs and concepts, in the model
  • developing or applying instruments to measure the variables in the measurement,
  • determining the relationships among the variables based upon the data collected by the instruments.

The second dimension is the input-process-output stream. As seen in Figure 1, this streaming can be viewed as a horizontal process:

  • defining the media properties
  • defining the mental constructs and mapping them to the media properties
  • defining the criteria of achievement specifying the behavioral outcomes.
In the following section, each component will be explained in terms of the second dimension.

Media properties

Different instructional designers may have different implementations of Web-based instruction. Inexperienced evaluators may ask an excessively board question such as "Can Web-based instruction enhance learning?" It is important to define what WBI is by identifying what certain features are introduced by the implementation of the medium. Rather than assessing the overall WBI program, the evaluation should focus on those specific properties. Typical examples of WBI's properties are:

  • learner control modules that present self-customized sequences to various learners,
  • virtual communities that provide chat rooms, threaded discussions, and ListServ groups,
  • hypermedia with graphical representations and animated illustration of information,
  • hyperlinks to world-wide resources.

Different use of those resources may contribute to different learning processes and outcomes. There are two ways to define variables and to develop instruments for the above media properties. The first way is to create another version of treatment without these media properties and randomly assign users to this control group. In this fashion, group membership is a variable. Another way is to treat the tools which carry specific media properties as numeric variables and track the frequency of using the tools. For example, the number of chat, threaded discussion, and ListServ messages could indicate the utilization of virtual communities. The number of time spent and the number of page accessed within the WBI could reflect the user engagement. Several web traffic tracking technologies can be used to collect data for studying user patterns on WBI (Yu, Jannasch-Pennell, DiGangi, & Wasson, 1998). .

Mental constructs

It is hypothesized that different media properties could change different aspects of user mentality. For instance, adaptive learning could lead to a positive attitude toward self-directed learning, virtual communities could strengthen the sense of collaboration, hypermedia can enrich visual-oriented cognitive process, and hyperlinks to world-wide resources can enhance cross-cultural awareness. A mental construct can be broken down into several variables for in-depth investigation. For instance, collaboration can be defined by eight factors: openness to learning opportunities, self-concept as an effective learner, initiative and independence in learning, informed acceptance of responsibility for one's own learning, a love to learn, creativity, future orientation, and the ability to use basic study skills and problem-solving skills (Guglielmino, 1977; Guglielmino & Murdick, 1997). These variables could be measured by validated and reliable instruments on attitudes and cognitive styles.

However, using standardized instruments is not absolutely flawless. Usually the variables that are said to be latent dimensions of a broader construct are extracted by factor analysis. Kelley (1940) warned that variables resulting from factor analysis are not timeless, spaceless, populationless truth. Thus, it is recommended that evaluators use standardized inventories as a reference for developing instruments adaptive to their local treatment and audience.


It is expected that the preceding attitudinal and cognitive change could result in improvement in Internet knowledge and skills, which may include the knowledge and skills of using web search, E-mail, newsgroups, threaded discussion, and FTP. Usually the criteria of achievement is tied to specific instructional content and therefore the instrument is locally developed. The strengths and weaknesses of locally-developed tests are discussed by Kubiszyn and Borich (1993).

Behavioral outcome

Finally, it is believed that test scores of Internet skills could predict the application of Internet skills reflected by observable behaviors . For instance, if the participants of a WBI are K-12 teachers, then incorporation of Internet technologies into teaching is expected. The variables may be the number of Web pages developed for class, the frequency of using the Internet as a source of teaching materials, the frequency of using web-based communication among peers. These variables could be measured by survey, interview and on-site observation.


Evaluation of learning is often limited by a "black-box" approach. However, there are many variables between the input and the output such as user patterns of the treatment and attitudinal changes prior to behavioral changes. Further, certain methodologies subsumed by a paradigm may be unquestioned by evaluators. To remediate these shortcomings, this proposed structural framework is an attempt to provide a guideline for evaluators to approach a research question in levels of abstraction and the stream of input-process-output.


Anderson, J. R. (1990). Cognitive psychology and its implications. New York : W.H. Freeman.

Fontana, D. (Eds.). (1984). Behaviorism and learning theory in education. Edinburgh: Academic Press.

Guglielmino, L.M. (1977). Development of the Self-directed Learning Readiness Scale, unpublished doctoral dissertation, University of Georgia, Athens, GA.

Guglielmino, P. J.; Murdick, R. G. (1997). Self-directed learning: The quiet revolution in corporate training and development. Advanced Management Journal, 62, 10-18.

Kelley, T. L. (1940). Comment on Wilson and Worcester's Note on Factor Analysis. Psychometrika, 5, 117-120.

Kellogg, R. T. (1995). Cognitive psychology. Thousands Oaks: Sage Publications.

Kubiszyn, T., & Borich, G. (1993). Educational testing and measurement: Classroom application and practice. New York: Harper Collins College Publishers. Kuhn, T. S. (1962). The structure of scientific revolutions. Chicago: University of Chicago Press.

Svanum, S., Chen, S. H., & Bublitz, S. (1997). Internet-based instruction of the principle of base rate and prediction: A demonstration project. Behavior Research Methods, Instruments, & Computers, 29, 228-231.

Swigger, K. M., Brazile, R., Lopez, V., & Livingston, A. (1997). The virtual collaborative university. Computers Education, 29, 55-61.

West, C. K., Farmer, J. A., & Wolff, P. M. (1991). Instructional design: Implications from cognitive sciences. Englewood Cliffs, New Jersey: Prentice Hall.

Yu, C. H., Jannasch-Pennell, A., DiGangi, S., & Wasson, B. (1999). Using On-line interactive statistics for evaluating Web-based instruction, Journal of Educational Media International, 35, 157-161.



Simplified Navigation

Table of Contents

Search Engine