Learner-Centered Software Design to Support Students Building Models

Shari L. Jackson, Steven J. Stratford, Joseph S. Krajcik, Elliot Soloway 1101 Beal Ave.

University of Michigan Ann Arbor, MI 48109

Paper proposal submitted to AERA 1995.

 

Introduction and Objectives: The goal of user-centered software design is to make computers easier to use, thus permitting the user to focus on using the technology to perform various tasks. For education, the challenge is learner-centered software design, supporting the user in learning about the task and developing expertise and understanding. We design software to be learner-centered by using the TILT model (Tools, Interfaces, Learner's needs, Tasks) – putting the learner in the center of software design issues, and providing software-based scaffolding to meet the learner's needs. (Soloway et. al. 1994)

We have applied learner-centered principles to the design of a learning environment in the domain of scientific modeling and simulation. Model building is an important activity for developing understanding of scientific systems; however, current technology does not provide learners sufficiently accessible support for creating models. Therefore, we have developed the Modeler as a general-purpose tool to support students' model construction and evaluation tasks. With the Modeler, students can construct dynamic models quickly and easily, and run simulations based on their models to verify and analyze the results. The research described here focuses on the application of the learner-centered TILT model to the design of the Modeler software, and the impact of that design on the modeling activities of students in the domain of stream ecosystems.

Theoretical Framework: Scientists build models to test theories and to develop a better understanding of complex systems (Kreutzer 1986). Similarly, we believe that by building models, students can construct their understanding of natural phenomena. This approach is consistent with current conceptions of learning in which students actively construct understanding by creating external representations of their knowledge (Papert, 1990; Perkins, 1986). Thus, by constructing external representations of scientific phenomena, learners build an internal, mental model of the phenomena.

One way in which students may construct these external representations is through the use of computer-based modeling tools. The modeling tools that are available for students fall into two categories, pre-defined simulations, and modeling environments. Although pre-defined simulations such as Maxis' SimEarth and Wings for Learning's Explorer are user-friendly and provide a great deal of depth in their pre-programmed domains, they do not provide access to underlying functions and representations which drive the simulation, nor the ability to add or change functionality. On the other hand, modeling environments, such as High Performance System's Stella or Knowledge Revolution's Working Model, allow unlimited flexibility in building models, but are difficult to learn; building complex models requires a substantial commitment of time and effort to learn a complex authoring language (Tinker, 1990). Thus, today's modeling tools inadequately address the needs of learners.

We designed the Modeler to provide learners with the combined flexibility and ease of use necessary in a computer-based learning tool. In particular, we designed the Modeler with scaffolding to address learner's needs regarding software Tasks, Tools, and Interfaces:

Tasks Learners need support for learning and understanding the task, which we designed into the Modeler by constraining the complexity of the tasks involved in building models. To build a model, students select from a set of high-level objects, define the factors (measurable quantities or characteristics associated with each object), and define the relationships between the factors. For example, in our target domain of stream ecosystem modeling, students might start with the stream object, define its factors “phosphate” and “quality,” and then define the relationship between them, all in a matter of minutes.

Tools Learners need tools which adapt to their level of expertise, so the Modeler provides a range of ways to define relationships. Initially, relationships can be defined qualitatively by selecting descriptors in a sentence, e.g., “As /stream /phosphate /increases, /stream /quality /decreases by /less and less” (Figure 1). As students' skills increase, and their needs grow more sophisticated, they have the option of defining the relationship more quantitatively, by entering data points into a table.

Interfaces Learners often need extra motivation to sustain interest in a task, and the visually exciting, dynamic interface of the Modeler can help provide that motivation. Students run simulations to try out experiments using their models, such as exploring the impact of increased phosphate levels on overall stream quality. The objects in a model are displayed using photo-realistic graphics, and during a simulation, graphical meters provide real-time feedback of changing values (Figure 2). In addition, the background graphic is a photograph of the actual stream the students studied. By using a photo of their stream we expect to make the task more concrete and authentic for the students; such meaningful, personal tasks are more motivating for students (Blumenfeld et. al. 1991).

Figure 1: Defining relationships

 

 

Figure 2: Running simulations

 

Methods: We are working with science teachers to develop, pilot, and assess a high school science curriculum emphasizing project-based activities (Blumenfeld et. al. 1991), routine use of computing technology, and collaboration. The pilot class contained 22 students who were given everyday access to Macintosh color powerbooks, commercial and research software, and various multimedia equipment. For one project, students spent several months investigating ecosystems; specifically, the ecosystem of a stream that runs behind their school. They collected a variety of biological, physical, and chemical data to determine the quality of the water. At the end of the school year, at the end of their project, they used the Modeler to build computer-based models of the ecosystem they had studied.

Students worked in groups of two with the Modeler, over a period of one week. For three of the days, they used a printed guide (along with the program) designed to introduce them to modeling, teach them how to use the program, and help them design and test some simple models. Students were encouraged to predict, revise, evaluate, and elaborate as they learned to construct their models. On the fourth day, they were assigned an open-ended modeling task, in which they created their own models to represent one of three choices of ecological phenomena (for example, the impact of a pollution source on populations of stream macroinvertebrates). Teachers and researchers were available to give advice and to answer questions. On the fifth day, a researcher led the class in a discussion about the models they constructed, giving them an opportunity to describe and discuss their models with the class.

Data Analysis Techniques: Several data sources were used: the models the students constructed; software-generated log files of each group's activities with the program; and audio and videotapes of student conversations and computer activity. We used several different methods for reducing and analyzing the data. First, using log files and the students' models, the models were reconstructed into a concept map form. We evaluated the reconstructed models and compared them with appropriate scientific models, judging qualitatively for accuracy and completeness. Next, we examined patterns in the log files and categorized them according to model-building strategies. To accomplish this task, we classified log events as either building (e.g. defining a factor or relationship) or testing (e.g. running a simulation). We then looked for patterns within groups and compared patterns across groups. For example, we looked to see if students followed a strategy of building all their factors and relationships first and then testing them as a whole, or an iterative strategy of building a few relationships, testing them, and then repeating the cycle.

Finally, log file entries were combined with the conversation transcripts in order to match student dialog with actions they took on the computer. We examined this combined data set to find evidence of the impact of the learner-center design features of the Modeler, such as: qualitative expression, model expansion, model assessment, photo-realism, multiple modes of expression, object-oriented expression, and real-time visual feedback. Since the log files contained time stamps, we also reviewed this data to determine the amount of time between the expression of an idea and the implementation of that idea.

 

Results and Interpretations: Based on our analysis of the students' models, we found that their models were generally accurate and reasonable, but occasionally contained errors of representation. Most groups were able to set-up at least three reasonable relationships in their model. One common error was to confuse a population's rate of growth with its count. Examining the summarized log files for student model-building strategies revealed that students who followed a more iterative building/testing approach tended to create more accurate and complete models. The process of testing their model revealed flaws or suggested additional relationships to be constructed.

Looking at the combined log file/transcripts data set, we found that students were able to generate and test ideas within a very short period of time. For example, one pair created four interrelated relationships in four minutes, and in the next four minutes, tested and verified their model, and found another relationship to add. Furthermore, the combined transcripts show that the learner-centered features of the program supported students' model-building tasks, particularly by simplifying the complexity of the task, supporting qualitative definitions of relationships, and guiding students into explaining, expanding, and testing their models. For instance, one student, when testing her model, said, “This phosphate went up, algae went up, bacteria didn't go up though, did we define algae and bacteria relationships?” The interface provided her with immediate feedback which led to discovering an error in the model, which she later went on to fix. Students also used qualitative expressions to explain their models, for example, “drain pipe discharge will increase the phosphate, which will increase the algae, which will increase the bacteria, which will lower the oxygen, which will lower the stream quality. So if there are a lot of bacteria and a lot of oxygen it will lower the stream quality.”

Educational Significance: With regard to educational software research and development, this study has several important implications. First, the application of the TILT model of learner-centered design seems to provide promise for the study of constructive educational software. The development of this model is particularly important as we continue to consider the ways in which students use software in classroom tasks to construct understanding. Second, our research has shown that students can use the learner-centered Modeler program not only to create models of phenomena they have observed, but to create them with relative ease and speed. Finally, we have seen that the Modeler can engage students in the process of planning and building computer-based models, a task which is usually inaccessible to learners in high school science classrooms. It should also be noted that this research has attempted to apply new technologies to problems of data collection, and these innovative techniques can help both researchers and practitioners observe, describe, and study the use of educational software in the classroom.

Blumenfeld, P., Soloway, E., Marx, R. W., Krajcik, J. S., Guzdial, M., & Palincsar, A. (1991) “Motivating Project-Based Learning: Sustaining the Doing, Supporting the Learning,” Educational Psychologist, Vol. 26.(3 & 4), 369-398

Kreutzer, W. (1986) Systems Simulation: Programming Styles and Languages, Addison-Wesley, Wokingham, England.

Papert, S. (1990) Introduction to Constructionist Learning: A fifth anniversary collection of papers, I. Harel, (Ed.), MIT Media Laboratory: Cambridge, MA.

Perkins, D. N. (1986) Knowledge as Design, Lawrence Erlbaum Associates, Hillsdale, NJ

Soloway, E., Guzdial, M., & Hay, K. E. (1994) Learner-Centered Design: The Challenge for HCI in the 21st Century, Interactions, Vol. 1, No. 2, April, 36-48

Tinker, R.F. (1990) Teaching Theory Building: Modeling: Instructional Materials and Software for Theory Building, NSF Final Report, TERC.

Back to hi-ce Office