A tool for engaging student teachers for
learning and practicing classroom instruction techniques


Student teachers face their first day of teaching with little or no experience leading a classroom. LIZA is an online tool for practicing discourse between a live instructor and a simulated student that provides a risk-free environment to hone specific pedagogical and behavioral practices.

The instructor talks to the virtual student by voice and engages in a discourse aimed towards particular pedagogical or behavioral goals. The virtual student responds to the instructor in the manner an actual student might and helps the instructor practice techniques in an immersive manner.

The course of the conversation is controlled by AI technology that matches the instructor’s words and overall strategy, with previously trained exemplary practices in similar situations.


The use of simulated environments to help people practice skills in impractical places has long been used for the aviation industry and the military, but the affective nature of face-to-face instruction is more difficult and nuanced than simulating a physical environment, such as an airplane. A number of approaches have been demonstrated, from simSchool which uses AI to model student’s “emotional intelligence,” to the highly immersive but expensive Mursion system originally developed by the University of Central Florida.

LIZA takes a different approach to the problem by using less-costly, but still immersive technology, coupled with powerful machine-learning algorithms that are trained to target very specific pedagogical goals commonly found in fields like reading, mathematics, and special education.


The idea of a minimalist application for exploring complex human interactions is not new. In 1966, MIT computer scientist Joseph Weizenbaum created the Eliza program, which used the simple technology of the time to simulate the discourse between a Rogerian-style psychotherapist and their patient. The “patient” would type to the program and the program would respond by simple word matching with vague responses such as “How did that make you feel?” and “Tell me more about that.” Weizenbaum was horrified how much people found comfort in talking with Eliza, but it highlights how little realism people need to suspend their disbelief and become affectively engaged with the simplest of interactions.

How it works

The LIZA author constructs a scenario that outlines the techniques or behaviors they want the instructor to practice. This scenario contains a finite set of possible student responses that are delivered orally. The instructor speaks to the virtual student using a microphone, and the natural language processing component in LIZA that converts speech into text.

AI engine

LIZA’s machine learning system has been previously trained by linking many other instructor’s prompts with the typical student response in the real world by qualified teachers over time for this very specific goal. The instructor, in turn, responds to the virtual student until given feedback indicates the instructional goal to the simulation has been achieved. There is significant real-time feedback given to the instructor on the effectiveness of their teaching.

3D environment

A minimalist rendering of the virtual student allows the instructor to concentrate on what the goals are without getting swept up in the personification of a specific student, and potentially making the process more generalizable in the real world. Scott McCloud deftly described the process as amplification through simplification and is a long recognized principle in the use of images in instruction where extraneous elements are eliminated in order to focus attention on the primary goal.

Student responses

The responses are coded to reflect particular affects such as anger, boredom, excitement, etc, and the virtual students are animated to reflect that affect in order to help the instructor better discriminate among them


Click here for a short screencast

You can kick the app's tires by clicking this link

LIZA is produced by StageTools, an EMMY-award winning maker of tools for filmmakers and educators since 1995. and funded by the Chan Zuckerberg Initiative by a grant to Harvard University.

For more information

Bill Ferster
+1 540-592-7001