Human tutors are usually good teachers — but what if a computer program could do the same thing, right down to identifying students’ emotions and boredom levels?

Researchers from the University of Notre Dame, the University of Memphis and the Massachusetts Institute of Technology have collaborated on programs they call the “AutoTutor” and “Affective AutoTutor,” which not only do the standard tutoring stuff — like assessing levels of knowledge and responding to questions — but also gauge facial expression and body posture to sense a student’s frustration or boredom.

What’s more, if such emotions are detected, the programs dynamically change their strategies to help students overcome those feelings so they can learn more effectively.

“[Unlike computers], humans have always communicated with each other through speech and a host of nonverbal cues such as facial expressions, eye contact, posture and gesture,” said Dr. Sidney D’Mello, a Notre Dame Assistant Professor of Psychology.

“Much like a gifted human tutor,” he added, “AutoTutor and Affective AutoTutor attempt to keep the student balanced between the extremes of boredom and bewilderment by subtly modulating the pace, direction and complexity of the learning task.”

No word on when the programs will be available to the general public, but they’re currently being used to help teach complex technical content in Newtonian physics, computer literacy and critical thinking — and in tests with 1,000 students, they’ve resulted in gains of about one letter grade.

More From 100.5 FM The River