Robots may help students with autism learn the art of human interaction
Three Bradley students are experimenting with humanoid robots that may assist children with autism in an educational setting. Dr. Deitra Kuester, assistant professor of special education, and Dr. Chris Nikolopoulos, professor of computer science and information systems, are facilitating the project. Dr. Kevin Finson, professor of education, has also joined the team as a resource. They began the project in spring 2008, creating an innovative, cross-curricular collaborative opportunity for Bradley students.
Dr. Chris Nikolopoulos, left, Dr. Kevin Finson, and Dr. Deitra Kuester hope that robots in the classroom can help reduce the anxiety that more directional teaching styles often create in students with autism.
Autism Spectrum Disorder (ASD) is a developmental disorder involving varying degrees of deficit in socialization, communication, and learning. Individuals with ASD may be highly functional to highly challenged.
Robots will act as a human substitute in teaching social skills to students with ASD. The ultimate goal is to use the robots to convey a variety of social skills. Some skills may include initiation of communication, reciprocal conversation, or teaching individuals with ASD who may be nonverbal how to express wants and needs.
Robot use has gained momentum in primarily therapeutic settings for individuals with ASD. Kuester noted there is room for novel teaching techniques in this field because the current directional teaching style using human interaction may contribute to anxiety for many children with ASD.
"The purpose of the project is to reduce the fear and complexity of working with a human to teach [students with ASD] social skills," Kuester says. She hopes this will result in students’ abilities to apply learned social skills to her target environments—school and home.
Kuester and senior learning behavior specialist, elementary education, and early childhood major Lauralyn Bogart will first observe the experimental subjects and discuss desired social skills with their parents and teachers. "These could include teaching them how to greet another student, how to ask a student to come play, and, for older students, [it] could even be how to ask another student to a dance,” Bogart says. “The possibilities are endless of what they could teach."
For now, the robots’ behavior will be scripted. They will be programmed with a social story or social script, two common strategies of teaching socially appropriate behavior. The robots will demonstrate a particular social skill in stages.
Future investigations will attempt to equip robots with “artificial intelligence” characteristics, such as non-deterministic behavior, vision recognition, and more extensive natural language capabilities.
Social skills will be introduced to students in three stages. First, two robots will act out a script; next, a student will interact with a robot. Then, the goal will be for the student to demonstrate the behavior with another person.
The current investigation will introduce the social skill to the students in three stages. First, students watch the programmed robots interact with each other.
Next, the instructor replaces one of the robots with a student, who interacts with the robot using a pre-scripted dialogue.
Finally, the teacher will remove the remaining robot. After having learned the appropriate behavior without the complexity or anxiety of working with a human, the goal is for the student to demonstrate and generalize the skill more readily with a human than if he or she learned the skill through traditional modalities.
The three stages enable the robot to act out what the teacher would typically teach using lecture (or role play) strategies.
"It’s not a matter of the robot teaching the skill. The robot acts as an agent through which the skill is learned," Kuester says.
The project is in its early stages. Kuester and Nikolopoulos brainstormed the inception of this study as a project for Nikolopoulos’ computer science class. Four students initially created the first prototype and programmed it according to Kuester’s specific criteria.
Will Herring, senior computer science major, is one of the two students continuing the project. "The field of this fusion between robotics and autism is fascinating, and a lot of good research is being done," Herring says. "Creating something with computer science that has a profound real-world impact would be a great step from all these years of instruction."
In addition to Herring and his senior computer science major counterpart, Andrew Becker, Kuester added Bogart to the team this year. The three students will construct and program two more prototype brands. The group will have created four testing robots in total.
Kuester says testing a variety of robots is important to determine which fits the future study best. This includes determining how the student will react to the robot and which robot is most economically efficient. For instance, human-sized robots are too expensive for many schools’ pocketbooks.
The team is well on its way to creating the next prototypes. The members completed one robot with minimal speech capabilities and plan to build a replica programmed with speech and movement capabilities. Herring says all robots have their own programming languages. He is learning the language for a robot that will be able to walk, run, sing, and dance.
Kuester says by spring 2009, they should have the robots ready for introduction into school and home environments. By May, they should have data on student interaction with the robots and whether or not the robots will perform what Kuester desires. Bogart and Kuester will observe the students and collect and analyze data.