Asimo Robot (photo courtesy Honda)

Asimo is a humanoid robot designed and created by Honda to demonstrate their advanced technology capabilities. Our research group is currently working with Honda Research Institute USA on how to utilize Honda's imitation technology with Asimo in treating children with Autism in a clinical setting.

We use an animated 3D avatar of Asimo (only the upper torso) instead of the real robot. The avatar is projected onto a flat panel installed on one side of the wall in the clinic room. Using a SwissRanger range camera, we can detect the movement of a therapist in front of the camera, and the imitation software then controls the avatar to imitate the movement of the person. We plan to extend the capabilities of the robot by enabling gesture recognition using wii controllers.

Research Objectives

  • What are the differences between a virtual robot (avatar) and a real physical robot when used in treating children with Autism in clinical settings?
  • Can a robot avatar help develop autistic children's capabilities in joint attention?
  • How can a therapist “program” the virtual robot in preparation for clinical sessions using a skill/behavior based interactive learning interface?
  • What interface techniques help the therapists interact with the robot to better treat children with Autism?

Robot Specifications

Sensors

Existing Sensors:

  • Range Sensor (SwissRanger 3000): Generate a 3D point cloud representing the range surface from the camera.

Possible Sensors:

  • Visual Sensor (single or stereo-camera): Enable computer vision algorithm to identify patterns. Can be in front of the therapist (child) or on the side.
  • Audio Sensor (single or multiple directional microphones): Allows audio signals detection from different directions. Possibly voice commands.
  • Position Sensor (wii remotes): Detects locations of the operator's elbows, wrists, etc. The therapist will be holding the wii remotes.
  • Motion Sensor (wii remote): Detects the direction of movement from the operator.

Actuators

Existing Actuators: (Theoretically all joint movements controlled through TCPIP commands)

  • Arm movements: Up, down, in, out
  • Head/neck movements: Turn left/right, lean left/right, raise head up/down
  • Torso movements: Turn left/right. Lean forward/backward

Possible Actuators:

  • Facial Expressions (is it possible?)
  • Speaker: make sounds or speech

Processor

  • Processing is all done on a standalone laptop.

Communication

  • TCPIP commands can be used to control avatar joint movements

Robot Skills/Behaviors

Existing skills/behaviors

  • Body movements: Arm, head/neck, torso movements.
  • Cognitive behaviors: Imitate

Possible skills/behaviors

  • Gestures: Nod, shake heads, wave
  • Movements: Point at, reach out
  • Sounds: Make sounds/noises, speak
  • Emotions: Excited (arms up and moving left and right), scared (leaning backward with arms covering face)
  • Cognitive behaviors: find face, learn to identify objects, identify direction of sound source
  • Social behaviors: Keep gaze on person in front, briefly turn to check out visual/sound changes (curiosity)

Desired skills/behaviors

(Please add here…)

ar/asimo-avatar.txt · Last modified: 2015/03/26 15:04 by ryancha
Back to top
CC Attribution-Share Alike 4.0 International
chimeric.de = chi`s home Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0