[{{media:hcmi:180px-Troy1.jpg|thumbnail|TROY Robot}}] This is a home-grown torso-only humanoid robot, designed and built in the Mechanical Engineering department. The arms of the robot resemble real human arms. The head of the robot is a monitor, so the robot is capable of showing animated expressions (emoticons). The head can also be rotated, so we have the choices of a wide, short head and a thin, tall head. == Research Objectives == * How can a humanoid robot assist in treating children with Autism in clinical settings? * Can a machine-look-like humanoid robot (loosely resemble a human) help develop autistic children's capabilities in '''joint attention'''? * How can a therapist "program" the robot in preparation for clinical sessions using a skill/behavior based '''interactive learning interface'''? * What '''interface techniques''' help the therapists interact with the robot to better treat children with Autism? == Robot Specifications == === Sensors === Existing Sensors: * '''Visual Sensor''': The monitor (head) has webcam built-in. Enables computer vision algorithm to identify patterns. * '''Audio Sensor''': The monitor (head) has mic built-in. Possibly voice commands. Possible Sensors: * '''Audio Sensor''': We could add two mics (as ears) for audio signals detection from different directions. * '''Range Sensor''': We could add an infrared proximity sensor to detect object distances and trigger events when object is close. * '''Position Sensor''' (wii remotes): Detects locations of the operator's elbows, wrists, etc. The therapist will be holding the wii remotes. * '''Motion Sensor''' (wii remote): Detects the direction of movement from the operator. === Actuators === Existing Actuators: * '''Arm movements''': Up, down, in, out. 1-DOF elbow and 3-DOF shoulder. * '''Head/neck movements''': Turn left/right, lean left/right, raise head up/down. 2-DOF neck. * '''Display''': Monitor (head) can display emotions and expressions. Possible Actuators: * '''Speaker''': make sounds or speech === Processor === * Processing is all done on a standalone laptop. === Communication === * Laptop communicates with the robot through a serial cable because USB is too slow. == Robot Skills/Behaviors == === Existing skills/behaviors === * Body movements: Arm, head/neck. === Possible skills/behaviors === * Cognitive behaviors: Imitate * Gestures: Nod, shake heads, wave, YMCA, flux muscles * Movements: Point at, reach out * Sounds: Make sounds/noises, speak * Emotions: Excited (arms up and moving left and right), scared (leaning backward with arms covering face) * Cognitive behaviors: find face, learn to identify objects, identify direction of sound source * Social behaviors: Keep gaze on person in front, briefly turn to check out visual/sound changes (curiosity) === Desired skills/behaviors === (Please add here...)