I am a PhD student in the Computer Science Department at BYU. I received my undergraduate degrees from Southern Oregon University in 1997. After working in the industry for 9 years, I decided to return to school to pursue something that I've always wanted to do, something I could enjoy for the rest of my life — Artificial Intelligence and Robotics — hence, I am here.

Research Interests

We believe a robot is only an assistive tool to help a therapist/clinician in treating children with Autism. The objective is not to provide a cool toy for an autistic child to play with, but to use the robot to encourage the child to develop more social norm behaviors such as turn taking, joint attention, and eye contacts. Therefore, it is important for the robot to behave as the therapist wants the robot to, and display very specific behaviors designed by the therapist for a specific session to treat a specific child. However, therapists are not likely engineers or programmers, and would rather not know how the robot's internal mechanism works. Therefore, there is a need for therapists to be able to manage a robot's autonomous behaviors by means the therapists can easily understand.

We propose a framework where a therapist can manage a robot's autonomous behaviors by managing information provided to the robot at different resolution and at different phases of the operation.

  • At the highest scale (lowest resolution), the robot (or rather the intelligent overall system) would create a general plan for treatment based on general trends. The plan would include areas where the therapist should focus on to improve the autistic child's social interaction capabilities.
  • At the medium scale (medium resolution), the human operator can perform case-specific planning, especially the between session planning, and then provide the intelligent system additional information such as areas of focus/interest for the next session.
  • At the lowest scale (highest resolution), the human operator can manage autonomous behaviors at the execution level during a live session by specifying areas of focus/interest in real-time. Additional information, such as desired intensity and the robot's level of reliance on the therapist can also be specified by the human user as indirectly means to affect the robot's autonomous behaviors.

Autonomous Components

  • Gestures and Expressions
    • Facial expressions with audio effects: happy, sad, scared, excited, etc.
    • Body movements and gestures: waving hello/goodbye, raise hand, raise hand, wave, nod/shake head, turn body to, turn head to (look at), point at, etc.
  • Social behaviors
    • Politeness: turning to and follow the person who is talking and make eye contacts, greetings, thanking people, etc.
    • Curiosity: turn to sudden, loud sounds or turn to people coming into the room.
    • Reference: pointing to objects or people when referring to it.
  • Cognitive capabilities:
    • Sound source tracking
    • face tracking
    • object identification (learned at the planning phase)
    • Audio cue identification (pre-programmed at the planning phase)

User Interface Components

  • Area of Focus Map Tool – This tool presents the therapist a map of the areas (really a 1D list presented as a 2D map) the therapist can work on with the child. Examples include joint attention, turn taking, verbal communication, and etc. The therapist can adjust the importance of each area with simple gestures for a specific child in a specific stage of the overall treatment. Once specified, the robot will adjust its autonomous behaviors to match the areas of focus specified and use behaviors that promote desired interactions from the child. This tool is only used at the planning phase because at the execution phase (during live session), the therapist would not be working directly on a computer.
  • Area of Focus Wii Remote Tool – Ideally, the therapist would have mapped specific Wii remote buttons to certain areas of focus, and during a live session, the therapist can increase/decrease the desired effect for the top focus areas specified in the planning phase.
  • Autonomous Behaviors Intensity Control Tool – The therapist can control the intensity of certain autonomous behaviors at both the planning phase and the execution phase (during live session). At the planning phase, the therapist can adjust intensities with slider controls on a computer monitor. These controls can also be mapped to a Wii remote so adjustment can be accomplished during a live session. Example controls are:
    • Intensity of the verbal communication – high means the robot becomes very verbose and talks a lot; low means the robot is very concise and doesn't talk much.
    • Intensity of the social behaviors – high means the robot shows many social behaviors similar to human; low means the robot looks dumber and doesn't display any human-type social behaviors.
    • Reliance on the therapist – high means the robot tends to always refer to the therapist for help or for approval; low means the robot will perform tasks without referring to the therapist. This control can be used to promote interaction between the child and the therapist. It can also be used to encourage turn taking.

<!–

Research Interests

We believe a robot is only an assistive tool to help a therapist/clinician in treating children with Autism. The objective is not to provide a cool toy for an autistic child to play with, but to use the robot to encourage the child to communicate more with the therapist and develop joint attention. Therefore, it is important for the robot to only do things the therapist wants the robot to do, and display very specific behaviors designed by the therapist for a specific session to treat a specific child. This requires the therapist to be able to choreograph, or in another word, “programmer” the robot to behave in certain ways in certain cases. However, therapists are not programmers or roboticists, and they also do not want to become programmers or roboticists. There is a strong need for an intuitive and easy to use interface for the therapists to be able to program a robot's behaviors dynamically. We propose a behavior-based interactive learning interface to solve this problem. This interface also allows the therapist to learn about the robot's capabilities, which enables the therapist to select the right mix of robot autonomy and human intervention to achieve satisfactory performance.

Our design is influenced by the following philosophies/ideas:

  • Let robot do what robot is good at and let human do what human is good at.
  • Intelligence is in the eyes of the beholder.
  • Ability to learn is a key characteristic of intelligence.
  • Key to success: lower expectations.
    1. >

<!–

Research Components

  • Primitive behaviors — Simple behaviors already programmed into the robot's capabilities. A therapist can then compose higher level behaviors using these primitive behaviors. Here are some general primitive behaviors a robot should have independent of which robot is to be used:
    • Body movements and gestures: raise hand, wave, nod/shake head, turn body to, turn head to (look at), point at, etc.
    • Emotions (include audio and body movement): happy, sad, scared, excited, etc.
    • Basic cognitive skills: face tracking, sound source tracking, object identification, imitation, etc.
  • Interactive learning —
  • Create new body movements using wii remote —
  • Composition interface —
    • Safety layer and default layer —
    • Argument-setting linkage —
    • Conditional and logical grouping —
  • Human elements — Therapist pushes a button to trigger a behavior
    1. >

<!–

Other Challenges

  • Active confirmation –
  • Behavior transition –
  • Higher level behavior preview –
  • Easy (offline and online) behavior modification –
    1. >

<!–

Example Scenarios

–>

Personal Interests

I have too many hobbies. Some of the leading ones are soccer, billiards, music improvisation, and martial arts. In my spare time (if I can squeeze out any), I translate Chinese martial arts novels into English and also maintain a personal blog because I believe that good things should be shared. Welcome to check them out:

ar/lanny-lin.txt · Last modified: 2014/08/11 19:56 by tmburdge
Back to top
CC Attribution-Share Alike 4.0 International
chimeric.de = chi`s home Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0