• 9/24/10, 1pm. Discussed the status of the project, including what was learned over the summer, the status of our new Honda funding, and plans for the next round of clinical trials.
    • The clinicians requested the following:
      • That the robot Troy is moved back to the clinic for use in therapy with one of the patients.
      • Some new robot behaviors, including songs with actions. Tim Major will help with this task.
    • We had a discussion on the technology that is used to view and code videos from therapies and pre- and post-assessments. Bryan Morse indicated that his group may be able to help with vision-based tools. Mike Goodrich proposed the idea of using a Wii remote to allow the therapists to tag when desirable behaviors occur in therapy to facilitate post-therapy retrieval and investigation of those behaviors.
    • As part of our list of requirements for Honda: For the Honda imitation software to be useful, it must be able to be controlled by a single clinician.
    • For future grant proposals, the idea of “low-dose” robot therapies should be a prominent selling point.
    • Some additional notes from this meeting are found here.
  • 2/4/10, 4pm. We watched a compelling video of a child interacting with the robot Troy, and we discussed plans for a proposal to Willow Garage.
    • Examples of initiation, joint attention, and triadic interaction were observed. The child was not on the autism spectrum, but had some social impairments. It appeared that Troy facilitated child-therapist interactions, but anecdote is not evidence.
    • For Willow Garage, a proposal would need to tell the following story.
      • We need a robot and robot software that can be sustained in a lab for a long time even without a company providing these.
      • To accomplish this, we would like to adopt an open source model for both robot and software.
      • This would not only facilitate sustainability in our lab, but could also help make it possible for other clinics to adopt successful strategies for robot-assisted therapies.
  • 12/7/2009, meeting with Behzad and Kikuo. Here is a summary of our discussions:
    • What is the most important needed technology for robot-assisted therapies?
      • Guiding principle: to use a robot in therapy, the clinician can't do other therapies. Therefore, importance is defined by what makes a therapist more powerful either because the technology allows something to be done faster/more completely or because it enables something that couldn't be done in other ways.
      • Gesture recognition might be helpful because it could allow the therapist to naturally shape robot behavior.
      • A combination of preprogrammed robot behavior and imitation might be helpful because it could allow the therapist to “naturally” move through a therapeutic plan/script.
      • Allowing the robot to imitate the child directly would be very useful because it would open up new clinical opportunities. I believe that Honda said that they would look into this.
      • Programming by demonstration is probably not very important right now.
    • How can we best coordinate our work with other projects for robot-assisted therapy?
      • Mataric's work, Scaz's work, and Dautenhahn's work.
      • Cassell's work on virtual actors in therapies.
    • How can we calibrate expectations so that they match emerging capabilities in robot-assisted therapies?
      • Focus on supporting the therapist.
      • Focus on helping low-functioning children.
      • Emphasize using the robot to prime social/emotional 'wiring' in a child – engage and prime.
      • Use sabotage to turn attention to therapist.
      • Reward social/emotional interaction with therapist, using the robot as part of the reward.
      • Shift from child-robot interactions to child-therapist actions, with the goal of eventually phasing out the robot.
    • What is the development plan?
      • Identify technologies on the critical path.
      • Constrain technology developments by those that could be used in real therapies in real practice.
      • Modify technologies and therapies by introducing therapies into the clinic using pilot studies and carefully controlled clinical trials.
    • What gestures from the robot or recognized by the robot would be most useful?
      • Touchdown – elation
      • Hands on hips – disappointment
      • Hand on head – puzzlement
      • Beckoning, waving clapping
  • 11/5/09, Francois Michaud visited BYU. The schedule for his visit can be found here.
  • 10/29/09, 2pm. We met as a complete team to discuss work in the clinic. We also discussed this paper. Here is a summary of our discussion:
    • We believe that imitation will be necessary, but recognize that the Honda imitation software must be improved before it can be used in the clinic.
      • We will explore better configurations for the lab, e.g., position of the camera, orientation of the clinician and child, etc.
      • We hope that Behzad will be able to help us tune the software to make it more robust when he visits in November.
      • We expect that Honda will need to improve the software before it will be usable in the clinic.
      • As a fallback for using the software, Jonathan and Sukhbat are learning how to use imitation to program robot behaviors.
    • It is time to try Nicole's robot and Dan's robot with some children.
      • The clinicians will need to be trained on using the GUI (graphical user interface) software.
      • The engineers will need to work with the clinicians to create behaviors that can be used in the clinic.
      • Alan Atherton will lead an effort to make the GUI software reliable enough to use.
      • Jonathan will work on creating a Wiimote controller to manage behavior sequencing.
    • The paper that we discussed had some great technology for doing eye tracking, but
      • We wonder about the therapeutic effectiveness of creating fully autonomous robots, since there are problems in believing that developing social skills with robots will transfer to the real world.
      • Although eye gaze is correlated with attention, it is an overstatement to conclude that just because a child with autism is gazing at something that he or she is attending to it.
  • 8/17/09, 11am. We met as a complete team earlier today to work out the details of moving BYU and Honda technology into the clinic. Here is a summary of our discussion:
    • We will initially introduce two technologies into the clinic:
      • Honda ML software and avatar, with the ability to imitate the clinician in real time and playback pre-recorded behaviors. The software is ready. We are now working on some issues related to camera and projector placement.
      • Pleo, with pre-recorded behaviors that can be activated by the clinician via a handheld PDA. The software is ready to do this already, and we are working with the clinicians to identify which behaviors they would like available.
    • Time frame for introduction of technologies:
      • Pleo: 1-2 weeks
      • Honda software: 2-3 weeks.
    • Bonnie, Martin, and Lee have identified two children with autism to participate in initial studies.
    • After introducing the avatar and Pleo, we will then introduce a humanoid Lego robot. This robot will also be capable of pre-recorded behaviors, and imitation using the ML software. Prototype hardware and interface software are ready to go. We are working on a few improvements before placing the system in the clinic within a couple of months.
  • 6/2/09, 1 pm, in the ME Conference Room (445 CTB). Discussed near-term strategies for using robot technology in the clinic. We will also discussed this paper. Alan Olsen demonstrated the Pleo robot.
  • 3/26-3/27/09. Behzad Dariush from Honda Research Institute visited BYU. The group meeting, software work meetings, clinic visit, and lab tours were all very useful. During our meetings several ideas were discussed. Here is a summary of some of the key points:
    • Near-term work:
      • Use Pleo robot in clinical settings to get a feel for the types of behaviors that can be elicited.
      • Investigate the use of Honda software to imitate the therapist, rather than the child, in therapy sessions. This may promote interactions between the child and therapist, who is the key to initiating robot behaviors. Two potential implementations:
        • The robot or avatar imitates the therapist directly.
        • Online gesture recognition is used as a way to control/initiate autonomous behaviors of the robot or avatar. This could be used to control Pleo or another robot (possibly LEGO Mindstorms).
      • Use Honda software to do offline programming of robot and avatar behaviors. This may be a convenient way for therapists to program robots through imitation.
      • Create a bi-directional software interface to allow BYU researchers to access motion variables in Honda software. To start, access to “key points” and generalized coordinates would be useful.
      • Develop new behaviors for Pleo and create a remote control interface that will allow therapists to invoke different behaviors.
    • Long-term work:
      • Continue development of a small, upper-body, humanoid robot. This robot will eventually be used in imitation, pointing, sabotage, and other activities.
      • Continue development of Honda software to enable pose imitation for small children.
  • 2/26/09, 4pm. Discussed the “I Am Robot” paper by V. Groom, L. Takayama, P. Ochi, and C. Nass (to appear in HRI2009). Discussed whether the absence of autonomy makes the Groom experiment simply a replication of Kiesler and Kiesler's decorated rock experiment. Also discussed whether programming a robot's behavior is analogous to building the robot and, if so, whether this would lead to self-extension. Finally, we discussed goals of the TiLAR project and ways the work might benefit real people. Goals include (1) improving a therapist's ability to help children with autism or SLI, (2) understand the nature of ASD or SLI, and (3) improve diagnostic ability to differentiate between groups on the ASD spectrum. Potential ways to benefit real people, in order of preference of the group, are (1) use robots as one stage of therapy to help a clinician improve the abilities of a child and (2) possibly provide an assistive device to a child that either allows them to be higher functioning or makes a caregiver's job easier by improving some aspect of caregiver-child interaction.
  • 1/15/09, 4 pm, in the CS Conference Room. Martin, Bonnie, and Lee led a discussion on the clinical aspects of our work, including potential robot-based therapies. Notes from our discussion are found here.
  • 12/4/08, 3 pm, in the CS conference room. Mike Goodrich lead a discussion on the state-of-the-art in robot-assisted therapy. The presentation slides are found here.
  • 11/6/08, 3 pm, in the CS conference room. A 1-hour Q&A session was held with Brian Scassellati from Yale university. The schedule for Brian's visit is available here.<BR>
  • 10/31/08, noon, in the CS conference room. We discussed the state-of-the-art in robot face design and brainstormed about what kind of robot face should be used to maximize potential therapeutic benefits for children with ASD or SLI. Perhaps most importantly, we discussed ways that we might gather data on what kind of face would be most useful without having to build several fully actuated robots. Mark Colton will explore this idea further with the goal of gathering data that can be used to motivate a robot design as well as be shared with other researchers in a publication.<BR>
  • 10/23/08, 3 pm, in the CS conference room.<br>
ar/past-meetings.txt · Last modified: 2014/08/11 13:51 by tmburdge
Back to top
CC Attribution-Share Alike 4.0 International
chimeric.de = chi`s home Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0