The main DREAM experimental platform was the 58-cm tall, 5-kg humanoid robot NAO developed by Aldebaran Robotics. NAO is a 25 degrees of freedom robot, equipped with a rich array of sensors: 2 cameras, 4 directional microphones, sonar rangefinder, 2 IR emitters and receivers, 1 inertial board, 9 tactile sensors, and 8 pressure sensors. NAO can detect and recognize pre-learned objects and faces, recognize words and sentences, and localize sounds in space. It has various communication devices including LED lights, two high-fidelity speakers, a voice synthesizer with language-specific intonation and pronunciation.
The NAO robot has already been used in different experiments with ASD children3,4,5 . Because of its size and appearance, NAO has been particularly well received by young children, they anthropomorphize NAO and readily engage in affective social interactions with it.
The second DREAM experimental platform wass Probo, developed at the Vrije Universiteit Brussel. This robot was designed to focus on verbal and non-verbal communication, and act as a social interface by employing human-like social cues and communication modalities. It was well suited for this task since it had a fully expressive and anthropomorphic head. With 20 motors in the head, the robot is able to express attention and emotions via its gaze and facial expressions. To guarantee a safe physical interaction between the robot and the children, compliant actuation systems and a layered structure with foam and fabric were implemented, contributing to the aspects of a safe, soft and “huggable” interaction.
Probo has also previously been used with ASD children6,7,8,9. The outcomes of these studies all showed positive results. In all the studies, the robot was used in a Wizard of Oz setup.
3 Tăpus, A., Peca, A., Aly A., Pop, C., Jisa, L., Pintea, S., Rusu, A. & David, D. (2012) “Exploratory Study: Children’s with Autism Awareness of Being Imitated by Nao Robot.”1st International Conference “Technologies for Autism: Tools, Trends and Testimonials”.
4 Villano, M., Crowell, C., Wier, K., Tang, K., Thomas, B., Shea, N., Schmitt, L. & Diehl, J. (2011), Domer: a wizard of oz interface for using interactive robots to scaffold social skills for children with autism spectrum disorders, in ‘Proceedings of the 6th international conference on Human-robot interaction’, ACM, 279–280.
5 Belpaeme, T., Baxter, P., Read, R., Wood, R., Cuayahuitl, et al. (in press), “Multimodal Child-Robot Interaction: Building Social Bonds”. Journal of Human-Robot Interaction.
6 Saldien, J., Goris, K., Vanderborght, B., Vanderfaeillie, J. & Lefeber, D. (2010), “Expressing emotions with the social robot Probo.” International Journal of Social Robotics 2(4), 377–389.
7 Vanderborght, B., Simut, R., Saldien, J., Pop, C., Rusu, A., Pintea, S., Lefeber, D., and David, D. (2012) “Using the social robot Probo as social story telling agent for children with ASD,” Interaction Studies, 13(3), 348-372(25).
8 Pop, C., Simut, R., Pintea, S., Saldien, J., Rusu, A., Vanderfaeillie, J., David, D., Lefeber, D. & Vanderborght, B. (under review), ‘Identifying Situation-based Emotions using the Social Robot Probo: A case study in autism spectrum disorders.’ International Conference on Innovative Technologies for Autism Spectrum Disorders. ASD: Tools, Trends and Testimonials.
9 Simut, R., Pop, C., Vanderfaeillie, J., Lefeber, D., & Vanderborght B. (2012). Trends and future of social robots for ASD therapies: potential and limits in interaction. International Conference on Innovative Technologies for Autism Spectrum Disorders. ASD: Tools, Trends and Testimonials.
Vision sensors provide a noncontact, natural, unobstructed way of recording, monitoring and analysing everyday behaviours of people with ASD, e.g. by analysing facial-head movements of ASD individuals1 as well as eye gaze. These visual cues are important because children with ASD usually have a number of atypical visual behaviours and viewing strategies, such as reduced gaze towards the eyes and preference for the mouth2.
Given that placing sensors on children may impede therapy, DREAMwill use RGB-D sensors such as the Microsoft Kinect® rather than employing high precision wearable motion tracking devices, together with adaptive action and behaviour analysis software. These will be augmented by techniques for multi-sensor data fusion. Touch sensors and RFID will be optionally considered to capture environment-related information.
1 Madsen, M., El Kaliouby, R., Goodwin, M. & Picard, R. (2008), “Technology for just-in-time in-situ learning of facial affect for persons diagnosed with an autism spectrum disorder”, in: ‘Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility’, ACM, 19–26.
2 Bird, G., Catmur, C., Silani, G., Frith, C. & Frith, U. (2006), “Attention does not modulate neural responses to social stimuli in autism spectrum disorders”, Neuroimage 31(4), 1614–1624.