Published:
Professor Oliver Lemon and his team in the Interaction Lab, in the School of Mathematical and Computer Sciences, have been successful in a recent call from the European Commission Horizon 2020 funding programme.
Crucially, our robot will exhibit behaviour that is socially appropriate, combining speech-based interaction with non-verbal communication and human-aware navigation.
The proposal MultiModal Mall Entertainment Robot, which will bring around €900,000 over four years to the University, will follow-on from the work the team previously carried out on socially intelligent robot communication in the FP7 JAMES project.
The project team will develop a humanoid robot (based on Aldebaran's Pepper platform) able to engage and interact autonomously and naturally in the dynamic environments of a public shopping mall, providing an engaging and entertaining experience to the general public. Using co-design methods, they will work together with stakeholders including customers, retailers, and business managers, to develop truly engaging robot behaviours, including telling jokes or playing games, as well as providing guidance, information, and collecting customer feedback.
Crucially, the robot will exhibit behaviour that is socially appropriate, combining speech-based interaction with non-verbal communication and human-aware navigation.
To support this behaviour, they will develop and integrate new methods from audio visual scene processing, social-signal processing, high-level action selection, and human-aware robot navigation. Throughout the project, the robot will be deployed in a large public shopping mall in Finland.
Collaborators on the project include IDIAP, Aldebaran Robotics, CNRS, and being led by University of Glasgow.