The EmoTED project aims to explore Virtual Reality technological solutions and techniques in order to help the mimicry of emotions for children with ASD. It involved two academic partners, ENSAM and LIUM, and one association about ASD, Cocci’Bleue. The project was supported by a programme funding organisation from local authorities for a period of 18 months. The project followed a global action-research methodology towards a main objective: to design and develop a facial-recognition prototype for helping children with ASD to learn about the mimicry of emotions. The facial-recognition technique aims at allowing the system to animate, in real-time, a mirroring avatar according to the child’s face while performing the mimicry activity.
The LIUM research work focused on the instructional design of the application from a gamification perspective. We originally proposed to deal with the gamification design from a mixed Instructional Design perspective and a Domain Specific Modeling framework. The first perspective allows us to consider the mimicry application as a Technology Enhanced Learning environment that is set-up by providing a learning scenario taking into account the children’s profile. The DSM framework considers the learning scenario as a model in conformance with a specific metamodel that formalizes the game mechanisms and instructional design elements as concepts, properties and relations. Thanks to the EMF tooling we proposed an iterative and incremental meta-modeling/modeling process to identify, formalize, and validate the game mechanisms and dynamics. Its originality comes from the ASD experts involvement. Indeed, they are co-designers of the proposals with the modeling experts. This approach has been made possible with the development of a text-based simulator that helps, conjointly with mock-ups of the future application, to simulate game sessions and interactions according to various parameters (specified within the learning scenario).