Humans are superior to computers and robots when it comes to applying multimodal sensing together with learned knowledge in choosing the best actions. This project aims at developing human-inspired self-models of the behaviour of humans and robots as well as their environment and show that these models have the ability to predict future actions in an accurate way. This is based on the concept from psychology and philosophy called embodied cognition proposing that the body plays an important part in intelligence. The goal is to develop effective reasoning models as an alternative to the more traditional reactive systems.
The novelty in this project is the design of self-models as well as combining these with predictive systems to accurately model behaviour and predict future actions and events. The models are to be applied in embedded systems and will be tested in the interdisciplinary fields of music and robotics where we expect significant breakthrough compared to state-of-the-art.
This is by building on ideas stemming from our earlier work on addressing scalability through incremental approaches and hardware system design. Furthermore, the exploration of the systems in an interdisciplinary way is expected to contribute to groundbreaking results. The research in the project is divided into six work packages: (1) Sensing human motion and state,
(2) Sensor data analysis, (3) Self-modeling with embodied cognition, (4) Predictive models, and one work package for each of the two use cases. The use cases include active music control and co-ordinated robotic actions, respectively.