The BioGRAAM Project: Bio-inspired, Generic Robotic Autonomous And Adaptive Modules

The Holy Grail of robotics development is a robot with general-purpose capabilities and intelligence. In all the seven decades since Alan Turing wrote about ‘thinking machines’ there has been continual hype about the imminent arrival of such intelligent robots that will cook our dinner, drive our cars, explore Mars and help us in our old age. The reality has been very different and, beyond some restricted uses, such as on the factory floor, the alluring promise and substantial potential impact has yet to materialise.

There is a reason for this, which is that the nature of intelligence and behaviour in all sectors of society (including Psychology and Robotics) is poorly understood. We have a resolution, one that significantly challenges current thinking, and one that has the potential to overcome the impasse that has stifled progress towards the arrival of our robot companions.
We can prevail based upon our radical and unique understanding of the nature of behaviour and intelligence within living systems, that of perceptual control, and its application to the software control systems (the brains!) of robots with our RAPTA technology.

In contrast to convention, behaviour is about controlling how the world is perceived – not about computing specific actions. The Perceptual Control Theory (PCT) paradigm highlights the processes that make the difference between automatons and life, and, hence, what has been missing from robotics to date. Our vision is to develop robotic systems that, at last, replicate the adaptive and purposeful nature of humans, in stark contrast to the rigid, inflexibility of, most current existing systems.

The objective of the project, therefore, is to establish the core technology of generic robotic modules that can learn by themselves and be applied to any task. For this purpose, we will develop the RAPTA platform which is an open source framework of software libraries and tools for designing, implementing and deploying the modules. Within the platform we will develop new mechanisms of machine learning to provide the modules with the ability to adapt to unknown environments and develop a new approach for active computer vision so that the modules can automatically learn new ways to perceive the world.

We will develop and demonstrate our vision by implementing a series of robots that, cumulatively, progress to an end system of a bipedal, humanoid robot. This will demonstrate how complex robot control systems can be assembled in a modular and incremental fashion.

The perceptual control approach synergises cognitive motivation, perception, action and the environment into a single dynamic component of a multi-levelled, hierarchical system. Therefore, the main focus of the BioGRAAM project will address the Cognitive Mechatronics aspect of the Robotics Core Technology topic. The hardware we will use will be off-the-shelf mechatronics systems, modified for our purposes. The control system modules will bring the robot hardware systems to life enabling them to interact dynamically and intelligently with the world.

However, as our approach constitutes a model of the general function of intelligent systems it also informs the other core technologies. The paradigm is both a bottom-up and top-down explanation of all levels of behaviour, so indicates how AI and Cognition could be implemented in robotics systems. It also describes how behavioural systems interact and cooperate, including humans and robots, so provides great insight into Socially cooperative human-robot interaction.

Additionally, a crucial element of our paradigm is the way we think about behaviour, and how to apply it to robots. Instead of thinking in terms of algorithmic sets of instructions we need to think from the perspective of the robot, about what goals it wants to achieve. Therefore, for our own purposes, in order to get developers to think differently about behaviour we will build a Model-based design configuration tool for designing, developing, executing and testing the modules for robotics systems. This tool will be the front-end to the platform and enable users to develop systems entirely graphically by selecting goals, thus avoiding the algorithmic mindset enforced by writing programming code.

The project is led by the University of Manchester (Dr Warren Mansell) in conjunction with robotics SME Perceptual Robots (Dr Rupert Young). Manchester provide the project coordination capacity and theoretical foundation, while technical direction is led by Perceptual Robots. Mansell and Young are both experts in the underlying PCT technology, in their respective fields of Psychology and Robotics.

For the project we have brought together a pan-European consortium of experts comprising the University of Manchester (UK), Cyberbotics (Switzerland), Irida Labs (Greece), GII University da Coruna (Spain), Quasar Science Resources (Spain), BOC (Austria), JuMelia (UK), RoTechnology (Italy), Strane Innovation (France), University of Coventry (Spain) and Perceptual Robots (UK).