Autonomous Vehicles the Natural Way

Living systems act in the world in order to perceive it the way they want it to be. In other words, behaviour is about varying behavioural output in order to control and maintain perceptual variables at desired values. This is in contrast to the conventional wisdom of computing specific actions from what’s perceived, based upon a predictive or world model.

The Perceptual Control perspective provides a radically different way of understanding intelligence and behaviour. Applied to robotics the resulting architecture greatly simplifies the problem of building artificial systems.

Perceptual control systems define dynamic, goal-oriented, adaptive, autonomous agents, so the approach is ideally suited to both explaining how humans drive and to implementing artificial autonomous vehicles.

The functionality for an artificial behavioural system is achieved by building up layers of capabilities where each layer adds increasingly sophisticated abilities, and behaviours. Each layer dynamically sets the goals of the layer below, so any complex system, such as an autonomous vehicle, is composed of a multitude of simple goal-based controllers at multiple levels, acting simultaneously.

The methodology for the perceptual control approach to robotics is outlined in the paper, “A General Architecture for Robotics Systems: A Perception-Based Approach to Artificial Life“, Rupert Young Artificial Life  Volume 23 | Issue 2 | Spring 2017 p.236-286.

The videos below demonstrate a proof of concept of perceptual control applied to the operation of an autonomous vehicle. The observed behaviour is not specifically programmed, or defined by way of predictive models of the world, but emerges from the interaction of a set of perceptual controllers each of which has their own perceptual goal to reach and maintain.

In the demonstration the blue car is implemented with the perceptual control system in that the way it moves is in order to controls its perceptual inputs, at goal values. The other vehicles act as its environment, and disturbances to its goals. The behaviour of those other vehicles is determined by formal (and more complex) rule-based and model-based computations, within the simulation software.

Although limited to a circular track and basic graphics the current demonstration is a proof of concept of the basic principles of the perceptual control paradigm. In the real world the controlled signals would be provided by more sophisticated perceptual systems, such as vision. The first challenge to address when applying the perceptual control solution is to understand what perceptual variables are being controlled. Provided these are determined and the corresponding perceptual capabilities are put in place then, for more complex situations, such as turning at a junction, the autonomous vehicle would operate in a similar way to that shown in this demonstration.

The long version.

The short version.

The autonomous vehicle in the demo was implemented with RAPTA, our perceptual control modelling software, which enables design, configuration, execution and monitoring of perceptual control hierarchies. This was integrated with Vissim, the traffic simulation software, as part of a software evaluation exercise for possible use on a feasibility study for the Department of Transport in association with the University of Manchester.

The integration was performed by running the Java functions of RAPTA through a Java Native Interface called from the C++ of Vissim’s Driver Model Dll.