Award details

Developing probabilistic frameworks for the detailed analysis of insect visual navigation behaviours

ReferenceBB/F010052/1
Principal Investigator / Supervisor Professor Phil Husbands
Co-Investigators /
Co-Supervisors
Dr Bartholomew Baddeley, Professor Thomas Collett, Professor Paul Graham, Professor Andrew Philippides
Institution University of Sussex
DepartmentSch of Engineering and Informatics
Funding typeResearch
Value (£) 271,689
StatusCompleted
TypeResearch Grant
Start date 01/12/2007
End date 30/11/2010
Duration36 months

Abstract

Our ultimate goal is to provide a general mathematical framework for analysing insect visual learning and navigation in a novel way that recognises both the importance of dealing with uncertainty and the relationship between actions and their sensory consequences. Such a framework has recently been applied with great success to the related Simultaneous Localisation And Mapping (SLAM) problem in robotics. By employing a probabilistic approach to handling uncertainty, recent approaches are able to map extended environments. We intend to employ the same probabilistic tools to the problem of modelling uncertainty within the insect navigation paradigm. This will provide a genuinely novel approach to the analysis of insect behaviour. In order to implement the proposed approach we need to construct probabilistic sensory and motor models. We can utilise existing models of both the optics and motor systems of insects as well as more abstract behavioural descriptions by adapting them into probabilistic forms. While simple in concept, these models are non-trivial and their construction will constitute a significant part of the research effort. We will test the motor and sensory models in simulation and implement the approach on a large gantry robot fitted with a panoramic camera. A major benefit of the proposed framework is that hypothetical insect control schemes, visual algorithms and navigation models can be directly compared using the final certainty, reliability and accuracy of position estimates as a metric for comparison. Moreover, determining accurate motor models will allow us to build better video tracking software for automated data acquisition in behavioural experiments with ants and bees. Finally, we will investigate how insects solve the data-association problem (recognising whether a visual feature has been seen before or not), an outstanding issue in robot navigation and the main reason for failure in SLAM approaches.

Summary

Ants, bees and wasps are able to successfully find their way back to their nest sites following long and often convoluted foraging trips. The fact that these impressive navigational feats are achieved despite their relatively poor visual acuity and simple nervous systems makes them particularly interesting systems for study. Most current models of insect navigation have tended to ignore problems of noise. In this project we intend to construct a general probabilistic framework for the modelling and analysis of real and synthetic behaviours subject to noise. The idea stems from the probabilistic Simultaneous Localisation and Mapping (SLAM) approach to robot navigation, which assumes that: (1) all information and output is (at least to some degree) uncertain, and (2) motor commands and their sensory consequences are strongly coupled. Robust mapping performance is achieved by combining multiple sources of uncertain information. In SLAM, the navigating agent relies on having models of both its sensors and motors. The models incorporate what a sensor reading should be if the agent is a given distance from a sensed object, and how its position will change in response to a given motor command. With these two models, the agent can predict the sensory consequences of its actions, and, by combining these predictions with the actual sensor readings, obtain a better estimate of its own position and of the mapped features. By replacing the robot measurement and movement models with models of insect sensory and motor capabilities we can turn the SLAM approach into a tool for analysing real and simulated behaviours. The sole constraint on the models is that they specify a probability distribution over the outputs of the model rather than just a single point prediction. The sensory and motor models that we employ can be applied to problems at different levels. Once constructed, the sensor models and framework provide a standardised test environment for comparing different navigation algorithms. In the proposed framework, navigation strategies can be expressed as closed-loop motor models, that is, motor models that take into account sensory feedback. By allowing different navigation algorithms to be implemented, subject to the same constraints on visual acuity and accuracy of movement, more meaningful comparisons can be made. We can also investigate how the fine structure of movements affect localisation performance by comparing how uncertainty in the estimated positions of landmarks changes in response to different motor patterns. Some patterns of movement will provide more useful information than others and we can investigate how closely observed behaviours match theoretically optimal movements that maximally reduce uncertainty within the SLAM framework. Having built and tested the motor and sensory models in simulation, they will be transferred to a large gantry robot fitted with a panoramic camera to allow a more comprehensive evaluation of different control algorithms. The gantry robot set up will also enable us to investigate how insects solve the data-association problem, which is an outstanding issue in robot navigation. Data-association refers to the problem of recognising whether a visual feature has been seen before or not. Errors in data-association are the main reason for failure in current SLAM approaches. To approach this question in real insects, we will record the fine structure of their movements during acquisition and then use the gantry to emulate their visual input during the course of learning. Finally, these techniques provide a potential solution to the problem of automated video tracking for data acquisition. The movement models described above can be incorporated into a video tracking system. By having some idea of where an insect is likely to move to, it is easier to track the agent's position from frame to frame, thereby leading to more robust tracking performance.
Committee Closed Committee - Animal Sciences (AS)
Research TopicsNeuroscience and Behaviour
Research PriorityX – Research Priority information not available
Research Initiative X - not in an Initiative
Funding SchemeX – not Funded via a specific Funding Scheme
terms and conditions of use (opens in new window)
export PDF file