SenseMaker - a multi-sensory, task-specific adaptable perception system
Funded by the European Commission under the Life-like Perception Programme.
SenseMaker Demonstration site
Read more on the Sensemaker project.
In contrast to the majority of artificial sensor machines, animal or plant organisms have available a vast array of modalities for interaction with their environment. Perception in living organisms is thus a complex process going beyond the simple detection and measure of sensory stimuli. It is dependent on integration of sensory information entering through a number of different channels and, in higher animals, is subject to modulation by higher cognitive processes acting at the cortical level, or via descending brain pathways to early stages of the sensory processing chain.
Internal representations of the sensory world are formed through unification of information from different sources. Interaction of information arriving through different sensory pathways may be complementary for object identification. For example, auditory information - e.g. the "moo" of a cow - may help to identify the visual entity - i.e. the shape of the cow. Perception is not a fixed concept: it is significantly modulated by the internal state of the organism and by many contextual factors such as past experience, internal prediction, association, or on-going motor behaviour. Repetitive experience of a particular sensory-motor context may lead to functional habituation, based on central prediction of the sensory world. In contrast, active goal-directed exploration of sensory space may focus attention on a particular sensory modality, on a particular region of body space, or on a particular temporal sequence of motor command and re-afferent sensory input. The importance attributed to each sensory modality in constructing this integrated representation of the sensory world also depends on its working range: for instance, vision is often the preferred sense when light conditions are good but touch, hearing, or smell, or a combination of all three, may serve better for object recognition in the dark.
The range of sensory modalities available to living organisms is large. In order to deal with different sorts of environmental milieux, nocturnal or aquatic animals have developed a variety of senses, which higher animals, such as humans, do not possess. These include echolocation used by bats, the electric organ combined with active electric senses used by many tropical freshwater fish, the magnetic sense of certain moles and birds, infrared sensors used by certain snakes and beetles, lateral line mechanoreceptive or hydrodynamic sensory systems used by fish and some marine mammals, and sonar and ultrasound also used by aquatic mammals. Humans themselves have developed certain artificial sensors, which detect physical or chemical signals that are not perceived directly by living organisms, but which if made available in a biologically compatible manner might also extend the human sensory range. These include, for example, sensors to detect chemical pollutants, x-rays, radioactivity, electromagnetic fields or cosmic radiation.
The SenseMaker Project
The project has two principle aims. One is a project to combine biological, physical and engineering technological approaches in the production of a multi-sensory, task specific adaptable perception system. The second aim will be to push forward knowledge of natural systems and to find the links between what we consider as biological principles and the science of mathematics, which has been used effectively by humans in the construction of intelligent machines.
The first aim of the project is thus to conceive and implement electronic architectures that embody the features of living perceptual systems reviewed above and are able to merge sensory information obtained through different sensory modalities, into a unified perceptual representation of the environment. The architectural design of the SenseMaker machine will be based on biological principles of sensory receptor and nervous system function, inspired by experimental studies of several different sensory modalities. The system will include higher cognitive levels modelled on psychophysical research paradigms, whose function will be to implement dynamic rules of cross-modal integration, activity and time-dependent algorithms for internal prediction, goal directed attention, and transitions between dominant or convergent sensory modalities according to changing environmental parameters.
As in living systems, we will seek to create a representation of the proximal environmental space, which is largely independent of the sensory substrates. The electronic architecture will have the capacity for auto-reconfiguration, forming supplementary cross-connections between the sensory receptor level of a given modality and the higher stages of processing specific to another sensory modality. The ultimate ambition is thus to generate the capacity to create entirely new senses based on hybrid system design.
The project partners are a multi-disciplinary team of biologists, neuroscientists, engineers and computer scientists and are comprised of:
||Universtiy of Ulster, UK (co-ordinator)
||Intelligent Systems Research Centre, Faculty of Informatics
||Prof. TM McGinnity
Dr LP Maguire
||Centre National de la Recherche Scientifique, France
||The Integrative and Computational Neuroscience Research Unit
||Dr Y Fregnac
Dr K Grant
Dr A Destexhe
Dr J Lorenceau
Dr T Bal
Dr D Shulz
||Ruprecht-Karls-Universitaet Heidelberg, Germany
||Electronic Vision Group, Kirchhoff- Institut fur Physik
||Prof. Dr K Meier
Dr J Schemmel
||Trinity College, Ireland
||Visual Cognition Group, Institute for Neuroscience, Department of Psychology
||Dr F Newell
||ENSEIRB-CNRS Universite Bordeaux, France
||IXL Laboratory, School of Elctronics
||Dr Sylvie Renaud-Le Masson
SenseMaker Partner’s Websites
SenseMaker Portal Server
SenseMaker Portal Server
The SenseMaker project commenced on June 1st 2002 and has a 3-year duration.