LENSLocational Encoding with Neuromorphic Systems
LENS is a compact, brain-inspired localization system for autonomous robots. Using spiking neural networks, dynamic vision sensors, and a neuromorphic processor on a single SPECK™ chip, LENS performs real-time, event-driven place recognition with models 99% smaller and over 100× more energy-efficient than traditional systems. Deployed on a Hexapod robot, it can learn and recognize over 700 places using fewer than 44k parameters—demonstrating the first large-scale, fully event-driven localization on a mobile platform.
Applications
- Search and Rescue Robots: Navigate unstable environments without GPS or maps.
- Warehouse Automation: High-speed tracking with low energy use.
- Planetary Rovers: Suitable for Mars or lunar environments with no GPS.
- Wearable Robotics: Integration into assistive devices for indoor navigation.
As autonomous robots evolve toward greater adaptability and intelligence, traditional localization techniques often fall short in dynamic, unstructured environments. Locational Encoding with Neuromorphic Systems (LENS) represents a breakthrough in biologically inspired localization by emulating the spatial navigation mechanisms of the mammalian brain. Leveraging neuromorphic hardware and neural algorithms, LENS provides robust, power-efficient, and real-time localization for mobile robots operating in GPS-denied or visually complex environments.
Architecture of LENS
1. Sensor Input Layer
- Event-based Cameras: Unlike traditional frame-based vision, these sensors provide asynchronous, high-temporal-resolution visual input.
- Inertial Measurement Units (IMUs): Capture real-time motion and orientation.
- Wheel Encoders / LIDAR: Optional modules depending on the robot’s form factor.
2. Spiking Neural Network (SNN) Core
- Implements models of grid cells and place cells.
- Encodes velocity, acceleration, and visual landmarks into spike trains.
- Learns spatial representations through unsupervised Hebbian learning.
3. Neuromorphic Hardware Backend
- Loihi (Intel), SpiNNaker, or BrainScaleS systems provide energy-efficient, low-latency processing of SNNs.
- Parallelizes spike computations, making real-time localization feasible.
4. Map Formation and Path Integration
- Uses self-motion cues (proprioceptive input) to track position via path integration.
- Visual and auditory landmarks act as correction signals to prevent drift over time.
Accurate and reliable localization is a foundational capability for autonomous robots navigating real-world environments. Conventional approaches such as SLAM (Simultaneous Localization and Mapping), GPS-based positioning, and vision-based methods are widely used. However, they often face limitations in environments with poor lighting, occlusion, dynamic obstacles, or GPS unavailability.
To address these limitations, Locational Encoding with Neuromorphic Systems (LENS) introduces a new paradigm, grounded in neuroscience, by mimicking the brain’s navigation network, particularly grid cells and place cells in the hippocampus. This bio-inspired approach combines spiking neural networks (SNNs) with neuromorphic processors, offering high efficiency, noise resilience, and fast adaptation to novel terrains.
QUT robotics researchers have developed a new robot navigation system that mimics neural processes of the human brain and uses less than 10 per cent of the energy required by traditional systems.
In a study published in the journal Science Robotics, the researchers detail a new system which they call LENS – Locational Encoding with Neuromorphic Systems.
LENS uses brain-inspired computing to set a new, low-energy benchmark for robotic place recognition.
The research, conducted by first author neuroscientist Dr Adam Hines along with Professor Michael Milford and Dr Tobias Fischer, all from the QUT Centre for Robotics and the QUT School of Electrical Engineering and Robotics, uses a system called neuromorphic computing
“To run these neuromorphic systems, we designed specialised algorithms that learn more like humans do, processing information in the form of electrical spikes, similar to the signals used by real neurons,” Dr Hines said.
“Energy constraints are a major challenge in real-world robotics, especially in fields like search and rescue, space exploration and underwater navigation.
“By using neuromorphic computing, our system reduces the energy requirements of visual localisation by up to 99 per cent, allowing robots to operate longer and cover greater distances on limited power supplies.
“We have known neuromorphic systems could be more efficient, but they’re often too complex and hard to use in the real world – we developed a new system that we think will change how they are used with robots.”
In the study, the researchers developed LENS, a system that was able to recognise locations along an 8km journey but using only 180KB of storage – almost 300 times less than other systems.
LENS combines a brain-like spiking neural network with a special camera that only reacts to movement and a low-power chip, all on one small robot.
“This system demonstrates how neuromorphic computing can achieve real-time, energy-efficient location tracking on robots, opening up new possibilities for low-power navigation technology,” Dr Hines said.
“Lower energy consumption can allow remotely operated robots to explore for longer and further.
“Our system enables robots to localise themselves using only visual information, in a way that is both fast and energy efficient.”
Dr Fischer, ARC DECRA Fellow, said the key innovation in the LENS system was a new algorithm that exploited two types of promising bio-inspired hardware: sensing, via a special type of camera known as an “event camera”, and computing, via a neuromorphic chip.
“Rather than capturing a full image of the scene that takes in every detail in each frame, an event camera continuously senses changes and movement every microsecond,” Dr Fischer said.
“The camera detects changes in brightness at each pixel, closely replicating how our eyes and brain process visual information.
“Knowing where you are, also known as visual place recognition, is essential for both humans and robots.
“While people use visual cues effortlessly, it’s a challenging task for machines.”
Professor Michael Milford, director of the QUT Centre for Robotics, said the study was representative of a key theme of research conducted by the centre’s researchers.
“Impactful robotics and tech means both pioneering ground-breaking research, but also doing all the translational work to ensure it meets end user expectations and requirements,” Professor Milford said.
“You can’t just do one or the other.
“This study is a great example of working towards energy-efficient robotic systems that provide end-users with the performance and endurance they require for those robots to be useful in their application domains.”
Read the full article, A compact neuromorphic system for ultra energy-efficient, on-device robot localization, published in the Science Robotics online.
Top image: Dr Adam Hines, with his ‘green’ robot. l/r- Dr Tobias Fischer, Professor Michael Milford and Dr Adam Hines.
Media contact:
Rod Chester
QUT Media
Other Recent Developments and Research
Recent efforts in the LENS domain include:
- University of Western Sydney’s MARCS Institute: Demonstrated LENS on mobile robots with Loihi.
- ETH Zurich Neuromorphic Lab: Tested SNN-based grid cell models in obstacle-rich environments.
- DARPA Neovision2 Program: Integrates LENS-like methods into surveillance drones.
07 3138 2361 / 0407 585 901 (After Hours)