• 1. School of Information Science and Technology, Beijing University of Technology, Beijing 100124, P. R. China;
  • 2. Beijing Key Laboratory of Computational Intelligence and Intelligent Systems, Beijing 100124, P. R. China;
YU Naigong, Email: yunaigong@bjut.edu.cn
Export PDF Favorites Scan Get Citation

In animal navigation, head direction is encoded by head direction cells within the olfactory–hippocampal structures of the brain. Even in darkness or unfamiliar environments, animals can estimate their head direction by integrating self-motion cues, though this process accumulates errors over time and undermines navigational accuracy. Traditional strategies rely on visual input to correct head direction, but visual scenes combined with self-motion information offer only partially accurate estimates. This study proposed an innovative calibration mechanism that dynamically adjusts the association between visual scenes and head direction based on the historical firing rates of head direction cells, without relying on specific landmarks. It also introduced a method to fine-tune error correction by modulating the strength of self-motion input to control the movement speed of the head direction cell activity bump. Experimental results showed that this approach effectively reduced the accumulation of self-motion-related errors and significantly enhanced the accuracy and robustness of the navigation system. These findings offer a new perspective for biologically inspired robotic navigation systems and underscore the potential of neural mechanisms in enabling efficient and reliable autonomous navigation.

Copyright © the editorial department of Journal of Biomedical Engineering of West China Medical Publisher. All rights reserved