Welcome to The Neuromorphic Engineer | ||||
|
Biological Models » Neural
Learning to correct orientation estimates using spatial memory PDF version | Permalink The survival of many animals is dependent on knowing where they are in relation to sources of food and shelter. While spatial navigation clearly requires memory (i.e., storage of and access to knowledge) and an estimate of one's position within this spatial knowledge, the algorithms and supporting neural mechanisms are poorly understood. One way to estimate one's position is to maintain a quantitative estimate based on the integration of velocities (odometry, or dead reckoning) from a reference point or it can be inferred approximately through the sensory recognition of position-linked sensory memories. In mammals, neurons in the hippocampal formation (head-direction,1 grid,2 and place cells3) have been discovered that appear to support both of these functions. In the absence of external sensory cues, animals can navigate successfully but errors accumulate over time. When sensory cues are present, drifts in navigation are not observed.4 This suggests that animals use noisy internal estimates of their position to navigate and sensory cues are used to correct for drifts caused by the noise. To begin exploring how the brain could perform such online corrections, we have used a rotational odometry (‘head-direction’) system that is being developed as part of a larger bat echolocation modeling project to link sensed objects to different directions in memory.5, 6 Our mixed hardware/software system offers a biologically-plausible model of how the brain could use this memory to keep its noisy estimate of the orientation aligned with the environment. In the system, a sonar head is mounted on a rotating platform from which the rotation velocity can be measured (see Figure 1 for a schematic of the environment). The head is equipped with a simple sonar system that can detect objects and report their range. A neuromorphic VLSI-based head-direction (HD) system integrates angular velocity to maintain an estimate of the head's orientation (see Figure 2 for a detailed block diagram). The system suffers from errors in the integration, which accumulate over time, causing the estimate to drift away from the actual position in the space. To mitigate this problem, a group of conjunctive cells use Hebbian learning to combine incoming sonar data and the location of activity in the HD system: the neurons are used to correct for drifts in the position estimates when data is available. Expectation cells are also important. Like the conjunctive cells, these learn the association between orientation and sensory input, but they reflect the activity of the object cells (i.e., sensory input). By observing both the object and expectation cells, the system can determine whether the HD estimate is aligned with the environment or a correction is required. Figure 1. A sonar transducer is mounted on a rotating platform from which the rotation velocity can be measured. The grey cone represents the effective field of view of the sonar. The individual targets are distinguished by their radial distance from the head.6 Figure 2. Block diagram of the system. Blocks enclosed in the dashed-line box are implemented in software and the other blocks are in hardware. The black arrows indicate hardwired synaptic connections, the arrows in grey show the plastic synaptic connections, and the white arrows indicate hardwired synaptic connections that provide teacher signals to guide the learning process.6 We conducted an experiment with two objects at 45° and 100° (see Figure 3). In the two cases shown, the head orientation and the HD estimate are initially aligned. As the head begins to rotate, the noisy integration accumulates errors and begins drifting away from the true orientation. In the case without learning, the errors accumulate and the orientation error grows over time (left panel). In contrast, the learning example (right panel) shows that once the objects are associated with specific orientations, the next encounter realigns the HD system to the stored orientation. Figure 3. Results from an experiment with two targets present (at 45°and 100°). With no correction; the HDS estimate accumulates error with time (left). With correction, as the HDS estimate drifts, spatial memories of the targets are used to reset it to the accurate position (right).6 In summary, we have demonstrated how sensory cues in the environment can be dynamically associated with internal states to compensate for (and potentially calibrate against) drifts in integration. We plan to extend this system as we expand to two-dimensional odometry (i.e., place cells) and more sophisticated object recognition. References
|
Tell us what to cover!
If you'd like to write an article or know of someone else who is doing relevant and interesting stuff, let us know. E-mail the and suggest the subject for the article and, if you're suggesting someone else's work, tell us their name, affiliation, and e-mail.
|
||
|