Research ArticleNEUROSCIENCE

Connecting multiple spatial scales to decode the population activity of grid cells

Science Advances  18 Dec 2015:
Vol. 1, no. 11, e1500816
DOI: 10.1126/science.1500816

You are currently viewing the abstract.

View Full Text

Abstract

Mammalian grid cells fire when an animal crosses the points of an imaginary hexagonal grid tessellating the environment. We show how animals can navigate by reading out a simple population vector of grid cell activity across multiple spatial scales, even though neural activity is intrinsically stochastic. This theory of dead reckoning explains why grid cells are organized into discrete modules within which all cells have the same lattice scale and orientation. The lattice scale changes from module to module and should form a geometric progression with a scale ratio of around 3/2 to minimize the risk of making large-scale errors in spatial localization. Such errors should also occur if intermediate-scale modules are silenced, whereas knocking out the module at the smallest scale will only affect spatial precision. For goal-directed navigation, the allocentric grid cell representation can be readily transformed into the egocentric goal coordinates needed for planning movements. The goal location is set by nonlinear gain fields that act on goal vector cells. This theory predicts neural and behavioral correlates of grid cell readout that transcend the known link between grid cells of the medial entorhinal cortex and place cells of the hippocampus.

Keywords
  • grid cell
  • entorhinal cortex
  • spatial cognition
  • goal-directed navigation
  • self localization
  • maximum likelihood decoding
  • population vector
  • nonlinear gain fields
  • goal-vector cells

This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial license, which permits use, distribution, and reproduction in any medium, so long as the resultant use is not for commercial advantage and provided the original work is properly cited.

View Full Text

More Like This