Journal of Neuroscience, v.42, no.6, pp.1141 - 1153
Publisher
Society for Neuroscience
Abstract
It is clear that humans can extract statistical information from streams of visual input, yet how our brain processes sequential images into the abstract representation of the mean feature value remains poorly explored. Using multivariate pattern analyses of electroencephalgraphy recorded while human observers viewed the sequentially presented ten Gabors of different orientations to estimate their mean orientation at the end, we investigated sequential averaging mechanism by tracking the quality of individual and mean orientation as a function of sequential position. Critically, we varied the sequential variance of Gabor orientations to understand the neural basis of perceptual mean errors occurring during sequential averaging task. We found that the mean-orientation representation emerged at specific delays from each sequential stimulus onset and became increasingly accurate as additional Gabors were viewed. Especially in frontocentral electrodes, the neural representation of mean orientation improved more rapidly and to a greater degree in less volatile environment while individual orientation information was encoded precisely regardless of environmental volatility. The computational analysis of behavioral data also showed that perceptual mean errors arise from the cumulative construction of the mean orientation rather than the low-level encoding of individual stimulus orientation. Thus, our findings provide neural mechanisms to differentially accumulate increasingly abstract feature from a concrete piece of information across the cortical hierarchy depending on
49 environmental volatility.