Research on the neural mechanism of multimodal sensory information integration and decision making

[ Instrument Network Instrument Development ] On October 10th, the "Neuron" journal published a research paper entitled "Using Linear Invariant Probabilistic Group Coding to Realize Optimal Decision Based on Complex Multimodal Sensory Information". Center for Excellence in Brain Science and Intelligent Technology, Institute of Neuroscience, Chinese Academy of Sciences, Key Laboratory of Primate Neurobiology, Chinese Academy of Sciences, Space Sensing Research Group, Shanghai Brain Science and Brain Research Center, and Cognitive Computational Neuroscience Research, University of Geneva, Switzerland The group cooperation is completed.
The organism is in a complex and varied environment, and the reliability of different sensory information inputs often changes over time. For example, when we suddenly enter a fog on the highway, the reliability of the visual input of the road is rapidly reduced, so the brain needs to adjust the strategy immediately, relying more on meter reading, vestibular, tactile and auditory information. To determine the speed and direction of the vehicle, so that you can make important decisions such as "whether you need to brake" and "whether you need to fine-tune the steering wheel", otherwise small mistakes may lead to serious consequences. A large number of psychophysical experiments have shown that humans and many other animals can use Bayesian inference to estimate the reliability of different sources of information and the weighted operation of evidence based on reliability. Key steps to integrate sensory information and optimize decision making. However, the principle of neural computation in the brain to achieve this process is unclear.
In order to study the neural mechanism of optimal integration and decision-making of multimodal sensory information in complex environments, the Space Perception Research Group of the Brain Intelligence Center has established a virtual reality experimental platform based on vestibular and visual with its own motion perception as the model system. . On the platform, the researchers trained the macaque to distinguish its own direction of motion through the sensory information of the vestibular and visual modalities (Fig. 1A). Importantly, the motion stimuli provided by the system have the process of first accelerating and then decelerating; since the vestibular organs of the inner ear are sensitive to acceleration, and the visual channels are usually sensitive to speed, the two sensory information received by the brain have different temporal dynamics. , thus simulating the complex multimodal input of real-time changes in evidence reliability in the natural environment. After training, rhesus monkeys can distinguish more precise changes in their own movement angle under the experimental conditions of multimodal stimulation (vestibular + visual), and their behavior is compared with the experimental conditions of single mode stimulation (only vestibular or visual only). The improvement is in line with the expectations of Bayesian optimal integration theory (Fig. 1B). These results indicate that macaques can indeed improve the accuracy of cognition by integrating information from different sensory channels, and that there is almost no loss of information (ie, "optimal") in the process.
While the macaques are resolving their own direction of motion, the researchers recorded the electrophysiological activity of neurons in a decision-related region, the lateral intraparietal area (LIP), in the posterior parietal cortex of the macaque. The researchers found that under two different single-mode stimuli, LIP neurons accumulate evidence from different physical quantities over time—the vestibule is from acceleration and the vision is from velocity (Figures 1C and D, blue and red curves). ). Therefore, the vestibular and visual evidence received by the neurons does have real-time varying reliability. So, under multi-modal experimental conditions, how do neurons implement the two key steps of Bayesian optimal integration, namely "estimate reliability" and "implement weighted operation"? A hypothesis is that the decision-making center collects the neuron pulse information in each small time window (for example, several tens of milliseconds) in real time, first evaluates the reliability of the evidence in the period of time, and then adjusts the weight distribution of the evidence according to this; Although this method is theoretically feasible, it is easy to bring delays in decision-making, and it is necessary to adjust the synaptic connection strength of sensory input in real time, so it is not necessarily realistic for the biological brain. Conversely, a hypothesis called “linear invariant probabilistic group coding (ilPPC)” suggests that the real-time discharge activity of population neurons can directly characterize the reliability of information input: in this case, only neurons are needed. The Bayesian optimal integration of information can be achieved by a simple linear superposition of sensory inputs with a synaptic weight. Therefore, the calculations proposed by the ilPPC hypothesis are simpler, faster, and more feasible for biological brains.
To test whether the experimental data is consistent with the ilPPC hypothesis, the researchers constructed and refined a neural network model based on the ilPPC theoretical framework (Fig. 1E). The results of theoretical estimation and numerical simulation confirm that the simple linear superposition of modality and time-to-time between the vestibular and visual information of the neuron cluster in the network can automatically realize the weighted operation of evidence reliability dependence, thus optimally accomplishing multiple sensations. The task of decision making (Figure 1F). Importantly, the neuronal activity in the neural network model has consistent characteristics with the real macaque LIP data (Figures 1G and H), suggesting that the brain can indeed take linear invariance when faced with real-time variable sensory input in complex environments. Probabilistic group coding is used to implement Bayesian optimal decision making. Therefore, for the first time, this work provides experimental and computational support for the ilPPC theoretical framework of optimal multisensory decision making, pointing out the calculation rules for the accumulation of complex multimodal sensory evidence in decision neurons, thus filling the multisensory integration and perceptual decision making. There has been a long gap between the two domains.
The study was completed under the joint guidance of Dr. Gu Yong and Professor Alexander Bujie of the Center for Excellence in Brain Intelligence of the Chinese Academy of Sciences (Nerve Institute). Hou Wei was the first author of the paper, and Ph.D. students Zheng Qihao and Zhao Yuchen also participated in the study. The collection and analysis of experimental data. The study was funded by basic science funds from both China and Switzerland.

Gantry Robot With One CNC Machine

Gantry Robot With One Cnc Machine,Wall Mounted Type Robot,Cnc Lathe With Robot Automation,Industrial Gantry Type Cartesian Robot

Qingdao Jiujin Machinery Co., Ltd , https://www.jiuinmachine.com

Posted on