This doctoral thesis investigates the computational intricacies of the human brain, exploring cortical microcircuits, the extent of their information processing capacity (IPC) and their role in memory and temporal difference learning.
By reproducing and extending a network model of a cortical column, the first part shows that the data-based connectivity improves computation by sharpening the internal representations rather than increasing the retention time.
The second part introduces a novel application of the IPC metric to spiking neural networks (SNN). This approach provides a comprehensive profile of the functions computed by SNN, encompassing memory and nonlinear processing. The study examines various encoding mechanisms and shows that the metric is indicative of the performance in tasks with varying demands for nonlinear processing and memory. This exploration not only extends the utility of the IPC metric to more complex neural networks but also offers a deeper insight into their computational capabilities.
The third part tests a hypothesis about computing temporal difference (TD) errors in the brain, focusing on two populations of cortical layer 5 neurons: CCS and CPn cells. By evaluating the memory of network models based on these populations through the lens of IPC, the research supports their proposed role in the computation of TD-errors for continuous rate networks. However, SNN models pose a greater challenge with little ability to memorize previous inputs.
In summary, this work extends existing research results and develops new methods for analyzing SNN. It lays a solid foundation for future studies of the brain's computational processes and presents advanced tools and methods for exploring the intricate workings of biologically inspired neural network.