This repository supports an ongoing effort to build a systems and computational neuroscience course for USTC students. The goal is a two-semester sequence: the first semester covers foundational material, and the second introduces more advanced topics for undergraduate and graduate students.
| When | Monday 2:00 pm – 3:35 pm, Spring 2026 |
| Where | 高新区 G3-110 |
| Teacher | 温泉 <qwen@ustc.edu.cn> |
| TA | 胡博洲 <hubozhou@mail.ustc.edu.cn> |
| Component | Weight |
|---|---|
| Homework | 70% |
| Final exam (take-home) | 30% |
- High school knowledge of biology and neuroscience
- Proficiency in Python, MATLAB, or Julia
- Working knowledge of multivariate calculus, probability theory, linear algebra, and differential equations
Core
- Theoretical Neuroscience: Computational and Mathematical Modeling the Neural System
- Principles of Neural Design
- Principles of Neurobiology
- Theoretical Neuroscience: Understanding Cognition
General audience
The emergence of intelligence and behavior from the complex interactions within the brain remains one of the most significant unsolved mysteries in modern science. In the last decade, rapid advancements in experimental tools have enabled us to monitor and manipulate brain circuits with unprecedented precision. Yet neuroscientists are still navigating the intricate landscapes of brain structures and dynamics. Mathematical theory has become essential for integrating seemingly unrelated evidence, generating new insights, guiding experiments, and identifying organizing principles of brain function.
This course explores how physics, engineering, and mathematics have shaped our understanding of the brain — in particular, the relationship between structure, dynamics, representation, and behavior. A central theme is comparing biological learning rules and architectures with modern machine learning methods. Special topics may include wiring optimization in neural circuits, attractor and chaotic dynamics in neural networks, sensory and motor representations, biological learning rules, Hopfield networks, and hierarchical control of behavior.
The course consists of 8 lectures (~90 minutes each). In this semester, we will explore the connections between Natural and Artificial Intelligence.
- Biophysics of real neurons and dendritic integration vs. artificial activation functions
- Energy efficiency and parallel processing across biological and GPU architectures
- Information encoding (rate vs. temporal coding) and population vectors
- Network dynamics and neural manifolds; random matrix theory and stability in RNNs
- Biological plasticity (Hebbian learning and STDP) vs. gradient descent and backpropagation
- Biological approximations of backprop (e.g., predictive coding)
- The dopamine system and reward prediction errors in the basal ganglia
- Temporal difference (TD) learning and Q-learning mapped to biological pathways
- Visual hierarchies (retina to IT cortex) vs. convolutional neural networks (CNNs)
- Object recognition invariances and robustness in natural vs. artificial vision
- The hippocampal–entorhinal system: place cells, grid cells, and systems consolidation
- Catastrophic forgetting in AI and solutions via experience replay and complementary learning systems
- Motor subspace dynamics and hierarchical control of behavior
- Sensorimotor integration in model organisms (C. elegans, larval zebrafish) and implications for robotic control
- Few-shot learning and unsupervised learning gaps
- Evaluating modern architectures (Transformers/LLMs) against known biological cognitive structures
.
├── Books/ # Reading list and book resources
├── Homework/ # Problem sets and final exam
├── Just_For_Fun/ # Optional explorations and demos
├── Notes_of_Teacher/ # Lecture notes
├── Papers/ # Assigned and supplementary papers
└── Slides/ # Lecture slides