Continuous Attractor Networks with Realistic Neural Dynamics
My master’s thesis, Continuous Attractor Networks with Realistic Neural Dynamics is available here.
Working memory, a cornerstone for cognitive functions like problem-solving and planning, has traditionally been linked to persistent neural activity. Attractor neural networks, known for forming persistent states, typically employ symmetric connectivity matrices in existing models. However, my research project investigated how asymmetric networks could produce more neural dynamics that more closely align with experimental data.
Our recurrent neural network models were trained using TensorFlow to tackle a generic working memory maintenance task. A unique cost function, tailored for a 1.5-second delay period, was built. Various methods of analysis were used, including fixed point analysis, principal component analysis (PCA) for visualizing network dynamics and attractor manifolds, and a decodability analysis.
If you don’t want to read the whole thesis but do want to enjoy my favourite figure, then see this animation below (see section 3.4 of my thesis for a full explanation).
The figure above shows dynamics of models trained with $M = 20$ initial conditions. Open circles correspond to $t = 0$. Left - symmetric network, right - unconstrained network. The symmetric network learns to solve the task by staying very close to its initial conditions, precluding a strongly dynamic code. The unconstrained network displays more interesting dynamics, which may be due to dynamic coding.
Key Conclusions
- Demonstrated the creation of continuous attractor networks without strict requirements on connectivity matrices.
- Revealed that continuous ring attractor manifolds could emerge from simply from training the network store a discrete range of inputs.
- Demonstrated dynamic coding (more realistic dynamics) in asymmetric networks (more realistic networks).