A Dynamic Neural Field Model of Memory, Attention and Cross-Situational Word Learning


A Dynamic Neural Field Model of Memory, Attention and Cross-Situational Word Learning

Authors: Bhat A. A., Spencer J. P., & Samuelson L. K.

Journal: Proceedings CogSci 2018

Tags: word-learning, attention, memory, neural, DFT

Link: URL

Abstract:


WOLVES integrates dynamic neural fields for vision and language to model cross-situational word learning. Peaks represent sustained attention to objects and words; memory fields accumulate co-occurrence statistics. The model reproduces human looking and learning curves across 12 experiments.

Methodology:


Eight coupled field equations evolve under local excitation and lateral inhibition. Word triggers excite word-object binding fields; gaze dynamics follow activation peaks. Co-occurrence counts update associative maps across trials. Simulations match children's preferential looking data.

Acknowledgements :