Thinking ahead: prediction in context as a keystone of language in humans and machines

Here we provide empirical evidence for the deep connection between autoregressive DLMs and the human language faculty using a 30-min spoken narrative and electrocorticographic (ECoG) recordings.

.

Finally, we demonstrate that contextual embeddings derived from autoregressive DLMs capture neural representations of the unique, context-specific meaning of words in the narrative.