SNL 2023: Exploring Language Neuroscience
Abigail, Sarah and Marion participated in SNL 2023 in Marseille!
SNL2023 was a blast!
Three days filled with amazing discussions led by renowned experts in various fields of language neuroscience.
Opening Presentation by Jean-Rémi King
The grand event commenced with a captivating presentation by Jean-Rémi King, where the audience was introduced to cutting-edge computational projects created by Meta. He presented their recent work entitled “The Unreasonable Effectiveness of AI in Neurosciences” (a reference to physicist Wigner’s 1960 work), where researchers trained artificial intelligence (AI) to decode MEG signals in response to images. The presentation also delved into specific studies related to listening, reading, MEG, iEEG and more, shedding light on how different brain responses and embeddings correlate with language perception.
Presentation by Katie Slocombe
The conference continued with an impressive presentation by Katie Slocombe, who described her research on animal communication. The presentation highlighted the specificity of vocalizations and calls in chimpanzees, showing that the calls are produced intentionally and can be adapted to the social context and the relationship the chimpanzees share. The talk furthermore focused on an inter-species study on joint attention, which is the ability, when interacting with one or more peers, to focus attention on a target while exchanging eye contact. Like humans, great apes seem to be capable of joint attention between parent and juvenile.
Presentation by Asli Ozyurek
The conference concluded with an inspiring presentation by Asli Ozyurek, who detailed her remarkable studies within the multimodal sphere. Understanding how the brain processes language involves reconsidering traditional assumptions: while language science has typically focused on spoken languages and particularly on orthography and phonology, incorporating sign languages as well as body language and gesture during communication is crucial in forming a comprehensive understanding of the language faculty. Embracing this multimodal perspective challenges our assumptions about language structure, its use in interactions, neural processing, acquisition, and evolution. Evidence from diverse bodies of research, spanning spoken and signed languages and including individuals with different sensory experiences suggests that a multimodal view of language is essential for explaining its adaptability, resilience, and learnability across various contexts and cultures.
NoCE Lab Representatives
NoCE lab proudly sent three of its members to represent at SNL this year, and each of them showcased their research through posters. The team presented both planned and ongoing research projects covering fMRI and computational approaches to examine the interaction between language and semantic knowledge. Marion showcased an exciting fMRI project design that enables the tracking of neural coding from linguistic input to sensorimotor implementation of instructions. Sarah presented her ongoing work on how technology, especially social media, is influencing the lexicon of Persian speakers. Abigail presented one project comparing embeddings cross-linguistically and a separate project using parametric modulation to see how psycholinguistic properties of words influence fMRI-BOLD signal.
If you are already a member of SNL, you can find their contributions in the following sessions:
Abigail:
- E113: deep phenotyping of lexico-semantic processing: the cneuromod-triplets dataset
- A91: Complex Lexico-Semantic Networks: Cross-Linguistic Comparison of Embedding Cosine Similarity and Human Free Word Associations
Marion:
- E102: Tracking the Neural Coding Shift from Low-Level Linguistic Features
Sarah:
- A92: Syntactic-Semantic Analysis of the Emergence of Novel English-Persian Bilingual Complex Predicates