When processing language, the brain is thought to deploy specialized computations to construct meaning from complex linguistic structures. Recently, artificial neural networks based on the Transformer architecture have revolutionized the field of natural language processing. Transformers integrate contextual information across words via structured circuit computations.
View Article and Find Full Text PDFPeople use language to influence others' and actions. Yet models of communication have diverged along these lines, formalizing the speaker's objective in terms of the listener's beliefs or actions. We argue that this divergence lies at the root of a longstanding controversy over the Gricean maxims of truthfulness and relevance.
View Article and Find Full Text PDFPeople use a wide range of communicative acts across different modalities, from concrete demonstrations to abstract language. While these modalities are typically studied independently, we take a comparative approach and ask when and why one modality might outperform another. We present a series of real-time, multi-player experiments asking participants to teach concepts using either demonstrations or language.
View Article and Find Full Text PDF