The way we talk can influence how we are perceived by others. Whereas previous studies have started to explore the influence of social goals on syntactic alignment, in the current study, we additionally investigated whether syntactic alignment effectively influences conversation partners' perception of the speaker. To this end, we developed a novel paradigm in which we can measure the effect of social goals on the strength of syntactic alignment for one participant (primed participant), while simultaneously obtaining usable social opinions about them from their conversation partner (the evaluator). In Study 1, participants' desire to be rated favorably by their partner was manipulated by assigning pairs to a Control (i.e., primed participants did not know they were being evaluated) or Evaluation context (i.e., primed participants knew they were being evaluated). Surprisingly, results showed no significant difference in the strength with which primed participants aligned their syntactic choices with their partners' choices. In a follow-up study, we used a Directed Evaluation context (i.e., primed participants knew they were being evaluated and were explicitly instructed to make a positive impression). However, again, there was no evidence supporting the hypothesis that participants' desire to impress their partner influences syntactic alignment. With respect to the influence of syntactic alignment on perceived likeability by the evaluator, a negative relationship was reported in Study 1: the more primed participants aligned their syntactic choices with their partner, the more that partner decreased their likeability rating after the experiment. However, this effect was not replicated in the Directed Evaluation context of Study 2. In other words, our results do not support the conclusion that speakers' desire to be liked affects how much they align their syntactic choices with their partner, nor is there convincing evidence that there is a reliable relationship between syntactic alignment and perceived likeability.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4833301 | PMC |
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0153521 | PLOS |
Neuroimage
December 2024
Center for Brain Disorders and Cognitive Sciences, School of Psychology, Shenzhen University, Shenzhen 518060, China. Electronic address:
Bilingual individuals manage multiple languages that align in conceptual meaning but differ in forms and structures. While prior research has established foundational insights into the neural mechanisms in bilingual processing, the extent to which the first (L1) and second language (L2) systems overlap or diverge across different linguistic components remains unclear. This study probed the neural underpinnings of syntactic and semantic processing for L1 and L2 in Chinese-English bilinguals (N = 44) who performed sentence comprehension tasks and an N-back working memory task during functional MRI scanning.
View Article and Find Full Text PDFTop Cogn Sci
October 2024
Department of Language and Linguistics, University of Essex.
Past research suggests that Working Memory plays a role in determining relative clause attachment bias. Disambiguation preferences may further depend on Processing Speed and explicit memory demands in linguistic tasks. Given that Working Memory and Processing Speed decline with age, older adults offer a way of investigating the factors underlying disambiguation preferences.
View Article and Find Full Text PDFNat Commun
October 2024
Donders Centre for Cognitive Neuroimaging, Radboud University, Nijmegen, The Netherlands.
Humans excel at extracting structurally-determined meaning from speech despite inherent physical variability. This study explores the brain's ability to predict and understand spoken language robustly. It investigates the relationship between structural and statistical language knowledge in brain dynamics, focusing on phase and amplitude modulation.
View Article and Find Full Text PDFNeuron
September 2024
Princeton Neuroscience Institute and Department of Psychology, Princeton University, Princeton, NJ 08544, USA.
Effective communication hinges on a mutual understanding of word meaning in different contexts. We recorded brain activity using electrocorticography during spontaneous, face-to-face conversations in five pairs of epilepsy patients. We developed a model-based coupling framework that aligns brain activity in both speaker and listener to a shared embedding space from a large language model (LLM).
View Article and Find Full Text PDFCurr Biol
August 2024
Institute for Biomagnetism and Biosignal Analysis, University of Münster, Münster, Germany; Otto-Creutzfeldt-Center for Cognitive and Behavioral Neuroscience, University of Münster, Münster, Germany.
Decoding human speech requires the brain to segment the incoming acoustic signal into meaningful linguistic units, ranging from syllables and words to phrases. Integrating these linguistic constituents into a coherent percept sets the root of compositional meaning and hence understanding. One important cue for segmentation in natural speech is prosodic cues, such as pauses, but their interplay with higher-level linguistic processing is still unknown.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!