Women perform better than men on tests of verbal memory, but the nature of this advantage has not been precisely established. To examine whether phonemic memory is a factor in the female advantage, we presented, along with other verbal memory tasks, one containing nonsense words. Overall, there was the expected female advantage. However, an examination of the individual tests showed female superiority in recall of the real words but not the nonsense words.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.2466/pr0.2003.93.1.263 | DOI Listing |
Learn Mem
January 2025
Department of Psychiatry, Yale University, New Haven, Connecticut 06511, USA
Emotional events hold a privileged place in our memories, differing in accuracy and structure from memories for neutral experiences. Although much work has focused on the pronounced differences in memory for negative experiences, there is growing evidence that positive events may lead to more holistic, or integrated, memories. However, it is unclear whether these affect-driven changes in memory structure, which have been found in highly controlled laboratory environments, extend to real-world episodic memories.
View Article and Find Full Text PDFPLoS One
January 2025
Department of Electrical Engineering, College of Engineering, Taif University, Taif, Saudi Arabia.
Modernizing power systems into smart grids has introduced numerous benefits, including enhanced efficiency, reliability, and integration of renewable energy sources. However, this advancement has also increased vulnerability to cyber threats, particularly False Data Injection Attacks (FDIAs). Traditional Intrusion Detection Systems (IDS) often fall short in identifying sophisticated FDIAs due to their reliance on predefined rules and signatures.
View Article and Find Full Text PDFSci Rep
January 2025
Department of Computer Science and Engineering, E.G.S. Pillay Engineering College, Nagapattinam, 611002, Tamil Nadu, India.
In response to the pressing need for the detection of Monkeypox caused by the Monkeypox virus (MPXV), this study introduces the Enhanced Spatial-Awareness Capsule Network (ESACN), a Capsule Network architecture designed for the precise multi-class classification of dermatological images. Addressing the shortcomings of traditional Machine Learning and Deep Learning models, our ESACN model utilizes the dynamic routing and spatial hierarchy capabilities of CapsNets to differentiate complex patterns such as those seen in monkeypox, chickenpox, measles, and normal skin presentations. CapsNets' inherent ability to recognize and process crucial spatial relationships within images outperforms conventional CNNs, particularly in tasks that require the distinction of visually similar classes.
View Article and Find Full Text PDFSci Rep
January 2025
Ministry of Higher Education, Mataria Technical College, Cairo, 11718, Egypt.
The current work introduces the hybrid ensemble framework for the detection and segmentation of colorectal cancer. This framework will incorporate both supervised classification and unsupervised clustering methods to present more understandable and accurate diagnostic results. The method entails several steps with CNN models: ADa-22 and AD-22, transformer networks, and an SVM classifier, all inbuilt.
View Article and Find Full Text PDFNeural Netw
January 2025
School of Information Management and Engineering, Shanghai University of Finance and Economics, 200433 Shanghai, PR China. Electronic address:
Users may click on a news because they are interested in its content or because the news contains important information and is very popular. Modeling these two aspects is crucial for accurate news recommendation. Most existing studies focused on capturing users' preferences towards news content, and thus they are limited in investigating in depth users' preferences towards news popularity and independently capturing user content and popularity preferences.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!