We introduce an incoherent adaptive imaging system based on optimization of an image quality metric measured using a coherent optical system. Experimental results and numerical simulations are presented that demonstrate adaptive correction of phase-distorted extended source images containing objects located at multiple distances.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1364/ao.36.003319 | DOI Listing |
Sci Rep
January 2025
Department of Biomedical Engineering, School of Life Science and Technology, Changchun University of Science and Technology, Changchun, 130022, China.
The cervical cell classification technique can determine the degree of cellular abnormality and pathological condition, which can help doctors to detect the risk of cervical cancer at an early stage and improve the cure and survival rates of cervical cancer patients. Addressing the issue of low accuracy in cervical cell classification, a deep convolutional neural network A2SDNet121 is proposed. A2SDNet121 takes DenseNet121 as the backbone network.
View Article and Find Full Text PDFSci Rep
January 2025
School of Electronic and Information Engineering, Changsha Institute of Technology, Changsha, 410200, China.
In order to solve the limitations of flipped classroom in personalized teaching and interactive effect improvement, this paper designs a new model of flipped classroom in colleges and universities based on Virtual Reality (VR) by combining the algorithm of Contrastive Language-Image Pre-Training (CLIP). Through cross-modal data fusion, the model deeply combines students' operation behavior with teaching content, and improves teaching effect through intelligent feedback mechanism. The test data shows that the similarity between video and image modes reaches 0.
View Article and Find Full Text PDFSci Rep
January 2025
College of Information Science and Technology, Hainan Normal University, Haikou, 571158, China.
Breast cancer is one of the most aggressive types of cancer, and its early diagnosis is crucial for reducing mortality rates and ensuring timely treatment. Computer-aided diagnosis systems provide automated mammography image processing, interpretation, and grading. However, since the currently existing methods suffer from such issues as overfitting, lack of adaptability, and dependence on massive annotated datasets, the present work introduces a hybrid approach to enhance breast cancer classification accuracy.
View Article and Find Full Text PDFRadiography (Lond)
January 2025
Department of Radiography, School of Allied Health Sciences, Faculty of Health Sciences and Veterinary Medicine, University of Namibia, P.O Box 13301, Windhoek, Namibia. Electronic address:
Introduction: Patient-centred care (PCC) is essential in radiography for polytrauma patients emphasising empathy, clear communication, and patient well-being. Polytrauma patients require tailored imaging approaches, often involving multiple modalities. Managing and handling these patients during imaging are key components of radiography training to develop the necessary competencies.
View Article and Find Full Text PDFLancet Neurol
February 2025
Department of Clinical Neurological Sciences, University of Western Ontario, London, ON, Canada; Department of Cognitive Neurology, St Joseph's Health Care London, London, ON, Canada. Electronic address:
Background: No treatments exist for apathy in people with frontotemporal dementia. Previously, in a randomised double-blind, placebo-controlled, dose-finding study, intranasal oxytocin administration in people with frontotemporal dementia improved apathy ratings on the Neuropsychiatric Inventory over 1 week and, in a randomised, double-blind, placebo-controlled, crossover study, a single dose of 72 IU oxytocin increased blood-oxygen-level-dependent signal in limbic brain regions. We aimed to determine whether longer treatment with oxytocin improves apathy in people with frontotemporal dementia.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!