Historically, visual acuity has been the benchmark for visual function. It is used to measure therapeutic outcomes for vision-related services, products and interventions. Quantitative measurement of suboptimal visual acuity can potentially be corrected optically with proper refraction in some cases, but in many cases of reduced vision there is something else more serious that can potentially impact other aspects of visual function such as contrast sensitivity, color discrimination, peripheral field of view and higher-order visual processing. The measurement of visual acuity typically requires stimuli subject to some degree of standardization or calibration and has thus often been limited to clinical settings. However, we are spending increasing amounts of time interacting with devices that present high-resolution, full color images and video (hereafter, digital media) and can record our responses. Most of these devices can be used to measure visual acuity and other aspects of visual function, not just with targeted testing experiences but from typical device interactions. There is growing evidence that prolonged exposure to digital media can lead to various vision-related issues (eg, computer vision syndrome, dry eye, etc.). Our regular, daily interactions (digital behavior) can also be used to assess our visual function, passively and continuously. This allows us to expand vision health assessment beyond the clinic, to collect vision-related data in the whole range of settings for typical digital behavior from practically any population(s) of interest and to further explore just how our increasingly virtual interactions are affecting our vision. We present a tool that can be easily integrated into digital media to provide insights into our digital behavior.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6292403 | PMC |
http://dx.doi.org/10.2147/OPTH.S187131 | DOI Listing |
PLoS One
January 2025
College of Arts, Anhui Xinhua University, Hefei, China.
To improve the expressiveness and realism of illustration images, the experiment innovatively combines the attention mechanism with the cycle consistency adversarial network and proposes an efficient style transfer method for illustration images. The model comprehensively utilizes the image restoration and style transfer capabilities of the attention mechanism and the cycle consistency adversarial network, and introduces an improved attention module, which can adaptively highlight the key visual elements in the illustration, thereby maintaining artistic integrity during the style transfer process. Through a series of quantitative and qualitative experiments, high-quality style transfer is achieved, especially while retaining the original features of the illustration.
View Article and Find Full Text PDFPLoS One
January 2025
Department of Ophthalmology, University of Washington, Seattle, WA, United States of America.
To investigate macula and optic nerve head (ONH) mitochondrial metabolic activity using flavoprotein fluorescence (FPF) in normal, glaucoma suspect (GS), and open-angle glaucoma (OAG) eyes we performed a cross-sectional, observational study of FPF in normal, GS, and OAG eyes. The macula and ONH of each eye was scanned and analyzed with a commercially available FPF measuring device (OcuMet Beacon, OcuSciences Inc., Ann Arbor, MI).
View Article and Find Full Text PDFPLoS One
January 2025
Orthopedics Department, First Teaching Hospital of Tianjin University of Traditional Chinese Medicine, Tianjin, China.
Objective: The objective of this systematic review and meta-analysis is to clarify the rehabilitation efficacy of virtual reality (VR) balance training after anterior cruciate ligament reconstruction (ACLR).
Methods: This meta-analysis was registered in PROSPERO with the registration number CRD42024520383. The electronic databases PubMed, Web of Science, Cochrane Library, MEDLINE, Embase, China National Knowledge Infrastructure, Chinese Biomedical Literature, China Science and Technology Journal Database, and Wanfang Digital Periodical database were systematically searched to identify eligible studies from their inception up to January 2024.
J Neuroophthalmol
December 2024
Division of Ophthalmology (EB-S, AS, AA-A, AS-B, DW, SS, FC), Department of Surgery, University of Calgary, Calgary, Canada; Department of Biomedical Engineering (CN), University of Calgary, Calgary, Canada; Departments of Neurology (LBDL) and Ophthalmology (LBDL), University of Michigan, Ann Arbor, Michigan; and Department of Clinical Neurosciences (SS, FC), University of Calgary, Calgary, Canada.
Background: Optic neuritis (ON) is a complex clinical syndrome that has diverse etiologies and treatments based on its subtypes. Notably, ON associated with multiple sclerosis (MS ON) has a good prognosis for recovery irrespective of treatment, whereas ON associated with other conditions including neuromyelitis optica spectrum disorders or myelin oligodendrocyte glycoprotein antibody-associated disease is often associated with less favorable outcomes. Delay in treatment of these non-MS ON subtypes can lead to irreversible vision loss.
View Article and Find Full Text PDFCornea
January 2025
Department of Ophthalmology and Visual Sciences, Escola Paulista de Medicina, Universidade Federal de São Paulo, São Paulo, SP, Brazil.
Purpose: To evaluate the efficacy and safety of intense pulsed light (IPL) combined with meibomian gland expression (MGX) for the treatment of dry eye disease and meibomian gland dysfunction associated with chronic Stevens-Johnson syndrome and toxic epidermal necrolysis.
Methods: This prospective noncomparative interventional study included 29 patients (58 eyes) who underwent 3 sessions of IPL and MGX at 2-week intervals. Subjective symptoms (ocular surface disease index score) and objective dry eye tests: matrix metalloproteinase 9, tear meniscus height, bulbar redness score, tear film lipid layer thickness (LLT), Schirmer I test, conjunctival and corneal staining, meibomian gland loss, MGX score [meibomian gland score (MGS)], and tear break-up time were assessed at the baseline and after 4, 8, and 12 weeks.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!