The metaverse enables immersive virtual healthcare environments, presenting opportunities for enhanced care delivery. A key challenge lies in effectively combining multimodal healthcare data and generative artificial intelligence abilities within metaverse-based healthcare applications, which is a problem that needs to be addressed. This paper proposes a novel multimodal learning framework for metaverse healthcare, MMLMH, based on collaborative intra- and intersample representation and adaptive fusion. Our framework introduces a collaborative representation learning approach that captures shared and modality-specific features across text, audio, and visual health data. By combining modality-specific and shared encoders with carefully formulated intrasample and intersample collaboration mechanisms, MMLMH achieves superior feature representation for complex health assessments. The framework's adaptive fusion approach, utilizing attention mechanisms and gated neural networks, demonstrates robust performance across varying noise levels and data quality conditions. Experiments on metaverse healthcare datasets demonstrate MMLMH's superior performance over baseline methods across multiple evaluation metrics. Longitudinal studies and visualization further illustrate MMLMH's adaptability to evolving virtual environments and balanced performance across diagnostic accuracy, patient-system interaction efficacy, and data integration complexity. The proposed framework has a unique advantage in that a similar level of performance is maintained across various patient populations and virtual avatars, which could lead to greater personalization of healthcare experiences in the metaverse. MMLMH's successful functioning in such complicated circumstances suggests that it can combine and process information streams from several sources. They can be successfully utilized in next-generation healthcare delivery through virtual reality.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11899152 | PMC |
http://dx.doi.org/10.34133/research.0616 | DOI Listing |
Research (Wash D C)
March 2025
The First Affiliated Hospital of Jinzhou Medical University, Jinzhou 121012, China.
The metaverse enables immersive virtual healthcare environments, presenting opportunities for enhanced care delivery. A key challenge lies in effectively combining multimodal healthcare data and generative artificial intelligence abilities within metaverse-based healthcare applications, which is a problem that needs to be addressed. This paper proposes a novel multimodal learning framework for metaverse healthcare, MMLMH, based on collaborative intra- and intersample representation and adaptive fusion.
View Article and Find Full Text PDFPLoS One
March 2025
Human Bio Information Group, Ewha Womans University Seoul Hospital, Seoul, Republic of Korea.
The objective of this study is to explore innovative integration within the field of anatomy education by leveraging HoloLens 2 Augmented Reality Head-Mounted Display (AR HMD) technology and real-time cloud rendering. Initial 3D datasets, comprising extensive anatomical information for each bone, were obtained through the 3D scanning of a full-body cadaver of Korean male origin. Subsequently, these datasets underwent refinement processes aimed at enhancing visual fidelity and optimizing polygon counts, utilizing Blender software.
View Article and Find Full Text PDFBJS Open
March 2025
Department of Colorectal Surgery, Manchester University NHS Foundation Trust, Manchester, UK.
Background: The metaverse is an emerging concept in surgery, with much interest in its highly immersive and interactive virtual environment. Despite the growing interest and importance in healthcare, the metaverse is still very much in its early phase of evolution and adoption in surgery, with debate on its definition and components. This scoping review provides a summary of the evidence and current understanding for the use of the metaverse in surgery.
View Article and Find Full Text PDFBMC Health Serv Res
March 2025
School of Health Sciences, Istanbul Medipol University, Istanbul, Turkey.
Background: Healthcare institutions have been affected by the changing environmental conditions with digitalization, and have turned to developing business models compatible with technological changes and adapting their institutions to these changes. For this change and adaptation, it is necessary to determine the innovative work behavior perception of healthcare professionals. This study aims to examine the relationship between healthcare professionals' innovative work behavior perceptions and metaverse knowledge and awareness levels together with demographic characteristics.
View Article and Find Full Text PDFThe metaverse, which integrates physical and virtual realities through technologies such as high-speed internet, virtual and augmented reality, and artificial intelligence (AI), offers transformative prospects across various fields, particularly healthcare. This integration introduces a new paradigm in AI-driven medical imaging, particularly in assessing brain age'a crucial marker for detecting age-related neuropathologies such as Alzheimer's disease (AD) using magnetic resonance imaging (MRI). Despite advances in deep learning for estimating brain age from structural MRI (sMRI), incorporating functional MRI (fMRI) data presents significant challenges due to its complex data structure and the noisy nature of functional connectivity measurements.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!