We introduce Hyper-YOLO, a new object detection method that integrates hypergraph computations to capture the complex high-order correlations among visual features. Traditional YOLO models, while powerful, have limitations in their neck designs that restrict the integration of cross-level features and the exploitation of high-order feature interrelationships. To address these challenges, we propose the Hypergraph Computation Empowered Semantic Collecting and Scattering (HGC-SCS) framework, which transposes visual feature maps into a semantic space and constructs a hypergraph for high-order message propagation. This enables the model to acquire both semantic and structural information, advancing beyond conventional feature-focused learning. Hyper-YOLO incorporates the proposed Mixed Aggregation Network (MANet) in its backbone for enhanced feature extraction and introduces the Hypergraph-Based Cross-Level and Cross-Position Representation Network (HyperC2Net) in its neck. HyperC2Net operates across five scales and breaks free from traditional grid structures, allowing for sophisticated high-order interactions across levels and positions. This synergy of components positions Hyper-YOLO as a state-of-the-art architecture in various scale models, as evidenced by its superior performance on the COCO dataset. Specifically, Hyper-YOLO-N significantly outperforms the advanced YOLOv8-N and YOLOv9-T with 12% and 9% improvements.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TPAMI.2024.3524377 | DOI Listing |
PLoS One
March 2025
Department of Physics, Portland State University, Portland, Oregon, United States of America.
The ability of microbial active motion, morphology, and optical properties to serve as biosignatures was investigated by in situ video microscopy in a wide range of extreme field sites where such imaging had not been performed previously. These sites allowed for sampling seawater, sea ice brines, cryopeg brines, hypersaline pools and seeps, hyperalkaline springs, and glaciovolcanic cave ice. In all samples except the cryopeg brine, active motion was observed without any sample treatment.
View Article and Find Full Text PDFIEEE Trans Vis Comput Graph
March 2025
In Augmented Reality (AR), virtual content enhances user experience by providing additional information. However, improperly positioned or designed virtual content can be detrimental to task performance, as it can impair users' ability to accurately interpret real-world information. In this paper we examine two types of task-detrimental virtual content: obstruction attacks, in which virtual content prevents users from seeing real-world objects, and information manipulation attacks, in which virtual content interferes with users' ability to accurately interpret real-world information.
View Article and Find Full Text PDFRev Sci Instrum
March 2025
School of Electrical and Information Engineering, Anhui University of Science and Technology, Huainan 232001, China.
Rotor attitude detection (RAD) is one of the key technologies to control permanent magnet spherical motors (PMSpM). This paper proposes an improved you only look once v8n (YOLOv8n) based RAD method for a PMSpM. The visual image datasets collection and annotation method are described, and three different visual feature objects are set for the RAD.
View Article and Find Full Text PDFObjective: To analyze the effects of multiplane reconstruction (MPR) technology with multi-slice spiral CT (MSCT) in the etiological diagnosis of acute intestinal obstruction (AIO). Obtaining clear images is of great help in determining the type and etiology of AIO, and doctors can quickly develop treatment plans to improve prognosis and efficacy.
Methods: The clinical data of patients with suspected AIO admitted to our hospital from May 2020 to May 2022 were retrospectively selected as the observation objects.
Adv Radiat Oncol
March 2025
Department of Radiation Oncology, University of Florida College of Medicine, Jacksonville, Florida.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!