Artificial intelligence is currently a hot topic in medicine. However, medical data is often sparse and hard to obtain due to legal restrictions and lack of medical personnel for the cumbersome and tedious process to manually label training data. These constraints make it difficult to develop systems for automatic analysis, like detecting disease or other lesions. In this respect, this article presents HyperKvasir, the largest image and video dataset of the gastrointestinal tract available today. The data is collected during real gastro- and colonoscopy examinations at Bærum Hospital in Norway and partly labeled by experienced gastrointestinal endoscopists. The dataset contains 110,079 images and 374 videos, and represents anatomical landmarks as well as pathological and normal findings. The total number of images and video frames together is around 1 million. Initial experiments demonstrate the potential benefits of artificial intelligence-based computer-assisted diagnosis systems. The HyperKvasir dataset can play a valuable role in developing better algorithms and computer-assisted examination systems not only for gastro- and colonoscopy, but also for other fields in medicine.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7455694PMC
http://dx.doi.org/10.1038/s41597-020-00622-yDOI Listing

Publication Analysis

Top Keywords

image video
8
video dataset
8
dataset gastrointestinal
8
gastro- colonoscopy
8
hyperkvasir comprehensive
4
comprehensive multi-class
4
multi-class image
4
dataset
4
gastrointestinal endoscopy
4
endoscopy artificial
4

Similar Publications

The Segment Anything model (SAM) is a powerful vision foundation model that is revolutionizing the traditional paradigm of segmentation. Despite this, a reliance on prompting each frame and large computational cost limit its usage in robotically assisted surgery. Applications, such as augmented reality guidance, require little user intervention along with efficient inference to be usable clinically.

View Article and Find Full Text PDF

Arthroscopy is a minimally invasive surgical procedure used to diagnose and treat joint problems. The clinical workflow of arthroscopy typically involves inserting an arthroscope into the joint through a small incision, during which surgeons navigate and operate largely by relying on their visual assessment through the arthroscope. However, the arthroscope's restricted field of view and lack of depth perception pose challenges in navigating complex articular structures and achieving surgical precision during procedures.

View Article and Find Full Text PDF

Widespread screening is crucial for the early diagnosis and treatment of glaucoma, the leading cause of visual impairment and blindness. The development of portable technologies, such as smartphone-based ophthalmoscopes, able to image the optical nerve head, represents a resource for large-scale glaucoma screening. Indeed, they consist of an optical device attached to a common smartphone, making the overall device cheap and easy to use.

View Article and Find Full Text PDF

Mediastinal lymphangiomas are rare benign tumors arising from lymphatic system malformations, most commonly seen in pediatric populations. In adults, they are exceedingly rare and present diagnostic challenges due to nonspecific symptoms and imaging overlap with other mediastinal masses. Diagnosis is typically based on imaging, including CT and MRI, with histopathology confirming the diagnosis.

View Article and Find Full Text PDF

When observing chip-to-free-space light beams from silicon photonics (SiPh) to free space, manual adjustment of camera lens is often required to obtain a focused image of the light beams. In this Letter, we demonstrated an auto-focusing system based on the you-only-look-once (YOLO) model. The trained YOLO model exhibits high classification accuracy of 99.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!