Purpose: To develop an automated method for the detection of retinal hemorrhages on color fundus images to characterize malarial retinopathy, which may help in the assessment of patients with cerebral malaria.
Methods: A fundus image dataset from 14 patients (200 fundus images, with an average of 14 images per patient) previously diagnosed with malarial retinopathy was examined. We developed a pattern recognition-based algorithm, which extracted features from image watershed regions called splats (tobogganing). A reference standard was obtained by manual segmentation of hemorrhages, which assigned a label to each splat. The splat features with the associated splat label were used to train a linear k-nearest neighbor classifier that learnt the color properties of hemorrhages and identified the splats belonging to hemorrhages in a test dataset. In a crossover design experiment, data from 12 patients were used for training and data from two patients were used for testing, with 14 different permutations; and the derived sensitivity and specificity values were averaged.
Results: The experiment resulted in hemorrhage detection sensitivities in terms of splats as 80.83%, and in terms of lesions as 84.84%. The splat-based specificity was 96.67%, whereas for the lesion-based analysis, an average of three false positives was obtained per image. The area under the receiver operating characteristic curve was reported as 0.9148 for splat-based, and as 0.9030 for lesion-based analysis.
Conclusions: The method provides an automated means of detecting retinal hemorrhages associated with malarial retinopathy. The results matched well with the reference standard. With further development, this technique may provide automated assistance for screening and quantification of malarial retinopathy.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3460387 | PMC |
http://dx.doi.org/10.1167/iovs.12-10191 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!