Linear regression of optical coherence tomography measurements of peripapillary retinal nerve fiber layer thickness is often used to detect glaucoma progression and forecast future disease course. However, current measurement frequencies suggest that clinicians often apply linear regression to a relatively small number of measurements (e.g., less than a handful). In this study, we estimate the accuracy of linear regression in predicting the next reliable measurement of average retinal nerve fiber layer thickness using Zeiss Cirrus optical coherence tomography measurements of average retinal nerve fiber layer thickness from a sample of 6,471 eyes with glaucoma or glaucoma-suspect status. Linear regression is compared to two null models: no glaucoma worsening, and worsening due to aging. Linear regression on the first M ≥ 2 measurements was significantly worse at predicting a reliable M+1st measurement for 2 ≤ M ≤ 6. This range was reduced to 2 ≤ M ≤ 5 when retinal nerve fiber layer thickness measurements were first "corrected" for scan quality. Simulations based on measurement frequencies in our sample-on average 393 ± 190 days between consecutive measurements-show that linear regression outperforms both null models when M ≥ 5 and the goal is to forecast moderate (75th percentile) worsening, and when M ≥ 3 for rapid (90th percentile) worsening. If linear regression is used to assess disease trajectory with a small number of measurements over short time periods (e.g., 1-2 years), as is often the case in clinical practice, the number of optical coherence tomography examinations needs to be increased.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10786363 | PMC |
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0296674 | PLOS |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!