Content analysis of traditional and social media has a central role in investigating features of media content, measuring media exposure, and calculating calculation of media effects. The reliability of content coding is usually evaluated using "Kappa-like" agreement measures, but these measures produce results that aggregate individual coder decisions, which obscure the performance of individual coders. Using a data set of 105 advertisements for sports and energy drinks media content coded by five coders, we demonstrate that Item Response Theory can track coder performance over time and give coder-specific information on the consistency of decisions over qualitatively coded objects. We conclude that IRT should be added to content analysts' tool kit of useful methodologies to track and measure content coders' performance.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10691860 | PMC |
http://dx.doi.org/10.1007/s11135-022-01397-7 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!