Objectives: We compared two non-alternative methods to assess the readability and learning of easy-to-read educational health materials co-written by physicians, educators and citizens.
Methods: Data from seven easy-to-read materials were analyzed. Readability formulae, and ad hoc data on readability and learning were also computed.
Results: The respondents had a mean age of 48.5 +/- 8.3 (SD) years (range 31-57 years). More than two thirds of them were females. About half of the participants had a 'secondary' education or more. According to the readability scores - 54 on average - the booklets resulted to be "easy" for a reader who had received a 'secondary education' or more. Of the 747 participants, 70% of them found the booklet's language to be 'easy' or 'very easy' and 28% 'sufficiently easy' for laypersons to understand. About 98% of the readers found the booklets useful. After reading the booklet 92% (simple knowledge rate) of the readers answered the cognitive items correctly. The after-minus-before net increase in knowledge was 24 +/- 16% and ranged from 8 to 40% (cognitive or knowledge delta).
Conclusions: The availability of readability scores is complementary and it does not replace the need to assess readability and learning by means of structured and tailored questionnaires.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1007/s10459-005-7852-2 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!