0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Research Letter |

A Comparative Analysis of the Quality of Patient Education Materials From Medical Specialties FREE

Nitin Agarwal, BS1; David R. Hansberry, PhD1; Victor Sabourin, BA1; Krystal L. Tomei, MD, MPH1; Charles J. Prestigiacomo, MD1,2,3
[+] Author Affiliations
1Department of Neurological Surgery, New Jersey Medical School, University of Medicine and Dentistry of New Jersey, Newark
2Department of Radiology, New Jersey Medical School, University of Medicine and Dentistry of New Jersey, Newark
3Department of Neurology and Neuroscience, New Jersey Medical School, University of Medicine and Dentistry of New Jersey, Newark
JAMA Intern Med. 2013;173(13):1257-1259. doi:10.1001/jamainternmed.2013.6060.
Text Size: A A A
Published online

Given the access to a seemingly unsurpassable amount of information online, one can understand why the Internet has become one of the most commonly used sources of information, including health care–oriented resources. According to a 2011 study performed by the Pew Internet and American Life Project, 59% of Americans use the Internet to find and understand health care–oriented information.1 However, a potential problem is the difficult reading level of the patient-specific education materials. The average American adult reads at approximately a seventh to eighth grade level.2 Therefore, the American Medical Association, the National Institutes of Health, and the US Department of Health and Human Services advocate for patient education materials to be written at a fourth to sixth grade reading level.24 As explored in this Research Letter, we assess the readability of patient education resources by using various readability parameters. To our knowledge, this is the first study to compare the readability of patient education materials to comprehensively assess the quality of resources provided by various medical professional organizations.

Online patient education materials from each medical specialty were downloaded in 2012. Resources from the 16 specialties were examined. For each website, material written specifically for patients was downloaded into Microsoft Office Word (Microsoft). Tables, figures, hyperlinks, and text unrelated to the patient education material, including copyright notices, disclaimers, and author information, was deleted.

Readability assessment of each article was performed using Readability Studio Professional Edition, Version 2012.1 (Oleander Software). The analysis included the Coleman-Liau index, FORCAST formula, simple measure of gobbledygook (SMOG) grading, New Dale-Chall readability formula, Flesch Reading Ease, Flesch-Kincaid grade level, Fry graphical analysis, Gunning fog index, New Fog Count, and the Raygor readability estimate. A separate analysis was performed to identify grammatical errors, cliches, and passive voice.

All readability assessments, excluding the New Fog Count, showed that patient education materials from all 16 medical specialties were too complex for the recommended sixth grade reading level (Table). The New Fog Count yielded the following scores near the recommended guidelines: dermatology, 4.3; obstetrics and gynecology, 6.0; plastic surgery, 6.1; and family medicine, 6.6. The New Dale-Chall readability formula test showed that only dermatology, family medicine, and obstetrics and gynecology were within the boundaries of the average American adult reading level. Flesch Reading Ease readability analysis showed that largely, patient resources were considered to be “difficult.” For the Flesch-Kincaid grade level readability test, family medicine was the only specialty within the parameters of the average adult reading ability. Readability scores using the Fry graphical analysis test ranged from the eighth grade level in family medicine to unclassifiable in dermatology because the complexity of the patient educational materials was beyond the 17th grade level.

Table Graphic Jump LocationTable.  Individual Medical Specialty Readability Scores and Grammatical Errors1

Overall, across all readability analyses used to measure each of the 16 websites, the New Fog Count demonstrated the lowest mean grade level score of 9.3, whereas SMOG grading demonstrated the highest mean grade level score of 14.1. The proportion of passive voice sentences used throughout resources ranged from 4% in family medicine to 27% in neurological surgery. Obstetrics and gynecology materials contained the most cliches with a total of 40, corresponding to 5.8 cliches per 50 pages. Obstetrics and gynecology materials also contained the highest total number of indefinite article mismatches (the improper use of “a” or “an”) at 14 errors, corresponding to 1.8 errors per 50 pages.

Research conducted at the US Department of Education found that 12% of adults had proficient health literacy, 53% had intermediate health literacy, 22% had basic health literacy, and 14% had below basic health literacy.5 As a result, on an individual level, problems arise in the form of preventable recurrent hospitalizations or visits. On the national level, there are negative economic consequences: it has been estimated that inadequate health literacy is costing the US economy between $106 and $236 billion dollars annually.6

Our analysis of the level of readability across all 10 readability scales showed that none of the patient education resources provided by the 16 professional organizations met the recommended sixth grade maximum readability level or even the seventh to eighth grade reading ability of the typical American adult. As such, website revisions may be warranted to increase the level of readability and quality of these patient resources to effectively reach a broader patient population. One simple adjustment is to write more clearly, which may increase comprehension regardless of the reader’s health literacy capabilities.7 The use of pictures and videos may also be an effective way of increasing a patient’s comprehension of health information that is too complex to fully explain through pure text.8 Future studies will seek to better explain the relationship between readability and multimedia effectiveness at improving the communication of health information, which would ultimately help to improve patient comprehension and outcomes.

Corresponding Author: Dr Prestigiacomo, Department of Neurological Surgery, New Jersey Medical School, University of Medicine and Dentistry of New Jersey, 90 Bergen St, Ste 8100, PO Box 1709, Newark, NJ 07101 (c.prestigiacomo@umdnj.edu).

Published Online: May 20, 2013. doi: 10.1001/jamainternmed.2013.6060

Author Contributions: Mr Agarwal and Dr Hansberry served as co–first authors, each with equal contribution to the manuscript.

Study concept and design: Agarwal, Hansberry, and Prestigiacomo.

Acquisition of data: Agarwal, Hansberry, and Sabourin.

Analysis and interpretation of data: Agarwal, Hansberry, Tomei, and Prestigiacomo.

Drafting of the manuscript: Agarwal, Hansberry, Sabourin, and Tomei.

Critical revision of the manuscript for important intellectual content: Agarwal, Hansberry, Tomei, and Prestigiacomo.

Statistical analysis: Agarwal and Hansberry.

Administrative, technical, and material support: Agarwal, Hansberry, Tomei, and Prestigiacomo.

Study supervision: Agarwal, Hansberry, Tomei, and Prestigiacomo.

Literature review: Sabourin.

Conflict of Interest Disclosures: None reported.

Additional Contributions: Chirag D. Gandhi, MD, provided guidance throughout the duration of this study.

Pew Internet & American Life Project. The Social Life of Health Information. 2011. http://pewinternet.org/~/media//Files/Reports/2011/PIP_Social_Life_of_Health_Info.pdf. Accessed February 28, 2012.
Walsh  TM, Volsko  TA.  Readability assessment of internet-based consumer health information. Respir Care. 2008;53(10):1310-1315.
PubMed
National Institutes of Health. How to Write Easy-to-Read Health Materials. http://www.nlm.nih.gov/medlineplus/etr.html. Accessed January 4, 2012.
Paasche-Orlow  MK, Taylor  HA, Brancati  FL.  Readability standards for informed-consent forms as compared with actual readability. N Engl J Med. 2003;348(8):721-726.
PubMed   |  Link to Article
Kutner  M, Greenberg  E, Jin  Y, Paulsen  C. The Health Literacy of America’s Adults: Results From the 2003 National Assessment of Adult Literacy (NCES 2006–483). Washington, DC: National Center for Education Statistics; 2006.
US Department of Health and Human Services, Office of Disease Prevention and Health Promotion. National Action Plan to Improve Health Literacy. Washington, DC: US Department of Health and Human Services. 2010;3.
Parker  R, Kreps  GL.  Library outreach: overcoming health literacy challenges. J Med Libr Assoc. 2005;93(4)(suppl):S81-S85.
PubMed
Murphy  PW, Chesson  AL, Walker  L, Arnold  CL, Chesson  LM.  Comparing the effectiveness of video and written material for improving knowledge among sleep disorders clinic patients with limited literacy skills. South Med J. 2000;93(3):297-304.
PubMed

Figures

Tables

Table Graphic Jump LocationTable.  Individual Medical Specialty Readability Scores and Grammatical Errors1

References

Pew Internet & American Life Project. The Social Life of Health Information. 2011. http://pewinternet.org/~/media//Files/Reports/2011/PIP_Social_Life_of_Health_Info.pdf. Accessed February 28, 2012.
Walsh  TM, Volsko  TA.  Readability assessment of internet-based consumer health information. Respir Care. 2008;53(10):1310-1315.
PubMed
National Institutes of Health. How to Write Easy-to-Read Health Materials. http://www.nlm.nih.gov/medlineplus/etr.html. Accessed January 4, 2012.
Paasche-Orlow  MK, Taylor  HA, Brancati  FL.  Readability standards for informed-consent forms as compared with actual readability. N Engl J Med. 2003;348(8):721-726.
PubMed   |  Link to Article
Kutner  M, Greenberg  E, Jin  Y, Paulsen  C. The Health Literacy of America’s Adults: Results From the 2003 National Assessment of Adult Literacy (NCES 2006–483). Washington, DC: National Center for Education Statistics; 2006.
US Department of Health and Human Services, Office of Disease Prevention and Health Promotion. National Action Plan to Improve Health Literacy. Washington, DC: US Department of Health and Human Services. 2010;3.
Parker  R, Kreps  GL.  Library outreach: overcoming health literacy challenges. J Med Libr Assoc. 2005;93(4)(suppl):S81-S85.
PubMed
Murphy  PW, Chesson  AL, Walker  L, Arnold  CL, Chesson  LM.  Comparing the effectiveness of video and written material for improving knowledge among sleep disorders clinic patients with limited literacy skills. South Med J. 2000;93(3):297-304.
PubMed

Correspondence

CME
Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Please click the checkbox indicating that you have read the full article in order to submit your answers.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
Submit a Comment

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Web of Science® Times Cited: 5

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Collections
PubMed Articles