Readability standards for informed-consent forms as compared with actual readability

Michael K. Paasche-Orlow, Holly Taylor, Frederick L. Brancati

Research output: Contribution to journalArticle

Abstract

BACKGROUND: Institutional review boards (IRBs) are charged with safeguarding potential research subjects with limited literacy but may have an inadvertent role in promulgating unreadable consent forms. We hypothesized that text provided by IRBs in informed-consent forms falls short of the IRBs' own readability standards and that readability is influenced by the level of research activity, local literacy rates, and federal oversight. METHODS: To test these hypotheses, we conducted a cross-sectional study linking data from several public-use sources. A total of 114 Web sites of U.S. medical schools were surveyed for IRB readability standards and informed-consent-form templates. Actual readability was measured with the Flesch-Kincaid scale, which assigns a score on the basis of the minimal grade level required to read and understand English text (range, 0 to 12). Data on the level of research activity, local literacy rates, and federal oversight were obtained from organizational Web sites. RESULTS: The average readability score for text provided by IRBs was 10.6 (95 percent confidence interval, 10.3 to 10.8) on the Flesch-Kincaid scale. Specific readability standards, found on 61 Web sites (54 percent), ranged from a 5th-grade reading level to a 10th-grade reading level. The mean Flesch-Kincaid scores for the readability of sample text provided by IRBs exceeded the stated standard by 2.8 grade levels (95 percent confidence interval, 2.4 to 3.2; P

Original languageEnglish (US)
Pages (from-to)721-726
Number of pages6
JournalNew England Journal of Medicine
Volume348
Issue number8
DOIs
StatePublished - Feb 20 2003

Fingerprint

Consent Forms
Research Ethics Committees
Reading
Confidence Intervals
Research Subjects
Medical Schools
Research
Cross-Sectional Studies

ASJC Scopus subject areas

  • Medicine(all)

Cite this

Readability standards for informed-consent forms as compared with actual readability. / Paasche-Orlow, Michael K.; Taylor, Holly; Brancati, Frederick L.

In: New England Journal of Medicine, Vol. 348, No. 8, 20.02.2003, p. 721-726.

Research output: Contribution to journalArticle

Paasche-Orlow, Michael K. ; Taylor, Holly ; Brancati, Frederick L. / Readability standards for informed-consent forms as compared with actual readability. In: New England Journal of Medicine. 2003 ; Vol. 348, No. 8. pp. 721-726.
@article{5d289fb22b8e45e28f274e54f1c2b65d,
title = "Readability standards for informed-consent forms as compared with actual readability",
abstract = "BACKGROUND: Institutional review boards (IRBs) are charged with safeguarding potential research subjects with limited literacy but may have an inadvertent role in promulgating unreadable consent forms. We hypothesized that text provided by IRBs in informed-consent forms falls short of the IRBs' own readability standards and that readability is influenced by the level of research activity, local literacy rates, and federal oversight. METHODS: To test these hypotheses, we conducted a cross-sectional study linking data from several public-use sources. A total of 114 Web sites of U.S. medical schools were surveyed for IRB readability standards and informed-consent-form templates. Actual readability was measured with the Flesch-Kincaid scale, which assigns a score on the basis of the minimal grade level required to read and understand English text (range, 0 to 12). Data on the level of research activity, local literacy rates, and federal oversight were obtained from organizational Web sites. RESULTS: The average readability score for text provided by IRBs was 10.6 (95 percent confidence interval, 10.3 to 10.8) on the Flesch-Kincaid scale. Specific readability standards, found on 61 Web sites (54 percent), ranged from a 5th-grade reading level to a 10th-grade reading level. The mean Flesch-Kincaid scores for the readability of sample text provided by IRBs exceeded the stated standard by 2.8 grade levels (95 percent confidence interval, 2.4 to 3.2; P",
author = "Paasche-Orlow, {Michael K.} and Holly Taylor and Brancati, {Frederick L.}",
year = "2003",
month = "2",
day = "20",
doi = "10.1056/NEJMsa021212",
language = "English (US)",
volume = "348",
pages = "721--726",
journal = "New England Journal of Medicine",
issn = "0028-4793",
publisher = "Massachussetts Medical Society",
number = "8",

}

TY - JOUR

T1 - Readability standards for informed-consent forms as compared with actual readability

AU - Paasche-Orlow, Michael K.

AU - Taylor, Holly

AU - Brancati, Frederick L.

PY - 2003/2/20

Y1 - 2003/2/20

N2 - BACKGROUND: Institutional review boards (IRBs) are charged with safeguarding potential research subjects with limited literacy but may have an inadvertent role in promulgating unreadable consent forms. We hypothesized that text provided by IRBs in informed-consent forms falls short of the IRBs' own readability standards and that readability is influenced by the level of research activity, local literacy rates, and federal oversight. METHODS: To test these hypotheses, we conducted a cross-sectional study linking data from several public-use sources. A total of 114 Web sites of U.S. medical schools were surveyed for IRB readability standards and informed-consent-form templates. Actual readability was measured with the Flesch-Kincaid scale, which assigns a score on the basis of the minimal grade level required to read and understand English text (range, 0 to 12). Data on the level of research activity, local literacy rates, and federal oversight were obtained from organizational Web sites. RESULTS: The average readability score for text provided by IRBs was 10.6 (95 percent confidence interval, 10.3 to 10.8) on the Flesch-Kincaid scale. Specific readability standards, found on 61 Web sites (54 percent), ranged from a 5th-grade reading level to a 10th-grade reading level. The mean Flesch-Kincaid scores for the readability of sample text provided by IRBs exceeded the stated standard by 2.8 grade levels (95 percent confidence interval, 2.4 to 3.2; P

AB - BACKGROUND: Institutional review boards (IRBs) are charged with safeguarding potential research subjects with limited literacy but may have an inadvertent role in promulgating unreadable consent forms. We hypothesized that text provided by IRBs in informed-consent forms falls short of the IRBs' own readability standards and that readability is influenced by the level of research activity, local literacy rates, and federal oversight. METHODS: To test these hypotheses, we conducted a cross-sectional study linking data from several public-use sources. A total of 114 Web sites of U.S. medical schools were surveyed for IRB readability standards and informed-consent-form templates. Actual readability was measured with the Flesch-Kincaid scale, which assigns a score on the basis of the minimal grade level required to read and understand English text (range, 0 to 12). Data on the level of research activity, local literacy rates, and federal oversight were obtained from organizational Web sites. RESULTS: The average readability score for text provided by IRBs was 10.6 (95 percent confidence interval, 10.3 to 10.8) on the Flesch-Kincaid scale. Specific readability standards, found on 61 Web sites (54 percent), ranged from a 5th-grade reading level to a 10th-grade reading level. The mean Flesch-Kincaid scores for the readability of sample text provided by IRBs exceeded the stated standard by 2.8 grade levels (95 percent confidence interval, 2.4 to 3.2; P

UR - http://www.scopus.com/inward/record.url?scp=0037456358&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0037456358&partnerID=8YFLogxK

U2 - 10.1056/NEJMsa021212

DO - 10.1056/NEJMsa021212

M3 - Article

C2 - 12594317

AN - SCOPUS:0037456358

VL - 348

SP - 721

EP - 726

JO - New England Journal of Medicine

JF - New England Journal of Medicine

SN - 0028-4793

IS - 8

ER -