Reliability, Readability and Quality of Online Information about Femoracetabular Impingement

Document Type: RESEARCH PAPER

Authors

1 Rothman Institute at Thomas Jefferson University, 925 Chestnut St, Philadelphia, PA 19107

2 Sidney Kimmel Medical College at Thomas Jefferson University, 1025 Chestnut St, Philadelphia, PA 19107

Abstract

Background: The Internet has become the most widely-used source for patients seeking information more about their health and many sites geared towards this audience have gained widespread use in recent years. Additionally, many healthcare institutions publish their own patient-education web sites with information regarding common conditions. Little is known about how these resources impact patient health, though, as they have the potential both to inform and to misinform patients regarding their prognosis and possible treatments. In this study we investigated the reliability, readability and quality of information about femoracetabular impingement, a condition which commonly affects young patients.
Methods: The terms “hip impingement” and “femoracetabular impingement” were searched in Google® in November 2013 and the first 30 results were analyzed. The LIDA scale was used to assess website accessibility, usability and reliability. The DISCERN scale was used to assess reliability and quality of information. The FRE score was used to assess readability.
Results: The patient-oriented sites performed significantly worse in LIDA reliability, and DISCERN reliability. However, the FRE score was significantly higher in patient-oriented sites.
Conclusion: According to our results, the websites intended to attract patients searching for information regarding femoroacetabular impingement are providing a highly accessible, readable information source, but do not appear to apply a comparable amount of rigor to scientific literature or healthcare practitioner websites in regard to matters such as citing sources for information, supplying methodology and including a publication date. This indicates that while these resources are easily accessed by patients, there is potential for them to be a source of misinformation.
 

Keywords


Introduction

T

he Internet has become the most widely-used source of information for patients, and the most important tool for maintaining an up to date source of current knowledge among the scientific community.  While the internet provides a tremendous breadth of knowledge readily available to the patients, the quality of available information on the web is an issue for content provided in various languages (1–4). Information quality was an issue in 70 % of studies assessing the quality of health information on the internet according to a recent systematic review (5). In addition to concerns about the accuracy of the information available on health care websites, technical aspects of the page such as spam generation, accessibility, the page’s credibility, readability and accuracy, and end-user behavior also raise issues (6,7).

While controlling the information entered into the Internet is impossible due to its deregulated nature, attempts have been made to estimate the individual risk of finding an inadequate information regarding various health conditions (8). The largest and oldest is the Health on the Net Foundation, a non-governmental initiative that established a code of ethical conduct called the HONcode aimed at pointing patients towards quality health information by certification of web sites deemed reliable for patient information (9). Unfortunately, the majority of quality healthcare websites do not display the HONcode, and this makes it difficult to use this as a signifier of quality information. The searcher’s ability to perform the proper query in order to avoid the non-certificated low quality pages is also an issue. To address this issue, search-engines, including one from the Health on the Net Foundation, have been developed to systematically show only the certificated pages, but this effort has not proved to be better than the general search-engines, generally due to the lower volume of information available (10).

This study aimed to evaluate the reliability and quality of information regarding femoroacetabular impingement (FAI) available online, specifically the sites which appear in the top 30 results of a Google search. FAI was chosen because the non-emergent nature of the procedures allows a patient ample time to research their condition, and because it affects a high proportion of young patients, a population which uses the Internet extensively. The LIDA scale was used to assess website accessibility, usability and reliability. The DISCERN scale was used to assess reliability and quality of information. The FRE score was used to assess readability. All authors report no conflicts of interest related to this subject.

 

Materials and methods

The terms “hip impingement” and “femoracetabular impingement” were searched in Google in November 2013 and the first 30 results were recorded and analyzed using the Flesch Reading Ease (FRE), LIDA, and DISCERN instruments. Only the first 30 were chosen because it has been shown that 90 % of search engine users click on a link within the first 3 pages of results (7) [Figure 1]. Duplicate web pages, video-based websites and social media sites were excluded as the tools used in this study are geared towards text-based sites. All results were relevant to the inputted search terms. Websites were not excluded based on intended audience as 62% of Internet users search medical literature and 28% search clinical trials when looking for health information (11). Notation was made of the intended audience based on statements made within each page, and a comparison of patient-oriented versus practitioner-oriented websites was made. Websites were not analyzed based on related links to other pages within the site, unless the scoring scale was based on an element of the website as a whole.

 

Readability

The Flesch Reading Ease (FRE) formula was used to calculate the readability of each website’s text (12). The main body of each website was copied in plain text and pasted into Microsoft Word. Websites which required navigation between multiple tabs to view all of the information were analyzed based on the cumulative text of the entire page. Blog-style websites were analyzed based on the most recent posting. All titles, headers, images captions, author details, tables, lists, hyperlinks, citations, references, user comments and website information were deleted. Analysis was performed using the spelling and grammar function in Microsoft Word. The FRE formula generated a score of 0 to 100, with 100 representing very easy to read text and 0 representing very difficult to read text. There is theoretically no lower limit and the upper limit is 120.

 

Accessibility, Usability, Reliability

The LIDA instrument was used to evaluate the accessibility, reliability and usability of selected websites. The accessibility portion consisted of a website generated accessibility test with a maximum score of 54 and two other categories scored on 0-3 scale (0 being never, 3 being always) to bring the maximum score possible to 60. The other two portions of the instrument used the same 0 to 3 scoring system to assess usability (clarity, consistency, functionality, engageability) with a maximum score of 54 and reliability (currency, conflict of interest, content production) with a maximum score of 27. Two supplemental tests for content production and output of content with a maximum score of 24 were also analyzed. This was performed in a blinded manner by analyzing the text in isolation from the each website. These two scores are analyzed separately from the general LIDA score.

 

Information Quality

The DISCERN instrument was used to evaluate the reliability, information quality and overall quality of selected websites (13). The information quality portion consisted of 8 scores, giving a maximum total of 40, the reliability portion consisted of 7 scores, giving a maximum total of 35, and the overall score was a single category. Each score is reported as a percentage of the total possible. Though the test is subjective in nature, ratings have been shown to be consistent between multiple scorers, even in the overall quality portion for which no objective measure is given (14). One difference between DISCERN and LIDA is that LIDA has a low score of 0 while DISCERN has a low score of 1. Thus, the lowest possible percentage score for the DISCERN scale is 20%, while the lowest possible LIDA score is 0%.

 

Statistical Analysis

The difference between the mean DISCERN scores of web sites by search term was analyzed using one-way analysis of variance, as was the difference between FRE scores by search term. The correlation between the overall DISCERN score and the position a Web site’s appearance on the Google® results list was analyzed using Pearson correlation coefficient r, as was the correlation between overall DISCERN score, overall LIDA score, and FRE score.

 

Results

We retrieved 44 web pages from Google® search engine, of which 17 were found in searches on “femoracetabular impingement,” 19 were found in searches on “hip impingement”, and 9 were found in both. The pages were grouped according to the main audience being addressed with 28 pages oriented towards patient, 9 oriented towards orthopedic surgeons, 2 towards physical therapists, 1 towards athletic trainers, 1 towards primary care physicians, 1 towards radiologists, 1 towards health care administrators, and 1 page toward both physicians and patients. Only 2 of the 44 pages were HONcode certificated.

The patient-oriented sites performed consistently 30 points worse in 100-point scales describing LIDA reliability, LIDA supplemental production (P<0.00001), LIDA supplemental output (P=0.0001) and DISCERN reliability. However, the FRE score was 16.68 points higher in patient-oriented sites (P<0.00001) [Figure 2].  Although based on different criteria, LIDA and DISCERN scores were found to correlate strongly for patient-oriented sites, (Spearman’s rho 0.75; P<0.00001). For physician-oriented sites, there was no such correlation (Spearman’s rho 0.14; P=0.62), but this appeared to be due to the saturation of values near 100 points as seen in Figure 3.

Subgroup analysis of different search queries (“hip impingement” vs. “femoracetabular impingement”) included page rank as a potential predictor but this was not found to be significant.

 

Discussion

In this study, we aimed to determine the readability, reliability and information quality of web pages regarding femoroacetabular impingement and our results show that patient oriented sites score significantly higher in readability indexes but worse in reliability and content generation. This indicates that  the websites intended to attract patients searching for health information are providing a highly accessible, readable information source, but do not appear to apply a comparable amount of rigor to scientific literature or healthcare practitioner websites in regard to matters such as citing sources for information, supplying methodology and including a publication date. A more concerted effort to present information intended for patients in a scientifically rigorous manner while maintaining the readability and accessibility that makes it such a highly utilized education tool would allow the careful patient to consider how the information was gathered, check it against original sources or judge how recently it was published. This would help ensure that the information they find is both reliable and accessible.

The probability that internet health information can mislead a patient is a major area of concern in the medical literature, exacerbated by the dynamic nature of the internet (15, 16). There are several instruments available to assess reliability and quality of the web pages offering health information, but few are validated and few are likely to be practically usable by the intended audience. There have been numerous attempts to assess reliability and readability regarding various health-related topics, and these yield consistently poor results (17–19).

In addition to reliability, the measurements of readability are important because the information provided on the web is above the average US population readability. The minimum required for understanding the content correlates to a high school degree, while the level of education necessary for understanding the disclosure corresponds to 2 years of college level education (20,21). The second major finding was within search terms, controlling for audience, there is no association between scores and rank in the query. Less than a quarter of the links revealed on the first page of the search site, lead to relevant content.

One strong point of this work is the use of three validated scores to assess the critical points regarding health information for patients (22). A possible flaw was our exclusive use of Google to gather websites for analysis. While Google is by far the most used search engine, this does not necessarily provide a clear picture of the information available through all search engines. It should also be noted that this analysis was performed on only one topic at one time, providing just a snapshot of a voluminous, highly dynamic source of information (20). Additionally, no verified tools exist to analyze the accuracy of the content available on a website. Thus, we were unable to analyze whether or not these sites contained factual inaccuracies and instead focused on content production and methodological rigor to assess the quality of the information.

Regardless of the above limitations, our findings confirm the commonly held perception among the medical profession that patient-oriented information provided on the internet is easier to understand and access than practitioner-oriented information but does not apply a comparable level of rigor. These websites therefore have the potential to mislead patients seeking further information about their health conditions, and this issue appears to apply to the topic of femoroacetabular impingement. This has implications for healthcare practitioners because it can be assumed patients will access these websites, which may contain inaccuracies, and therefore misunderstand their diagnosis. This reinforces the importance of thorough patient education in order to ensure that patients understand their conditions accurately and suggests ways in which patient-oriented online information could be improved in order to provide access to information that is both high quality and highly accessible.

  1. Küçükdurmaz F, Aytekin M, Tuncay I, Sen C. A Pilot Study About Quality Of Information at Health Related Web Sites in Turkish: Meniscus Tear. Nobel Med. 2013; 9(2):114–7.
  2. Smarrito S, Mitrofanoff M, Haddad R, Pavy B. Do we need a chart of quality for websites related to cosmetic surgery?. Ann Chir Plast Esthet. 2003; 48(4):222–7.
  3. Kishimoto K, Yoshino C, Fukushima N. Study of the health food information for cancer patients on Japanese websites. Yakugaku Zasshi. 2010; 130(8):1017–27.
  4. Neumark Y, Flum L, Lopez-Quintero C, Shtarkshall R. Quality of online health information about oral contraceptives from Hebrew-language websites. Isr J Health Policy Res. 2012; 1(1):38
  5. Eysenbach G, Powell J, Kuss O, Sa E-R. Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review. JAMA . 2014; 287(20):2691–700.
  6. Adams SA. Revisiting the online health information reliability debate in the wake of “ web 2.0”: An inter-disciplinary literature and website review. Int J Med Inform. 2010; 79(6):391-400.
  7. Eysenbach G, Köhler C. How do consumers search for and appraise health information on the world wide web? Qualitative study using focus groups, usability tests, and in-depth interviews. BMJ. 2002; 324(7337):573–7.
  8. Maloney S, Ilic D, Green S. Accessibility, nature and quality of health information on the Internet: a survey on osteoarthritis. Rheumatology (Oxford, England). 2005;44(3):382–5.
  9. Health on the Net Foundation. The commitment to reliable health and medical information on the internet; 2014. Available from: http://www.healthonnet.org/HONcode/Patients/Visitor/visitor.html.
  10. Ilic D, Bessell TL, Silagy CA, Green S. Specialized medical search-engines are no better than general search-engines in sourcing consumer information about androgen deficiency. Hum Reprod. 2003;18(3):557–61.
  11. Pletneva N, Cruchet S, Simonet M-A, Kajiwara M, Boyer C. Results of the 10 HON survey on health and medical internet use. Stud Health Technol Inform. 2011; 169:73–7.
  12. Flesch R. A new readability yardstick. J Appl Psychol. 1948; 32(2):221–33.
  13. Charnock D. Quality criteria for consumer health information on treatment choices. London: Radcliffe Medical Press; 1998.
  14. Rees CE, Ford JE, Sheard CE. Evaluating the reliability of DISCERN: A tool for assessing the quality of written patient information on treatment choices. Patient Educ Couns. 2002; 47(3):273–5.
  15. Soot LC, Moneta GL, Edwards JM, Roon AJ. Vascular surgery and the Internet: A poor source of patient-oriented information. J Vasc Surg. 1999; 30(1):84–91.
  16. Jadad AR, Gagliardi A. Rating health information on the Internet: navigating to knowledge or to Babel?. JAMA . 1998; 279(8):611–4.
  17. Fast AM, Deibert CM, Hruby GW, Glassberg KI. Evaluating the quality of Internet health resources in pediatric urology. J Pediatr Urol. 2013; 9(2):151–6.
  18. Hargrave DR, Hargrave UA, Bouffet E. Quality of health information on the Internet in pediatric neuro-oncology. Neuro-oncol. 2006; 8(2):175–82.
  19. Ansani NT, Vogt M, Henderson BAF, McKaveney TP, Weber RJ, Smith RB, et al. Quality of arthritis information on the Internet. Am J Health Syst Pharm. 2005; 62(11):1184–9.
  20. Berdland G, Elliott MN, Morales LS, Algazy JI, Kravitz RL, Broder MS, et al. Health Information on the Internet. JAMA. 2014;285(20):2612-21.
  21. Graber MA, D’Alessandro DM, Johnson-West J. Reading level of privacy policies on Internet health Web sites. J Fam Pract. 2002; 51(7):642–5.
  22. Purcell GP, Wilson P, Delamothe T. The quality of health information on the internet. BMJ. 2002; 324:557.