Skip to main content

Pelvic vein embolization: an assessment of the readability and quality of online information for patients



Pelvic congestion syndrome is a controversial topic. Pelvic vein embolization is a minimally invasive treatment for pelvic congestion syndrome. We aimed to assess the quality of information available on the Internet and determine how accessible information provided by the main IR societies was to patients.

Materials and methods

The most commonly used term relating to pelvic vein embolization was searched across the five most-used English language search engines, with the first 25 web pages returned by each engine included for analysis. Duplicate web pages, nontext content and web pages behind paywalls were excluded. Web pages were analyzed for quality and readability using validated tools: DISCERN score, JAMA Benchmark Criteria, HONcode Certification, Flesch Reading Ease Score, Flesch–Kincaid Grade Level, and Gunning–Fog Index.


The most common applicable term was “Pelvic Vein Embolization”. Mean DISCERN quality of information provided by websites is “fair”. Flesh–Kincaid readability tests and Gunning–Fog Index demonstrated an average “college level” of reading ease. HON code certification was demonstrated in less than one third of web pages. Professional societies and scientific journals demonstrated the highest average JAMA and DISCERN scores, while for-profit organizations and healthcare providers demonstrated the lowest. Only information from 1 of 3 interventional societies was included in the first 25 search engine pages.


The quality of information available online to patients is “fair” and outside of scientific journals the majority of web pages do not meet the JAMA benchmark criteria. These findings call for the production of high-quality and comprehensible content regarding interventional radiology, where physicians can reliably direct their patients for information.

Peer Review reports


In an ever-expanding technological age, there is a tendency for people to turn to the internet for information and advice on facets of everyday life including healthcare information (Marton and Wei 2012). Poor quality information can negatively influence patient decision making outside of the doctor-patient consultation as there is a broad range of health information ranging from patient experience discussed in online forums to more esoteric scientific journals (Papen 2013).

Interventional Radiology (IR) has revolutionized treatment for a wide range of conditions, including pelvic congestion syndrome (PCS) by offering pelvic vein embolization as a treatment. While pelvic vein embolization can offer a high success rate of symptom relief (Brown et al. 2018) it should be noted that treatment of PCS is a controversial area.

The primary purpose of this study was to assess the readability and quality of online information for patients with regard to pelvic vein embolization using a variety of online instruments and quality measures. Pelvic congestion syndrome was chosen because it is controversial and quality information would help patients navigate the problem and inform themselves. A secondary aim was to assess the information provided by large English-speaking IR societies such as the Cardiovascular and Interventional Society of Europe (CIRSE), the Society of Interventional Radiology (SIR) and the British Society of Interventional Radiology (BSIR).

Materials and methods

Web page selection process

Our study’s search strategy was similar to that instituted by Murray et al. (2018). A list of the most familiar search terms describing pelvic vein embolization was selected from both relevant literature and patient-information websites; Pelvic Vein Embolization, Ovarian Vein Embolization. Each of these terms was then searched across the five most popular English-language based search engines (Google, Bing!,, Yahoo, and AOL Search) (Chris 2019). All searches were conducted from the same Internet Protocol address in a cache and cookie cleared manner to minimize the influence of previous queries. Only the top 25 web pages for each search engine were examined, as it has been illustrated that patients are unlikely to view beyond these results for health-related searches (O’Neill et al. 2014; Silberg et al. 1997). The exclusion criteria included web pages advertised by search engines, web pages with paywall access, sole video and audio content, geographically inaccessible web pages, duplicate web pages, or web pages that were subsections of others.


Each web page was assessed for quality using three validated methods: the JAMA Benchmark Criteria, Health on the Net Foundation (HONcode) certification and the DISCERN instrument (O’Neill et al. 2014). The JAMA benchmark criteria are four criteria; (i) authorship, (ii) attribution of sources of information, (iii) disclosure of conflict of interest and (iv) how current the information is (Silberg et al. 1997). Publishing organization was recorded either from the web page itself or from the “About Us/Contact Us” section. The date of creation, or last reported update was recorded to assess for currency similar to a strategy used by Alderson et al. (2019).

The DISCERN instrument is a 16-point questionnaire that assesses important aspects of information reliability, description of treatment choices, and overall information quality (Charnock et al. 1999). A higher score indicates higher quality healthcare information. DISCERN has demonstrated inter-observer reliability and construct validity when used either by medical professionals or laypersons (Alderson et al. 2019; Rees et al. 2002).

HONCODE Health on the Net code of conduct (HONcode) is a website standard that assess the credibility and reliability of healthcare information. Prior studies have shown that HONcode is a marker of reliable medical information (Laversin et al. 2011) and is allied with superior clinical precision (Fallis and Fricke 2002). HONcode certification was recorded by checking the HONcode online database.


The National Institute of Health and the American Medical Association recommend that the readability level of health information for patients should not surpass a 6th grade reading level (Weiss 2003). Readability was assessed using a number of assessment tools via an online analysis tool: (i) the Flesch Reading Ease Score (FRES), (ii) the Flesch-Kincaid Grade Level (FKGL), and (iii) the Gunning-Fog Index (GFI) (Readability Test Tool 2019). The FRES score reports the readability ease of a text on a scale from 0 to 100, with 100 being the easiest text to read and 0 being the most difficult. The FKGL formula is dependent on two variables: average sentence length and average number of syllables per word (Flesch 1948). The higher the score, the easier the passage is to read. These two scores correlate with a required level of education to read an item (US school grade) (Kincaid et al. 1975). The GFI is a separate readability measure that additionally accounts for word complexity and word unfamiliarity using the formula 0.4 [(words/sentences) + 100 (complex words/words)], with reference to a list of common words that are not considered complex, regardless of syllable count. This then estimates the number of years of education required to read an article (O’Neill et al. 2014; Walsh and Volsko 2008). The higher the GFI score, the more intricate the passage is to read. Additionally, qualification of author and web page owner were also collected.

Statistical methods

Spearman rank order was used to assess the correlation between web page quality scores with their respective position in the order of search results. JAMA benchmark criteria score, mean web page age and DISCERN score were compared across different web page owners by one-way analysis of variance (ANOVA). Significance was predetermined at p-value < 0.05. Analysis was performed using Stata/IC 15 software (StataCorp 2017). An online statistics tool was used to create box plots.

Institutional Review Board (IRB) unnecessary as no human subjects involved in this study.


Search terms

Analysis identified that Pelvic Vein Embolization was the most common search term (824,000 hits). The first 25 search results were chosen for analysis from each of the 5 search engines yielding a total of 125 items. Seventy-five search results were excluded from analysis: 55 duplicate web pages and 20 non-readable links (video n = 13, paywall access web pages n = 6 and website not accessible n = 1). Fifty search results remained for analysis (Table 1).

Table 1 Web page ownership by organisation type

JAMA benchmark criteria

Compliance with JAMA benchmarks was separately recorded for each website (Table 2). Only 14% of websites (n = 7) fulfilled a full JAMA Benchmark score of 4, all being from scientific journals.

Table 2 Compliance with the Journal of the American Medical Association (JAMA) Quality Benchmarks


Doctor(s) (n = 23) and non-medical authors (n = 3). In the remaining cases (n = 24), the author was not reported. 50% of websites provided the original date of publication or update (the most recent being recorded), with a mean age of 3.28 years.


Scientific journals had 100% compliance (n = 11) whereas both for profit (n = 6) and non-profit (n = 8) organisations had 50%, professional societies 30% (n = 1) and lastly healthcare providers having a low compliance level of 27% (n = 6).


Scientific Journals had the highest average DISCERN score of 51.2 while for profit organisations had the lowest score of 35 (Table 3). 48% of websites were rated between ‘very poor’ and ‘poor’ with a DISCERN score of less than 38, while 32% of articles achieved a score of ‘fair’ and only 20% receiving a ‘good’ score of 51–62. No websites scored well enough to be deemed excellent by the DISCERN tool rating (p-value < 0.0002) (Table 3).

Table 3 Discern Score-Quality of Websites

HONcode Certification

In total 10% of all websites had HONcode certification with scientific journals having the highest score of 6%, for-profit, non-profit and professional societies all having 2% certification and health care providers having no Honcode certification (Table 4).

Table 4 Summary of Results


In general, the average readability scores inclusive of the FKGL (13.1) (p-value 0.0412), GFI (15.6) (p-value 0.0322) and FRES (41.3) (p-value 0.0314), show values indicating a “college level” of reading ease. Table 4 demonstrates the variation by web-page owner.

Quality assessment

Mean JAMA score was 3.28. JAMA scores varied significantly by web page owner (rs = − 0.2517, p (2-tailed) = 0.07786). Mean DISCERN score was 40.8. DISCERN scores did not vary significantly by web page owner (rs = − 0.1549, p (2-tailed) = 0.27777). Correlation between search ranking and quality score failed to reach significance, using both DISCERN score and JAMA bench- mark criteria.

Readability assessment

The average FRES score was 41.30. The average FKGL score was 13.14. The average GFI score was 15.61. ANOVA between groups showed no significance between FRES FKGL, GFI score and web page owner (rs = 0.12, p (2-tailed) = 0.41 rs = − 0.18, p (2-tailed) = 0.21 and rs = − 0.14, p (2-tailed) = 0.33 respectively).

From patient information web pages provided by IR societies, only BSIR placed in the top 25 search results. CIRSE or SIR material was not identified in the top 25 search results in any of the search engines. The BSIR page showed a JAMA score of 2, a good Discern score of 54 and readability scores of 53.8 (FRES), 10.3 (FKGL) and 13.6 (GFI).

The publishing journals of these societies showed

CVIR had 2 papers (4 search engine results with 2 being duplicates and one being paywall access) in the top 10 search results across all search engines. JVIR had the most results of all IR journals with 5 papers across Google and search engines. Two of these papers were paywall access.


Undoubtedly, Pelvic Congestion syndrome is a controversial topic in medicine. Within IR, there is no consensus on treatment approach as most of the literature is a collection of techniques and targetable therapeutic sites. Procedures are often not covered by insurance companies due to a lack of high quality literature. It is clear from review of these websites that there is a mix of information for patients to decipher. While some websites do note other treatment options for PCS including analgesia and surgical options, for profit standalone clinics and private surgical centers often only describe Pelvic Vein Embolization as the only effective curative approach with high success rates. The technique of embolization was also variably covered.

There was a diverse difference in web page quality as interpreted by JAMA and DISCERN scores. Professional Societies and Scientific journals showed the highest average scores. Of note scientific journals also scored the lowest FRES result, which signifies a more difficult text for the public to read. The more numerous, understandable web pages in the healthcare provider category, could easily tempt the general public to engage with the lower-quality information of these website providers.

Pages assessing JAMA benchmarks showed that most lacked authorship, references and disclosure. Doctors represented the largest group of authors (46%), however a further 48% of web pages did not report authors which questions the legitimacy of those web pages. The average DISCERN score was 40.8, meaning a “fair” quality which is below what many patients would expect to find when looking for information. As expected, scientific journals showed the highest average DISCERN score, and were the only category to fulfill all JAMA benchmark criteria. However, the target audience for scientific journals is not patients. The medical terminology and “jargon” used may actually make the information inaccessible to patients. Healthcare providers and for-profit organizations had the lowest JAMA and DISCERN scores, which suggests the provision of limited amounts of information to emphasize their own particular interests. An area all web pages could improve upon is HONcode certification.

In general, the average readability indicating a “college level” of reading ease. Thus, web pages were of both moderate difficulty in terms of word complexity and technical readability. To put that into perspective, this paper scores a 14.1 FKGL score, GFI score of 16.3 and FRES score of 32.8 which is representative of the difficult average readability of the information patients are accessing online when they search for pelvic vein embolization. As the general public increasingly rely on the Internet for access to information regarding procedures, it is important that the information available is at a suitable level. Instead of the suggested 6th grade readability level for health information (Weiss 2003), we have shown the level to be significantly higher.

Web-based health information is critical and can alter behavior, reach peers in real time, increase satisfaction with care, improve health outcomes, and facilitate shared decision-making between patients and healthcare professionals of the ever-expanding proportion of the population relying on the Internet for information (Suggs 2006; Daraz et al. 2011).

Better understanding of IR procedures would likely encourage and foster a more appropriate and well-informed decision-making process. Effective communication and propagation of quality information relating to interventional procedures is important for the continued expansion of Interventional Radiology as a specialty, as well informed patients are more likely to choose less invasive treatment options (Becker 2001). Many independent bodies and IR societies (CIRSE, SIR, BSIR) have patient information web pages, however these pages may not be found by patients due to their inclination not to proceed past the first 25 results. In fact, both CIRSE and SIR information pages on pelvic vein embolization do not make it into the top 100 Google search results. These organizations should aim to increase their online presence by moving higher up on the search engine ranking.

Study limitations

Readability tools often do not take into account the content or complexity of medical vocabulary or patients’ familiarity with medical terminology and may underestimate or overestimate the actual readability of online health information (Smith et al. 2011; Pichert and Elam 1985). Scoring was performed by doctors rather than patients who embody the target demographic in this study. This paper did not assess multimedia websites.

Future work

One intriguing area for future research relates to the use of mixed online media (e.g., video, audio) to deliver health care information. Mixed multimedia health information may be easier to understand than traditional text.


The reliability and quality of online content remains a critical issue for patients and doctors alike. This study demonstrates that outside of scientific journals, the majority of web pages do not meet the JAMA benchmark criteria. Overall, most patients would find it difficult to understand these articles, have little measure of which articles to trust and could be misled by the quality of content within. Content producers on the Internet need to have increased awareness of quality and readability tools which, when applied, could improve their trustworthiness and patient’s understanding. We believe that there is a necessity for a high-quality Interventional Radiology website, that is current, impartial, easy to read and well sourced at an accessible level for patients.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.


  1. Alderson JH, O'Neil DC, Redmond CE, Mulholland D, Lee MJ (2019) Varicocele Embolization: An assessment of the quality and readability of online patient information. Acad Radiol [Epub ahead of print]

  2. Becker GJ (2001) The future of interventional radiology. Radiology 220:281–292

    CAS  Article  Google Scholar 

  3. Brown C et al (2018) Pelvic congestion syndrome: systematic review of treatment success. Semin Interv Radiol 35(1):35–40

    Article  Google Scholar 

  4. Charnock D, Shepperd S, Needham G et al (1999) DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health 53:105–111

    CAS  Article  Google Scholar 

  5. Chris A. Reliable Soft [website] URL. Available at: Accessed 16 Sep 2019

  6. Daraz L, MacDermid JC, Wilkins S et al (2011) The quality of websites addressing fibromyalgia: an assessment of quality and readability using standardised tools. BMJ Open 1:e000152.

  7. Fallis D, Fricke M (2002) Indicators of accuracy of consumer health information on the internet: a study of indicators relating to information for managing fever in children in the home. J Am Med Inform Assoc 9:73–79

    Article  Google Scholar 

  8. Flesch R (1948) A new readability yardstick. J Appl Psychol 32:221–233

    CAS  Article  Google Scholar 

  9. Kincaid JP, Fishburne RP Jr, Rogers RL et al (1975) Derivation of new readability formulas (automated readability index, fog count and Flesch reading ease formula) for navy enlisted personnel. Naval Technical Training Command, Millington, Research Branch

    Google Scholar 

  10. Laversin S, Baujard V, Gaudinat A, Simonet MA, Boyer C (2011) Improving the transparency of health information found on the internet through the honcode: a comparative study. Stud Health Technol Inform 169:654–658

    PubMed  Google Scholar 

  11. Marton C, Wei CC (2012) A review of theoretical models of health information seeking on the web. J Documentation 68(3):330–352

    Article  Google Scholar 

  12. Murray TE, Mansoor T, Bowden DJ, O'Neill DC, Lee MJ, Chris A (2018) Uterine artery embolization: an analysis of online patient information quality and readability with historical comparison. Acad Radiol 25(5):619–625

    Article  Google Scholar 

  13. O’Neill SC, Baker JF, Fitzgerald C et al (2014) Cauda Equina syndrome: assessing the readability and quality of patient information on the internet. Spine 39:E645–E649

    Article  Google Scholar 

  14. Papen U (2013) Conceptualising information literacy as social practice: a study of pregnant women’s information practices. Inf Res 18:280

    Google Scholar 

  15. Pichert JW, Elam P (1985) Readability formulas may mislead you. Patient Educ Couns 7:181–191

    Article  Google Scholar 

  16. Readability Test Tool. WebFX. Harrisburg, PA. Available at: Accessed 23 Sep 2019

  17. Rees EC, Ford JE, Sheard CE (2002) Evaluating the reliability of DISCERN: a tool for assessing the quality of written patient information on treatment choices. Patient Educ Couns 47:273–275

    Article  Google Scholar 

  18. Silberg WM, Lundberg GD, Musacchio RA (1997) Assessing, controlling, and assuring the quality of medical information on the internet: Caveant lector et viewor--let the reader and viewer beware. JAMA 277:1244–1245

    CAS  Article  Google Scholar 

  19. Smith CA, Hetzel S, Dalrymple P, Keselman A (2011) Beyond readability: investigating coherence of clinical text for consumers. J Med Internet Res 13:e104

    Article  Google Scholar 

  20. Suggs LS (2006) A 10-year retrospective of research in new technologies for health communication. J Health Commun 11(1):61–74

    Article  Google Scholar 

  21. Walsh TM, Volsko TA (2008) Readability assessment of internet-based con- sumer health information. Respir Care 53:1310–1315

    PubMed  Google Scholar 

  22. Weiss BD (2003) Health literacy: a manual for clinicians. American Medical Association Foundation and American Medical Association, Chicago

    Google Scholar 

Download references



Compliance with ethical standards statements

Unnecessary as no human subjects involved in this study.

Informed consent

Not applicable.


No funding was received for this paper.

Author information




RL gathered, analysed and interpreted the data (HONcode, DISCERN and JAMA online tools). RL, DON, ML were major contributors in writing the manuscript. ML and DON had substantial contributions to the conception of this research. All authors read and approved the final manuscript and MB and JA aided in minor contributions to writing the manuscript and aided with analysis. MB aided with statistics.

Corresponding author

Correspondence to R. J. Lee.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable No human subjects or their data used.

Competing interests

No conflict of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Lee, R.J., O’Neill, D.C., Brassil, M. et al. Pelvic vein embolization: an assessment of the readability and quality of online information for patients. CVIR Endovasc 3, 52 (2020).

Download citation


  • Pelvic vein
  • Embolization
  • Pelvic congestion syndrome
  • Information
  • Patient
  • Internet
  • Online