Obstructive sleep apnoea patient resources—a longitudinal analysis of the quality and readability over a 5-year timespan
Original Article

Obstructive sleep apnoea patient resources—a longitudinal analysis of the quality and readability over a 5-year timespan

William Theile1, Christopher K. T. G. Erian2, Michael M. A. S. Erian3, Leon Kitipornchai4

1Division of Ear Nose and Throat Surgery, Department of Surgery, Princess Alexandra Hospital, Brisbane, QLD, Australia; 2Division of Orthopaedic Surgery, Department of Surgery, Princess Alexandra Hospital, Brisbane, QLD, Australia; 3Division of Orthopaedic Surgery, Department of Surgery, Ipswich Hospital, Ipswich, QLD, Australia; 4Division of Ear Nose and Throat Surgery, Department of Surgery, Clinical Senior Lecturer University of Queensland Faculty of Medicine, Logan Hospital, Brisbane, QLD, Australia

Contributions: (I) Conception and design: W Theile, CKTG Erian; (II) Administrative support: All authors; (III) Provision of study materials or patients: None; (IV) Collection and assembly of data: W Theile, CKTG Erian, MMAS Erian; (V) Data analysis and interpretation: W Theile, CKTG Erian, MMAS Erian; (VI) Manuscript writing: All authors; (VII) Final approval of manuscript: All authors.

Correspondence to: William Theile, MD. Associate Academic/Lecturer/Fellow University of Queensland, Division of Ear Nose and Throat Surgery, Department of Surgery, Princess Alexandra Hospital, 199 Ipswich Rd, Wooloongabba, 4102 Brisbane, QLD, Australia. Email: William.theile@health.qld.gov.au.

Background: Obstructive sleep apnoea (OSA) is an increasingly important health condition with significant impacts on morbidity and mortality. Given the increasing reliance on the internet as a source of clinical information for patients, we aimed to explore the quality and readability of this content and how it may be evolving over time.

Methods: Ninety websites were scrutinised in 2019 and again 5 years later in 2024. The top 15 search results for the terms ‘Obstructive sleep apnoea’ and ‘Sleep apnea’ across Google, Bing, and Yahoo were evaluated using validated readability [Flesch-Kincaid Reading Ease (FKRE)] and quality assessment tools [DIssembling Information on Treatment Choices for Patients (DISCERN)].

Results: In 2019, the FKRE scores ranged from 31–84 (‘difficult’ to ‘standard’ reading ease) with a mean score of 57.9 (standard deviation =11.28). In 2024, FKRE scores ranged from 16–80 with a mean score of 54.3 (standard deviation =13.17). DISCERN quality scores of websites in 2019 ranged from 21 to 63, with a mean score of 36.6 (standard deviation =11.98). In 2024, the DISCERN scores ranged from 32 to 68, with a mean score of 53.1 (standard deviation =8.63). There was a statistically significant negative relationship (Spearman’s r=−0.467, P=0.044) in the 2024 data, indicating higher quality of information was associated with a lower readability score and vice versa.

Conclusions: While patient resources for OSA have improved in quality over time, they have reduced in readability such that the majority are now too difficult for the average Australian to comprehend. Even more concerningly, the resources of superior quality have been found to be more difficult to read. Among the sources evaluated, ResMed consistently provided the most balanced content in terms of both quality and readability over the past 5 years (ResMed, https://www.resmed.com.au/sleep-apnea) FKRE: 2019 =65, 2024 =74; DISCERN: 2019 =63, 2024 =50.

Keywords: Sleep apnea; obstructive; patient education; health knowledge; health literacy


Received: 08 January 2025; Accepted: 07 July 2025; Published online: 30 October 2025.

doi: 10.21037/ajo-25-4


Introduction

Obstructive sleep apnoea (OSA) is an increasingly common health condition with a profound impact on sufferers’ quality of life (1). It has an estimated prevalence of up to 28% of adults in Western countries and men, the elderly and those with high body mass index (BMI) are most commonly affected (2,3). OSA is characterised by repeated upper airway collapse during sleep, which leads to recurrent apnoeas, desaturations and interrupted sleep and is associated with significant morbidity and mortality (1). Today it is known that OSA may predispose patients to coronary artery disease, hypertension and stroke as well as increase their risk of occupational and vehicular accidents (4-7). Beyond personal morbidity, OSA exacts a significant public health cost owing to direct healthcare-related costs, treatment of other conditions resulting from OSA and lost productivity (8).

Coinciding with the increasing public health relevance of OSA is a concomitant increase in the amount of information available to patients concerning the condition. Much of this information is now sought online, reflecting increasing internet accessibility globally, as well as evolving patterns in how patients engage with health information. Currently, nearly 13 million Australians have an active home internet subscription, with almost double that having internet access via mobile devices (9). Likewise, Australians are now very reliant upon online resources for conducting health research, rising from 22% in 2014–2015 to 46% in 2016–2017 (9). It can be reasonably extrapolated that these numbers would be even higher in 2024.

Online searches for ear, nose and throat (ENT) conditions such as OSA have increased in recent years. This is reflected across Australian search engine domains (Figure 1). For example, inputting the search terms “obstructive sleep apnoea” and “sleep apnea” into Google (www.google.com.au) in 2019 yielded a total of 32.3 million results, however, in 2024, over 100 million results are found. Likewise, Tassone et al. demonstrated in 2004 that 64% of ENT outpatients surveyed had internet access, whilst nearly twenty percent reported having used the internet to obtain information pertaining to their medical condition (10). A more contemporary study of 501 paediatric ENT outpatients and their caregivers reflects a similar trend, with thirty percent of caregivers reporting having accessed online health resources prior to their appointment (11). Importantly, 26% of such patients stated that the information they consulted ultimately informed management decisions regarding their child’s care. This demonstrates the significant role online resources have when it comes to patient decision-making.

Figure 1 Google trends analysis of Google.com.au searches of ‘sleep apnea’ in the past 10 years. Search date: 6/10/2024.

Appropriate patient education is known to underpin successful therapeutic relationships. It has been shown that effectively delivered patient education decreases hospital readmission rates, fosters patient collaboration in shared health decision making and, ultimately, motivates more satisfactory health outcomes (12-14). Contrastingly, information of dubious quality or that which cannot be comprehended by the average patient may impair sound clinical decisions (15). This also complicates the task of the clinician, as practitioners must overcome false impressions to adequately inform their patients regarding their conditions and management options. Thus, online healthcare information should both accurately distil clinical understanding and be communicated in a way that is comprehensible by its intended audience.

Two important determinants of the utility of healthcare resources are their quality and readability. High-quality information being available to patients is of paramount importance to ensure well-informed decisions can be made. Beyond resource quality, the language with which healthcare information is communicated is an important determinant of how well a resource will be received. Poorly readable content impairs source utility and, ultimately, the reach of the information (15). Consequently, the accessibility of a publication’s language is a key consideration for many stakeholders when devising healthcare resources online. Many different tools exist that evaluate the quality of healthcare information. A prominent challenge in evaluating and contrasting resource quality is the lack of consensus as to how best to undertake this. Validated instruments were used in this study to assess quality and readability of resources. The DIssembling Information on Treatment Choices for Patients (DISCERN) tool assesses 16 criteria to define the quality of written health information, creating a score out of 80 and the Flesch-Kincaid Reading Ease (FKRE) system determines a readability score out of 100 based on the average number of syllables, words and sentences in a resource (16-18). The use of these tools is well-established in the medical literature. For example, it has been employed to evaluate resources in the realms of neurology, oncology and orthopaedics, among other disciplines (19-21).

Despite the increasing ubiquity of online information specific to OSA, little is known of the quality and accessibility of the material available to patients and their caregivers. One study previously examined websites related to tonsillectomy sleep apnoea in paediatric contexts, concluding that the resources encountered were of variable quality and above recommended readability for most healthcare consumers (22). Other studies have analysed the data specific to OSA surgery and compared search engines to artificial intelligence software (23-26). However, this is the first study to conduct a longitudinal analysis of readability and quality of OSA information over time, comparing data collected in 2019 to data collected in the same manner 5 years later in 2024 by the same researchers.

Patients who access online health information are more likely to undertake searches using open-access search engines instead of relying on academic or medical databases (27). Given this fact, our study sought to evaluate the quality and readability of online patient resources concerning OSA over a 5-year period. The goal of this study was to determine if good-quality information is readily accessible to patients and where clinicians can direct patients to access this. It was hypothesised that the websites encountered would be of variable quality and readability and the available resources would improve in quality and be of similar readability over a 5-year timeframe.


Methods

Search strategy

The websites included in this longitudinal analysis were retrieved via a varied search strategy to maximise the results’ generalisability. Searches were conducted via the three most popular search engines in Australia: Google (www.google.com.au), Bing (www.bing.com) and Yahoo! (au.yahoo.com). These websites together represent 98.2% of Australian online search volume (28).

To maximise the chances of replicating the typical search strategy of a patient online, the top 15 websites were collected for two different broad search terms (‘obstructive sleep apnoea’ and the simplified generic term, ‘sleep apnea’) across the three aforementioned websites. Searches were conducted on 9 July 2019 using Australian iterations of the above search engines. This process was then repeated on 6 October 2024. Sponsored links and advertisement content were not included in the analysis, as these fluctuate in terms of search rank position and hence, were less likely to be used by the average patient. Other websites that were excluded included duplicate websites, resources that were not open-access, websites with no distinct management information and non-English resources (Figures 2,3). The study was discussed with the ENT Research Development team at the Princess Alexandra Hospital, and no ethics approval is required as only publicly accessible data is used.

Figure 2 Flow chart outlining the search strategy and exclusion criteria 2019.
Figure 3 Flow chart outlining the search strategy and exclusion criteria 2024.

Assessment of information quality

In this study, the quality of the resources encountered was appraised via the DISCERN instrument. The DISCERN tool is used for quality grading of written health information consisting of 16 distinct criteria, which each address resource components that are considered to underpin quality healthcare information, ultimately giving a score out of 80, with a higher score indicating better quality information (16). All criteria were rated between 1 and 5 (except question 2, in which the scale was 0 to 5) with the resultant scores summed. Total scores ranged from 15 (classed ‘poor’) to 80 (classed ‘very good’). The websites were then organised using previously published categorisations of DISCERN scores, which are as follows: a score of 15–28 indicates poor quality, scores of 29–41 are considered fair quality, 42–54 is a good quality score, scores of 55–67 are classified as very good and 68–80 is excellent (29).

To determine the scores’ inter-rater reliability, another reviewer blinded to the original DISCERN scores was assigned to evaluate the subjective quality of a random selection of 15 websites that was established using the randomise tool in Excel. These scores were organised as per the above categorisations and contrasted against original DISCERN ratings using Cohen’s quadratic rated kappa (κ). Thus, reliability in this study was determined using a methodology similar to that employed in the validation study of the DISCERN tool (30), minimizing bias.

Assessment of information readability

The accessibility of each website’s language (or ‘readability’) was determined using the FKRE scoring system, which considers the average number of syllables, words and sentences in a publication and offers an objective readability measurement ranging from 0 to 100, with a higher score indicating easier readability (17,18). FKRE scores were calculated in this instance using the formula shown in Figure 4 via freely available online software—“Web FX Readability Test” (31). This yielded integers from 0 to 100; higher scores reflecting an easier resource to read, and lower scores reflecting more complicated language. Similarly, as for the arrangement of the DISCERN scores, the FKRE results were grouped according to previously published categorisations with scores of 0–60 indicating difficult readability, 61–70 representing standard readability and 71–100 easy readability (32).

Figure 4 FKRE formula. FKRE, Flesch-Kincaid Reading Ease.

Statistical methods

Standard data analyses were performed via Microsoft Excel and IBM® SPSS Software (Build 1.0.0.1275). Descriptive statistics (mean, standard deviation, range) are reported. The correlations between the DISCERN and FKRE scores were assessed via Spearman’s rank correlation coefficients. Statistical significance was concluded for P values of 0.05 or lower.


Results

The literature search included a total of 90 websites scrutinised in both 2019 and 2024, with 31 and 35 eligible websites being subjected to analysis in 2019 and 2024, respectively (Figures 2,3).

Website quality

The DISCERN scores for the 31 evaluated websites in 2019 ranged from 21 to 63, with a mean score of 36.6 (standard deviation =11.98) (Figure S1). Upon consulting pre-ordained categorisations of DISCERN scores, 36.6 was adjudged to be of ‘fair’ quality (29,33). Utilising these stratification data, 11 websites were concluded to be of ‘poor’ quality [15–28], 13 were considered ‘fair’ [29–41], 4 achieved a ‘good’ score [42–54] and 3 were evaluated as ‘very good’ [55–67]. No websites scored ‘excellent’ [68–80].

In 2024 the DISCERN scores for the 35 evaluated websites ranged from 32 to 68, with a mean score of 53.1 (standard deviation =8.63) which correlates to be of ‘good’ quality (Figure S2). 0 websites were concluded to be of ‘poor’ quality, 5 were ‘fair’ quality, 20 were deemed to be of ‘good’ quality, 9 were evaluated as ‘very good’, and 1 website scored ‘excellent’.

Website readability

In 2019, the FKRE scores ranged from 31–84 (‘difficult’ to ‘standard’ reading ease) with a mean score of 57.9 (standard deviation =11.28) (Figure S1). Of the 31 appraised, 3 websites were adjudged as being of ‘easy’ readability [71–100], 12 were rated to be of ‘standard’ readability [61–70] and 16 were adjudged to be ‘difficult’ to read [0–60].

2024 FKRE scores ranged from 16–80 with a mean score of 54.3 (standard deviation =13.17) (Figure S2). Of the 35 appraised, 2 websites were categorised as being of ‘easy’ readability, 8 were of ‘standard’ readability and 25 were determined to be ‘difficult’ to read. The calculated DISCERN and FKRE scores are summarised in Table 1 and visualised in the box and whisker plots in Figures 5,6.

Table 1

Summary of website DISCERN and FKRE scores in 2019 and 2024 showing the distribution of resource quality and readability

Instrument scores No. of websites
2019 2024
DISCERN scores
   Poor [15–28] 11 0
   Fair [29–41] 13 5
   Good [42–54] 4 20
   Very good [55–67] 3 9
   Excellent [68–80] 0 1
   Total 31 35
   Mean (SD) 36.6 (5.5) 53.1 (8.1)
FKRE score
   Difficult [0–60] 16 25
   Standard [61–70] 12 8
   Easy [71–100] 3 2
   Total 31 35
   Mean (SD) 57.9 (6.7) 54.3 (11.9)

DISCERN, DIssembling Information on Treatment Choices for Patients; FKRE, Flesch-Kincaid Reading Ease; SD, standard deviation.

Figure 5 Box and whisker plot for FKRE scores. FKRE, Flesch-Kincaid Reading Ease.
Figure 6 Box and whisker plot for DISCERN scores. DISCERN, DIssembling Information on Treatment Choices for Patients.

The ResMed website was the most consistent resource in relation to both quality and readability over the 5-year period (34). Although at times some sites had better scores in either FKRE or DISCERN (Figures 7,8), on balance, the ResMed website consistently provided a better standard across both categories. In 2019, ResMed was of standard readability and very good quality, improving to easy readability and good quality in 2024 (FKRE: 2019 =65, 2024 =74; DISCERN: 2019 =63, 2024 =50).

Figure 7 Top 5 DISCERN scores 2024 and 2019. DISCERN, DIssembling Information on Treatment Choices for Patients; FKRE, Flesch-Kincaid Reading Ease.
Figure 8 Top 5 FKRE scores 2024 and 2019. DISCERN, DIssembling Information on Treatment Choices for Patients; FKRE, Flesch-Kincaid Reading Ease.

Association between readability and quality

There was no observed correlation between the DISCERN score and the FKRE score in 2019, with the Spearman’s rank correlation coefficient for this relationship calculated to be r=0.076 (P=0.683), implying statistical non-significance. However, in the 2024 data, a statistically significant correlation was observed, with the Spearman’s rank correlation coefficient calculated to be −0.467 (P=0.044), implying a statistically significant negative relationship. This is demonstrated in the scatter plots in Figure 9 and Figure 10. This means that in the 2024 data series, a higher quality of information was associated with a lower readability score and vice versa.

Figure 9 Scatter plot of 2019 DISCERN scores vs. FKRE scores. No significant correlation was demonstrated (Spearman’s r=0.076, P=0.6831). DISCERN, DIssembling Information on Treatment Choices for Patients; FKRE, Flesch-Kincaid Reading Ease.
Figure 10 Scatter plot of 2024 DISCERN scores vs. FKRE scores. Statistically significant negative relationship (Spearman’s r=−0.467, P=0.044). DISCERN, DIssembling Information on Treatment Choices for Patients; FKRE, Flesch-Kincaid Reading Ease.

Inter-rater reliability

Cohen’s quadratic weighted Kappa demonstrated a substantial degree of agreement between the independent raters using the DISCERN tool in both 2019 (κ=0.66) and 2024 (κ=0.75). Crucially, this score was higher than the acceptable level of agreement between independent raters (κ=0.4) defined by the Charnock et al. (30). This demonstrates that the tool was used appropriately by both reviewers in this context (30).


Discussion

The number of Australian patients who regularly access the internet has increased significantly in the past few decades. In 2017, 86% of Australian households declared having internet access, a significant increase from the 56% reported in 2005 (9). Although there is no updated data since 2017, it can be reasonably extrapolated that in 2024, this proportion would be far higher again. Information available on the internet is typically readily accessible, widely available and free of charge. These features have prompted more and more Australian patients to undertake their own searches of their medical conditions prior to attending a clinic (9). However, as the publication of information on the internet is largely unregulated, online healthcare information is often varied in its accuracy, utility and educational value (35). This should raise questions as to the dependability of these non-scientific search engines consulted.

Our study results confirmed a high degree of variance in the quality of online health information regarding OSA. The study confirmed the hypothesis that the quality of information improved over a 5-year timeframe; however, this was at the expense of being, on average, slightly less readable. The majority of websites examined in 2019 were adjudged to be of ‘poor’ or ‘fair’ quality by the DISCERN instrument, whereas the clear majority in 2024 were in the ‘good’ quality category. Evaluating their readability, most websites in both 2019 and 2024 scored ‘difficult’, according to the FKRE score.

Furthermore, we observed no strong correlations between quality and readability amongst the websites examined in 2019 (r=0.076, P=0.6831)—this aligns with other studies of quality and readability in other surgical fields (36,37). However, in 2024 a significant trend was observed of inverse proportionality between the quality and readability of resources (r=−0.467, P=0.044), which is likely a result of increasingly detailed and high-quality information becoming available over the past 5 years.

The NSW government recommends patient resources be written at a level able to be comprehended by a person in Secondary School Grade 9 or below. This correlates to an FKRE score of 60–70. This is because 96% of people in NSW graduated from Grade 9 or above at secondary school—meaning the majority of Australians should be able to understand these resources (38). Given only 23% of 2024 data fell in this category and 71% of resources were more difficult than this to read, it is clear that while patient resources have improved in quality over time, they have reduced in readability such that the majority are now too difficult for the average Australian to comprehend. Clinicians should be aware of the information consulted by patients online so that they may be better able to participate in shared dialogue with their patients and address any gaps in their knowledge. This awareness may also prompt clinicians to publish their own informative websites to provide high-quality and easily readable content that they may then recommend to their patients.

This study has found that among the evaluated websites, ResMed consistently scored well in both quality and readability metrics and clinicians can confidently refer patients to this resource for information. ResMed is an Australian company that provides equipment and services to patients, such as sleep assessments and devices to treat sleep apnoea—hence there is an intrinsic commercial bias in the content it provides. Despite this, due to having a clear target audience, expertise in the niche of sleep medicine and a vested interest in patient comprehension, the ResMed site has consistently been amongst the most readable and with consistent quality scores. Other sites amongst the most readable (Figure 8) are primarily from government hospital/health services or non-profit medical blogs—however, the quality of these is consistently lacking likely due to poor resource allocation and a lack of required knowledge to distil relevant good quality information into an easily comprehensible site. The highest quality websites shifted from primarily medical blogs and general practitioner (GP) articles in 2019 to reviews authored by ENT surgeons in the National Library of Medicine and comprehensive best practice tools targeted at clinicians in 2024 (Figure 7). This trend speaks to the increasing accessibility of high-quality online information, which is less readable and more difficult for patients to understand.

Many similar readability and quality assessments for online health information have been reported in the literature. Examples are far-reaching and include analyses of abdominal aortic aneurysms, urological stents, shoulder arthritis and other medical conditions (36,37,39). In the ENT discipline, reviews have been conducted on tinnitus, subcapsular tonsillectomy and respiratory papillomatosis (27,40,41). Other studies related to ours are also noted across the literature. Chi et al. previously examined information available to patients’ concerning tonsillectomy and sleep apnoea in a paediatric context (22). In that study, websites on the topic were found to be of variable quality, with an average readability above that recommended for health publications (22). Similarly, the study elucidated a very weak correlation between quality (as assessed by the DISCERN instrument) and readability (determined via the subtly different Flesch Reading Ease score) of the websites encountered. The study differed from ours in that the searches were conducted via American iterations of popular search engines and utilised the combined search strategy of ‘tonsillectomy AND sleep apnea’. Likewise, the tool used to assess readability differed slightly from that employed in our inquiry (Flesch Reading Ease versus FKRE).

To our knowledge this is the only study in the available literature that conducts a longitudinal analysis of online ENT patient resources for readability and quality. Having continuity of authors and methodology across half a decade to accurately compare complexity and content of information makes this study a novel contribution to the literature on OSA resources.

This study had some limitations that could be optimised for future enquiry. For instance, the searches were conducted via an Australian-based IP address and machine that would preferentially retrieve Australian websites and resources. Furthermore, the search engines used were selected owing to their majority market share in Australia. Search engine usage trends are known to fluctuate across not only countries but also time. Additionally, whilst only English resources were retrieved via these search strategies, the inclusion of non-English resources would have influenced generalisability. Thus, this study design may have less applicability to patients/clinicians outside of Australia.

This study also did not account for patients who use alternate methods of retrieving healthcare information online. For example, patients may already depend on certain websites to conduct health-based research and hence would not use a search engine to obtain information. This is true not only of open access websites, such as government or consumer health group websites, but also paid applications, such as UpToDate® and Medscape®. While the latter is an example of a website primarily marketed for clinicians’ use, the layman’s version of certain pages exists to aid patient education. As our study only included websites that do not require a login, such websites were omitted from the analysis. Despite this, patients still appear more likely to utilise search engines when pursuing healthcare information. For example, a survey that examined internet usage behaviours in obtaining health-related resources determined that 77% of online health seekers begin their searches on either Google, Yahoo! or Bing (42).

Another limitation of our study is that only the first 15 search results for each search term were included in our analysis. This design obviously overlooks the myriad poorly indexed websites that occur beyond the first page. Newer websites (as well as websites of lower quality) are typically indexed beyond the first page, meaning other resources were overlooked in this study. However, 75–90% of users do not click through to the second page of search results (43). Thus, our search engine strategy was likely to represent that of a generic Australian patient online.


Conclusions

Although the quality of information has improved over time, its readability has slightly worsened to a level that most Australians would struggle to comprehend. More concerningly, the resources of superior quality were found to be more difficult to read. The internet has irreversibly altered the landscape of healthcare literature as it provides patients with a vast amount of information constantly at their fingertips. However, much of the information that patients encounter is of poor quality and conveyed in a manner that many may find difficult to interpret, thus impairing decision-making. Among the evaluated websites, ResMed consistently scored well in both quality and readability metrics.


Acknowledgments

None.


Footnote

Peer Review File: Available at https://www.theajo.com/article/view/10.21037/ajo-25-4/prf

Funding: None.

Conflicts of Interest: All authors have completed the ICMJE uniform disclosure form (available at https://www.theajo.com/article/view/10.21037/ajo-25-4/coif). The authors have no conflicts of interest to declare.

Ethical Statement: The authors are accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. The study was discussed with the ENT Research Development team at the Princess Alexandra Hospital, and no ethics approval is required as only publicly accessible data is used.

Open Access Statement: This is an Open Access article distributed in accordance with the Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International License (CC BY-NC-ND 4.0), which permits the non-commercial replication and distribution of the article with the strict proviso that no changes or edits are made and the original work is properly cited (including links to both the formal publication through the relevant DOI and the license). See: https://creativecommons.org/licenses/by-nc-nd/4.0/.


References

  1. Jordan AS, McSharry DG, Malhotra A. Adult obstructive sleep apnoea. Lancet 2014;383:736-47. [Crossref] [PubMed]
  2. Hamilton GS, Joosten SA. Obstructive sleep apnoea and obesity. Aust Fam Physician 2017;46:460-3.
  3. Paul C, Rose S, Hensley M, et al. Examining uptake of online education on obstructive sleep apnoea in general practitioners: a randomised trial. BMC Res Notes 2016;9:350. [Crossref] [PubMed]
  4. Somers VK, White DP, Amin R, et al. Sleep apnea and cardiovascular disease: an American Heart Association/American College of Cardiology Foundation Scientific Statement from the American Heart Association Council for High Blood Pressure Research Professional Education Committee, Council on Clinical Cardiology, Stroke Council, and Council on Cardiovascular Nursing. J Am Coll Cardiol 2008;52:686-717. [Crossref] [PubMed]
  5. Parati G, Lombardi C, Narkiewicz K. Sleep apnea: epidemiology, pathophysiology, and relation to cardiovascular risk. Am J Physiol Regul Integr Comp Physiol 2007;293:R1671-83. [Crossref] [PubMed]
  6. George CF. Reduction in motor vehicle collisions following treatment of sleep apnoea with nasal CPAP. Thorax 2001;56:508-12. [Crossref] [PubMed]
  7. Peppard PE, Young T, Palta M, et al. Prospective study of the association between sleep-disordered breathing and hypertension. N Engl J Med 2000;342:1378-84. [Crossref] [PubMed]
  8. Hillman DR, Murphy AS, Pezzullo L. The economic cost of sleep disorders. Sleep 2006;29:299-305. [Crossref] [PubMed]
  9. Australian Bureau of Statistics. Household Use of Information Technology, Australia, 2016-17 [updated 28 March 2018]. Available online: https://www.abs.gov.au/ausstats/abs@.nsf/mf/8146.0
  10. Tassone P, Georgalas C, Patel NN, et al. Do otolaryngology out-patients use the internet prior to attending their appointment? J Laryngol Otol 2004;118:34-8. [Crossref] [PubMed]
  11. Glynn RW, O'Duffy F, O'Dwyer TP, et al. Patterns of Internet and smartphone use by parents of children attending a pediatric otolaryngology service. Int J Pediatr Otorhinolaryngol 2013;77:699-702. [Crossref] [PubMed]
  12. Krumholz HM, Amatruda J, Smith GL, et al. Randomized trial of an education and support intervention to prevent readmission of patients with heart failure. J Am Coll Cardiol 2002;39:83-9. [Crossref] [PubMed]
  13. Volk RJ, Spann SJ, Cass AR, et al. Patient education for informed decision making about prostate cancer screening: a randomized controlled trial with 1-year follow-up. Ann Fam Med 2003;1:22-8. [Crossref] [PubMed]
  14. Krist AH, Woolf SH, Johnson RE, et al. Patient education on prostate cancer screening and involvement in decision making. Ann Fam Med 2007;5:112-9. [Crossref] [PubMed]
  15. Cline RJ, Haynes KM. Consumer health information seeking on the Internet: the state of the art. Health Educ Res 2001;16:671-92. [Crossref] [PubMed]
  16. Charnock D. The DISCERN Handbook. Oxford: Radcliffe Medical Press; 1998.
  17. Flesch R. A new readability yardstick. J Appl Psychol 1948;32:221-33. [Crossref] [PubMed]
  18. Kincaid JP, Fishburne RP Jr, Rogers RL, et al. Derivation of New Readability Formulas (Automated Readability Index, Fog Count and Flesch Reading Ease Formula) for Navy Enlisted Personnel. Institute for Simulation and Training. 1975;75(8).
  19. Kumar VS, Subramani S, Veerapan S, et al. Evaluation of online health information on clubfoot using the DISCERN tool. J Pediatr Orthop B 2014;23:135-8. [Crossref] [PubMed]
  20. Beauharnais CC, Aulet T, Rodriguez-Silva J, et al. Assessing the Quality of Online Health Information and Trend Data for Colorectal Malignancies. J Surg Res 2023;283:923-8. [Crossref] [PubMed]
  21. Cerminara C, Santarone ME, Casarelli L, et al. Use of the DISCERN tool for evaluating web searches in childhood epilepsy. Epilepsy Behav 2014;41:119-21. [Crossref] [PubMed]
  22. Chi E, Jabbour N, Aaronson NL. Quality and readability of websites for patient information on tonsillectomy and sleep apnea. Int J Pediatr Otorhinolaryngol 2017;98:1-3. [Crossref] [PubMed]
  23. Bhat A, Nesmith W, Durr ML, et al. Analysis of Internet-Based Written Materials on Surgery for Obstructive Sleep Apnea. OTO Open 2024;8:e158. [Crossref] [PubMed]
  24. Cheong RCT, Unadkat S, Mcneillis V, et al. Artificial intelligence chatbots as sources of patient education material for obstructive sleep apnoea: ChatGPT versus Google Bard. Eur Arch Otorhinolaryngol 2024;281:985-93. [Crossref] [PubMed]
  25. Incerti Parenti S, Bartolucci ML, Biondi E, et al. Online Patient Education in Obstructive Sleep Apnea: ChatGPT versus Google Search. Healthcare (Basel) 2024;12:1781. [Crossref] [PubMed]
  26. Jongbloed WM, Grover N. The utility of Chat Generative Pre-trained Transformer as a patient resource in paediatric otolaryngology. J Laryngol Otol 2024;138:1115-8. [Crossref] [PubMed]
  27. McKearney RM, MacKinnon RC, Smith M, et al. Tinnitus information online - does it ring true? J Laryngol Otol 2018;132:984-9. [Crossref] [PubMed]
  28. WebAlive. What Are Australia’s Major Search Engines and Directories? Web Alive2018 [January 10, 2018]. Available online: https://www.webalive.com.au/australias-major-search-engines-and-directories/
  29. Daraz L, Macdermid JC, Wilkins S, et al. The quality of websites addressing fibromyalgia: an assessment of quality and readability using standardised tools. BMJ Open 2011;1:e000152. [Crossref] [PubMed]
  30. Charnock D, Shepperd S, Needham G, et al. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health 1999;53:105-11. [Crossref] [PubMed]
  31. WebFX. Readability Test Tool 2019 [updated 1August 1, 2019]. Available online: https://www.webfx.com/tools/read-able/
  32. D'Alessandro DM, Kingsley P, Johnson-West J. The readability of pediatric patient education materials on the World Wide Web. Arch Pediatr Adolesc Med 2001;155:807-12. [Crossref] [PubMed]
  33. Hargrave DR, Hargrave UA, Bouffet E. Quality of health information on the Internet in pediatric neuro-oncology. Neuro Oncol 2006;8:175-82. [Crossref] [PubMed]
  34. ResMed. Sleep apnea. ResMed Australia. [cited 2024 Oct 6]. Available online: https://www.resmed.com.au/sleep-apnea
  35. Park E, Kim H, Steinhoff A. Health-Related Internet Use by Informal Caregivers of Children and Adolescents: An Integrative Literature Review. J Med Internet Res 2016;18:e57. [Crossref] [PubMed]
  36. Mozafarpour S, Norris B, Borin J, et al. MP68-20 Quality and readability of online information about ureteral stents. J Urol 2018;199:e924-5. [Crossref] [PubMed]
  37. Bailey M, Coughlin P, Griffin K, et al. Abdominal aortic aneurysm repair: quality and readability of online patient information. Br J Surg 2011;98:117. [Crossref] [PubMed]
  38. Puliuvea E. The importance of reading level for accessibility. Retrieved from NSW Government. 15 August 2023. Available online: https://www.nsw.gov.au/nsw-government/onecx-program/blog/why-reading-level-important
  39. Davis DE, Stoll L, Brolin T, et al. Quality and readability of online resources regarding shoulder osteoarthritis. Current Orthopaedic Practice 2017;28:474-8.
  40. Wong K, Levi JR. Partial Tonsillectomy. Ann Otol Rhinol Laryngol 2017;126:192-8. [Crossref] [PubMed]
  41. San Giorgi MRM, de Groot OSD, Dikkers FG. Quality and readability assessment of websites related to recurrent respiratory papillomatosis. Laryngoscope 2017;127:2293-7. [Crossref] [PubMed]
  42. PewResearchCenter. Health fact sheet 2013 [updated 2019]. Available online: https://www.pewresearch.org/internet/2013/01/15/health-online-2013/
  43. Goingclear Interactive. Google’s First Page: By the Numbers 2018 [cited 2018 15/09]. Available online: https://goingclear.com/googles-first-page-by-the-numbers/
doi: 10.21037/ajo-25-4
Cite this article as: Theile W, Erian CKTG, Erian MMAS, Kitipornchai L. Obstructive sleep apnoea patient resources—a longitudinal analysis of the quality and readability over a 5-year timespan. Aust J Otolaryngol 2025;8:44.

Download Citation