You are viewing the site in preview mode

Skip to main content

What is the effect of the Informed Health Choices secondary school intervention on the ability of students in Rwanda to think critically about health choices after one-year follow-up? A cluster-randomized trial

Abstract

Aim

The aim of this study was to evaluate the effects of the Informed Health Choices secondary school intervention on the ability of students in Rwanda to think critically and make informed health choices after 1 year.

Methods

This was a two-arm cluster-randomized trial conducted in 84 lower secondary schools from 10 districts representing five provinces of Rwanda. We used stratified randomization to allocate schools 1:1 to the intervention or control arm. One class in each intervention school had ten 40-min lessons taught by a trained teacher in addition to the usual curriculum. Control schools followed the usual curriculum. The primary outcome was a passing score (≥ 9 out of 18 questions answered correctly) for students on the Critical Thinking about Health Test completed 1 year after the intervention. We conducted an intention to treat analysis using generalized linear mixed models, accounting for the cluster design using random intercepts.

Results

After 1 year, 35 of 42 teachers (83.3%) and 1181 of 1556 students (75.9%) in the control arm completed the test. In the intervention arm, 35 of 42 teachers (83.3%) and 1238 of 1572 students (78.8%) completed the test. The proportion of students who had a passing score in the intervention arm was 625/1238 (50.5%) compared to 230/1181 (19.5%) in the control arm (adjusted odds ratio 7.6 [95% CI: 4.6–12.6], p < 0.0001). The adjusted difference in the proportion of students with a passing score was 32.2% (95% CI 24.5–39.8%).

Conclusions

The IHC secondary school intervention was effective after 1 year. However, the size of the effect was smaller than immediately after the intervention (adjusted difference 32.2% vs 37.2%) due to decay in the proportion of students in intervention schools with a passing score (50.5% vs 58.2%).

Trial registration

Pan African Clinical Trial Registry (PCTR), trial identifier: PACTR202203880375077. Registered on February 15, 2022.

Peer Review reports

Introduction

Over the past decade, the Informed Health Choices (IHC) network has worked on empowering the public to make informed health choices [1]. This is in response to overwhelming misinformation that is available to the public through one-to-one communication, mass and social media, and other sources of communication [1,2,3,4]. Exposure to misinformation and acting on poorly informed decisions may lead to harm and waste of resources. Systematic reviews of interventions to teach the public to think critically about health information have found only one small randomized trial of a school-based intervention to teach critical thinking about health to adolescents [5, 6]. To respond to this problem, the IHC network aimed to develop the capacity of young people to think critically about health claims, evidence, and choices. We have developed school-based educational interventions to do this [7, 8].

We initially developed and evaluated educational resources for primary school children [9,10,11]. The lessons in these resources focused on 12 prioritized key concepts. The key concepts are principles for thinking critically about whether to believe claims about the effects of interventions and for deciding what to do [12]. We also developed and evaluated a podcast for parents of primary school children that focused on nine key concepts [13,14,15].

Informed by this earlier work, we have developed educational resources for secondary schools. Adolescence is a critical age where young people learn quickly [16]. We developed and evaluated the IHC secondary school intervention in East Africa (Kenya, Rwanda, and Uganda). We started by exploring the context for teaching critical thinking about health in lower secondary schools in East Africa [17,18,19]. This informed the design of digital learning resources that were fit and scalable in the East African context. We worked with educational stakeholders to prioritize key concepts that were relevant to the targeted age group (13–15 years lower secondary school students) [20]. We employed a human-centered design approach to develop low-cost and scalable digital learning resources in close collaboration with students and teachers in East Africa [21].

To ascertain the effect of the intervention, we conducted randomized trials in Kenya, Rwanda, and Uganda [22,23,24]. The intervention was effective in all three countries [25]. We conducted process evaluations alongside the three trials and found that the intervention was largely implemented as intended [26,27,28]. Factors that facilitated effective implementation were that participants valued the content, the design of the resources, and teacher training workshops at the start of the school term when the 10 lessons included in the resources were taught. Factors that limited effective implementation were lack of printed materials for students, especially in schools with fewer computers, competing priorities, and the fact that schools focus on teaching what is prescribed in the curriculum and what is expected to be examined in national evaluations.

Systematic reviews have found little evidence of the effectiveness of educational interventions to teach adolescents to think critically about health [5, 6], and no evidence of the extent to which what is learned is retained. This study aimed to assess effects of the IHC secondary school intervention and retention of what was learned 1 year after the intervention was delivered in Rwanda.

Methods

Design

We report the findings after 1 year for a two-arm, cluster-randomized trial conducted in Rwanda between 12 and 23 June 2023. We previously reported findings immediately after the intervention [22]. We received ethical approval from the Rwanda National Ethics Committee (RNEC) (Approval No. 1019/RNEC/2020 and subsequent amendments No. 41/RNEC/2022 and No. 236/RNEC/2022). The trial protocol can be found online [26]. Before data collection, we obtained permission to conduct the study in schools from the Rwanda Basic Education Board. The trial was registered in the Pan African Clinical Trial Registry, trial identifier: PACTR202203880375077.

Setting and participants

The trial was conducted in lower secondary schools. We included private, public, and government-aided schools that followed the national curriculum. We recruited schools that had over 100 students and 10 teachers, and that had computers and internet. We exclude schools that were hard to reach and schools with a special needs or international curriculum.

We randomly selected 84 schools from 10 districts, two from each of the five provinces in Rwanda. We stratified schools by their performance on national examinations (low or high performance as defined by Rwanda National Examination and School Inspection Authority) and the number of schools was proportionate to the number of schools in each district. We selected one science teacher and one second-year class in each school. Details of the settings, participants, and recruitment process can be found in our previous report [22]. We sought consent and assent from teachers, school directors, and students.

Random allocation and masking

We allocated schools in a 1:1 ratio to the intervention or control. We used block randomization to balance for school performance, with block sizes of six and four, and equal numbers in each arm. Concealed allocation was conducted by a statistician who was not involved in the recruitment of schools or the analysis of data. We did not change the list after random allocation by the statistician. We did not mask the trial participants or investigators.

Procedures

The intervention included ten lessons covering nine key concepts [8] taught in a single school term, in addition to the usual curriculum. The lessons were taught using the digital educational resources that we developed [21]. The resources included lesson plans and background information for each lesson, a teachers’ guide, a glossary, and other resources. Each lesson could be taught using a projector, if one was available, or a blackboard. All the schools in the Rwandan trial had projectors. The teachers who taught the lessons attended a 3-day teacher training workshop before teaching the lessons. Participants in the control schools followed the usual curriculum. A detailed description of the intervention is provided using the GREET checklist in Supplementary File S1 [29].

Students and teachers in both control and intervention schools completed the Critical Thinking about Health Test (Supplementary File S2) at the end of the school term and 1 year later. We developed this test to measure the ability of students to understand and apply the key concepts taught in the intervention arm. The questions were taken from the Claim Evaluation Tools item bank [30]. Based on cognitive interviews, we made minor modifications to the questions to ensure they were correctly understood and appropriate. We conducted a Rasch analysis to assess the validity and reliability of the test [31]. We used a combination of the Nedelsky and Angoff methods to determine the cut off for passing and mastery scores [32].

Research assistants administered the test. Research assistants who were not available for the 1-year follow-up were replaced by new ones. We trained all research assistants before data collection. They supervised the test in schools to ensure independent answering. After the test, the research assistant scanned the answer sheets.

Outcomes

The primary outcome was the proportion of students with a passing score (≥ 9 out of 18 questions answered correctly) on the Critical Thinking about Health Test. Secondary outcomes were the proportion of teachers with a passing score, the proportion of students and teachers with a mastery score (≥ 14 out of 18), students’ and teachers’ mean scores (percent correct answers for the 18 multiple-choice questions), the proportion of students that answered both questions correctly for each of the nine concepts, intended behaviors, and self-efficacy.

Statistical analysis

We computed the sample size based on the primary outcome, using the University of Aberdeen Health Services Research Unit’s Cluster Sample Size Calculator [33]. We made the following assumptions: 39 students per cluster (one class in each school) based on education statistics [34], an intraclass correlation at 0.19 and 30% of students achieving a passing score in the control arm based on a previous trial in primary schools, a minimally important difference of 20% based on at least 50% of students in the intervention arm having a passing score, an alpha of 1%, power of 90%, and a maximum 10% loss to follow-up. Based on these assumptions, we calculated a sample size of 84 schools.

In the analysis, we estimated adjusted odds ratios and differences in means for binomial and continuous outcomes, respectively. We estimated adjusted odds ratios using mixed effects logistic regression. Adjusted differences in means were estimated using mixed effects linear regression. For outcomes measured at the level of student, we accounted for the cluster-randomized design using random intercepts at the level of school (the unit of randomization). Because there was a one-to-one relationship between teachers and schools, it was not necessary to account for clustering at the level of teachers. Except where noted below, all analyses were adjusted for the variable used in the stratified random allocation (low versus high school performance). To aid interpretation, we re-expressed odds ratios as adjusted differences, accounting for uncertainty of the odds in the control arm as well as the odds ratios. Missing test answers were counted as wrong answers. We followed the intention-to-treat principle throughout: all children and teachers who completed the test were included and analyzed in the arms to which they were allocated. We have reported 95% confidence intervals and two-sided p values, where appropriate, throughout. All statistical analyses were performed using Stata 16 (StataCorp LLC, College Station, Texas, USA).

We conducted two prespecified sensitivity analyses to explore the risk of bias due to attrition: inverse probability weighting (IPW) and Lee bounds. We calculated Lee bounds for mean score for students [35, 36]. These analyses provide sharp bounds on treatment effect under conditions in which missing outcomes maximally favor or disfavor the intervention. It was not possible to estimate Lee bounds for the teachers because there was no imbalance in the number of teachers lost to follow-up. To account for the cluster-randomized design for the students, we computed confidence intervals using imputed design effects to inflate the variances of the estimators. A design effect for a particular outcome was imputed as the ratio of the variance of the IPW effect estimate (which does account for the cluster design) to the variance of an estimate from the same model without the random intercepts term. It was not possible to adjust these analyses for the stratification variables.

To put the effect of the intervention in the context of the effect sizes reported for other interventions to improve critical thinking or learning in schools, we estimated Hedges’ g (a standardized mean difference) for the adjusted difference in students’ mean scores. This was estimated as the ratio of the adjusted difference to within-cluster standard deviation [37].

We estimated adjusted odds ratios comparing students’ ability to correctly answer both multiple-choice questions for each of the nine concepts and present these results as a forest plot. For questions about intended behaviors and self-efficacy, we report numbers and percentages of students for each response option and estimates of adjusted odds ratios comparing dichotomized responses (e.g., very unlikely or unlikely, versus very likely or likely).

We performed two planned subgroup analyses as described in our trial protocol [22]. In the first, we estimated treatment effects for the primary outcome in schools with high and low performance as defined by National Examination and School Inspection Authority (NESA). In the second, we estimated treatment effects for the primary outcome in students whose English reading proficiency was assessed to be advanced, basic, or lacking. Students who correctly answered all four literacy questions in the Critical Thinking about Health Test were categorized as having advanced proficiency. Students who answered both basic questions correctly and one or both of the advanced questions incorrectly were categorized as having basic proficiency. Students who did not correctly answer both basic questions were categorized as lacking basic reading proficiency. For each subgroup analysis, we estimated odds ratios for the interactions between treatment and the variable defining the subgroups. We report these alongside p values testing hypotheses of no interaction.

Lastly, we assessed whether the students who were randomized to the intervention liked the lessons, found them easy, and found them helpful. We report numbers and percentages of students for each response option as well as for dichotomized responses (e.g., liked the lessons a little or very much versus disliked the lessons a little or a lot).

Results

Characteristics of trial participants after 1 year

Between February 25, 2022, and March 29, 2022, we recruited 3128 students in second year of lower secondary and 84 science teachers. We randomly assigned 42 schools (1556 students and 42 teachers) to the control arm and 42 schools (1572 students and 42 teachers) to the intervention arm. All 84 schools participated in the follow-up study. After 1 year, 35 of 42 teachers (83.3%) and 1181 of 1556 students (75.9%) in the control arm completed the test. In the intervention arm, 35 of 42 teachers (83.3%) and 1238 of 1572 students (78.8%) completed the test after 1 year. Figure 1 shows the flow of schools, teachers, and students.

Fig. 1
figure 1

Flow diagram of study participants in 1-year follow-up trial

Characteristics of the schools, teachers, and students who participated in the trial are summarized in Table 1. The median number of students per class who completed the test after 1 year was 28 in the control arm and 32 in the intervention arm (compared to 39 and 40 per class at the end of the school term when the intervention was delivered). The percentage of female students who completed the test was 56.3% in the control compared to the 56.2% in the intervention arm (compared to 53.8% and 56.0% at the end of the school term). The mean age of students who completed the test was 15.6 at the time of intervention in both arms (compared to 15.8 and 15.7 at the end of the school term). The number of teachers with bachelor’s degree was 15/42 (35.7%) in control schools compared to 26/42 (61.9%) in intervention arm.

Table 1 Characteristics of participants in the trial

Main findings of the trial after 1 year

In the intervention arm, 625/1238 (50.5%) of the students had a passing score compared to 230/1181 (19.5%) in the control arm (adjusted odds ratio 7.6 [95% CI 4.6–12.6], p < 0.0001) (Table 2). The adjusted difference was 32.2% (95% CI 24.5–39.8%).

Table 2 Main results of the primary and secondary outcomes of the trial

In the intervention arm, 228/1238 (18.4%) of the students had a mastery score compared to 28/1181 (2.4%) in the control arm (adjusted odds ratio 11.7 [95% CI 5.4–25.2], p < 0.0001). The adjusted difference was 14.3% (95% CI 9.5–19.1%). The mean test score among students in the intervention arm was 50.6% (SD: 23.6%) compared to 34.7% (SD: 16.9%) in the control arm (adjusted mean difference 16.0% [95% CI 11.8–20.1%], p < 0.0001).

Among the teachers who completed the test after 1 year, 31/35 (88.6%) had a passing score in the intervention arm compared to 14/35 (40.0%) in the control arm (adjusted odds ratio 12.1 [3.4–42.7], p < 0.0001). The adjusted difference was 48.6% (95% CI 29.4–67.8%). In the intervention arm, 20/35 (57.1%) of the teachers had a mastery score compared to 1/35 (2.9%) in the control arm (adjusted odds ratio 45.3 [5.6–368.5], p < 0.0001). The adjusted difference was 54.3% (95% CI 37.0–71.6%). The mean test score among teachers in the intervention arm was 75.2% (SD: 17.5%) compared to 43.3% (SD: 14.9%) in the control arm (adjusted mean difference 31.9% [95% CI: 24.4–39.4%], p < 0.0001).

Performance of students on each concept

Students who answered correctly both questions for each of the nine concepts were better in the intervention arm compared to the control arm for all the concepts (Fig. 2). The adjusted difference was largest for the concept “Be cautious of small studies.” For that concept, 396/1238 (32.0%) answered both questions correctly in the intervention arm compared to 94/1181 (8.0%) in the control arm (adjusted difference 23.2% [95% CI 18.1–28.2%]). The adjusted difference was smallest for the concept “Weigh the benefits and savings against the harms and costs of acting or not.” For that concept, 313/1238 (25.3%) answered both questions correctly in the intervention arm compared to 198/1181 (16.8%) in the control arm (adjusted difference 7.2% [95% CI: 2.3–12.2%]).

Fig. 2
figure 2

Results for each key concept covered in the 1-year follow-up trial. p < 0.0001 for all comparisons. 1Number (%) of students answering both MCQs correctly. 2Adjusted odds ratios are re-expressed as adjusted risk differences. 3Intraclass correlation coefficient

Subgroup analysis on school performance and English proficiency

The effect of the intervention was similar in the high and low performing schools. The odds ratio of the interaction between school performance and the intervention was 0.9 (95% CI 0.3–2.6, p = 0.89) (Table 3).

Table 3 Subgroup analyses on school performance and English proficiency

The effect of the intervention was also similar for students with advanced and basic English reading proficiency (OR 0.7 [95% CI 0.4–1.2], p = 0.19) (Table 3). Students who lacked English reading proficiency were less likely to achieve a passing score compared to those with advanced reading proficiency (OR 0.3 [95% CI 0.2–0.5], p < 0.0001). However, the intervention was effective for students who lacked English proficiency (OR 3.9 [95% CI 2.1–7.3], p < 0.0001).

Self-efficacy and intended behaviors of students participated in the trial after 1 year

There were only small differences in the proportion of students in the intervention and control arms who found it easy or very easy to know if a claim about treatment was based on a research study comparing treatments (adjusted difference 7.8% [95% CI 2.1 to 13.5%]), to find information about treatments that is based on research (adjusted difference 3.4% [95% CI − 2.5 to 9.2%]), to judge the trustworthiness of the results of a research study comparing treatments (adjusted difference 3.0% [95% CI − 2.6 to 8.6%]), or to judge the relevance of a research study comparing treatments (adjusted difference 5.5% [95% CI − 0.2 to 11.1%]) (Table S1).

There also were only small differences in the proportion of students in the intervention and control arms who were likely or very likely to find out the basis of the claim (adjusted difference 0.1% [95% CI − 5.6 to 5.9%]), to find out if a claim was based on a research study (adjusted difference 5.0% [95% CI − 0.6 to 10.7%]), or to participate in the research study if asked (adjusted difference 0.9% [95% CI − 3.7 to 5.6%]) (Table S2).

Most students in the intervention arm liked the lessons a little or very much (82.5%), found the lessons easy or very easy to understand (66.6%), and found what they learned helpful or very helpful (84.7%) (Table S3).

Sensitivity analysis

There were only small differences between the unweighted analysis and the analysis done using inverse probability weighting for students for passing scores (adjusted difference 32.2% vs 31.8%), mastery scores (adjusted difference 14.3% vs 13.7%), and the mean difference (16.0% vs 15.7%) (Tables 2 and 4).

Table 4 Sensitivity analysis

The Lee bounds were 14.2 to 17.5% for the adjusted difference in mean scores for students (95% CI 7.6–23.3%) (Table 4). It was not possible to calculate Lee bounds for the adjusted difference in mean scores for teachers because the number of teachers lost to follow-up was the same in the control and intervention arms.

Discussion

After 1 year, the IHC secondary school intervention was effective in improving students’ critical thinking about health claims compared to the usual curriculum. The effect for the primary outcome (passing scores) was smaller after 1 year compared to the end of the term when the intervention was delivered (adjusted difference 32.2% vs 37.2%). This was due to decay in the proportion of students in the intervention schools with a passing score compared to the end of the intervention term. Retention was 86.8% (50.5% passing in intervention schools after 1 year compared to 58.2% at the end of the intervention term) [22]. The proportion of students with a passing score in control schools was about the same at the end of the intervention term and after 1 year (19.5% vs 19.4%).

There also was a reduction in the proportion of students in the intervention schools with a mastery score and the size of the effect for mastery scores after 1 year compared to the end of the intervention term (adjusted difference 14.3% vs 22.3%). Retention was 78.3% (18.4% mastery in intervention schools after 1 year compared to 23.5% at the end of the intervention term). The effect on the mean score also was smaller after 1 year (adjusted difference 16.0% vs 20.8%). Retention for students in intervention schools was 91.3% (50.6% after 1 year compared to 55.4% at the end of the intervention term).

For teachers, the effect on passing scores was slightly less after 1 year (adjusted difference 48.6%) compared to the end of the intervention term (adjusted difference 50.0%). For mastery scores, the adjusted difference between the intervention and control schools was 54.3% after 1 year compared to 71.4% at the end of the intervention term. The adjusted difference for mean scores was 31.9% after 1 year compared to 36.9% at the end of the intervention term.

Both at the end of the intervention term and after 1 year, the intervention had similar effects in schools categorized as low and high performing. Students who had low English reading proficiency benefitted less from the intervention than students with advanced reading proficiency. Responses to the questions about self-efficacy, intended behaviors, and students’ perceptions of the lessons also were slightly similar at the end of the intervention term and after 1 year.

The parallel 1-year follow-up studies in Kenya and Uganda had similar results (unpublished work). The proportion of students with a passing score in Kenya was 53.2% in intervention schools compared 32.2% in control schools after 1 year (adjusted difference 21.2%, 95% CI 14.1–28.3). In Uganda, the proportion of students with a passing score after 1 year was 53.4% compared to 33.1% (adjusted difference 21.9%, 95% CI 17.1–32.8%). Overall, across the three studies, 52.6% of students in intervention schools had a passing score after 1 year compared to 58.1% at the end of the intervention term (retention 90.5%). When adjusted for chance, retention was 88.3%. Taken together, these studies suggest that the findings are broadly applicable to schools that follow the national curriculum in East Africa. The extent to which the findings are applicable to schools that are hard to reach, special needs schools, schools that do not follow the national curriculum, and schools in other countries is uncertain.

A similar 1-year follow-up study of the effects of the IHC primary school intervention [9] found that the proportion of students with a passing score in intervention schools increased from 69.0% at the end of the intervention term to 80.1% after 1 year. However, the proportion of students in control schools with a passing score increased even more (from 26.8 to 51.5%), so the effect was smaller after 1 year (adjusted difference 39.5% vs 49.8%). No other randomized trials of school-based interventions to teach critical thinking about health have been reported results after 1 year [38]. More broadly, a review of long-term retention of basic science knowledge found that decay in what students learn in school is common [39, 40].

A limitation of this study is the loss to follow-up. Overall, 22.7% of students who participated in the trial did not complete the CTH test after 1 year (24.1% loss to follow-up in control schools and 21.2% in the intervention schools). Loss to follow-up was 16.6% for teachers in both arms of the trial. However, based on the pre-specified sensitivity analyses we conducted, the loss to follow-up is unlikely to have substantially biased the effect estimates. The main reasons for both teachers and students being lost to follow-up were absence from school on the day of the test was administered and change of school.

There were more teachers with a bachelor’s degree in the intervention schools compared to the control schools (61.9% vs 35.7%). This is unlikely to have biased the results for students, since critical thinking about health was not taught in control schools. We cannot rule out that it biased the results for teachers, but this seems unlikely since critical thinking is not included in the bachelor’s curriculum. It also is unlikely that it affected the applicability of the results. None of the teachers had taught critical thinking about health previously. All the teachers in intervention schools participated in the teacher training workshops, and all 42 teachers reported that the teacher training helped them to acquire the knowledge and skills needed to deliver the intervention [26].

Other limitations of this study are the same as those discussed in our report of the results at the end of the intervention term [20]. Responses to the questions about self-efficacy, intended behaviors, and students’ perceptions of the lessons may have been biased to some extent by social desirability, and that the Critical Thinking about Health Test was a treatment-inherent outcome measure.

Conclusion

The 1-year follow-up results and the initial results show that it is possible to teach critical thinking about health in Rwandan secondary schools. Understanding and the ability to apply the key concepts covered in the lessons was retained for at least a year by half of the students. However, there is a need to improve retention and the proportion of students who benefit from the lessons. One way of doing this would be to introduce subsequent lessons to reinforce what was learned and introduce new concepts. Another would be to modify the delivery of these lessons by, for example, increasing the amount of time allocated to teaching the lessons, providing students with direct access to printed or digital materials, integrating the lessons into the usual curriculum, and examining students’ ability to apply the concepts in national evaluations.

Data availability

All de-identifiable individual-participant data and the data dictionary will be made available on Zenodo. The study protocol with the detailed analysis plan can be found online at https://doiorg.publicaciones.saludcastillayleon.es/https://doiorg.publicaciones.saludcastillayleon.es/10.5281/zenodo.6562788.

Abbreviations

CTH:

Critical Thinking about Health Test

IHC:

Informed Health Choices

NESA:

National Examination and School Inspection Authority

PACTR:

Pan African Clinical Trial Registry

RNEC:

Rwanda National Ethics Committee

References

  1. https://www.informedhealthchoices.org/. Accessed 20 May 2024.

  2. Park E, Kwon M. Health-related internet use by children and adolescents: systematic review. J Med Internet Res. 2018;20(4):e120. https://www.jmir.org/2018/4/e120.

  3. Oxman M, Larun L, Pérez Gaxiola G, Alsaid D, Qasim A, Rose CJ, et al. Quality of information in news media reports about the effects of health interventions: systematic review and meta-analyses. F1000Research. 2021;10:433.

  4. Haber N, Smith ER, Moscoe E, Andrews K, Audy R, Bell W, et al. Causal language and strength of inference in academic and media articles shared in social media (CLAIMS): a systematic review. PLoS ONE. 2018;13(5):1–21.

    Article  Google Scholar 

  5. Cusack L, Del Mar CB, Chalmers I, Gibson E, Hoffmann TC. Educational interventions to improve people’s understanding of key concepts in assessing the effects of health interventions: a systematic review. Syst Rev. 2018;7(1):1–12.

    Article  Google Scholar 

  6. Nordheim L V., Gundersen MW, Espehaug B, Guttersrud O, Flottorp S. Effects of school-based educational interventions for enhancing adolescents abilities in critical appraisal of health claims: A systematic review. PLoS One. 2016;11(8):1–21.

    Article  Google Scholar 

  7. Nsangi A, Semakula D, Rosenbaum SE, et al. Development of the informed health choices resources in four countries to teach primary school children to assess claims about treatment effects: a qualitative study employing a user-centred approach. Pilot Feasibility Stud. 2020;6:18.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Rosenbaum S, Moberg J, Oxman M, et al. Be smart about your health. 2022. https://besmarthealth.org/.

  9. Nsangi A, Semakula D, Oxman AD, Austvoll-Dahlgren A, Oxman M, Rosenbaum S, et al. Effects of the Informed Health Choices primary school intervention on the ability of children in Uganda to assess the reliability of claims about treatment effects: a cluster-randomised controlled trial. Lancet. 2017;390(10092):374–88.

    Article  PubMed  Google Scholar 

  10. Nsangi A, Semakula D, Oxman AD, Austvoll-Dahlgren A, Oxman M, Rosenbaum S, et al. Effects of the Informed Health Choices primary school intervention on the ability of children in Uganda to assess the reliability of claims about treatment effects, 1-year follow-up: a cluster-randomised trial. Trials. 2020;21(1):1–22.

    Article  Google Scholar 

  11. Nsangi A, Semakula D, Glenton C, Lewin S, Oxman AD, Oxman M, et al. Informed health choices intervention to teach primary school children in low-income countries to assess claims about treatment effects: process evaluation. BMJ Open. 2019;9(9):e030787.

  12. Oxman AD, Aronson JK, Barends E, Boruch R, Brennan M, Chalmers I, et al. Key concepts for making informed choices. Nature. 2019;572:303–6.

    Article  PubMed  Google Scholar 

  13. Semakula D, Nsangi A, Oxman M, et al. Development of mass media resources to improve the ability of parents of primary school children in Uganda to assess the trustworthiness of claims about the effects of treatments: a human-centred design approach. Pilot Feasibility Stud. 2019;5:155.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Semakula D, Nsangi A, Oxman AD, et al. Effects of the Informed Health Choices podcast on the ability of parents of primary school children in Uganda to assess claims about treatment effects: a randomised controlled trial. Lancet. 2017;390(10092):389–98.

    Article  PubMed  Google Scholar 

  15. Semakula D, Nsangi A, Oxman AD, et al. Effects of the Informed Health Choices podcast on the ability of parents of primary school children in Uganda to assess the trustworthiness of claims about treatment effects: one-year follow-up of a randomised trial. Trials. 2020;21:187.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Dahl RE, Allen NB, Wilbrecht L, Suleiman AB. Importance of investing in adolescence from a developmental science perspective. Nature. 2018;554(7693):441–50.

    Article  CAS  PubMed  Google Scholar 

  17. Mugisha M, Uwitonze AM, Chesire F, Senyonga R, Oxman M, Nsangi A, et al. Teaching critical thinking about health using digital technology in lower secondary schools in Rwanda: a qualitative context analysis. PLoS One. 2021;16(3 March):1–18.

  18. Chesire F, Ochieng M, Mugisha M, Ssenyonga R, Oxman M, Nsangi A, et al. Contextualizing critical thinking about health using digital technology in secondary schools in Kenya: a qualitative analysis. Pilot Feasibility Stud. 2022;8(1).

  19. Ssenyonga R, Sewankambo NK, Mugagga SK, Nakyejwe E, Chesire F, Mugisha M, et al. Learning to think critically about health using digital technology in Ugandan lower secondary schools: a contextual analysis. 2022;1(17):e0260367.

  20. Agaba JJ, Chesire F, Mugisha M, Nandi P, Njue J, Nsangi A, et al. Prioritisation of Informed Health Choices (IHC) key concepts to be included in lower secondary school resources: a consensus study. PLoS One;18(4):e0267422.

  21. Rosenbaum S, Moberg J, Chesire F, Mugisha M, et al. Teaching critical thinking about health information and choices in secondary schools: human-centred design of digital resources [version 1; peer review: 1 approved with reservations]. F1000Research. 2023;12:481.

  22. Mugisha M, Nyirazinyoye L, Simbi CMC, Chesire F, Ssenyonga R, Oxman M, et al. Effects of the Informed Health Choices secondary school intervention on the ability of students in Rwanda to think critically about health choices: a cluster-randomised trial. J Evid Based Med. 2023;1–11.

  23. Chesire F, Kaseje M, Ochieng M, Mugisha M, Ssenyonga R, Oxman M, et al. Effects of the Informed Health Choices secondary school intervention on the ability of students in Kenya to think critically about health information for informed choices: a cluster-randomised trial. J Evid Based Med. 2023;1–10.

  24. Ssenyonga R, Oxman AD, Nakyejwe E, Chelagat F, Mugisha M, Oxman M, et al. Use of the Informed Health Choices educational intervention to improve secondary students’ ability to think critically about health interventions in Uganda: a cluster-randomised trial. J Evid Based Med. 2023;1–9.

  25. Chesire F, Mugisha M, Ssenyonga R, Rose CJ, Nsangi A, Kaseje M, et. Al. Effects of the Informed Health Choices secondary school intervention: a prospective meta-analysis. J Evid Based Med. 2023:1–11.

  26. Mugisha M, Oxman AD, Nyirazinyoye L, Uwitonze AM, Simbi CMC, Chesire F, et al. Process Evaluation of Teaching Critical Thinking About Health Using the Informed Health Choices Intervention in Rwanda: A Mixed Methods Study. Glob Heal Sci Pract. 2024;12(6):e2300483. [cited 2025 Feb 14]. Available from: https://pubmed.ncbi.nlm.nih.gov/39706678/.

  27. Ssenyonga R, Lewin S, Nakyejwe E, Chelagat F, Mugisha M, Oxman M, et al. Process Evaluation of Teaching Critical Thinking About Health Using the Informed Health Choices Intervention in Uganda: A Mixed Methods Study. Glob Heal Sci Pract. 2024;12(6). [cited 2025 Feb 14]. Available from: http://www.ncbi.nlm.nih.gov/pubmed/39706681.

  28. Chesire F, Oxman AD, Kaseje M, Gisore V, Mugisha M, Ssenyonga R, et al. Process Evaluation of Teaching Critical Thinking About Health Using the Informed Health Choices Intervention in Kenya: A Mixed Methods Study. Glob Heal Sci Pract. 2024;12(6):e2300485. [cited 2025 Feb 14]. Available from: http://www.ghspjournal.org/lookup/doi/10.9745/GHSP-D-23-00485.

  29. Phillips AC, Lewis LK, McEvoy MP, et al. Development and validation of the guideline for reporting evidence-based practice educational interventions and teaching (GREET). BMC Med Educ. 2016;16:237.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Austvoll-Dahlgren A, Semakula D, Nsangi A, Oxman AD, Chalmers I, Rosenbaum S, et al. Measuring ability to assess claims about treatment effects: The development of the “Claim Evaluation Tools.” BMJ Open. 2017;7(5).

  31. Dahlgren A, Semakula D, Chesire F, Mugisha M, Ssenyonga R, Nakyejwe E, et al. Critical thinking about treatment effects in Eastern Africa: development and Rasch analysis of an assessment tool. F1000Research. 2023;12:887.

  32. Nsangi A, Aranza D, Asimwe R, Munaabi-Babigumira SK, Nantongo J, Nordheim LV, et al. What should the standard be for passing and mastery on the Critical Thinking about Health Test? A consensus study. BMJ Open. 2023;13(2):e066890.

  33. Health Services Research Unit. Cluster Sample Size Calculator: User Manual. 1999;44.

  34. Republic of Rwanda Ministry of Education. Rwanda Education Statistics. 2019.

  35. Lee DS. Training, Wages, and Sample Selection: Estimating Sharp Bounds on Treatment Effects. Rev Econ Stud. 2009;76(3):1071–102.

    Article  Google Scholar 

  36. Tauchmann, H. Treatment-effect bounds for non-random sample selection. Stata J. 14(4):884–94.

  37. Hedges LV. Effect Sizes in Cluster-Randomized Designs. J Educ Behav Stat. 32(4):341–70.

  38. Verdugo-Paiva F, Novillo F, Pena J, Avila-Oliver C, Rada G. Screening (partial report). Update of: Educational interventions to improve people’s understanding of key concepts in assessing the effects of health interventions. Epistemonikos Foundation. 2023. Available from: https://doiorg.publicaciones.saludcastillayleon.es/10.5281/zenodo.7542970.

  39. Custers E. Long-term retention of basic science knowledge: a review study. Adv Health Sci Educ Theory Pract. 2010;15(1):109–28.

    Article  PubMed  Google Scholar 

  40. Oxman AD, Nsangi A, Martinez Garcia L, et al. The effects of teaching strategies on learning to think critically in primary and secondary schools: an overview of systematic reviews. Eur J Educ. 2024;submitted.

Download references

Acknowledgements

We acknowledge all students, teachers, school authorities, Rwanda Biomedical Centre, Rwanda Basic Education, National Examinations and School Inspection Authority Staff who made this work possible. We acknowledge the research assistants who participated in the data collection. We acknowledge Catty Mathews who took time to review the draft manuscript as external reviewer.

Funding

Open access funding provided by University of Oslo (incl Oslo University Hospital) The Research Council of Norway funded the trial: project number 284683, grant no: 69006, project title: Enabling sustainable public engagement in improving health and health equity (1 August 2019 to 31 January 2024). Michael Mugisha was co-funded by the University of Rwanda’s Centre of Excellence for Biomedical Engineering and E-health (UR-CEBE) funded by the African Development Bank. The funding bodies had no role in the design of the study or in writing the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

MM was the principal investigator for this trial. He conceptualized the study, planned data collection, and managed trial process and writing the manuscript. LN and ADO were the supervisors, oversaw the trial implementation, and were members of the trial steering committee together with NKS and MK. DK and CMCS supported the coordination of field work activities. All authors contributed to the protocol development, review and approval of this manuscript and had final responsibility for the decision to submit for publication. SER, ADO, JM, and MO led the development of the intervention. AD led the development and validation of the outcome measure. All authors except CJR and AD contributed to the development, review, and approval of the intervention. CJR provided statistical advice and conducted the statistical analyses. All authors had full access to all the data and final responsibility for the decision to submit for publication.

Corresponding author

Correspondence to Michael Mugisha.

Ethics declarations

Ethics approval and consent to participate

This research sought ethical approval from the Rwanda National Ethics Committee (RNEC) (Approval No. 1019/RNEC/2020 and subsequent amendments No. 41/RNEC/2022 and No. 236/RNEC/2022). Before data collection, we obtained permission to conduct the study in schools from the Rwanda Basic Education Board and the Ministry of Local Government in Rwanda. All the participants signed a consent form and students under 18 years signed assent forms.

Competing interests

MM, LN, CMCS, FC, RS, MO, AN, DS, JM, MK, SL, NKS, SER, and ADO both developed and evaluated the intervention.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mugisha, M., Nyirazinyoye, L., Kayiranga, D. et al. What is the effect of the Informed Health Choices secondary school intervention on the ability of students in Rwanda to think critically about health choices after one-year follow-up? A cluster-randomized trial. Trials 26, 160 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13063-025-08779-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13063-025-08779-w

Keywords