The 2018 Forum Report: PSA Response

The Forum on Education Abroad, also known as “the Forum,” is a large U.S.-based institutional membership organization and standards setting body for the education abroad industry. The Forum’s website states their purpose, to “disseminates comprehensive standards of good practice, resources and training, advocates for education abroad and its value, and engaged the field in critical dialogue to benefit students,” including “health, safety, security and risk management.”

In March of 2016, the Forum released a report that purportedly analyzed education abroad student deaths for the year 2014.  The question that framed their analysis was this, which is safer, study on-campus or study abroad? The resultant “key finding” was widely announced by the Forum: “College students studying abroad are less likely to die than college students studying on campuses in the U.S.” Thereafter, in press coverage of subsequent student deaths, news stories were accompanied by a restatement of the Forum’s key finding.

Two years later, the Forum asked the same question, though this time they considered seven years of data (2010-2016). Again, they came to the same conclusion: “College students studying abroad are less likely to die than college students studying on campuses in the U.S.”

In order to determine on-campus risk, both Forum reports used data from what they refer to as “the Turner study.”  In 2013, Turner and colleagues published the results of a pilot study, a small-scale study intended to better understand potential issues with conducting a larger one. Their objective was to identify leading causes of death and mortality rates for students attending U.S. institutions of higher education. To accomplish this, Turner sent out 1154 surveys to institutions of higher education.

Survey research is always challenging, and there are significant challenges with the Forum using Turner’s study. For one thing, only 166 surveys were returned. Turner rightly dropped nine of those surveys, for a final total of 157, which at a below 15% response rate must have been disappointing. That this study relies on a self-selected convenience sample with a low response rate limits the ability to make inferences and raises concerns about sampling error, mistakes in statistical analysis due to using a subset population that does not adequately represent the whole. Moreover, since there’s no standardized methodology for schools to track and report student death, survey information arrived in widely variable formats. Turner never did conduct a larger study, so there exists no expanded response rate, nor has the definition of student death been standardized.

To obtain information about deaths during study abroad, two large insurers confidentially shared claims data information with the Forum.* This information represented insurance claims made by bereaved families for the cost of repatriating their child’s body. (Issues with these two data sources have been previously discussed in PSAs’ blog “Scientific Research Versus The Forum Report.”**)

Table 1 of the 2018 Forum report lists decedents from 2010 through 2016, by gender, country of death, and cause of death. When comparing this information against Protect Students Abroad ad hoc data, we surmised students who were included and not included. For example, my PSA colleague, Ros Thackurdeen’s son was not included. His program did not carry insurance. My son was also not included. Though his program was insured, the program’s carrier did not participate in the Forum’s data project. Also, Thomas’ body was never found, making a repatriation claim a non-issue.

Though personal, these examples are not exceptional. We are aware of other education abroad students who have died, but are not a part of the Forum’s claims data. This may be because not every bereaved family makes a claim, or they may make a claim through a different insurance policy or carrier, or the program might not be insured at all. All of these possibilities represent missing data, which is why insurance claims data is generally not considered high quality information for assessing risk.

In summary, datasets used by the Forum have weaknesses that were inadequately addressed in their reports. Much to the disappointment of PSA, the 2018 report asked the same hollow question they asked in 2016, a question that undoubtedly means more to the education abroad industry than to student safety advocates and bereaved families. It uses the same qualitatively weak data, the same researcher and the same flawed methodologies. The final product reads as if criticism should be limited by simply supplying larger numbers.

Improved understanding of risk can never result from replicating flawed numbers with more flawed numbers. This approach can only result in a larger version of itself, which inevitably will remain deficient. So for the remainder of this blog, PSA will focus on four specific weaknesses with both Forum reports. These weaknesses fall under the heading of bias, ways in which systematic error can be introduced into a research project to damage the validity of conclusions.*** We will discuss reasons why Turner data and insurance data are not comparable, which ultimately compromises the Forum’s key finding.

Confounding

Students who engage in study on their home campus likely have many differences from those who engage in education abroad. Examples of such differences include a student’s age, gender, academic standing, disciplinary issues, physical health, mental health, socioeconomics, previous travel experience, and so on.

While research methods can sometimes allow for comparing disparate groups, the Forum offers no information about these differences. This suggests that no attempts were made to understand, account for, or control for confounding factors. In a sense, by using Turner data and insurance data, the Forum compared apples and oranges, claiming that because both are “fruit”—that is, both groups are made up of students—they’re comparable.

Ascertainment bias

As previously discussed, one of the most basic problems with comparing Turner and insurance data is that there’s no standardized definition of student death. This is true both for student death on the home campus and student death during during study abroad.

Perhaps this seems like a frivolous point; after all, even if students are different from one another, death is still death. Nevertheless, how student death is defined has enormous implications for what researchers count and what they do not count. Given the distinct differences with which Turner and the insurance companies collected their data, and given the aforementioned problems with both of these data sets, we can rightly suspect that who Turner did and did not count was different than who the insurance companies did and did not count.

Healthy traveling student effect

The “healthy worker effect” is a type of selection bias that occurs when the subjects being studied (population groups of workers) are not like the comparison population with regard to overall health. Likewise, the “healthy traveling student effect” is a type of selection bias that occurs when subjects being studied (population groups of students) are not similar to a comparison population with regard to overall health.

It is reasonable to assume that students with pre-existing conditions would be less likely to study abroad than students without pre-existing conditions. In other words, less healthy students probably are more likely to stay put, while healthier students may be more likely to venture out. Furthermore, students with marginal behavioral health may be less likely to study abroad than those who are behaviorally strong. Thus, the general population of students on study abroad may have overall better health than the general population of students on campus.

This type of self-sorting can impact research results in a significant way, by some estimates reducing measures of mortality in the healthier group by fifty-percent. For this reason, when comparing Turner and insurance data, the Forum should have considered the pre-death health of both groups. However, they did not.

Missing deaths

The term all-cause mortality is used by researchers to indicate all deaths that occur in a particular population experiencing a particular exposure during a particular period of time. For the purposes of the Forum reports, exposure is either being a student on a college campus (Turner data) or being a student on study abroad (insurance data). The value assigned is measured in number of deaths per 100,000 subjects (students) per year.

Turner found an all-cause mortality rate of 22.4, which the Forum annualized to account for the variability in study abroad program length. This shifted Turner’s death rate up to 29.4 deaths per 100,000 students per year. In contrast, insurance data from 2014 yielded an all-cause mortality rate of 13.5, while the 2010-2016 insurance data resulted in an all-cause mortality rate of 17.6.

All of these numbers seem low. For comparison, it may be useful to consider scholarly studies of youth deaths around the world. For example, in 2012, Gopal K. Singh published “All-Cause and Cause-Specific Mortality among U.S. Youth.” Singh found that youth between the ages of 15-24 had an all-cause mortality range of 23.1 (Hong Kong) to 167.1 (Russia), with the United States at 80.1.*****

In light of this information, if the Forum’s 13.5 and subsequent 17.6 all-cause mortality rates do not make sense, what’s going on? Is their analysis being impacted by healthy traveling student effect? Ascertainment bias? Both? Surely healthy traveling student effect would account for a lowered mortality rate for study abroad students. As for ascertainment bias, under ascertainment in both student groups seems likely.

To their credit, the 2016 Forum report seems to understand the weakness of their data, for they state, “With such a small number of deaths it is very difficult to infer anything to the general population of students studying abroad.” And yet, the Forum leaps ahead to do just that, calculating a confidence interval of 99% that their key finding is accurate.

But determining confidence intervals only works if the numbers being compared have validity. The Forum report has calculated a 99% confidence interval of what? That the two populations (campus and study abroad) are different? This means nothing if the all-cause mortality rates are implausible in the first place. The Forum has not only produced apples and oranges, they’ve produce apples and oranges that are rotting with flawed methodologies.

The Forum’s all-cause mortality rate is so low that, if it is to be believed, it suggests that U.S. students on study abroad are literally the safest youth in the world! At the very least, the reasons for such outlier numbers should have been interrogated in the discussion section of both Forum papers. But they were not. This failure to stringently examine begs many questions, from professional competence to motivation.

There is no way to soft pedal our conclusion. Bad studies can be worse than no studies. Both the 2016 and 2018 Forum report are not true academic research. Their findings are weak, at best, and misguided, even manipulative, at worst. They provide no scientific information upon which to draw conclusions about education abroad student safety, risk and risk mitigation. Rather, these are industry reports, written by the industry for the industry. Sadly, their loose resemblance to research allows media to pick up and repeat the “key finding” like a tagline, thus sending an inaccurate message to students and their families.

While the first Forum report may have represented an attempt by the organization to move their membership toward shared safety data, their persistence with the same question, the same data, the same researcher and research methods, represents something else. In spite of proclamations by both the Forum and the education abroad industry that student safety is priority #1, a difficult truth is revealed by the failures of the second Forum report, as well as the industry’s unquestioning acceptance of these results, despite their persistent refusal to share safety data.

Survivor families understand that student safety risks during education abroad are unique and challenging, as are safety risks on campus. Before our children-as-students leave home for the world, they deserve safety information built upon the best academic scholarship. We continue to hope for and work toward the day when the education abroad industry recognizes transparent safety data and true scientific research as in everyone’s best interest.

~

* Cultural Insurance Services International (CISI) and GeoBlue (GEO, formerly known as HTH)

**https://protectstudentsabroad.org/to-begin-again-3-2/

***https://catalogofbias.org/

****https://protectstudentsabroad.org/telling-the-untold-2/

*****https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3665977/