Shawn Vestal: Sturgis research called for greater skepticism
I wrote a column last week about a research paper that estimated the Sturgis (South Dakota) motorcycle rally had been a massive COVID-19 superspreader event.
Immediately, people began to tell me I was full of it. Some of this I ignored – in part because it came from COVID deniers, people who complained about the paper only in partisan terms, or those who relied on published rebuttals that misrepresented what the paper said. These days, that’s more normal than not.
But there was more. A friendly acquaintance on Twitter, a professor at Washington State University, told me he thought the Sturgis paper was “ridiculous.” Another person with a science background reached out to say it was the worst piece of research on the virus he’d seen.
I began to look into it more and found that others were roundly criticizing the Sturgis paper on scientific grounds. It became clear that I had accepted and amplified the study too uncritically, and that the paper itself had arrived at estimates that other epidemiologists considered deeply flawed.
And so, a meal of crow: While it’s known there was some spread from the rally, the Sturgis paper was an unreliable measure of that spread, and likely overstated it dramatically. And I revved the engine on that overstatement.
I should have worked harder to put this research to the test before publishing a column. One of the reasons I didn’t, I’m sure, was the degree to which the paper reinforced my own beliefs that virus skeptics, mask refuseniks, and those who ignore health guidelines and minimize the threat of the virus – as the half-million Sturgis ralliers did – are putting others at risk during the pandemic.
I still believe this. But I let that belief undercut my skepticism, as a lot of people much smarter than me have made clear.
One of these people was the guy who wrote the book on being full of it, University of Washington epidemiologist Carl Bergstrom, who has become a vigorous critic of COVID-19 crackpottery. I admire his book and his campaign, and I interviewed him and his co-author and UW colleague, Jevin West, for a Northwest Passages Book Club event last month.
Bergstrom is a blunt, tart voice against the wave of pandemic misinformation that has flooded the country. He suffers no fools, and is a warrior for a factual America and he teaches students at UW how to do so as well.
That class has a Twitter account, which recently posted a link to a withering critique of the Sturgis study with this intro: “In our course, one of the rules that we stress for detecting misinformation is that if something seems too good or too bad to be true, it probably is and it’s time to dig deeper.”
The deeper dig came from Oxford epidemiologist and demographer Jennifer Beam Dowd, who published a piece on Slate titled, “No, the Sturgis rally did not cause 266,796 cases of COVID-19,” in which she took apart the methodology of the Sturgis study.
Dowd emphasized that producing a good estimate of the Sturgis effect would be difficult, and there are many reasons that case counts might be higher in counties that had a lot of Sturgis rally-goers, compared to counties that did not.
It’s likely, she said, that counties had trend patterns already in place based on factors other than only rally attendance.
She also wrote that, given such high estimates, the researchers should been skeptical whether that much spread was even epidemiologically possible in the time frame of the rally. Her criticism of the study’s methodology was detailed.
The dean of Brown University’s School of Health, Ashish K. Jha, also weighed in, asserting that there was so much “noise” in the data that the findings were not supportable.
“So was Sturgis harmless?” he asked on Twitter. “No. But I doubt it caused 250,000 cases.”
Like Jha and others, Dowd does not doubt that there was COVID-19 spread as a result of the rally, and she notes that contact tracing in some communities have identified both cases and deaths associated with the event – “but in the range of hundreds.”
Understanding science, and accurate reporting of it in the press, is incredibly important right now.
It’s important to me, personally – I try hard to make this column a place where the facts are reliable, whatever you think of the opinions.
In this case, I fell short of that standard. Many people much smarter than me are saying the paper was poor science.
Elevating it as I did was poor journalism.