So, I kinda wanted to hop in and subject this tumblr post to Peer Review, because while I definitely agree that there is a big problem in the US regarding poor media literacy, the study featured here is... not a good measurement, and these bullet points make the results seem worse than they are.
Another person in the notes linked an additional article that contained the full study, and these are the questions that were included in the study:
A couple of problems right off the bat. The first one that jumps out is #6: "The Earth is between 5,000 and 10,000 years old." In this study, the correct answer would be that this is a statement of fact, because it is, well, stated as a fact. And according to the paper in question:
- A fact is a statement that can be verified. It can be proven to be true or false through objective evidence.
That's lovely and all, but nowhere in this study does it say that the participants were informed that this is definition of "statement of fact" that the study was using. It's not exactly common knowledge that "statement of fact" and "fact" do not actually mean the same thing. A statement of fact is when someone says something that they believe is true. A fact is something that is true.
So, anyone who participated in this study who interpreted "statement of fact" to mean "fact" (and also wasn't a young-Earth conspiracy theorist) would have gotten this question wrong, because they know that Earth is much older than 10,000 years.
Another somewhat less obvious problem is in question 2, which the authors of the study address as such:
- First, as noted by Firey (2018), item 2 perhaps could be categorized as a statement of opinion because the phrase “a significant portion” is subjective. We repeated Pew’s coding on our survey, and solid majorities on both surveys rated the item as a statement of fact. In retrospect, we would categorize the statement as borderline (see note 4 above). By the end of 2017, ISIS had lost 95% of its territory (Wilson Center, 2019). Because such an overwhelming loss would be difficult to deem insignificant, we retain the item as a statement of fact for present purposes.
Yes, I agree that nobody could reasonably argue that a 95% loss isn't significant. However, unless the study specifically recruited participants who were extremely knowledgable about statistics relating to ISIS activity, the participants have no way of knowing that 95% is the amount in question. That "significant portion" could be anything!
Like, let's say that ISIS had lost 25% of its territory. That's a lot! That's over 10,000 square kilometers that were no longer under ISIS control. So one could argue that this is a significant portion. But also, that's only a fraction of their territory. ISIS still has control over the majority of the territory they'd had previously - three times as much as the amount they'd lost. So one could also argue that this is an insignifant portion.
If I had been a participant in this study, I would have marked this as a statement of opinion. And thus I would have become part of the 95% who apparently can't differentiate between fact and opinion.
But, I was not a study participant. So, who was? What does this study say about its participants?
Well, here's what the study says about it:
- Data are from a national online survey we designed. The survey was fielded by YouGov from March 9 to March 14, 2019. There are 2,500 respondents.
That's it. No information about how participants were recruited or invited to take the survey. Nothing about what quality assurance methods were used to make sure that participants were following instructions, or that computer error didn't interfere with the data.
Nothing about incentives given for the completion of the survey.
Oh, yes, that's something very important to note. You see, YouGov is not an academic research firm. It's a marketing research firm. It functions similarly to sites like Swagbucks or Survey Junkies. And this is how it markets itself to participants when you search for YouGov in Google:
That's exact marketing used by Swagbucks. I use Swagbucks. I'm a pretty active user, in fact. I answer surveys and play mobile games to collect points, then redeem them, usually for Wal-Mart gift cards so I can save on groceries. I know how sites like these work.
And I also know that when I just want to get a survey completed, I bullshit. I just click whichever multiple choice option is closest to my cursor until I'm done and can get the points.
Guess what else I do: I lie. Sometimes a survey is listed on the site that's worth a lot of points but is only looking for people of a certain demographic. And I take it anyway. I give them whatever info is needed to get my points and get those gift cards. I have had a grand total of one cup of coffee in my life and hated it. But whenever a survey asks me if I have ever bought a certain brand of coffee, I say yes, and I review it.
Oh yeah, that's another point: there is no demographic information given about this study's participants. We know that participants were at least asked for their age and political affiliation ("Multivariate models include controls for age and partisanship") but we, the readers, are never told what that spread turned out to be. We just have to take their word for it that YouGov's sampling methodology was sound.
If I had turned in this research paper for review, with this Methods section and this survey, my advisor would have slapped me.
So, what is my point? Why did I spend an hour writing this response to a tumblr post? Well, because the findings of this study may be bullshit, but this whole post brings to light another big problem in the world of media and journalism.
The University of Illinois published this sloppy-ass study that I would have been embarassed to hand in as a 10-point assignment. This survey made it through the internal review board, a process that I know from experience can take months and dozens of rounds of review. It was then peer reviewed by Harvard, the school that people point to as the epitome of academic prestige full of super geniuses, and added to their library. Then it was picked up by a Washington DC online newspaper and tweeted about by a member of the House of Representatives.
And finally, it was posted here on Tumblr, uncritically.
I don't know what percentage of Americans can't tell fact from opinion. And reading this study is not going to give me an accurate answer, because the study's design and conclusions the authors reached are a mess.
All I can tell you is that academic and journalistic instutions alike need to do a better job of reviewing and thinking critically about information they receive before they publish it to the world.