For those who didn’t have time to skim the headlines over the last week, you missed a pair of hair-raising news narratives chronicling certain doom for the State of Florida. First, there was an “extremely dangerous” Category 5 hurricane, named Lee, chugging toward us. And second, the media (both national and Florida-based) are claiming that the Sunshine State is facing a massive university “brain drain,” fueled by GOP policies that are driving professors to seek employment in more “reasonable” states.
A few days ago we exposed the media’s false, intentionally misleading overhype of the “explosive” and “deadly” Hurricane Lee (read that here). Only in the last day or so have news outlets started being more honest about the storm’s path. That’s because the soon-to-be-forgotten Lee is now losing steam and turning harmlessly northward far out in the empty parts of the Atlantic Ocean (exactly as forecasters had always predicted).
Today, we’ll deal with Florida’s alleged “brain drain,” a not-so-new but still false narrative. It’s being fueled by politically-motivated labor unions pushing unsupported survey data to compliant media outlets already sympathetic to the progressive cause. And as it turns out, there may be less of a brain drain and more of something else: either a “critical thinking drought” considering the media’s drone-like acceptance of the union narrative, or just willful and intentional bias for the progressive side of the story.
Perhaps it’s a mixture of both. So before we sound the brain drain alarm bells, let’s unpack the assumptions behind this latest headline-grabbing tale, and then we’ll dig into modern polling methodology and how to know when they actually matter.
Here’s the Tampa Bay Times (paywalled) headline and lead paragraph:
New laws in Florida and elsewhere are pushing faculty to leave, survey says
Forty-six percent of Florida respondents said they planned to look for university jobs in other states.
A survey of more than 4,250 faculty across four states, including Florida, highlights growing concern over political involvement in higher education and a widespread desire to find new employment. Close to half the 642 respondents in Florida said they planned to seek employment in a different state within the next year.
Other progressive outlets, from Axios to Raw Story, also picked up the narrative and dutifully reported it without lifting a finger to question the source, or, as it turns out, anything else.
Other than reporting the number of professors who participated in the survey, conducted by the American Association of University Professors (AAUP), none of the media outlets included any information about the survey itself. Missing from every single story: the wording of the questions asked and the methodology for selecting the professors included in the results.
Longtime Florida pollster Steven Vancore of ClearView Research says that media outlets shouldn’t ever bother reporting on polls without having a grasp on those basic fundamentals.
“The gold standard for media should be if you don’t get the full methodology, you don’t see the entire questionnaire, and you don’t have an opportunity to interview the pollster, then you shouldn’t cover the poll,” Vancore says. “At the very least, reporters should disclaim a poll by explaining to readers that the pollster refused to make that critical information available.”
In this case, the survey wasn’t a traditional scientific poll – it was an instrument designed to throw fuel on the fire of the already burning “brain drain” news narrative, which has been pushed for months by progressive groups and their media allies. From a public relations perspective, the survey was just another tool to keep that narrative alive.
It’s worth noting that the willful reporting by the media of the AAUP’s survey, despite the lack of data to support its validity, stands in sharp contrast to how the media treated another college campus survey just a year ago. For those with short memories: a similar survey, commissioned by the Florida Legislature, aimed to examine the political climate on Florida campuses, but it differed in two respects from the AAUP’s survey: first, the Florida legislature’s survey actually disclosed the questions and methodology behind it, and second, the Florida media completely dismissed the legislative survey results for reasons that don’t hold up to scrutiny (especially so now that they reported on the AAUP’s survey).
What makes the AAUP’s survey more valid than the Florida Legislature’s survey? Apparently – sadly – just the conclusion: one fits the desired narrative, the other doesn’t.
Which brings us to the larger question: With so much manufactured political hogwash permeating the media, how can anyone know which polls to rely on and which ones are just pushing a particular narrative?
The answer is simple:
Almost all public polls are designed to manipulate, not measure, public opinion
Whether it’s commissioned by a political campaign, a corporation, or an advocacy group like AAUP, most public opinion polls and surveys released to the public have one purpose: to directly influence the media narrative and drive stories about what the polling data shows. But the vast majority of public opinion polls are kept private because the information they uncover is often highly sensitive and more valuable because it allows the poll’s funders to act on information before anyone else.
“The public does not see 90 percent of the polls that are conducted,” Vancore says. “My firm does about 30-40 polls a year, in the history of my firm, we’ve released only a handful.”
While public opinion polling is expensive, accurate public opinion polling is very expensive. As a result, there are generally only three main types of organizations that can justify the expense of public opinion polls. They are: political science departments at colleges and universities, major national media outlets, and political campaigns and powerful advocacy groups.
The problem, Vancore says, is that, “polls are becoming more and more difficult to get right.”
Even though colleges and universities are able to leverage student labor to keep costs down, the shifting polling landscape has made gathering accurate data much more expensive.
“Over the past decade or so, polling methods have had to be adapted to changing dynamics among respondents,” Vancore explained. “First, people are more resistant to answering calls and agreeing to partake in polls. Second, with new phone technologies—there is far more room for error if not done correctly.”
The end result is that college and university polls have become much less reliable in recent years, while well-trained pollsters have shifted to more sophisticated – and more expensive – methods, including stratified sampling, novel data-gathering techniques, and more complex modeling in order to produce accurate results.
Faster, cheaper polling proliferates
Modern technology has enabled cheaper, less accurate polls to thrive. That, combined with the media’s unquenched thirst for polling data upon which they can base a story, means a lot of bad information is making its way into the public realm. The media is using polls and shaping narratives without critical examination or explanation, and it’s a growing concern. Reporters evaluating the validity of a poll should always include a range of factors (or the lack of them) in any story based on poll results or survey findings. But they usually don’t.
For news consumers, they should read with a critical eye and dismiss most polling stories unless the story includes who conducted the poll, who paid for it, the underlying methodology, the sample size, the polling demographics, the actual wording of the questions asked, and the specific margin of error for each subset of data.
Unfortunately, many reporters fail to understand how a poll’s margin of error even works. According to Vancore, reporters will often include a poll’s overall margin of error, but not realize that it changes for specific questions or subsets of the data, so the margin of error on that question can be radically different from the overall poll.
Polling silly season is upon us – what to look for
We’re getting close to the one-year mark before the 2024 election cycle, a time period when publicly released polls will be pounced on by reporters like kids jumping on candy thrown from a busted pinata. Both journalists and news consumers alike need to be particularly discerning of any story offering poll results as though they are meaningful and relevant.
In the case of the Florida faculty “brain drain” survey, not only was the methodology undisclosed, but the survey was also conducted by a labor union, with an obvious progressive political agenda. Perhaps the media, in their eagerness for a good story and a rush to create a captivating narrative, simply overlooked those glaring red flags. More likely, they are just more apt to believe the survey’s findings and eager to report a narrative that validates their own worldview. Either way, the lack of transparency and eagerness to report such thin survey results taints the validity of the resulting news coverage.
Just as bad, the selective treatment of two very different campus surveys undermines public trust and raises serious questions about journalistic integrity.
As polling grows more complex, media outlets owe it to their readers to delve into the methodologies and potential biases that shape the narratives being offered. Ignoring these intricacies not only misinforms the public but also erodes the very foundation of balanced and fair journalism.
Personally I would be well Pleased with a bit of Liberal Brain Drain in our Florida Universities. Of course that would be with the current definition of Liberal, i.e. Marxist, rather than the classical def i.e. free thinker encouraging robust discussion of a wide variety of views!
@billknight When was the last time you were in a college classroom? Encouraging robust discussion is a norm in the classes on my Florida campus. At least until Desantis’ policies cast a pall on it with threats to fire, suppress or dox the professors. Perhaps you should try to attend a robust discussion and see what it really looks like.
Hi Zen
I don’t believe the robust discussions will ever entail conservative thoughts. The “robust discussion” will be how to destroy the conservative thought with talking points.
Sadly everyone knows the only diversity on college campi is demographics, NOT even remotely is there a diversity of thought, which, of course, ultimately is SUPPOSED to be what happens on college campi. Lockstep liberalism has had a strangling grip for more than 40 years now.
Conservatives are not welcome, which is why a “brain drain” is an oxymoron when it comes to genuine intelligence.
Academics are hired for lockstep liberalism and hewing to the litmus test of the “correct” politics.
May the “brain drain” begin and accelerate rapidly, as there are better smarter conservatives even more qualified than the current placeholders but we know we are not welcome there.
Their departure would be a net “brain GAIN” for all
Please be willing to send me a private note as to where you teach, as maybe just maybe you really do have a diversity of thought where you “discuss” but I’d need to see it to Believe it.
Thanks for the “inside baseball” coverage. This explains a few things I have noticed about polling results over the last several years.
The “brain drain”— IF —it was to happen will be a “brain GAIN” as conservatives would be there for their intellect, not for hewing to the litmus test of lockstep liberalism that permeates academia now
Thanks BB for “shining the light of truth” on this typical political charade.
Ironic…criticizing media for not reporting the poll’s specifics, while also not reporting the polls specifics during the critique. Pot meet kettle.
Dumb since I don’t have the poll’s specifics. That’s the whole point. They weren’t released.