Hey, Mom! The Explanation.

Here's the permanent dedicated link to my first Hey, Mom! post and the explanation of the feature it contains.

Wednesday, July 21, 2021

A Sense of Doubt blog post #2346 - Vaccinate Against Misinformation - Public Health Crisis!



A Sense of Doubt blog post #2346 - Vaccinate Against Misinformation - Public Health Crisis!

There's more than one public health crisis, and too many people are taking neither of them seriously enough.

Since early 2020 (let's say February because that's when the schools where I work closed), we have been embroiled in a pandemic due to COVID-19, the disease caused by the novel coronavirus Sars-Cov-2, and by "we," I mean EVERYONE IN THE WORLD.

The other public health crisis has a longer history, which is not the entire history of FOX NEWS but at least the last five or six years: MISINFORMATION.

FOX NEWS (which is neither foxy nor actual "news" as I understand the term to mean factual and accurate reporting of current events) bombards its viewers (of which there are far too many) with a constant stream of misinformation, opinion and accusation disguised as fact, and a disregard for truth and reality that's so flagrant that not only has the network been sued for pandering its lies but the lawyers representing the network in court claimed that viewers do not take seriously ANYTHING Tucker Carlson has to say.

Except that they do take it seriously. And they repeat it. And it catches on and spreads like a VIRUS. Very contagious for some people with an enjoyment of conspiracy theory, a resentment of so-called liberals forcing them to be compassionate and civil (an action they called "political correctness" when they were being nice), and/or an undeveloped capacity for CRITICAL THINKING. 

My whole life (quite literally) I have fought against misinformation and championed belief in science. I started as a science nerd quite young with flash cards of the dinosaurs that I memorized before I could read and then in second grade I was enthralled by space because American astronauts were about to land on the moon, which happened the summer between second and third grades. There were several Apollo missions prior to number eleven when they landed and went walkabout.

Later, when I became a teacher in college, I have always zealously advocated for students to use critical thinking to think for themselves and yet believe the experts after verifying their credibility and authority.

And yet, it has never been so DIFFICULT to convince people that science is real, that scientists who know more than them are right (because these scientists have spent a lifetime knowing what they know), and that part of CRITICAL THINKING is learning to verify and corroborate "facts," which is not something that can be done with about 90% of the content on FOX NEWS let alone the forward from your Aunt Minnie or those other news networks that I am not even naming.

And now the SUGEON GENERAL has come out publicly to call the MISINFORMATION swamp created by these people as a public health crisis as serious (if not more so) than COVID-19.

Truth meet power.

Speaking truth to power has never been so hard.

If only there was a reverse brainwashing vaccine to put people back in their "right" minds so that they do not believe nonsense like the COVID vaccine alters one's DNA (not possible) or that thousands of vaccinated people are hospitalized and dying of Covid (also not true).

Do some extra reading. PLEASE.

The truth is out there.

https://knowledge.wharton.upenn.edu/article/rothschild-project-ratio/


Dr. Vivek Murthy has issued his first "surgeon general's advisory" during his term in the Biden administration, to call attention to the health dangers posed by misinformation about COVID-19 and the coronavirus vaccines. Murthy calls on all American institutions, and all Americans, to do everything they can to limit the spread of dangerous health misinformation. He didn't recommend warning labels be tattooed on the foreheads of all Fox News anchors or Republican members of Congress, at least not yet. Instead, he suggests people actually use their heads and seek credible sources of health information. We're doomed.

OK, we kid. Some. Seriously, go read the whole advisory, for your own edification and future Twitter arguments if nothing else. Minus the title page, table of contents, and references, it's just 12 pages, and very good reading!

Murthy announced the campaign against misinformation at Thursday's White House press briefing; here's the video:

Lies Are Hazardous To Your Health

Murthy noted that his report might seem a bit different from other surgeon general's advisories, which often address public health threats like smoking, addiction, and the like, but he added that right now, "we live in a world where misinformation poses an imminent and insidious threat to our nation's health."

We were impressed by Murthy's take on how bad information throws poison into the marketplace of ideas:

[The] truth is that misinformation takes away our freedom to make informed decisions about our health and the health of our loved ones.

During the COVID-19 pandemic, health misinformation has led people to resist wearing masks in high-risk settings. It's led them to turn down proven treatments and to choose not to get vaccinated. This has led to avoidable illnesses and death. Simply put, health [mis]information has cost us lives.


How Do We Fix This?

Murthy called for an "all of society" approach to combating COVID misinformation, starting with all of us: How about we all think twice before forwarding stuff we see online that sounds amazing, but might be bullshit? Instead, perhaps it would be a better idea to ask where that's coming from, and see if it is indeed credible or complete bullshit. (We really want Murthy, whose personality is like an acutely smart, well-informed, sciencey version of Kenneth the NBC Page, to just shout cusses some time, because he's so goddamn nice.)

He even suggested a slogan that might fit nicely on an old WW-II style poster: "If you're not sure, don't share."

Because it'll take more than just better individual behavior, Murthy also recommended that other institutions, like professional groups and foundations, help spread the word on how to spot health misinformation, and to study how it spreads and what works to counter it. He even touched on one of my own favorite hobby horses, calling on schools and teachers to improve media and health information literacy. (The full report goes into a bit more detail on that, as well)

Along similar lines, Murthy urged media organizations to "proactively address the public's questions without inadvertently giving a platform to health misinformation that can harm their audiences," which is a huge goddamn deal. There too, the full report has excellent guidelines, from training reporters to be more thoughtful about sources, to providing context when discussing health information, particularly warning against pushing "alternative" views that have no validity. (Dok's Hobby Horse time again: the term "truth sandwich," as a means of discussing disinformation without granting it credibility, doesn't appear in the report, but it damn well should).


Vaccinating Against Social Media Disease

Murthy also called out tech platforms for their role in spreading misinformation:

We're asking them to operate with greater transparency and accountability. We're asking them to monitor misinformation more closely. We're asking them to consistently take action against misinformation super-spreaders on their platforms.

Let's dive into that one a bit more, because the full report is quite good, pointing out that the very things that make social media so addicting (and profitable) also help promote the spread of bad information. We've removed the footnotes here:

First, misinformation is often framed in a sensational and emotional manner that can connect viscerally, distort memory, align with cognitive biases, and heighten psychological responses such as anxiety. People can feel a sense of urgency to react to and share emotionally charged misinformation with others, enabling it to spread quickly and go "viral."

Second, product features built into technology platforms have contributed to the spread of misinformation. For example, social media platforms incentivize people to share content to get likes, comments, and other positive signals of engagement. These features help connect and inform people but reward engagement rather than accuracy, allowing emotionally charged misinformation to spread more easily than emotionally neutral content. One study found that false news stories were 70 percent more likely to be shared on social media than true stories.

Third, algorithms that determine what users see online often prioritize content based on its popularity or similarity to previously seen content. As a result, a user exposed to misinformation once could see more and more of it over time, further reinforcing one's misunderstanding. Some websites also combine different kinds of information, such as news, ads, and posts from users, into a single feed, which can leave consumers confused about the underlying source of any given piece of content.

Yep, yep, and holy crap Yep! The specific recommendations for tech platforms are promising but may also involve some whistling in the dark, since so much of social media is about connecting people with content that triggers their neurotransmitters and reinforces biases.

Hell yes, we would support social media platforms making changes that would discourage the spread of dangerous garbage, but the trick will be getting the companies to act on recommendations like these:

Redesign recommendation algorithms to avoid amplifying misinformation, build in "frictions"— such as suggestions and warnings—to reduce the sharing of misinformation, and make it easier for users to report misinformation.

There's some of that going on already, clumsy though it may be, like popping up links to valid information. But fundamental changes to the algorithm might mean fewer clicks, and fewer eyeballs on ads.



Here Comes The Rightwing Freakout

Other recommendations are more proactive, and are already eliciting howls of "censorship!" from the usual paranoid sources, which, to be clear, howl about "censorship" no matter what. Among them, the advisory recommends beefing up moderation, both by AI and by actual humans, particularly when it comes to non-English language posts and livestreams. But the one that's really got the wingnuts freaking out is this'n:

Prioritize early detection of misinformation "super-spreaders" and repeat offenders.

Impose clear consequences for accounts that repeatedly violate platform policies.

Oh noes! The federal government wants to do fascism to unapproved views! Muh free speach! Mind you, social media platforms are not the federal government, not even when lawyers for a disgraced former president say they are.

One problem with the recommendations is that, as we note, they simply reinforce the Right's own delusions that they're being repressed, which spurs them to cling to disinformation ever more strongly. We're not entirely certain that's necessarily avoidable, though, and if social media platforms really do take more measures to reduce the size of the wingnut misinformation bubble, maybe that would still be good for everyone not already inside it?



Don't Expect Tech Platforms To Help Much

It's certainly something that deserves closer study, which, hey, is also one of Murthy's recommendations, both for nonprofits that study how misinformation spreads and can be dealt with, and for the platforms themselves. The advisory calls on platforms to "Give researchers access to useful data to properly analyze the spread and impact of misinformation," which sounds like a great idea, although it flies in the face of the platforms' actual behavior. They consider their algorithms and internal data so vital to their profits that they guard them with a paranoid fervor that may rival wingnuts' attachment to conspiracy theories.

Back in April, Facebook gutted its own data analytics outfit, Crowdtangle, as the New York Times reported just the day before Murthy's advisory. The company reassigned staff from the semi-independent operation and brought it more tightly under Facebook's own control, because Crowdtangle had been too darn transparent, and that was pretty embarrassing to Facebook.

Executives behind the move

argued that journalists and researchers were using CrowdTangle, a kind of turbocharged search engine that allows users to analyze Facebook trends and measure post performance, to dig up information they considered unhelpful — showing, for example, that right-wing commentators like Ben Shapiro and Dan Bongino were getting much more engagement on their Facebook pages than mainstream news outlets.

Oops, so much for cooperation and transparency! Facebook data revealed Facebook spreads misinformation and rightwing craziness, so Facebook decided the solution isn't to fix that, but to hide the data.

Damned if we can see any easy answer to that! But Joe Biden saying that Facebook is killing people with misinformation might help shame them. Or not, who knows?

Still, we like Murthy's recommendation that much of the solution to dealing with misinformation will require getting good information to people, on a one-to-one basis if needed. (That certainly seems to be what the research on communicating about climate change shows, too.) For all the crazy it revealed, that March Frank Luntz focus group on "vaccine hesitancy" suggested that people do at least trust doctors who give them unemotional facts, so that seems valuable too.

Now, if we could just find some way of making sure people actually can see doctors.

[Confronting Health Misinformation / White House / AP / NYT NPR]

Yr Wonkette is funded entirely by reader donations. if you can, please help us bring you both the very best information AND fart jokes with a donation of $5 to $10 a month.

Do your Amazon shopping through this link, because reasons.





+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

- Bloggery committed by chris tower - 2107.21 - 10:10

- Days ago = 2210 days ago

- New note - On 1807.06, I ceased daily transmission of my Hey Mom feature after three years of daily conversations. I plan to continue Hey Mom posts at least twice per week but will continue to post the days since ("Days Ago") count on my blog each day. The blog entry numbering in the title has changed to reflect total Sense of Doubt posts since I began the blog on 0705.04, which include Hey Mom posts, Daily Bowie posts, and Sense of Doubt posts. Hey Mom posts will still be numbered sequentially. New Hey Mom posts will use the same format as all the other Hey Mom posts; all other posts will feature this format seen here.

No comments: