Tue. Dec 7th, 2021


If you’ve been following the news for the past few years, you’re probably sure we’re living A golden age Conspiracy theories and false information.

Whether QAnon or the January 6 uprising or anti-vaccine hysteria, many have come to believe that the culprit, not often, is bad information – and Fantasy-art complex That creates and propagates it – breaks the human brain.

However, I recently read an essay Harper’s Magazine It made me wonder if the story was so simple. I can’t say that it has changed my mind in any profound way about the real-world consequences of lies, but it has called into question some of my key assumptions about the online information ecosystem. It’s called “The Bad News: Selling Confusion Stories,” and the author is Joseph Bernstein, a senior technology reporter at BuzzFeed News.

Bernstein does not deny that confusion is one thing. The problem is that we do not have a consistent definition of the term. Bernstein says that what you find in the literature is a very vague reference to information “which may lead to misconceptions about the state of the world.”

He argues that a broad definition is not so useful as the basis of objective study. And while it’s also not clear how the confusion differs from the misinformation, the former is considered more “intentionally” misleading. All of this led Bernstein to conclude that even people who have studied the subject cannot agree on what they are talking about.

But the biggest – and much less understood – problem is that certain interests are invested in the highly publicized confusion as an existential crisis because it is good for business and because it is a way to deny the real root of our problem.

I reached out to him for this week’s episode Vox conversation To talk about where he thinks the misleading speech went wrong and why it’s not clear why the Internet has broken American society The mask is off.

Below is an edited excerpt from our conversation. As always, there’s a lot more to the full podcast, so subscribe Vox conversation On Apple Podcast, Google Podcast, Spotify, Stitcher, Or wherever you listen to podcasts.


Shawn Ealing

I’ve spent a lot of time over the last few years making noise about confusion and misinformation and how big of a problem it is, and I have to say, you really gave me a break and thought hard about how easily I shopped the conventional wisdom on this stuff.

But let’s just start there: Do you think people like me, who are concerned about confusion in public, were part of a panic?

Joe Bernstein

I think the idea of ​​bad information on the internet is a misunderstood and sometimes badly discussed topic. That’s a huge issue. This is a new thing. This is a very important issue, but like many problems, it helps to define them. And if you have trouble defining them, it helps to think about why. And when you start to think about why, it helps to think about who is trying to define the problem and why.

And so, I don’t feel comfortable even necessarily calling it a panic because I think, especially as we saw with the opening of this series. The Wall Street Journal Over the past few weeks, and then the testimony of the Facebook whistleblower, these are the real problems. It’s not clear to me whether we fully understand what the risks are or whether we fully understand what these categories are being thrown around – and I’ve sometimes thrown at them as well – wrong – and confusion – how they’re being used

And I really wanted to do this: it goes without saying that a number of private companies with monopoly power over the flow of information is something we should just be happy with and live in, but we should understand who we are when we talk about the problem. It wants to address and why.

Shawn Ealing

It may surprise people to know that even researchers studying confusion cannot come up with a coherent definition of the term.

Joe Bernstein

I played for this piece of laughter, laughter, laughter, laughter, laughter, laughter, laughter, laughter, laughter, laughter, laughter. What would scholars say is that they have a lexical problem. Everyone knows there is a problem, but everyone is attacking this problem by using the same word with different ideas in their head.

So the most comprehensive survey in the field of scholars is 2018. This is called a scientific literature review “Social media, political polarization, and political confusion.” And the definition they give of misinformation – and this is a good, comprehensive survey of the field – they give this definition: For a misconception about the real state of the world. “

Now, as far as I can tell, that definition is basically applicable to anything you can come in contact with online. And Shawn, I should say, it drives to the definitions that technology companies use when they give wrong- and misleading definitions. So – I don’t quite understand it – but TikTok’s definition of misinformation is something like, “Information that is not true or information that can be misleading or untrue.” There just isn’t much there. There is a lot of good research, but for something that wants to be like an objective science, there is no good objective basis.

Shawn Ealing

The big problem here is that we are desperate for a kind of neutral definition of disinformation so that something can be called “disinformation” without looking political, but that doesn’t seem possible.

Joe Bernstein

Yes. And then, I had an interesting point when I looked at the etymology of the word – it was actually borrowed from a Russian word that became popular in the early years of the Cold War: dysinformatsia. It was originally defined in the 1952 Great Soviet Encyclopedia, a promotional encyclopedia for English use. It was defined as follows: The capitalist press and radio make extensive use of disinformatics. ”

I don’t want to be completely relative and say that there is nothing that is true or false. Of course there is. But on the Internet in particular, context is very, very important, and it is very difficult to distinguish specific nuggets of information as good or bad information.

Shawn Ealing

What is a good definition of “confusion”? How is it different from “misinformation” or “propaganda”?

Joe Bernstein

I prefer the word propaganda to the wrong- and misleading words because I think it has a strong political meaning. I think there is a broad understanding between those who study and those who talk about mis- and misinformation in the media, that misinformation is more intentional than misinformation, and misinformation is poorly relevant but still true or “true” information.

What I wanted to do with this piece makes it clear that there is politics behind these definitions, the way people use them there is politics behind them. I don’t think there’s necessarily anything wrong with using these terms, unless it’s clear that there are interests.

And I’m not referring to any kind of broad conspiracy. I’m sorry to say – maybe I didn’t say it enough – that there are some people who are working in complete faith, who care deeply about public discourse, who are studying this problem. I just want some recognition that there is a politics behind the use of these terms, even if it is like a centralist or conventional liberal politics. I want to be a feature of the discussion.

Shawn Ealing

A big claim on your part is that the insanity of confusion has become a vehicle to drive the online advertising economy, and it may seem counterintuitive to say that big tech companies like Facebook will enthusiastically embrace the idea that “confusion” is a big problem.

What can a company like Facebook gain here? Why are they selling this so hard?

Joe Bernstein

Well, one of the things I think about this is that I started with a buzzword that I used; “Information Ecosystem.” It just kind of makes sense. We have a world, a natural world of information, and then something has corrupted it. And then I started thinking about other industries that pollute, and those that had problems with pollution.

So like the tobacco industry – which has recently become a major point of comparison with big technology – well, cigarettes cause people cancer. Or the fossil fuel industry, polluting it and contributing to climate change. And there is good science behind it. And yet these industries have spent years fighting science, trying to weaken science.

And for the 2016 election throw in favor of Trump and for Brexit, when Mark Zuckerberg admitted publicly that misinformation was a problem, I was amazed at how long it took to blame Facebook. And we know it’s true, but I don’t think there’s necessarily science. I don’t think the study of media influence in politics is necessarily still there.

I mean, we’re still getting political science about the impact of Father Cofflin in the 1936 election. These are questions that are going to be solved over time. But you told Mark Zuckerberg in public, “We’re going to fight misinformation.”

In part, this is because I think Facebook has never had a particularly consistent press strategy. But part of that, I think, is that Facebook, like other big tech companies, realized very quickly that, instead of saying in a kind of blanket way, “It’s not true ৷ these claims, there’s no empirical basis behind them,” I think they understood. That co-opting around those who were doing this research or at least placing their weapons was a good strategy.

And I wondered why. From a PR perspective, it makes good sense. But also, I myself began to think about the nature of the claim that people facing bad information must be convinced by that information. And then, when I had a “Eureka” moment, which is exactly how Facebook makes money. What Hannah Arendt calls the “mental basis of human manipulation” is a kind of word of mouth.

And so, if we recognize that whatever people see on Facebook, on the Internet, they are endlessly trustworthy, in some ways we are contributing to the idea that ad duplicates, Facebook and Google and in general only online ads work.

I’m going somehow, but there’s a great book I read at the time by a guy who is now a general advisor to Substack. He is a man named Tim Wang, who has worked at Google for a long time. The book is called Subprime Attention Crisis. And that’s basically how many cards house the online advertising industry.

One very interesting piece of information about the disclosure to the Facebook Whistleblower SEC, and one that has received almost no press attention, is that he claims that, based on internal Facebook research, they are misleading investors about the reach and effectiveness of their advertising. And to me, the most damaging thing you can say about Facebook is that this kind of industry information machine doesn’t actually work.

And so that kind of flipped everything I thought about this in his head. And that’s when I started writing the piece.

To hear the rest of the conversation, Click here, And be sure Subscribe Vox conversation On Apple Podcast, Google Podcast, Spotify, Stitcher, Or wherever you listen to podcasts.



Source link

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *