Anthony Longhitano and TEDxAsburyPark speaker Philip Bump talk about “How Subjective Media Replaced Institutions,” which will be presented at TEDxAsburyPark on May 18, 2019.
The following is an excerpt from the interview. To read and hear the full interview, CLICK HERE
Anthony Longhitano: Welcome, this is Expert Open Radio, I’m Anthony Longhitano and I’m a volunteer with TEDxAsburyPark. Today we are talking with Philip Bump, who’s a speaker scheduled at this year’s TEDxAsburyPark conference on May 18th. Welcome, Philip.
Philip Bump: Thank you, sir.
Anthony Longhitano: Philip, why don’t we start by having you tell us a little bit about your background.
Philip Bump: Sure. I’m a national correspondent for the Washington Post. Prior to that I worked for a website called The Atlantic Wire, which is essentially a news blog for The Atlantic Magazine. And prior to that I worked for an environmental site called Grist. I’ve done a number of things which I think have helped inform my ability to report, particularly in politics, for the Post.
Anthony Longhitano: Why don’t you introduce the topic that you’re going to be talking about at our conference on May 18th.
Philip Bump: I’m going to be walking through the ways in which the American media landscape has evolved away from public trust in institutional media and into a shattered system of trust that allows in any number of random actors with potentially, and at times, demonstrated, negative effects. It is trying to look at how politics, partisanship, the Internet, all of these things have combined with the lack of confidence in institutions broadly, to give us a scenario in which there are a lot of people that don’t have confidence in the information they receive, and so they put confidence in faulty information.
Anthony Longhitano: And Philip, what’s the title of your talk?
Philip Bump: “How Subjective Media Replaced Institutions.”
Anthony Longhitano: What has brought you to focus attention on this at this point, especially with our TEDx conference?
Philip Bump: Well, I think it fit neatly with the theme of chaos. There certainly is an element of chaos in the current media landscape. But, more broadly, it’s something that, as someone who works for mainstream media organizations, something I’ve paid a lot of attention to over the course of the past several years. Since prior to the 2016 election, I’ve seen lots of examples of ways in which people have seized upon information that is faulty, that is inaccurate. And since the 2016 election in particular, I’ve seen a lot of really negative effects of that. Even recently, the shooting down in Southern California, at the synagogue, which is a direct offshoot of people putting confidence in a media ecosystem which spreads hate and spreads bigotry, and spreads and encourages violence. And that is a direct result of the Internet, it is a direct result of people losing confidence in the validity of institutions, and I think it’s worth noting, particularly now.
Anthony Longhitano: The Internet has been around for years, and people have not vetted information, shall we say, on their own, for a long time. Why do you think, in recent years, we have a phenomenon where it seems like the Internet is out of control?
Philip Bump: I’d say a couple of reasons. The first is the advent of social media, which, in addition to what it does fairly obviously, sharing photos and so on and so forth, it is the thing that’s really differentiated the mass media. Back in the 1980’s it was not necessarily that reporters were everywhere in a way that they are currently. It was that the Washington Post had a distribution mechanism which most people didn’t have. ABC News had a distribution mechanism, mainly the airways, that a lot of people didn’t have. Radio stations had a distribution mechanism people didn’t have. That’s gone out the window with the advent of smartphones and social media. People anywhere, can post news any time about anything, and people have grown accustomed to seeing random Twitter users being the ones to break scene. I mean this guy who’s sitting in Abbottabad, observes the Osama Bin Laden raid, right? He was the first person to report that happening, because he was on Twitter at the time. We’re used to seeing that, and what happens then, as a result, is we’ve built these little ecosystems, these little environments in which people can share information, and can distribute broadly to a large group of people, what it is they’re seeing and hearing. A lot of times that is not accurate. For example, there have always been people who were into dressing up as animals; we called them the ‘furries’ now. And the only reason we know the furries exist is because furries went online, and they found that they had this community of like-minded people. And now there are entire conventions where people like to dress up like animals. No one would ever have expected that. The Internet made that possible. I wouldn’t necessarily say it’s an innocent way to look at it, but that is not something that is particularly nefarious, but that’s a community that’s emerged because of the Internet.
Now what we’re seeing though, is we see a lot of communities that are nefarious emerge…we’re seeing white nationalists, we’re seeing the alt-right, we’re seeing people who have motivations of hate. They, too, can find communities, and those communities not only help them share information, but they also provide a support structure. And so they, in the same way that a ‘furry’ can go into a community and say, “You know what, you’re not so weird, I also enjoy dressing like a wolf,” now Nazis can go online and say, “You know what, you’re not so weird, I also think that people who aren’t white should be murdered.” And that, obviously, is a broadly negative thing for the country and the world.
Anthony Longhitano: Is there a case to be made that social media platforms have a responsibility to somehow try to ensure that the information that is shared on their platforms is curated, whether by a human being, a set of human beings, or some kind of AI algorithm?
Philip Bump: Yeah, I mean I think they acknowledge that. I mean Facebook, YouTube (which is owned by Google), and Twitter have all taken steps to try and root this stuff out. And a good example is, when you look at the rise of the Islamic State, the Islamic State was very, very deliberate about trying to reach out to people, and build a community of like-minded individuals on social media. And the platforms got together, and they came up with a way of tamping that down. It hasn’t been 100% effective, but it has been very, very effective. The challenge is that the ideology and the means and the rhetoric of Islamic State is much less integrated into American culture than are things like white nationalism, and hate, and very extreme political views. Those things are part of American culture, in a way that the Islamic State isn’t. And so, I think its proven more difficult for them to be able for them to weed those things out in a timely fashion.
Anthony Longhitano: Is there also a case where some of the functionality of the social media platforms that allows the amplification of these messages is actually based upon the economic model of the social media platform itself? Not necessarily relative to the negative message, but obviously the more clicks, the more advertising, the more advertising the more revenue there is for the social media platform. The algorithms that determine whether something is popular and therefore enable more attention with the audience are also the same mechanisms that can allow for the amplification of these hate messages.
Philip Bump: That’s exactly right, and over the course of the 2016 campaign we saw lots of examples where people understood that they could make money by sharing nonsense about the 2016 campaign. And people were able to leverage Facebook to share this stuff, Facebook got a cut of the profits from that. Facebook made money off the Russians that were trying to interfere with the election, right? They were paid money to run these ads. But I think over the long term, it is not good for the Facebook brand to be associated with things like genocide. It’s not helpful for Facebook, or for Google, or for Twitter to be seen as a place that coddles white supremacists. While there is an economic incentive for them to allow people to share whatever they want to share, and they do, there is also an economic disincentive to allow people to share things which are obviously harmful to society.
Anthony Longhitano: How much of this phenomena would you attribute to the fact that users are not necessarily going the extra step to vet information, to see if it’s accurate or not?
Philip Bump: That’s a huge problem. We’ve seen studies that look at the extent to which people accept or reject untrue information. There appears, according to some studies, to be some correlation between age and the receptiveness to sharing or accepting untrue information. And that may itself be an actual side effect of having grown up in a media ecosphere in which you could generally trust what you read in the media. If you grew up thinking that anything you read was something that was reliable, and now you’re reading news sites that you may not have heard of before, but are sharing information you may be more willing to trust those things. There certainly is the case that people need to be better attuned to whether or not they are getting accurate information, but part of the problem is also that people are also living in a much more polarized political space. Part of that means that because Republicans are so skeptical of Democrats, and Democrats are so skeptical of Republicans, people tend to seize upon news stories which reinforce their existing biases, and share those before they recognize that those things may not be accurate. And so, part of it, too, is just that everyone is so amped up, generally speaking, about politics, that a lot of these political stories, in particular, get shared, simply because they reinforce people’s biases.
Anthony Longhitano: Is the anonymity of the Internet also something that catalyzes this behavior?
Philip Bump: I think that it certainly is the case of something like an 8Chan, which is this message board where both the New Zealand shooter and the guy down in Southern California both were active participants. There’s an anonymity there which lets people really express extreme views in a way that they probably wouldn’t if they had to put their names to it. That said, though we’ve seen lots of other people have undertaken really horrific acts using their own names and aren’t particularly shy about it. So, I don’t think it is necessarily a component of people acting badly or sharing negative information, but I think it can, at times, exacerbate how negative the results of those things are.
Anthony Longhitano: Are there other people or resources that you could point our audience to if they are interested in following up on this topic on their own?
Philip Bump: There’s been a lot of good research data about partisan polarization, for example, by Pew Research Center. There’s been a lot of good analysis of media usage by Pew as well. There are a lot of reporters who focus specifically on hate groups and organizations. Twitter, in general, is a good place to follow reporters who you think are interested in any particular subject, because reporters spend all day on Twitter as a place to generally find people that are reporting on things you are interested in. But beyond that, media criticism has really shifted to a large extent on trying to figure out how this emerged and how this problem can be fixed. There are a lot of places that have been covering that.
Anthony Longhitano: Do you think this is going to correct itself, by people becoming more aware of it or social media platforms taking action, or is this something where there’s going to be a need for some kind of governmental regulation, to incent the social media platforms to accelerate their activity in trying to correct this?
Philip Bump: The social media platforms have indicated a willingness to try and self-police. There’s obviously a broader question of the extent to which any business is going to effectively police itself if it could undermine its profit motive. I think we’re going to have to wait and see. I don’t think Congress is going to pass anything over the short-term which does much about this, and while President Trump has complained about social media platforms, generally, through the lens of this unproven and undemonstrated argument that conservatives are unfairly targeted on the platforms, I think that it is unlikely, at this point, that any legislation will pass. I also think that it is in the companies’ self-interest to try and do as effective a job as they can to prevent that from happening, so that their hands aren’t tied by the government.
Anthony Longhitano: When I was watching/listening to your video application to give the talk at TEDxAsburyPark, the thing that came to my mind is the “yellow journalism” that took place in the late 1800’s/early 1900’s…
Philip Bump: Sure.
Anthony Longhitano: That was in the traditional, written word of newspapers. The media platforms at the time, the big newspapers, some of them were guilty of…because of their own economic self interest… propagating false information that, got people riled up and caused newspapers to be bought. How did that problem get solved, or did it self-correct?
Philip Bump: I think there was sort of self-correction by media institutions in which it became part of the culture to reinforce objectivity and reinforce a willingness to self-correct when mistakes were made. When we’re talking about things like William Randolph Hearst at the turn of the last century, who was taking steps to try and advocate his political positions through his papers and news reporting… that came to be seen as a negative in part because people like Hearst were doing it, right?
I think that World War II also played a role in this. There was a premium on accurate information, and there was a premium placed on America as sort of an idealized place and it was really after World War II that we saw this standard of “Okay, we are going to be objective, we are going to self-correct, we are going to do our best to represent fairly both sides of the issue to people.”
It is absolutely the case that over the span of American history, the minority of time has been spent in a world where that was a predominant and prevailing attitude of news organizations. What’s different now is not that we are returning to the days of William Randolph Hearst controlling distribution mechanisms and making up stories as they go, but instead there is no one controlling the distribution of information. It is this massive, massive marketplace and in this particular case that has meant some bad actors have really been able to thrive. It’s more akin to the world prior to having a modern medical industry, and people were selling snake oil on every corner and claiming that it cured things, until the medical industry cracked down and was like “Hey, you can die if you drink this stuff/ here are standards that we’re applying,” and so on and so forth that sort of revised how that world worked. People put confidence in it because they found that they weren’t dying and these snake oil things didn’t work. The question is, is there something we can do similarly on the media end to revise how our marketplace works?
Anthony Longhitano: Well Philip, thank you very much for being with us today. This is Expert Open Radio. Here’s a reminder to get your tickets for the largest, highest-rated TEDx conference on the east coast. It’s TEDxAsburyPark on Saturday, May 18, 2019. And you’ll have an opportunity to hear Philip talk about “How Subjective Media Replaced Institutions” in more depth at that conference on that date. So, thank you again Philip.
Philip Bump: You bet, thank you.
Graphic created by Kel Grant.