
The role of social media companies in the 2024 election
Clip: 2/10/2024 | 6m 18sVideo has Closed Captions
Can social media companies safeguard the 2024 election against misinformation?
As 2024 election campaigns pick up steam, Meta announced this week that it would start labeling AI-generated images posted on Facebook, Instagram and Threads. In December, an advocacy group said Meta, YouTube and X have rolled back 17 policies intended to protect against hate speech and misinformation. John Yang speaks with Katie Harbath, former Facebook public policy director, to learn more.
Problems with Closed Captions? Closed Captioning Feedback
Problems with Closed Captions? Closed Captioning Feedback
Major corporate funding for the PBS News Hour is provided by BDO, BNSF, Consumer Cellular, American Cruise Lines, and Raymond James. Funding for the PBS NewsHour Weekend is provided by...

The role of social media companies in the 2024 election
Clip: 2/10/2024 | 6m 18sVideo has Closed Captions
As 2024 election campaigns pick up steam, Meta announced this week that it would start labeling AI-generated images posted on Facebook, Instagram and Threads. In December, an advocacy group said Meta, YouTube and X have rolled back 17 policies intended to protect against hate speech and misinformation. John Yang speaks with Katie Harbath, former Facebook public policy director, to learn more.
Problems with Closed Captions? Closed Captioning Feedback
How to Watch PBS News Hour
PBS News Hour is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipJOHN YANG: Earlier this week, as the 2024 election campaigns picked up steam, Meta announced it would start labeling AI generated images that appear on Facebook, Instagram and threads to help users better judge what they're seeing.
In December, an advocacy group called Free Press said Meta, YouTube and X have rolled back a total of 17 policies intended to protect against hate speech and misinformation.
The group also said layoffs at the three companies make it harder to enforce the safeguards that remain.
Katie Harbath is the chief global affairs officer at Duco, which is a technology consulting firm.
She's also a former public policy director at Facebook.
Katie, this announcement from Meta, how big a deal is it?
How helpful is this going to be, and is it enough?
KATIE HARBATH, Chief Global Affairs Officer, Duco: So I think that this is just one of many important steps that all online platforms have to take when thinking about safeguarding elections, not only here in the United States, but around the globe.
And so we are facing new challenges with AI, as well as existing challenges that have been around for a long time, such as misinterpretation, foreign interference, transparency around political ads.
And I think one of the challenges as we go into this cycle is that question of whether or not these platforms are prepared is we just don't know because they are also putting in a lot of investment, they're making a lot of changes, as you mentioned.
But we also don't know the twists and turns that await for us over the course of this next year.
And so the real proof of this will be how these companies act as these different elections happen and as we see different forms of nefarious interference happen.
JOHN YANG: Given those unknowns, do we start this election season more vulnerable or less vulnerable than were in 2020?
KATIE HARBATH: It's kind of hard to say if it's more or less.
It's just different.
You know, on top of the legacy platforms like Meta and Google, you also have a lot of newer platforms that, for the first time are having to write the policies and build the tools for elections such as these.
And so it's more of a kaleidoscope of a myriad of different policies that we're looking at.
And so that's what makes some of this so uncertain and uncomfortable as we go into this.
But I think at the bare minimum, we at least know that these companies are paying attention to it.
JOHN YANG: Talk about those differences the sort of new boys on the block, the new fellas on the block, as opposed to the legacy platforms.
Do you have any sense of how they are developing their policies and how they compare to the legacy companies?
KATIE HARBATH: Absolutely.
Some, such as TikTok have invested a lot into this.
In a recent hearing, their CEO mentioned they're going to invest 2 billion into trust and safety.
Meta has said that they've spent 20 billion over the last, I think, five or six years.
You also have some platforms like TikTok deciding just to not allow political ads altogether.
Others are choosing to try to also deemphasize politics, whereas platforms like Substack are trying to lean into politics.
Some are taking a very hands off approach to content moderation, and others are putting a lot more time and effort into it.
And so this makes it really challenging to understand the differences between all of these platforms, as they're all trying to learn the lessons from the past, but also understand the responsibility that they have going into this.
JOHN YANG: Two of the legacy platforms, YouTube and Meta, say they're allowing ads now that do question the validity of the 2020 election, which, of course, led to the January 6 assault on the Capitol.
At the same time, Meta says they are not allowing ads that would weaken or question current ongoing elections or future elections.
Is that slicing it pretty thin, do you think?
I mean, can they really make that distinction?
KATIE HARBATH: All of these policies are all about nuance and trade offs.
Even defining what is political is a very difficult thing in which to do.
And it's in these nuances where you actually see more of the difficult calls and more of the disagreement around it, and where you see challenges for these companies in enforcement of trying to determine what is or is not allowed.
And so we should never expect that there will ever be 100% perfection by these companies in order to try to find it.
And it's really hard to do a blunt approach to this, because if you do so, you actually end up taking down more legitimate speech than probably what many people would want.
JOHN YANG: This is 2024 is one of the greatest concentrations of elections around the world in a long time.
Are platforms doing enough to protect those international elections, as well as the elections here in the United States?
KATIE HARBATH: This is one of the biggest open questions for me.
Some of the platforms, such as Meta, have actually announced things that they're doing for the upcoming Indonesian elections, for the Mexico elections coming up, but they're just not being as transparent about that of what that means around the globe.
Traditionally, we have seen just a lot more time, money and effort put into the English language, into elections in the U.S. and the EU.
And I am worried that so much attention is going to get sucked into those elections that we will forget about these elections all around the globe that will have just as much impact in sort of the future of the global world order and how we handle issues on everything from climate change to other geopolitical issues.
JOHN YANG: Do you think this sort of self-regulation, this self-policing by these platforms, is going to be enough, or do you think the government's going to have to step in with regulation?
KATIE HARBATH: Everyone has a role.
This technology is moving so fast that you kind of have to start with self-regulation.
You're seeing that with AI issues right now.
But we absolutely need the government, too.
And I think one of the problems or challenges with all of this is that no one's quite comfortable with the tech companies being in charge.
No one's comfortable with the government being in charge or others.
And instead, I think we need to build a system of checks and balances so that we're holding all these different entities accountable for the role that they need to play in safeguarding our information, environment and elections overall.
JOHN YANG: Katie Harbath of the technology consulting forum Duco, thank you very much.
KATIE HARBATH: Thank you.
The unique challenges of dating with disabilities
Video has Closed Captions
The unique challenges of dating and finding love while living with disabilities (8m 10s)
What to know about protests against the far-right in Germany
Video has Closed Captions
What to know about Germany’s far-right politics and protests against its rise (5m 10s)
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipSupport for PBS provided by:
Major corporate funding for the PBS News Hour is provided by BDO, BNSF, Consumer Cellular, American Cruise Lines, and Raymond James. Funding for the PBS NewsHour Weekend is provided by...