YouTube bans ‘doctored’ content
Worries over ‘deepfake’ videos loom amid US presidential primaries
Published: Feb 04, 2020 05:53 PM

In this file photo taken on May 26, 2010 YouTube headquaters is pictured in San Bruno, California, the United States. (Xinhua/AFP)

YouTube said Monday it would remove election-related videos that are "manipulated or doctored" to mislead voters, as part of its efforts to stem online misinformation.

The Google-owned service said it was taking the measures as it strives to become a "more reliable source" for news and to promote a "healthy political discourse" amid heightened fears over video fakes around the world.

Leslie Miller, YouTube's vice president of government affairs and public policy, said in a blog post that the service's community standards prohibit "content that has been technically manipulated or doctored in a way that misleads users... and may pose a serious risk of egregious harm."

The latest YouTube statement, which seeks to clarify its global policy on election misinformation, was announced as the US presidential primary season kicks off, with caucuses held in Iowa on Monday and the first primary next week in New Hampshire.

The move comes amid growing concern about so-called "deepfake" videos altered by using artificial intelligence, which can create credible-looking events, but also "shallow" fakes that use more rudimentary techniques to deceive viewers.

Deepfakes and manipulated content have raised fears over the election process worldwide, amid notable incidents in Britain and India where videos were used for disinformation.

YouTube noted the policy also bans content which aims to mislead people about voting rules or the US census process that is underway.

Online platforms have come under pressure to root out misinformation in the wake of foreign manipulation efforts in the US in 2016 and elsewhere in recent years.

In the US, critics of online platforms have claimed not enough is being done to curb false claims by candidates themselves.

Google last year said it was stepping up efforts on election misinformation and would remove false claims in ads, including on YouTube, but the new statement appeared to offer specifics on certain kinds of content that will be blocked.

"The underlying standards YouTube explains and illustrates today do not appear to be brand new, but the company deserves praise for setting them out in clear terms and warning that it intends to enforce them vigorously," said Paul Barrett of the New York University Center for Business and Human Rights and author of a 2019 study on political disinformation.

The announcement underscores differing policies by major social networks on disinformation. 

Twitter has said that it would ban all political advertisements for candidates, while Facebook has maintained a hands-off policy for political speech and ads, with some exceptions for content that misleads users about voting times and places.

AFP