Proposals to change recommendations and curb conspiracies were sacrificed for engagement, staff say.
A year ago, Susan Wojcicki was on stage to
defend YouTube. Her company, hammered for months for fueling falsehoods
online, was reeling from another flare-up involving a conspiracy theory
video about the Parkland, Florida high school shooting that suggested
the victims were “crisis actors.”
Wojcicki,
YouTube’s chief executive officer, is a reluctant public ambassador,
but she was in Austin at the South by Southwest conference to unveil a
solution that she hoped would help quell conspiracy theories: a tiny
text box from websites like Wikipedia that would sit below videos that
questioned well-established facts like the moon landing and link viewers to the truth.
Wojcicki’s media behemoth, bent on overtaking television, is estimated to rake in sales of more than $16 billion a year. But on that day, Wojcicki compared her video site to a different kind of institution. “We’re really more like a library,” she said, staking out a familiar position as a defender of free speech. “There have always been controversies, if you look back at libraries.”
Since Wojcicki took the stage, prominent conspiracy theories on the
platform—including one on child vaccinations; another tying Hillary
Clinton to a Satanic cult—have drawn the ire of lawmakers eager to
regulate technology companies. And YouTube is, a year later, even more
associated with the darker parts of the web.
The conundrum isn’t just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive “library,” generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube’s problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.
Wojcicki and her deputies know this. In recent years, scores of people insideYouTube
and Google, its owner, raised concerns about the mass of false,
incendiary and toxic content that the world’s largest video site
surfaced and spread. One employee wanted to flag troubling videos, which
fell just short of the hate speech rules, and stop recommending them to
viewers. Another wanted to track these videos in a spreadsheet to chart
their popularity. A third, fretful of the spread of “alt-right” video
bloggers, created an internal vertical that showed just how popular they
were. Each time they got the same basic response: Don’t rock the boat.
The
company spent years chasing one business goal above others:
“Engagement,” a measure of the views, time spent and interactions with
online videos. Conversations with over twenty people who work at, or
recently left, YouTube reveal a corporate leadership unable or unwilling
to act on these internal alarms for fear of throttling engagement.
Wojcicki
would “never put her fingers on the scale,” said one person who worked
for her. “Her view was, ‘My job is to run the company, not deal with
this.’” This person, like others who spoke to Bloomberg News, asked not
to be identified because of a worry of retaliation.
YouTube turned down Bloomberg News’ requests to speak to Wojcicki, other executives, management at Google and the board of Alphabet Inc., its parent company. Last week, Neal Mohan, its chief product officer, told The New York Times that the company has “made great strides” in addressing its issues with recommendation and radical content.
A
YouTube spokeswoman contested the notion that Wojcicki is inattentive
to these issues and that the company prioritizes engagement above all
else. Instead, the spokeswoman said the company has spent the last two
years focused squarely on finding solutions for its content
problems. Since 2017, YouTube has recommended clips based on a metric
called “responsibility,” which includes input from satisfaction surveys
it shows after videos. YouTube declined to describe it more fully, but
said it receives “millions” of survey responses each week.
“Our
primary focus has been tackling some of the platform’s toughest content
challenges,” a spokeswoman said in an emailed statement. “We’ve taken a
number of significant steps, including updating our recommendations
system to prevent the spread of harmful misinformation, improving the
news experience on YouTube, bringing the number of people focused on
content issues across Google to 10,000, investing in machine learning to
be able to more quickly find and remove violative content, and
reviewing and updating our policies — we made more than 30 policy
updates in 2018 alone. And this is not the end: responsibility remains
our number one priority.”
In response to criticism about prioritizing growth over safety, Facebook Inc.
has proposed a dramatic shift in its core product. YouTube still has
struggled to explain any new corporate vision to the public
and investors – and sometimes, to its own staff. Five senior personnel
who left YouTube and Google in the last two years privately cited the
platform’s inability to tame extreme, disturbing videos as the reason
for their departure. Within Google, YouTube’s inability to fix its
problems has remained a major gripe. Google shares slipped in late
morning trading in New York on Tuesday, leaving them up 15 percent so
far this year. Facebook stock has jumped more than 30 percent in 2019,
after getting hammered last year.
YouTube’s inertia was
illuminated again after a deadly measles outbreak drew public attention
to vaccinations conspiracies on social media several weeks ago. New data
from Moonshot CVE, a London-based firm that studies extremism, found
that fewer than twenty YouTube channels that have spread these lies
reached over 170 million viewers, many who where then recommended other
videos laden with conspiracy theories.
The company’s lackluster response to explicit videos aimed at kids has drawn criticism from the tech industry itself. Patrick Copeland, a former Google director who left in 2016, recently posted a damning indictment of his old company on LinkedIn. While watching YouTube, Copeland’s daughter was recommended a clip that featured both a Snow White character drawn with exaggerated sexual features and a horse engaged in a sexual act. “Most companies would fire someone for watching this video at work,” he wrote. “Unbelievable!!” Copeland, who spent a decade at Google, decided to block the YouTube.com domain. READ MORE:https://www.bloomberg.com/news/features/2019-04-02/youtube-executives-ignored-warnings-letting-toxic-videos-run-rampant
You must be logged in to post a comment.