Can YouTube’s Users Help the Company Deal With Its “Moral” Problem?*
This month’s mini-case described the dilemma faced by Susan Wojcicki, CEO of YouTube: how far to promote “sustainability,” viewership, and (to some) free speech at the risk of losing what is seen by some as the moral high ground and long-term viewership.
As realJoannLyles put it, “YouTube t-h-i-n-k-s it has a suppression problem. Instead, YouTube has a moral problem …” David Wittenberg added, “There's no ‘right answer’ here. No matter what YouTube does or doesn't do, someone is bound to be offended, outraged, short-changed, or disappointed…”
Other comments reflected the nature of the dilemma, with some advice about what she should do.
Philippe Gouamba’s comment framed the dilemma nicely: “As this Nation still believes in freedom of speech and hopefully will continue to do so in the foreseeable future, let there not be any sort of attack on this very fundamental right. On the other hand social media platforms still have an obligation to the public that they serve… YouTube needs to redouble its efforts and be totally relentless in its quest to position itself on the moral high-ground.” Shelley D. Chuchmuch put it succinctly in saying, “Free speech, free thought and the right to be heard. How far do we go to secure the block chain of information to ensure that content is free from hate?”
Gene identified the misuse of technology as the challenge in commenting: “YT's philosophy is representative of the Silicon Valley's culture as a whole—‘an algorithm solves everything’ and ‘just put it out and we'll fix it later.’ In traditional media, for example, considerable debate can go into the production of a program ... With YouTube, Facebook, and others, there is no mechanism to filter content based on good human judgment.” realJoannLyles added, “Tech can't make moral distinctions. Therefore, when a company fails to make moral distinctions, expect tech to fail in the same way. As one person, Susan Wojcicki is limited by her own instincts and biases. She and Google should start prioritizing the voices of their teams…”
David Wittenberg would have YouTube look to its users for a solution to the dilemma. As he put it, ‘My preferred solution errs on the side of personal liberty and adds the element of information. I'd institute a rating system like the ones currently in place for music, video games, and motion pictures. However, instead of a rating board, I'd let the users of YouTube determine the ratings. I'd want YouTube to develop a system for age-checking so that children were not exposed to (XXX) content rated that way. Of course, child pornography and illegal content would continue to be barred… People already consume tabloid newspapers, racist propaganda, biased news broadcasts, and pornography willingly. Let them at least know how the general public categorizes such materials… Everyone benefits from such a system. Viewers can find the content that they want to see and avoid what they don't want to see, publishers can attract the audiences they want, and commercial activity can grow accordingly.”
Can YouTube’s users help the company deal with its “moral” problem? What do you think?
*For the record, Susan Wojcicki tweeted in response to the column, “My #1 priority is responsibility, even if that comes at the expenses (sic) of growth.”
Original Column
On April 2, 2019 the leadership team at YouTube, headed by CEO Susan Wojcicki ( “The most powerful woman on the internet,” according to Time magazine), was confronted with a posting on Bloomberg.com of an article by Mark Bergen titled “YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant.”
The article noted recommendations from current and former employees about ways to alter policies designed to increase “user engagement, ” or the amount of time spent on the YouTube website. The policies, it was alleged, had led to the site hosting inappropriate material. Employees believed that their proposals had been ignored in management’s quest for higher revenues and profits.
YouTube is the video website and Google subsidiary on which creators post their content, host advertising, and build small (or not-so-small) businesses, all the while contributing to the Company’s revenues, recently thought to be about $16 billion annually. It is now the world’s most popular video site: millions of hours of content are posted daily for viewing by hundreds of millions of visitors, some of them subscribers to YouTube and creators’ sites. On the surface, YouTube’s business model appears to be the work of genius. But it has posed problems.
One problem was lack of profitability. Wojcicki, who got to know Google’s founders when she leased her garage to them for Google’s first office, became CEO of YouTube in 2014, facing the task of bolstering the Company’s growth and financial performance. Two years earlier she had come up with the controversial insight that a more profitable business model for YouTube should be based not on the number of eyeballs visiting the site but on the length of time those eyeballs remained engaged.
“On the surface, YouTube’s business model appears to be the work of genius. But it has posed problems.”
According to Google investor John Doerr, she inherited a stretch goal set in 2012 to increase viewer engagement by ten times in four years—to one billion hours of viewing daily. This would require greater promotion of the site to creators of content, some of whom were raising issues about their split of the revenue. It also required a rewrite of the recommendation algorithm that led viewers to related videos, thus building engagement. Google-trained artificial intelligence engineers were capable of calibrating the recommendation algorithm to do this. Decisions were based on the knowledge that “outrage equals attention” and that “clickbait” (misleading headlines for content of dubious value) had to be minimized to build viewer loyalty.
Unfortunately, cute puppy videos weren’t the only things producing engagement. Content such as hate speech and unsupported conspiracy theories were among the most popular viewer “draws” as well. A big step, for example, involved barring Alex Jones’ Infowars site and its contents (periodic hate speech) from the service. It was a big earner for him and for YouTube. Jones complained that his freedom of speech was being suppressed. The move was prompted by complaints from the public but also by criticism from YouTube employees. But YouTube continued to feature “borderline” material containing clearly false claims of things such as a flat planet, bogus cures for cancer, and conspiracy theories such as United States government involvement in the 9/11 attacks.
Employee suggestions regarding ways of dealing with questionable material continued to surface. Some former employees complained that they were ignored. According to one person who worked for Wojcicki, her attitude was that she would “never put her fingers on the scale.” Her view was, “My job is to run the company, not deal with this.” (This coincided with her view of YouTube as a “library” rather than a media site with different responsibilities).
While accepting the notion that hate speech had no place on the site, Wojcicki rejected the idea that sites containing conspiracy theories or other borderline content such as “Every Cancer Can Be Cured in Weeks” should be barred from the site. Instead, she endorsed another employee proposal to post “Info Boxes”—many containing blurbs from Wikipedia—to counter claims made for cancer cures or conspiracy theorists.
This involved a complex algorithm utilizing artificial intelligence that didn’t always work. For example, when Notre Dame burned several weeks ago, sites showing the fire contained a curious “Info Box” referring to 9/11 with no explanation, suggesting a sinister relationship of some kind. Other snafus, such as recommendations on the children’s YouTube service that directed children watching cute pet tricks to a site featuring animals having intercourse, suggested, as Harvard University Fellow Brittan Heller put it, “They don’t know how the algorithm works …”. It was thought also that clever creators of borderline material were designing their videos to trick the algorithm.Growing responsibly?
At one time, content creators had been YouTube’s biggest problem as they complained about poor revenue splits from subscribers and advertisers. The stretch growth goal led to difficult negotiations with Google’s leadership for more investment in the massive bandwidth required by the Company. Ways of achieving higher engagement in pursuit of a stretch goal then attracted most of management’s attention. The latter produced complaints about inappropriate material on the site. In contrast, others cautioned YouTube not to suppress freedom of speech. Now employee claims of having their recommendations ignored were surfacing. All of this was attracting increased attention from government regulators. Clearly, the list of management challenges was becoming longer.
In his book, Doerr points out that from the start of her YouTube assignment, Wojcicki “didn’t want to grow at any cost—she wanted to do this responsibly.” Perhaps in response to growing criticism, a February, 2019 Google release reminded everyone that “The YouTube company-wide goal is framed not just as ‘Growth’ but as ‘Responsible Growth.’”
After barely meeting its 2016 stretch goal, rumors of a new goal of achieving viewership greater than for all of commercial television (roughly five times larger) had arisen.
Issues and pressures facing the Company in May, 2019 suggest the question: What should the leadership of YouTube do? What do you think?
References:
John Doerr, Measure What Matters: How Google, Bono, and the Gates Foundation Rock the World with OKRs, (New York: Portfolio/Penguin), pp. 154-171.
Mark Bergen, YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant, Bloomberg, April 2, 2019, Bloomberg.com
Kevin Roose, YouTube’s Struggle To Silence the Stars Who Sell Conspiracy, The New York Times, February 19, 2019, pp. B1-B2.