Monday, 26 March 2018

YouTube's A.I. finds that radicalization gets more ad revenue.

Tufekci points to yet another pathological social consequence of A.I. algorithms designed to make people stay on a website longer, and thus generate more clicks on advertisements that provide revenue. The YouTube algorithm "seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general." The author found that a YouTube video giving straightforward information on on a Trump rally was followed by autoplay videos "that featured white supremacist rants, Holocaust denials and other disturbing content." The author created another YouTube account to watch videos of Hillary Clinton and Bernie Sander, and was soon auto-directed to "videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11." The same pattern emerges with nonpolitical topics, "Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons."
...a former Google engineer named Guillaume Chaslot...worked on the recommender algorithm while at YouTube...The Wall Street Journal conducted an investigation of YouTube content with the help of Mr. Chaslot. It found that YouTube often “fed far-right or far-left videos to users who watched relatively mainstream news sources,” and that such extremist tendencies were evident with a wide variety of material. If you searched for information on the flu vaccine, you were recommended anti-vaccination conspiracy videos.
It is also possible that YouTube’s recommender algorithm has a bias toward inflammatory content. In the run-up to the 2016 election, Mr. Chaslot created a program to keep track of YouTube’s most recommended videos as well as its patterns of recommendations. He discovered that whether you started with a pro-Clinton or pro-Trump video on YouTube, you were many times more likely to end up with a pro-Trump video recommended.
Combine this finding with other research showing that during the 2016 campaign, fake news, which tends toward the outrageous, included much more pro-Trump than pro-Clinton content, and YouTube’s tendency toward the incendiary seems evident.
YouTube has recently come under fire for recommending videos promoting the conspiracy theory that the outspoken survivors of the school shooting in Parkland, Fla., are “crisis actors” masquerading as victims.
What we are witnessing is the computational exploitation of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.
This state of affairs is unacceptable but not inevitable. There is no reason to let a company make so much money while potentially helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.


from Deric's MindBlog https://ift.tt/2pFr5My
via https://ifttt.com/ IFTTT

No comments:

Post a Comment