Set as Homepage - Add to Favorites

日韩欧美成人一区二区三区免费-日韩欧美成人免费中文字幕-日韩欧美成人免费观看-日韩欧美成人免-日韩欧美不卡一区-日韩欧美爱情中文字幕在线

【xem phim tình c?m x? h?i】Does YouTube radicalize users? This study says not —but it's deeply flawed.

A new study of YouTube's algorithm attracting mainstream attention this weekend claims that the online video giant "actively discourages" radicalization on xem phim tình c?m x? h?ithe platform. And if that sounds suspect to you, it should.

The study flies in the face of everything we know about YouTube's recommendation algorithm. There has been plenty of evidence that it pulls users down a rabbit hole of extremist content. A 2018 study of videos recommended to political viewers during the 2016 election found that an overwhelming majority were pro-Trump. Far-right allies of the authoritarian Jair Bolsonaro in Brazil say he and they wouldn't have won election without YouTube.

Now here comes Australian coder and data scientist Mark Ledwich, who conducted this new study along with UC Berkeley researcher Anna Zaitsev. The pair looked at 768 different political channels and 23 million recommendations for their research. All of the data was pulled from a fresh account that had never viewed videos on YouTube.


You May Also Like

In a tweet, Ledwich presents some of their findings using oddly emotive language:

“It turns out the late 2019 algorithm

*DESTROYS* conspiracy theorists, provocateurs and white identitarians.

Let's break down why the study doesn't measure up.

1. It's woefully limited

The first problem: the study ignores prior versions of the algorithm. Sure, if you’re using the “late 2019” version as proof that YouTube “actively discourages” radicalization now, you may have a point. YouTube has spent the year tweaking its algorithm in response to the evidence that the platform was recommending extremist and conspiratorial content. The company publicly announced this clean-up plan early in 2019.

But in a followup tweet, Ledwich says his study "takes aim" at the New York Times, in particular tech reporter Kevin Roose, "who have been on myth-filled crusade vs social media.”

“We should start questioning the authoritative status of outlets that have soiled themselves with agendas," Ledwich continues — ironically, after having announced an agenda of his own.

Ledwich’s problem appears to be withRoose's articleThe Making of a YouTube Radical. The story's subject, Caleb Cain, started being radicalized by YouTube video recommendations in 2014. Therefore, nothing about the 2019 YouTube algorithm debunks this story. The barn door is open, the horse has bolted.

Cain represents countless individuals who are now subscribed to extremist or conspiracy theory-related content. Creators publishing this content have had years to get a head start. They’ve already benefited from the old recommendation algorithm in order to reach hundreds of thousands of subscribers. These channels are now popular and their content spreads due to that popularity.

Roose hit back against Ledwich in a lengthy thread:

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

2. It's clearly slanted

The second problem has to do with the subjective and highly suspect way Ledwich and Zaitsev have grouped YouTube channels. He has CNN categorized as "Partisan Left," no different than, say, left-wing YouTube news outlet The Young Turks.

The study described channels in this category as a “exclusively critical of Republicans” and “would agree with this statement: ”GOP policies are a threat to the well-being of the country.””

This is, of course, self-evidently ridiculous. CNN is a mainstream media outlet which employs many former Republican politicians and members of the Trump administration as on-air contributors. It is often criticized, most notably by one of its former anchors, for allowing these commentators to spread falsehoods unchecked.

Naming CNN as "partisan left" betrays partisanship at the root of this study.

Beyond that, there are other partisan flaws with the study such as how it groups YouTube channels from right wing partisans like Steven Crowder and Milo Yiannopoulos. The two are labeled simply as nonpartisan "provocateurs" looking to take just any position for attention. This is a blatantly false description and inaccurate grouping for the study's two examples.

3. It doesn't get YouTube

A third major problem: the researchers appear to not fully understand how YouTube works for regular users.

“One should note that the recommendations list provided to a user who has an account and who is logged into YouTube might differ from the list presented to this anonymous account,” the study says. “However, we do not believe that there is a drastic difference in the behavior of the algorithm.”

The researchers continue: “It would seem counter-intuitive for YouTube to apply vastly different criteria for anonymous users and users who are logged into their accounts, especially considering how complex creating such a recommendation algorithm is in the first place."

That is an incorrect assumption. YouTube's algorithm works by looking at what a user is watching and has watched. If you’re logged in, the YouTube algorithm has an entire history of content you’ve viewed at its disposal. Why wouldn't it use that?

It’s not just video-watching habits that YouTube has access to, either. There are other complex factors at play. Every time you hit "subscribe" on a YouTube channel, it affects what the algorithm recommends you to watch.

Plus, since YouTube accounts are connected to a Google account, simply being logged into any of Google’s services means you’re pretty much always accumulating data for its algorithm because you're logged into YouTube as well.

Any user can test out whether being logged in to their YouTube account matters on their own and debunk this claim. Being logged into an account versus being an anonymous user makes a major difference to the algorithm, as other researchers of YouTube radicalization have pointed out.

As experts in the field will tell you, it is extremely difficult to produce reliable, quantitative studies on YouTube recommendation radicalization for these very reasons. Every account will produce a different result based on each user’s personal viewing habits. YouTube itself would have the data necessary to effectively pursue accurate results. Ledwich does not.

We may never truly know the magnitude of YouTube radicalization. But we do know that this study completely misses the mark.

Topics YouTube Politics

0.1549s , 8113.3125 kb

Copyright © 2025 Powered by 【xem phim tình c?m x? h?i】Does YouTube radicalize users? This study says not —but it's deeply flawed.,Public Opinion Flash  

Sitemap

Top 主站蜘蛛池模板: 97夜夜澡人人爽人人模人人喊 | 91精品国产福利在线观看麻豆 | 亚洲视频一二三 | 成人片国产在线观看无码 | 久久无码一区人妻 | 在线观着免费观看国产黄 | 亚洲欧美日韩一本无线码专区 | 人妖专区免费视频精 | 成人a片国产无码免费视 | a毛无码91麻豆精品国产 | 男人J进女人P | 亚洲国产中文精品无码专区网站 | A片扒开双腿猛进入免费观 A片扒开双腿猛进入免费观看 | 日韩亚洲AV无码波多野结衣 | 国产午夜性春猛交xxxx亚洲黄色一级片 | 日韩国产人妻一区二区三区 | 国产高潮国产高潮久久久m3u8 | 国产精品沙发午睡系列990531 | 91亚洲国产成人精品看片 | 99久久国产综合精品麻豆导演 | 一本色道久久综合国产 | 国产精品无码一区二区久久 | 国产偷国产偷亚洲清高动态图 | 91高清色网一二三区 | 99在线观看一区二区三区 | 久久女同互慰一区二区三区 | 亚洲国产精品熟女 | 四虎影院211风情影院 | 2024日本大片免a费观看视频 | 国产高清在线精品一区二区三区 | 无码国产伦一区二区三区视频 | 丁香婷五月 | 国产情侣91| 亚洲国产精品一区二区成人小说 | 成人国产一区二区三区精品不卡 | 黄网址免费 | 国产在线观看免费 | 国产欧美中文日韩在线综合网 | aⅴ一级视频在线观看 | 国产成人精品亚洲精品 | 久久精品无码一区二区软件 |