论文标题
“这怎么了?”了解YouTube上的Incel社区
"How over is it?" Understanding the Incel Community on YouTube
论文作者
论文摘要
YouTube是迄今为止最大的全球用户生成视频内容。 las,该平台也因主持不适当,有毒和可恶的内容而受到抨击。一个经常与共享和出版仇恨和厌恶女性内容有关的社区是非自愿独身者(Incels),这是一个表面上的宽松定义的运动,从表面上看,专注于男性问题。在本文中,我们着手在过去十年中专注于该社区的发展,并了解YouTube的推荐算法是否将用户引导用户转向INCEL相关的视频,从而分析INCEL社区。我们收集有关Reddit中Incel社区共享的视频,并对YouTube上发布的内容进行数据驱动的表征。 除其他外,我们发现YouTube上的Incel社区正在受到关注,在过去的十年中,与Incel相关的视频和评论的数量大大增加。我们还发现,从与非INCEL相关的视频开始时,用户有可能在五个啤酒花内通过YouTube的推荐算法提出与Incel相关的视频的机会6.3%。总体而言,我们的发现描绘了令人震惊的在线激进化图片:随着时间的流逝,Incel活动不仅在增加,而且平台也可能在指导用户转向这种极端内容方面发挥着积极作用。
YouTube is by far the largest host of user-generated video content worldwide. Alas, the platform has also come under fire for hosting inappropriate, toxic, and hateful content. One community that has often been linked to sharing and publishing hateful and misogynistic content are the Involuntary Celibates (Incels), a loosely defined movement ostensibly focusing on men's issues. In this paper, we set out to analyze the Incel community on YouTube by focusing on this community's evolution over the last decade and understanding whether YouTube's recommendation algorithm steers users towards Incel-related videos. We collect videos shared on Incel communities within Reddit and perform a data-driven characterization of the content posted on YouTube. Among other things, we find that the Incel community on YouTube is getting traction and that, during the last decade, the number of Incel-related videos and comments rose substantially. We also find that users have a 6.3% chance of being suggested an Incel-related video by YouTube's recommendation algorithm within five hops when starting from a non Incel-related video. Overall, our findings paint an alarming picture of online radicalization: not only Incel activity is increasing over time, but platforms may also play an active role in steering users towards such extreme content.