Youtube and social media may amplify extremist viewpoints.
Youtube and social media may amplify extremist viewpoints. However, it appears that the video portal's recommendation algorithms are incorrectly attributed to reinforcing extreme views.
Since their birth, the Internet and the World Wide Web have been accompanied by both hopes and anxieties. Early utopias praised unrestricted communication and near-unrestricted access to information. The material of the enormous Brockhaus looks meager when compared to what is readily available and free on Wikipedia. The online encyclopedia, on the other hand, illustrates the drawbacks of open access and widespread participation: some additions are of doubtful quality, while others are constantly contested for interpretive sovereignty. When one considers other online information sources that do not have Wikipedia's claim or correction processes, such as YouTube, the balance becomes much worse.
The Internet has something for everyone, but it also has a lot of rubbish and disinformation, as well as a wide range of moral and political viewpoints. Digital communication, in particular, makes it simpler to reach a huge number of individuals with views that are either rejected by the mainstream or do not appear in it at all. If algorithms prefer to offer material that matches to one's own interests and primarily links like-minded people with one another, radical and extreme ideas are encouraged.
Anecdotal evidence points to two mechanisms of radicalization: recommendation systems that offer increasingly extreme content based on user behavior, and links between this and moderate content that emerge, such as when people from extremist or conspiracy theory circles appear on channels with a larger audience. Channels that criticize the mainstream - usually seen as "left" - can therefore become conduits for political radicalism.
A recent research by American computer and communication experts explores if and how such online radicalization occurs for the first time using meaningful and trustworthy data from the video platform Youtube. The study looks at the surfing habits of a representative sample of the American population of over 300,000 people over the course of four years, from 2016 to 2019, on and off the popular YouTube platform. During this period, research participants watched almost 10 million YouTube videos from over two million channels, with only about a tenth of them connected to the news and political information that was the subject of the investigation. Over half a million videos from over 1000 channels may be classified into political categories.
The study reveals that this basic categorization clearly distinguishes between six “archetypes” of news consumption: news is primarily received through one of the categories, implying that receiving groups are generally homogenous. When compared to politically moderate and apolitical material, right-wing extremist and "woke" antagonistic content is viewed comparatively infrequently. During the time period under consideration, however, both categories have seen a significant increase in popularity on YouTube. Above all, they appear to have a significant pull: if you view many videos from these categories in a single session, you will utilize the associated channels considerably more frequently in the future.
But how important is the platform's recommendation algorithm in this? It doesn't appear to be a very huge one. According to the statistics, extremist films are more frequently managed from outside the platform. In addition, there is no evidence of a tendency toward more extreme material during a session. People appear to prefer to utilize sources of information that corroborate or radicalize their own point of view, despite the fact that algorithms are frequently chastised for doing so.
Users, on the other hand, already carry their ideas and interests with them and wander around in the digital environment like they would in a library: goal-oriented, yet open to chance findings that match their own tastes.
©Translated from FAZ