Twitter says its algorithms favor tweets with right-leaning political content

315
Twitter says its algorithms favor tweets with right-leaning political content

Twitter has released the findings of research that examined its algorithms for potential political bias. According to the business, users with an algorithmically sorted home timeline will see a mix of tweets from individuals they follow and recommended tweets based on their Twitter behavior. This preliminary study evaluated the political content displayed in users’ timelines when organized by an algorithm rather than chronologically.

Twitter announced earlier this year that it would investigate the fairness of its algorithms and how they may be unwittingly contributing to ‘harms.’ This new study is a component of that effort; it focuses on tweets from elected leaders in seven nations, as well as recommended information from news organizations identified by its algorithms.

Also See:  A glitch in Halo Infinite is erasing players' save files

The company investigated if its algorithms are boosting particular political organizations more than others and, if so, whether this is a consistent issue across multiple countries. The corporation also investigated if its algorithms prefer some political news providers over others. The study included millions of tweets made between April 1 and August 15, 2020.

Twitter has revealed some of the findings from this investigation. For example, the algorithmically sorted timeline amplifies tweets concerning political material regardless of party affiliation compared to a chronological timeline. In addition, it is worth noting that six of the seven countries studied were found to amplify tweets with political right content more than tweets with left political content.

Also See:  You Can Now Sign In to Twitter With Apple or Google Logins

Furthermore, Twitter discovered that its algorithm substantially amplifies information from right-leaning news organizations. Unfortunately, Twitter doesn’t seem to know why this is the case, stating in a blog post about the study that “additional root cause analysis is required to establish what, if any, modifications are required to lessen detrimental effects by our Home timeline algorithm.”

Rumman Chowdhury, Twitter’s Director of Software Engineering, explained:

In this study, we identify what is happening: certain political content on the platform is magnified. Because these observable patterns result from interactions between people and the platform, determining why they exist is a far more challenging topic to answer. As researchers and practitioners embedded within a social media firm, the objective of the ML Ethics, Transparency, and Accountability (META) team is to identify and mitigate any unfairness that may develop.

Source: twitter