EDITORIAL: Bursting the filter bubble, algorithms vs. reality

If one possesses an account with any of the social media giants, then one has undoubtedly been a victim of unwarranted emotional and potentially behavioral manipulation by each platform’s respective algorithm. Many users do not consider that although social sites present themselves as free social networks, there are still costs associated with their use. As these platforms have developed, the price has become the perception of our reality.

Despite this, self-regulation is the best means of controlling the impact these algorithms have. In order to resolve the potential negative impacts of these algorithms, people need to be informed about what they are, how they impact us and how to resist their influence.

What are social media algorithms?

A social media algorithm is a proprietary computation method used by platforms to present users with content the algorithm believes best caters to the user’s experience within the social site. Typically, these algorithms are trained to maximize content engagement on the consumer end. These highly advanced algorithms often utilize artificial intelligence to curate the content users see based on previous engagement and interaction on the platform.

More simply, an Algorithm is unique to its respective social media platform and mathematically computes which content will most likely drive users to engage with the application for the maximum amount of time. In order for social media sites to generate revenue, they turn to advertisers. Algorithms help social sites generate revenue, by placing advertising strategically between viewed content to maximize revenue from advertisers on the platform.

With the litany of information users engage with on social media, algorithms are able to ‘get to know’ users and cater content based on the continuous stream of data an algorithm receives. These algorithms constantly change and evolve, especially as users change habits or find new opportunities to interact or engage with a platform.

So what’s the problem?

Just as users should be more conscious of their screen time, they should be especially conscious of the cost of their activity on sites and in mobile applications that are constantly collecting data.

The main problem with algorithms lies in their patterned nature. Algorithms are computational mathematical equations designed to solve a problem. The problem social media algorithms solve is maximum user engagement with a platform.

The longer users utilize apps and engage with content, the more chances there are to serve them advertisements. More advertisements served to users is more money in the platform’s pocket.

This is harmful because it means that if someone repeatedly engages with ideologically preferred content, they are likely to be presented more of that content as a means of user retention even when the content is factually inaccurate or entirely fictional. Regardless of ideological affiliations, these algorithms have the potential to push alternate viewpoints and perspectives entirely away from users.

How bad could algorithms possibly be?

When users experience the results of an algorithm attempting to maximize their engagement by catering to their existing political views or other closely held beliefs, what results is an algorithmic echo chamber. This can be especially harmful because it leads users to believe reality is in exact alignment with their existing feelings, thoughts and opinions.

These algorithmic echo chambers are increasingly harmful as users get more and more of their news content from social media. The Pew Research Center reported in 2017 that two-thirds of American adults at least occasionally read news via social media. Despite this, the majority of users (57%) expect news on social media to be inaccurate.

However, that does leave a close minority who do not possess that expectation, which helps to popularize content that may not be accurate but caters to existing ideological proclivities. As a result, the content will be shared, commented on or reacted to and then catered to other users who align with it.

Over time, this can begin to alter a user’s perception of reality. As conflicting viewpoints or ideologies to theirs fade out of view, the user becomes accustomed to a social media environment and a reality in which only one ideological position is presented rather than an environment where a myriad of perspectives can be offered to users in order to better understand an issue.

This phenomenon is referred to as a “filter bubble.” Coined by internet activist Eli Pariser in 2010, a filter bubble is the result of an algorithm’s best guesses at content curation based on user information, search history and location among other collected data.

More recently, The effects of the filter bubble phenomenon have been applied to users’ engagement with these algorithmic echo chambers and fake news content which plagued social networks during the 2016 Presidential Election and continue to do so today.  

Breaking the feedback loop

As social media platforms continue to grow, it is imperative users take the time to critically analyze the content they view on their social media platforms. This is especially helpful when it comes to engaging and interacting with content on these platforms.

When content seems sensationalized, especially when one agrees with it, one should not engage or interact with it before verifying its authenticity. This will ensure sensationalized content slowly begins to fade out of relevance in one’s feed. The more comments, reactions or shares misleading content has will result in the spread of such misinformation.

In order to break the cycle, users must actively assess all content shown to them by these algorithms and make a critical analysis of the information presented before sharing, commenting or reacting.

Living with algorithms

Although social media sites allow one to stay connected to friends and families, share information and have discussions about ideological issues, users must remain vigilant and protect their perception of reality with every like, share or comment.

With an increase in awareness of these algorithms and their nature, users can expect to see advertisers and purveyors of misinformation alter their content in order to adapt to the constantly evolving nature of social media in the information age.

Ultimately, users are responsible for themselves and should constantly question the information presented to them in order to ensure they shape their reality before an algorithm shapes it for them.

Leave A Reply

Your email address will not be published.