It is important to remember that social media platforms such as YouTube, Facebook, and Instagram all allow for creation and exchange of content .With this, it is impossible for our digital interactions to not intertwine with democracy because we are constantly being fed biased information. While this is happening all the time, we experience it the most when it is time to vote, to demonstrate, to campaign, to support, and of course, to complain (Margetts 2018). you could learn further information on just to what extent from, Democracy and Social Media. The tiniest acts of engagement online can trigger the algorithm to factor in information that originally wouldn’t have even been thought of. An example of this in my daily life is one that you hear of people tell all the time. There have been many instances where I would be on my boyfriend’s phone watching videos on his YouTube account that were of his interests. Most of the videos we would watch on his phone, I would not typically search on my own to watch. But, after one search on my phone all that would appear on my feed were these videos. With all of this being said, I found it strangely interesting that when I logged on to my own YouTube account, similar if not the exact videos that we watched on his YouTube account were now displayed on my feed even though I never personally watched any of those videos on my phone. This all leads me to wonder, to what extent do these companies go to, to filter our information. Are these algorithms causing me to miss out on other information?
WHAT ARE ALGORITHMS?
So before we can question the effect that algorithms have on our social and digital engagement, we must first understand what algorithms are. Algorithms are designed to find solutions to make things speedier, easier, and overall better. Companies such as Facebook and Google use algorithms to better learn what their users like, what they may be interested in seeing, and any other information they can use to make the user experience easier and more effective. Algorithms personalize information for each user based on data that is collected from tech companies. These tech companies take the information gathered from user’s interactions and use it to filter and push out content that they feel like their users could relate to or enjoy. While some may appreciate this aspect, others feel like it could be problematic to filter out user’s feeds. Algorithms play a big role in our everyday social media use and it is often not something we tend to have to think too much about when using our everyday social media platforms. Some of the platforms that we use that have taken on the further use of algorithms include Facebook, Twitter, Instagram, and even YouTube which is owned by Google. YouTube & Google’s Use of Algorithms Creators on YouTube have to become well-versed in the algorithm formula that YouTube uses if they want their content to reach people and generate success. Being that I recently started a YouTube channel, I have had a first-hand experience with learning how to navigate the algorithm used on this site. It has become very important for content creators to learn the algorithm and learn how to trick it in assurance that their content will not only reach their viewers and subscribers, but will also reach new viewers and bring more traffic to their channel What most people don’t know is just how complicated YouTube’s algorithm can be and how many factors are calculated in order to produce a feed that fits into what each individual user would like to see. Google uses a Page Rank algorithm (Mayer 2005). This Page Rank algorithm is a software that looks at over 100 different features to define and determine the relevancy of the content put out. YouTube in general has a feature that allows you to view the analytics of your channel and content to see just how many factors go into getting your content out and at the top of the algorithm. If you were to take a look at YouTube Creator Academy, there are many videos that allow you to learn what you can do to ultimately stay apart of the algorithm and keep your content relevant to your viewers.
As you could see from my recommended feed on YouTube, these are clearly all videos that I have expressed interest in viewing.
How Can This Be A Problem?
Through the use of these algorithms according to (Sustein, 2017, p. 5), people may become isolated in their social media use. This type of isolation could lead to extremism and polarization. Not only can we become extremist by only receiving a certain type of information, but a lot of times the information we can receive is more likely to be false. When using sites such as YouTube or Google, the majority of users want to be able to immediately find content that would be interesting to them. However, regardless of the topic of interest, there should be different perspectives given. This quote from Sustein’s “The Daily Me”, explains why this is important. “First, people should be exposed to materials that they would not have chosen in advance. Unplanned, unanticipated encounters are central to democracy itself.” So, in short, it is essential that people receive not only information that is of interest to them, but in order to keep a healthy balance, they should receive different perspectives as well. In this article https://www.news.com.au , an example of how algorithms may affect the way we find information and the validity of the information we find is presented through the example of a high school student researching a topic for a paper. Just by this student searching “The Federal Reserve” in his YouTube search, he was sent down a rabbit hole of information that mainly consisted of conspiracy theories and false information on some really important and influential topics.
This all makes us wonder if there is a political agenda in place to also play a role in the algorithms of these big social companies. For the use of his project, it might have been helpful to have a solid flow of consistent information. As highlighted in In Praise of Echo Chamber, “Solidarity is important for political action” (Parker, 2017). Some would have us to believe that the new technological resources we have will allow us to “embrace the platinum standard of the internet age” (Scalimbrino 2015). Many people agree with this thought process on how filter bubbles, algorithms, and echo chambers can be beneficial for the new age. The question that we need to ask ourselves when considering our views on the use of algorithms is a very typical question. “Is it all worth it”? But, to better tailor this question to fit into the topic of algorithms, I will phrase it differently. We should ask whether or not internet users are driven by an unorganized and self-centered curiosity instead of a curiosity that reflects the desire for knowledge and learning. The importance of asking this question is to better understand if the algorithms used are tailoring the information given to us so much that we no longer feel the curiosity of having to search and find new information abstract to what we would normally see.
What Should Be Done About It?
With the advancement of our technology today, there is not much to say on what should be done about the use of algorithms. As a human race, we seek convenience and algorithms give us just that. Though we enjoy this convenience, we should question as to what extent should we allow companies to use algorithms in order to control what we see and engage with. There should be a regulation of algorithms that would control not only allow personalized information to appear on user’s feed, but also randomized information. By doing this, people are not only subjected to biased posts, information, and in the case of YouTube, videos. In an article about Filter bubbles, the author writes about how we as internet users should “burst our bubbles” (Spohr 2017). This method would encourage us as users to search your own content instead of just scrolling by and taking the information fed to us.
Some would have us to believe that the new technological resources we have will allow us to “embrace the platinum standard of the internet age” (Scalimbrino 2015). Many people agree with this thought process on how filter bubbles, algorithms, and echo chambers can be beneficial for the new age. The question that we need to ask ourselves when considering our views on the use of algorithms is a very typical question. “Is it all worth it”? But, to better tailor this question to fit into the topic of algorithms, I will phrase it differently. We should ask whether or not internet users are driven by an unorganized and self-centered curiosity instead of a curiosity that reflects the desire for knowledge and learning. The importance of asking this question is to better understand if the algorithms used are tailoring the information given to us so much that we no longer feel the curiosity of having to search and find new information abstract to what we would normally see.
Parker, E. (2017, May 22). In praise of echo chambers.The Washington Post. Retrieved from https://www.washingtonpost.com/news/democracy-post/wp/2017/05/22/in-praise-of- echo-chambers/?utm_term=.4657094cd971
Scalambrino, F. (Ed.). (2016). Social epistemology and technology : Toward public self- awareness regarding technological mediation(Collective studies in knowledge and society). London: Rowman & Littlefield International. (2016). Retrieved March 7, 2019, from INSERT-MISSING-DATABASE-NAME.
Spohr, D. (2017). Fake news and ideological polarization: Filter bubbles and selective exposure on social media. Business Information Review, 34(3), 150-160. doi:10.1177/026638211 7722446
Sunstein, C. R. (2017). #Republic: Divided Democracy in the Age of Social Media. New Jersey, NJ: Princeton University Press.