Are we living in a world of manipulation?

Mighty Mo
3 min readSep 17, 2020

I was always aware that clicking a video on the “watch next” section of YouTube was contributing to an algorithm that would then feed me similar videos to watch, but what I didn’t know was that clicking on those recommended videos was more than just YouTube recognizing my interest. That that click would put me into a matrix that I may never get out of.

I used to enjoy sitting down and watching video after video of clips from previous Olympics or recaps from the latest Dancing with the Stars episode. Because of that I am a part of an algorithm that keeps taking my information and giving me more videos that it predicts I will like. Sounds like good marketing, is it actually bad? I didn’t have a problem with the idea that the next video that plays is something similar to the one before. But is this just a small way that the media is completely manipulating me and my decisions without me even knowing it?

Algorithms and YouTube reveals the idea that YouTube algorithms doesn’t want me to just watch one or two videos it wants me to watch more! Actually the more I watch the more advertisements I will have to absorb. Is this another way for them to persuade me? It may just be a simple advertisment about paper towels but that advertisment could then change my mind about my current paper towels use, then lead me to change my paper towel brand for good. Or the ad could be something much more serious like the upcoming election.

In a Ted Talk by Zeynp Tufekci on We’re building a dystopia just to make people click on ads” says that experiments show that what the algorithm selects for you to watch can affect your emotions and politcal behaviors. Facebook did a study on election day, that advertised two different “vote today” ads, one with your Facebook friends that said I voted at the bottom and one without that feature. The one that included your friends, created over 300,000 additional votes for the election. These algorithms can infer not only political views but religious affiliation, food liking etc., even if you don’t specifically disclose that information online it has the power to figure it out. So, how do we stop this from happening?

Currently #4 on Netflix’s top ten in the U.S today category is “the social dilemma”. In this film, many tech experts show their concerns and fears to the danger and impact that our social media can have on us. The documentary did make me question if having social media was worth it. Have these social media apps truly persuaded us because everything we read and see we believe to be true even though we don’t really know what the truth is. Have we fallen into a rabbit hole that we will not be able to get out of? What is a solution to keeping our information safe? At the end of the documentary they state, “we built these things, we have a responsibility to change it”. Does that mean those who made these apps in the first-place need to go back and create a new algorithm that wouldn’t have as much control in what we believe? Can they possibly design it more humanly? Is that possible or will we still have the problem of humans being persuaded too easily. If algorithms got us into this rabbit hole would creating a new one get us out?

--

--