Algorithms of hate and the digital divide
Advertisement
Read this article for free:
or
Already have an account? Log in here »
To continue reading, please subscribe:
Monthly Digital Subscription
$1 per week for 24 weeks*
- Enjoy unlimited reading on winnipegfreepress.com
- Read the E-Edition, our digital replica newspaper
- Access News Break, our award-winning app
- Play interactive puzzles
*Billed as $4.00 plus GST every four weeks. After 24 weeks, price increases to the regular rate of $19.00 plus GST every four weeks. Offer available to new and qualified returning subscribers only. Cancel any time.
Monthly Digital Subscription
$4.75/week*
- Enjoy unlimited reading on winnipegfreepress.com
- Read the E-Edition, our digital replica newspaper
- Access News Break, our award-winning app
- Play interactive puzzles
*Billed as $19 plus GST every four weeks. Cancel any time.
To continue reading, please subscribe:
Add Winnipeg Free Press access to your Brandon Sun subscription for only
$1 for the first 4 weeks*
*$1 will be added to your next bill. After your 4 weeks access is complete your rate will increase by $0.00 a X percent off the regular rate.
Read unlimited articles for free today:
or
Already have an account? Log in here »
If recent events are any indication, it has become clear that the current use of technology has driven a wedge between people like never before.
The polarization of ideas, perspectives, ideologies, politics, identities, cultures, and other differences that are expected and should be celebrated in diverse and dynamic societies has resulted in an undercurrent of fear of the other, fuelled by media that reinforce our own beliefs and disavow others, the consequences of which are felt by a generation who more often is fed by and fed to an algorithm.
Imagine you are watching television and have a wide selection of channels to choose from: sports, news, cooking, mystery, sci-fi, the usual variety of channels. You decide to watch the golf channel for a while because you like golf. When you are done you go to the channel guide and discover that all your channels have changed to golf channels. Weird, but I like golf.
You go to the library. It has a great selection of thousands of books from all genres. You like mystery novels and pick one off the shelf to borrow. As you look up after reading the back cover, all the books in the library have changed to mystery novels. Mysterious, indeed.
On your way home you stop at the mall and see someone with spiked hair and wearing a Ramones T-shirt. “Hey, I like your outfit,” you casually say as you pass, and then in an instant everyone in the mall is dressed like a punk-rocker.
You might think that you had lost your mind if these things had happened to you, but not if you were on social media. Using social media means that while you are there, the world around you will change based on how you interact in it.
The purpose of any social media platform is to keep you on the platform for as long as possible, thereby maximizing revenue. Algorithms monitor user’s actions, choices, preferences and other data to ensure that the user gets content that is personalized, engaging, and likely to retain you on the platform. This means that as the user makes choices about content, the available content that the user sees will change around them in their feed.
Take YouTube, for example. If you chose a video on a particular topic, more of the same will show up as choices for your next video. This narrowing means you are more likely to stay and watch, as you are getting videos that appeal to you. Ironically, the more you choose, the narrower the scope of your future choices. The algorithm will not disappoint in steering you to more specific content.
This is usually not a big problem unless you want people to look at specific content and have a particular agenda.
Algorithms, by nature of personalizing content for engagement, tend to push groups of people into similar silos of content. For those who wish to promote specific ideas, regardless of motivation, users may be filtered into seeing specific content types, and this can happen gradually and become more extreme.
Regardless of the platform, if a particular content creator has engaging content that also is not-so-innocent, this may lead some to more content of that type that is more specific and dangerous. Over time, ideological content may become more prominent, leading to subtle, yet purposeful changes in a person’s views, often without the user being aware that their views are being changed by the content they are consuming, pushed by the algorithm.
This is not necessarily purposeful by any platform, algorithm, or technology. Hate is a particularly human capability.
But social media, by understanding how it works, can be expertly gamed by some organizations and groups to promote their viewpoints and gain support for their ideology. This is a positive feedback loop that separates people by their choices, amplified by the algorithm that only wants you to get more of what you want for a platform’s benefit and company business model.
It is the human manipulation that is concerning. The algorithm is just following its instructions for its platform.
It may be surprising to ask a young person what is the last book they read, what they watch on television, or the last social they went to.
If they are like many out there, they don’t do any of that, but they have likely been on Reddit, YouTube, X (Twitter), or any number of online platforms. There may be an entire generation (and soon everyone else) living exclusively in the world of the algorithm, changing their world around them and maybe changing themselves in the process.
It can narrow anyone’s view, not being able to see the other, creating a digital divide that is not only generational but ideological, created by the isolation of people and the tribalization of content. An unfortunate and ironic result of the use of social media that has real social consequences.
David Nutbean writes with roles as a college technology instructor, an administrator for a school division, and a computer science teacher.