Oh yes VERY unlike those. Anything that can be traced and verified, aren’t read to you by an AI voice or a white person claiming to be an American while trying very hard to suppress an Eastern European or East Asian accent. Another good trait to have would be anything that isn’t verifiably false.
Well that and whatever will keep you addicted and hopefully spending money.
Rage, bias confirmation, propaganda that hits the class or group you belong to. And the more you trust it the more they can use that trust.
It sure does, but it doesn’t only show you what you interact with the most. It shows you lots of other stuff too. The exact algorithm of which neither you or I are privy to so don’t get too cocky thinking you have it all figured out. After all “interacting with” can be something as small as lingering on a video just a bit too long. One second longer than your usual average view time. That’s all it takes for an algorithm to decide it’s worth it to push more content like it at you. And given that it’s a priority goal for propaganda, bots, and misinformation posters to craft their video in a way to maximize your engagement, that’s a trivial thing to accomplish.
Algorithms are by design, a way to remove your agency in finding information for yourself, and instead give the platform control over the information you see. This is very handy and even innocent when you just want to see memes that you personally think are funny, but very dangerous when it’s used to mislead you or influence your behavior and thinking. And most people aren’t smart or tech savvy enough to know how any of this works, which makes them very easy to manipulate.
It’s full of misinformation and propaganda unlike… You know… All those super reliable objective sources of information that you use?
Oh yes VERY unlike those. Anything that can be traced and verified, aren’t read to you by an AI voice or a white person claiming to be an American while trying very hard to suppress an Eastern European or East Asian accent. Another good trait to have would be anything that isn’t verifiably false.
You know the algorithm shows you what you interact with the most… Right?
Among other things.
If it only showed you what you interacted with the most it’d be less of an issue but that’s not how it works. Thats not even how it works on YouTube.
Well that and whatever will keep you addicted and hopefully spending money. Rage, bias confirmation, propaganda that hits the class or group you belong to. And the more you trust it the more they can use that trust.
Thanks for proving why that platform is just so damned dangerous. The ignorance it inspires is shocking.
It sure does, but it doesn’t only show you what you interact with the most. It shows you lots of other stuff too. The exact algorithm of which neither you or I are privy to so don’t get too cocky thinking you have it all figured out. After all “interacting with” can be something as small as lingering on a video just a bit too long. One second longer than your usual average view time. That’s all it takes for an algorithm to decide it’s worth it to push more content like it at you. And given that it’s a priority goal for propaganda, bots, and misinformation posters to craft their video in a way to maximize your engagement, that’s a trivial thing to accomplish.
Algorithms are by design, a way to remove your agency in finding information for yourself, and instead give the platform control over the information you see. This is very handy and even innocent when you just want to see memes that you personally think are funny, but very dangerous when it’s used to mislead you or influence your behavior and thinking. And most people aren’t smart or tech savvy enough to know how any of this works, which makes them very easy to manipulate.