• 0 Posts
  • 31 Comments
Joined 10 months ago
cake
Cake day: January 18th, 2024

help-circle




  • For the Meta apologists, I have a reality check for you:

    Threads was immediately subject to mass amounts of radicalizing, extremist content, and there have also been instances of users having personal information doxxed on Threads due to Meta’s information-harvesting practices. [1]

    Threads was marketed to be open to ‘free speech’ (read: hate speech and misinformation) and encouraged the Far-Right movement to join, who have spread extremism, hate, and harassment on Threads already. [2] Threads has been a hotbed of Israel-Palestine misinformation/propaganda. [3] They also fired fact-checkers just prior to Threads’ launch. [1]

    As already established, Meta also assisted in genocide! [4]

    Meta/FB/Instagram also have a strong history of facilitating the spread of misinformation and extremism, which contributed to the January 6th insurrection attempt. [5], [6]

    This really should be obvious by now… but Meta mines and sells their user’s information.[7] Just look at the permissions you have to grant them for Threads…

    FB users have to agree to all sorts of unethical things in the TOS, including giving Meta permission to run unethical experiments on their users without informed consent. [8] Their first published research was where they manipulated users’ feeds with positive or negative information, in order to see if it affected their mood. It did, and they successfully induced depression in many of their users!

    I will now turn to an article that surmises well the core practices of Meta as a company:

    • Elevates disinformation campaigns and conspiracy theories from the extremist fringes into the mainstream, fostering, among other effects, the resurgent anti-vaccination movement, broad-based questioning of basic public health measures in response to COVID-19, and the proliferation of the Big Lie of 2020—that the presidential election was stolen through voter fraud [16];

    • Empowers bullies of every size, from cyber-bullying in schools, to dictators who use the platform to spread disinformation, censor their critics, perpetuate violence, and instigate genocide;

    • Defrauds both advertisers and newsrooms, systematically and globally, with falsified video engagement and user activity statistics;

    • Reflects an apparent political agenda espoused by a small core of corporate leaders, who actively impede or overrule the adoption of good governance;

    • Brandishes its monopolistic power to preserve a social media landscape absent meaningful regulatory oversight, privacy protections, safety measures, or corporate citizenship; and

    • Disrupts intellectual and civil discourse, at scale and by design. [9]













  • The modern Republicans are domestic terrorists by their own admission. They deserve to be named and shamed for what they are.

    Both sides are not alike… The following is an excerpt from my blog post (no ads, no benefit to me).

    In a study evaluating Left-Wing and Right-Wing domestic extremism between 1994 and 2020, there was one fatality as the result of Left-Wing extremism, versus 329 fatalities resulting from Far Right extremism in that 25 year period. [5]

    The Far-Right movement is the oldest and most deadly form of domestic terrorism in the United States, and The Anti-Defamation League Center on Extremism found that the Far-Right is responsible for 98% of extremist murders in the U.S. [24] Furthermore, for nearly every year since 2011, Far-Right terrorist attacks/plots have accounted for over half of all terror attacks/plots in the United States. [21]

    In the U.S., Right-Wing extremism was responsible for two-thirds of all failed, foiled, or successful terror attacks in 2019, and was responsible for 90% of attacks in the first half of 2020 alone. [21] Since 2013, Far-Right extremism has been responsible for more terror attacks/plots than the Left-Wing, ethnonationalism, or religiously motivated attacks/plots. [21]

    For the references



  • GONADS125@feddit.detoFediverse@lemmy.world0.19.3 is now the most installed version
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    9 months ago

    It’s different on other platforms (like mastodon) but on lemmy, it only blocks posts from the blocked instance. Users from Threads would still be interacting in comments with the user who blocks their instance.

    Regardless, I believe they should be defederated by instance admins on ethical grounds. Meta/FB have run unethical, uninformed experiments on their users, including purposefully inducing depression in their users.

    The fact that Meta has assisted in genocide should be grounds for defederation by instances which claim to protect and care about their users.

    Meta’s platforms have also played a key role in radicalizing users, and they purposefully marketed Threads to far-right extremists.

    Here’s my argument with citations

    There’s also good arguments to defederate and block them from the fedivers based on EEE.

    If an instance’s admins claim they care about protecting their users and providing a safe, healthy community but are federated with Threads, then they are either uninformed or liars.