For the first time, internal TikTok communications have been made public that show a company unconcerned with the harms the app poses for American teenagers. This is despite its own research validating many child safety concerns.

The confidential material was part of a more than two-year investigation into TikTok by 14 attorneys general that led to state officials suing the company on Tuesday. The lawsuit alleges that TikTok was designed with the express intention of addicting young people to the app. The states argue the multi-billion-dollar company deceived the public about the risks.

In each of the separate lawsuits state regulators filed, dozens of internal communications, documents and research data were redacted — blacked-out from public view — since authorities entered into confidentiality agreements with TikTok.

But in one of the lawsuits, filed by the Kentucky Attorney General’s Office, the redactions were faulty. This was revealed when Kentucky Public Radio copied-and-pasted excerpts of the redacted material, bringing to light some 30 pages of documents that had been kept secret.

[…]

TikTok’s own research states that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety,” according to the suit.

In addition, the documents show that TikTok was aware that “compulsive usage also interferes with essential personal responsibilities like sufficient sleep, work/school responsibilities, and connecting with loved ones.”

TikTok: Time-limit tool aimed at ‘improving public trust,’ not limiting app use

The unredacted documents show that TikTok employees were aware that too much time spent by teens on social media can be harmful to their mental health. The consensus among academics is that they recommend one hour or less of social media usage per day.

The app lets parents place time limits on their kids’ usage that range from 40 minutes to two hours per day. TikTok created a tool that set the default time prompt at 60 minutes per day.

[…]

  • Steve@communick.news
    link
    fedilink
    English
    arrow-up
    19
    ·
    3 months ago

    It’s always surprising to me that people think these harms are limited to kids and teens. These same issues effect everyone of all ages. Even I’ve noticed my attention span has been effected.

    • Alice@beehaw.org
      link
      fedilink
      arrow-up
      6
      ·
      3 months ago

      I think it’s more that kids are the ones expected to be protected by the law, whereas adults are allowed to knowingly engage in addictive behavior, like alcohol and cigarettes.

      • averyminya@beehaw.org
        link
        fedilink
        arrow-up
        15
        ·
        3 months ago

        Or we could, you know, force social media companies to not use psychologists to make their apps more addictive by design. Something called ethics.

        It’s extremely telling that you can look for a job as a psychologist for Meta and all the opportunities that are available are UX researchers.

      • Kissaki@beehaw.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        Can you explain what you mean by free speech?

        Is the free choice of content selection algorithm free speech? Isn’t the speech, the content, there either way, and could be selected through other alternative algorithms?

        Is using deliberately engaging or addicting design free speech? Isn’t the speech, the content, there either way?

      • Steve@communick.news
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        3 months ago

        You’re conflating free speech of individuals, with engagement driven black box recommendation algorithms of corporations. It’s a common mistake. I think most people make it.

        A company can allow people to post things, and for people to see them if they like, without algorithmically pushing it in endless scrolling interfaces.

        For example Lemmy and Mastodon. You only see what you choose to subscribe to. The sites don’t chose to push any content into your feed because an algorithm thinks you’ll like it.

        There is a big difference between the two.
        And removing the algorithms isn’t a hindrance to free speech, only profits.

  • FIash Mob #5678@beehaw.org
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    3 months ago

    There’s plenty of evidence that social media is harmful to kids, teens, and adults. Several are referenced in this article: https://www.sciencenews.org/article/social-media-teens-mental-health

    We also know that FB has used its platform to unknowingly run social experiments on users without their knowledge and helped the proliferation of misinformation to the tremendous detriment of our country and our people. With gen AI creating even more false information, it’s imperative that governments crack down.

    Social Media can and should be regulated just like any other objectionable form of speech.

  • Jagothaciv@kbin.earth
    link
    fedilink
    arrow-up
    7
    ·
    3 months ago

    So does Facebook and a whistlblower told on them. Wheres the charges? Wheres the regulation? Old ass stupid motherfucking senators can’t even do their job because they’re cock sleeves for Mark.

    20 years ago kids were complaining about MH and SM and nothing has been done about it whatsoever.

  • technocrit@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    3 months ago

    The pseudo-science is too damn high.

    There’s no conclusive evidence that “social media” is bad for kids, much less TikTok specifically or only.

    • tardigrada@beehaw.orgOP
      link
      fedilink
      arrow-up
      3
      ·
      3 months ago

      There’s no conclusive evidence that “social media” is bad for kids, much less TikTok specifically or only.

      This is blatant misinformation and inconsistent with scientific evidence.

      Even Tiktok’s own investigation says there’s strong harm caused by its own platform, let alone the strong body of research on Tiktok and other platforms. Just read tbe article.