Today, a prominent child safety organization, Thorn, in partnership with a leading cloud-based AI solutions provider, Hive, announced the release of an AI model designed to flag unknown CSAM at upload. It’s the earliest AI technology striving to expose unreported CSAM at scale.

  • Scratch@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    50
    ·
    1 month ago

    Not to mention the self image impact such things would have on women with smaller breasts, who (as I understand it) generally already struggle with poor self image due to breast size.

    • sunzu2@thebrainbin.org
      link
      fedilink
      arrow-up
      25
      arrow-down
      2
      ·
      1 month ago

      Clearly the state gives zero fucks about these women, or anyone else or even “the children”

      Catholic Church is still around for a reason

      • Halosheep@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        1 month ago

        Typically the state only cares about things they perceive as children.