• jsdz@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    vastly expands the pool of potential victims

    I’m not brave enough at the moment to say it isn’t some kind of crime, but creating such images (as opposed to spamming them everywhere, using them for blackmail, or whatever) doesn’t seem to be a crime that involves any victims.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      I’m brave enough to say what I am sure some people are thinking.

      If a pedophile can have access to a machine that generates endless child porn for them, completely cutting off the market for the “real thing”, then maybe that’s a step in a positive direction. Very far from perfect but better than the status quo.

      The ideal ultimate solution is to develop a treatment that pedophiles can use to just stop being pedophiles entirely. I bet most pedophiles would jump on such a thing. But until that magical day maybe let’s explore options that reduce the harm done to actually real children in the immediate term.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Some psychologists agree with you. Others say it would only make the problem worse, making them want to escalate. Definitely one that I’m letting the professionals debate on and I’ll go with their opinion

    • SmoochyPit@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      My bigger concern is the normalization of and exposure to those ideas and concepts (sexualization of children). That’s also why I dislike loli/shota media, despite it being fictional.

      That said, I still think it’s a much better alternative to CSAM and especially to actually harming a child for those who have those desires due to trauma or mental illness. Though I’m not sure if easy, open access is entirely safe, either.

      • ono@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        My bigger concern is the normalization of and exposure to those ideas and concepts

        The same concern has been behind attempts to restrict/ban violent video games, and films before that, and books before that. Despite generations of trying, I don’t think a causal link has ever been established.

  • Zagaroth@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Normally I err on the side of ‘art’ being separated from actual pictures/recordings of abuse. It falls under the “I don’t like what you have to say, but I will defend your right to say it” idea.

    Photorealistic images of CP? I think that crosses the line, and needs to be treated as if it was actual CP as it essentially enables real CP to proliferate.

    • interolivary@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      8 months ago

      Photorealistic images of CP? I think that crosses the line, and needs to be treated as if it was actual CP as it essentially enables real CP to proliferate.

      While I absolutely don’t want to sound like I’m defending the practice (because I’m not), I’m really not too sure of this. If this was true, would similar logic apply to other AI-generated depictions of illegal or morally reprehensible situations? Do photorealistic depictions of murder make it more likely that the people going out of their way to generate or find those pictures will murder someone or seek out pictures of real murder? Will depictions of rape lead to actual rape? If the answer to those or other similar questions is “no”, then why is child porn different? If “yes”, then should we declare all the other ones illegal as well?

      It’s not that I think AI-generated child porn should be accepted or let alone encouraged by any means, but as was pointed out it might actually even be counterproductive to ruin someone’s life over AI-generated material in which there is factually no victim, as reprehensible as the material may be; just because something is disgusting to most of us doesn’t mean it’s a very good justification for making it illegal if there is no victim.

      The reason why I’m not convinced of the argument is that a similar one has been used when eg. arguing for censorship of video games, with the claim that playing “murder simulators” which can look relatively realistic will make people (usually children) more likely to commit violent acts, and according to research that isn’t the case.

      I’d even be inclined to argue that being able to generate AI images of sexualized minors might even make it less likely for the person to move over to eg. searching for actual child porn or committing abuse as it’s a relatively easier and safer way for them to satisfy an urge. I wouldn’t be willing to bet on that though

  • artaxadepressedhorse@lemmyngs.social
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    8 months ago

    I am sort of curious, bc I don’t know: of all the types of sexual abuse that happens to children, ie being molested by family or acquaintances, being kidnapped by the creep in the van, being trafficked for prostitution, abuse in church, etc etc… in comparison to these cases, how many cases deal exclusively with producing imagery?

    Next thing I’m curious about: if the internet becomes flooded with AI generated CP images, could that potentially reduce the demand for RL imagery? Wouldn’t the demand-side be met? Is the concern normalization and inducing demand? Do we know there’s any significant correlation between more people looking and more people actually abusing kids?

    Which leads to the next part: I play violent video games and listen to violent aggressive music and have for many years now and I enjoy it a lot, and I’ve never done violence to anybody before, nor would I want to. Is persecuting someone for imagining/mentally roleplaying something that’s cruel actually a form of social abuse in itself?

    Props to anybody who asks hard questions btw, bc guaranteed there will be a lot of bullying on this topic. I’m not saying “I’m right and they’re wrong”, but there’s a lot of nuance here and people here seem pretty quick to hand govt and police incredible powers for… I dunno… how much gain really? You’ll never get rights back that you throw away. Never. They don’t make 'em anymore these days.

    • Zagaroth@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      The issue here is that it enables those who would make the actual CP to hide their work easier in the flood of generated content.

      Animesque art is one thing, photorealistic is another. Neither actually harms an underaged person by existing, but photorealistic enables actual abusers to hide themselves easily. So IMO, photorealistic ‘art’ of this sort needs to be criminalized so that it can not be used as a mask for actual CP.

  • Draedron@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Isnt it better the are AI generated than real? Pedophiles exist and wont go away and no one can control it. So best they watch AI images than real ones or worse

        • barsoap@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          8 months ago

          Images, yes, but mixing concepts is a mixed bag. Just because the model can draw, say, human faces and dog faces doesn’t mean it has the understanding necessary to blend those concepts. Without employing specialised models (and yes of course the furries have been busy) the best you’ll get is facepaint. The pope at a beach bar doesn’t even come close to exercising that kind of capability: The pope is still the pope and the beach bar is still the beach bar, and a person is still sitting there slurping a caipirinha.

    • SmoochyPit@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      I can’t believe how hard it is to avoid drawn or generated cp on there— and you can only ignore one tag without premium, so it’s not viable to manually make a blocklist :(