• danny161@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    20 days ago

    That’s unfortunately (not really sure) probably the fault of Germanys approach to that. It is usually not taking these websites down but try to find the guys behind it and seize them. The argument is: they will just use a backup and start a “KidFlix 2” or sth like that. Some investigations show, that this is not the case and deleting is very effective. Also the German approach completely ignores the victim side. They have to deal with old men masturbating to them getting raped online. Very disturbing…

    • taladar@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      20 days ago

      Honestly, if the existing victims have to deal with a few more people masturbating to the existing video material and in exchange it leads to fewer future victims it might be worth the trade-off but it is certainly not an easy choice to make.

      • Geetnerd@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        19 days ago

        Well, some pedophiles have argued that AI generated child porn should be allowed, so real humans are not harmed, and exploited.

        I’m conflicted on that. Naturally, I’m disgusted, and repulsed. I AM NOT ADVOCATING IT.

        But if no real child is harmed…

        I don’t want to think about it, anymore.

        • quack@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          19 days ago

          Understand you’re not advocating for it, but I do take issue with the idea that AI CSAM will prevent children from being harmed. While it might satisfy some of them (at first, until the high from that wears off and they need progressively harder stuff), a lot of pedophiles are just straight up sadistic fucks and a real child being hurt is what gets them off. I think it’ll just make the “real” stuff even more valuable in their eyes.

          • Elaine Cortez@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            19 days ago

            I feel the same way. I’ve seen the argument that it’s analogous to violence in videogames, but it’s pretty disingenuous since people typically play videogames to have fun and for escapism, whereas with CSAM the person seeking it out is doing so in bad faith. A more apt comparison would be people who go out of their way to hurt animals.

            • Schadrach@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              14 days ago

              A more apt comparison would be people who go out of their way to hurt animals.

              Is it? That person is going out of their way to do actual violence. It feels like arguing someone watching a slasher movie is more likely to make them go commit murder is a much closer analogy to someone watching a cartoon of a child engaged in sexual activity or w/e being more likely to make them molest a real kid.

              We could make it a video game about molesting kids and Postal or Hatred as our points of comparison if it would help. I’m sure someone somewhere has made such a game, and I’m absolutely sure you’d consider COD for “fun and escapism” and someone playing that sort of game is doing so “in bad faith” despite both playing a simulation of something that is definitely illegal and the core of the argument being that one causes the person to want to the illegal thing more and the other does not.