cross-posted from: https://jamie.moe/post/113630

There have been users spamming CSAM content in !lemmyshitpost@lemmy.world causing it to federate to other instances. If your instance is subscribed to this community, you should take action to rectify it immediately. I recommend performing a hard delete via command line on the server.

I deleted every image from the past 24 hours personally, using the following command: sudo find /srv/lemmy/example.com/volumes/pictrs/files -type f -ctime -1 -exec shred {} \;

Note: Your local jurisdiction may impose a duty to report or other obligations. Check with these, but always prioritize ensuring that the content does not continue to be served.

Update

Apparently the Lemmy Shitpost community is shut down as of now.

  • The Picard Maneuver
    link
    fedilink
    English
    117
    edit-2
    1 year ago

    So, from memory there has been:

    • This recent attack
    • Regular DDOS attacks
    • Frequent attempts to spam community creation
    • That one time the instance got hacked and set to redirect to shock sites

    Am I missing anything?

    This seems like more than just a few trolls. Maybe someone really doesn’t want to see user-owned social media take off.

    • Scrubbles
      link
      fedilink
      English
      871 year ago

      I see where you’re going with this, but no, people really are just absolutely horrible. The fact is that with other social media they’re just already very set up in managing this so we never see it. Lemmy wants to be open, this is the flipside of that openness.

      • @kromem@lemmy.world
        link
        fedilink
        English
        211 year ago

        It’s generally easy to crap on what’s ‘bad’ about big players, while underestimating or undervaluing what they are doing right for product market fit.

        A company like Meta puts hundreds of people in foreign nations through PTSD causing hell in order to moderate and keep clean their own networks.

        While I hope that’s not the solution that a community driven effort ends up with, it shows the breadth of the problems that can crop up with the product as it grows.

        I think the community will overcome these issues and grow beyond it, but jerks trying to ruin things for everyone will always exist, and will always need to be protected against.

        To say nothing for the far worse sorts behind the production and more typical distribution of such material, whom Lemmy will also likely eventually need to deal with more and more as the platform grows.

        It’s going to take time, and I wouldn’t be surprised if the only way a federated social network eventually can exist is within onion routing or something, as at a certain point the difference in resources to protect against content litigation between a Meta and someone hosting a Lemmy server is impossible to equalize, and the privacy of hosts may need to be front and center.

        • @Zeth0s@lemmy.world
          link
          fedilink
          English
          16
          edit-2
          1 year ago

          The solution in this case is absolutely AI filters. Unfortunately you won’t find many people willing to build robust model for that. Because they’d be those getting the ptsd you mention.

          • @Haui@discuss.tchncs.de
            link
            fedilink
            English
            51 year ago

            Iirc, ptsd is something only certain characters get. We should probably focus on finding people who really have no problem watching rough content. I have ptsd so I probably am not the right person for the job.

            • @Zeth0s@lemmy.world
              link
              fedilink
              English
              11
              edit-2
              1 year ago

              I don’t want to try. I have pretty low barrier. I set up NSFW filter on lemmy because I found disturbing the furry content that was common some time ago… I don’t want even to try anything worst than that

      • HTTP_404_NotFound
        link
        fedilink
        English
        81 year ago

        Yup.

        I sent a step further, and commented out the pictrs related configuration from the lemmy.hjson too.

    • PastThePixels
      link
      fedilink
      English
      121 year ago

      Yeah… Just wow. I disabled pictrs and deleted all its images, which also means all my community images/uploaded images are gone, and it’s more of a hassle to see other people’s images, but in the end I think it’s worth it.

      Through caching every image pictrs was also taking up a massive amount of space on my Pi, which I also use for Nextcloud. So that’s another plus!

      • HTTP_404_NotFound
        link
        fedilink
        English
        4
        edit-2
        1 year ago

        Note, apparently, lemmy will get pretty pissy if pictrs isn’t working… and the “primary” lemmy GUI will straight-up stop working.

        Although, https://old.lemmyonline.com/ will still work.

        And- I am with you. My pictrs storage, has ended up taking up quite a bit of room.

      • @rar@discuss.online
        link
        fedilink
        English
        2
        edit-2
        1 year ago

        There has to be a more elegant way of dealing with this in the future, like de-coupling between Lemmy-account hosting (which effectively means acitivypub-fediverse account) and Lemmy-communities hosting.

      • HTTP_404_NotFound
        link
        fedilink
        English
        6
        edit-2
        1 year ago

        Yup.

        So far, mostly everything appears to work still. But, trying to upload an image, just throws an error.

        SyntaxError: Unexpected token ‘R’, “Request er”… is not valid JSON

        I don’t see a way to actually “gracefully” disable it, but, this works.

        Edit- don’t just stop pictrs.

        Lemmy gets very pissy… and b reaks.

  • @idle@158436977.xyz
    link
    fedilink
    English
    201 year ago

    I went ahead and just deleted my entire pictrs cache and will definitely disable caching other servers images when it becomes available.

    • @jeffw@lemmy.world
      link
      fedilink
      English
      441 year ago

      It impacts everyone when this shit happens. It takes time for mods/admins to take down. And you can’t unsee it.

      I hope nobody else has the misfortune of stumbling on that shit

      • Bleeping Lobster
        link
        fedilink
        English
        421 year ago

        There have been studies which found playing tetris for an hour or two after seeing something traumatic can prevent it taking root in our longterm memory.

        I tried it once after accidentally clicking a link on reddit that turned out to be gore, I can’t remember exactly what it was now (about 9 months later) so it must have worked

      • @thrawn@lemmy.world
        link
        fedilink
        English
        41 year ago

        Yeah you really can’t. I’m pretty desensitized from earlier internet with death and other shock gore content but had managed to avoid CSAM until today. It was a lot worse than I expected, felt my heart drop. Worse, my app autoplays gifs in thumbnail so it kept going while I was reporting it.

        I’ve mostly forgotten and it wasn’t on my mind until I saw this thread (happened less than 24hr ago) but even the slightest reminder is oddly upsetting. Wish I’d thought of the Tetris thing.

  • ugjka
    link
    fedilink
    English
    191 year ago

    blocked lemmyshitpost some time age because it is trash anyway

    • JamieOP
      link
      fedilink
      English
      171 year ago

      At this point, the community is clean. So unless more is posted, then you should be good. If someone searched for the community and caused a preview to load while the content was active though, then it could be an issue.

  • @Oneobi@lemmy.world
    link
    fedilink
    English
    611 year ago

    Likely scum moves from reddit patriots to destroy or weaken the fediverse.

    I remember when Murdoch hired that Israeli tech company in Haifa to find weaknesses is TV smart cards and then leaked it to destroy their market by flooding counterfit smart cards.

    They are getting desperate along with those DDOS attacks.

    • OrbitJunkie
      link
      fedilink
      English
      281 year ago

      Could be, but more likely it’s just the result of having self hosted services, you have individuals exposing their own small servers to the wilderness of internet.

      These trols also try constantly to post their crap to mainstream social media but they have it more difficult there. My guess is that they noticed lemmy is getting a big traction and has very poor media content control. Easy target.

      Moderating media content is a difficult task and for sure centralized social media have better filters and actual humans in place to review content. Sadly, only big tech companies can pay for such infrastructure to moderate media content.

      I don’t see an easy way for federated servers to cope with this.

      • @maxprime@lemmy.ml
        link
        fedilink
        English
        121 year ago

        Yeah exactly. This is the main reason I decided not to attempt to self host a Lemmy instance. No way am I going to let anyone outside of my control have the ability to place a file of their choosing on my hardware. Big nope for me.

    • JamieOP
      link
      fedilink
      English
      101 year ago

      Not really. You could technically locate the images and determine precisely which ones they are from their filenames, but that means you actually have to view the images long enough to pull the URL. I had no desire to view them for even a moment, and just universally removed them.

      As mentioned in my edit above though, ensure you are in compliance with local regulations when dealing with the material in case you have to do any preservation for law enforcement or something.

        • JamieOP
          link
          fedilink
          English
          41 year ago

          From what I was informed, purging a post doesn’t remove the associated cached data. So I didn’t take any chances.

  • Dandroid
    link
    fedilink
    English
    44
    edit-2
    1 year ago

    I got lucky. I am not subscribed to this community, and I am the only person on my instance. But what if I was subscribed and hadn’t seen this post? This is too much responsibility for me.

    I just shut down my instance until we can disable cached images. If that never happens, then I’m not bringing it back up.

    Shout-out to https://github.com/wescode/lemmy_migrate. I moved my subscriptions over in a minute or two, and now, other than not having my post history, it’s exactly the same.

  • Neuromancer
    link
    fedilink
    English
    131 year ago

    If the source deletes the post. Won’t that remove it from all the instances ?

      • regalia
        link
        fedilink
        English
        701 year ago

        This isn’t trolling, this is just disgusting crime.

        • @Not_Alec_Baldwin@lemmy.world
          link
          fedilink
          English
          -17
          edit-2
          1 year ago

          The crime happened in the past when the children were abused. This is some weird amalgam of criminal trolling.

          Edit: yeah yeah I get that csam is criminal, that’s why I called it an amalgam. It’s both trolling and criminal.

          • chiisana
            link
            fedilink
            English
            111 year ago

            Depending on jurisdiction, I am not a lawyer, etc etc, but I’d imagine with fairly high degree of probability that re-distribution of CSAM is also a crime.

          • @ChunkMcHorkle@lemmy.dbzer0.com
            link
            fedilink
            English
            31 year ago

            The crime happened in the past when the children were abused.

            That’s true. You could look at it that way and stop right there and remain absolutely correct. Or, you could also look at it from the eventual viewpoint of that victim as a human being: as long as that picture exists, they are being victimized by every new use of it, even if the act itself was done decades ago.

            Not trying to pile on, but anyone who has suffered that kind of violation as a child suffers for life to some extent. There are many who kill themselves, and even more that cannot escape addiction because the addiction is the only safe mental haven they have where life itself is bearable. Even more have PTSD and other mental difficulties that are beyond understanding for those who have not had their childhood development shattered by that, or worse, had that kind of abuse be a regular occurrence for them growing up.

            So to me, adding a visual record of that original violating act to the public domain that anyone can find and use for sick pleasure is an extension of the original violation and not very different from it, IMO.

            The visual records are kind of a sick gift that never stop giving, and worse still if the victim knows the pics or videos are out there somewhere.

            I am well aware not everyone sees it this way, but an extra bit of understanding for the victims would not go amiss. Imagine being an adult and browsing the web, thinking it’s all in the past and maybe you’re safe now, and stumbling across a picture of yourself being raped at the age of five, or whatever, or worse still, having friends or family or spouse or children stumble across it.

            So speaking only for myself, I think CSAM is a moral crime whenever it is accessed, one of the most hellish that can be committed against another human being, regardless of the specificities of the law.

            I don’t have a problem with much else that people share, but goddamn I do have a problem with that.

          • Dark Arc
            link
            fedilink
            English
            241 year ago

            It’s still a crime. Taking the pictures is a crime. Sharing the pictures is also a crime.

  • @drcobaltjedi@programming.dev
    link
    fedilink
    English
    181 year ago

    I was looking into self hosting. What can I do to avoid dealing with this? Can I not cache images? Would I get in legal trouble for being federated with an instance being spammed?

  • @itsdavetho@lemmy.world
    link
    fedilink
    English
    671 year ago

    I literally am going to give up social media in general if this doesn’t stop

    Seen it last night late around 3am shit made me sick I honestly almost cried but I just closed the app and tried not to think about it

    Whatever the goal is it’s a stark reminder that there is monsters creeping in the shadows every where you go

      • 𝒍𝒆𝒎𝒂𝒏𝒏
        link
        fedilink
        English
        11 year ago

        You don’t, those are the collateral damage.

        IMO it’s better to just nuke every image from the last 24 hours than to subject yourself to that kind of heinous, disgusting content