I’m not going to name names here, because there’s multiple places doing this and also I forgot what instance I saw it on, but I’ve noticed something disturbing with the automated repost bots. You know, those bots that copy whole Reddit communities over to Lemmy with tons of automated posts? I don’t like them in general because when I reply to a post I like the OP to actually see my reply, but this issue is more ethical. It’s the automated duplication of porn from Reddit to Lemmy.

Now, I know that these models have consented to their images being shared on the internet. However, in my own personal opinion, porn models should have some amount of control over the manner in which their image is shared on a public forum. In this case, the people posting their naked bodies do not have control over how the image is shared. They can’t decide to delete it if they revoke consent later, and they can’t report creepy comments on their pictures. In most cases, they probably don’t even know what Lemmy is, and yet their images are getting search indexed and shared with people. There’s no creative control over the distribution by the person whose body is in the picture. I consider that a form of non-consensual intimate media. I don’t think these bots should be allowed to repost porn without asking the permission of the user who originally shared the media.

  • m-p{3}@lemmy.ca
    link
    fedilink
    arrow-up
    14
    ·
    edit-2
    2 months ago

    Most, if not all of these images are hosted through i.reddit.com or imgur, which will stop working if the original reddit or imgur, etc post is taken down as well.

    That way, OP on Reddit still remains in control even if unaware of the bot.

    It doesn’t stop someone from saving the image or video and republish somewhere else without OP consent (either here or on Reddit, etc). But that’s not what the bot is doing, it’s simply linking to the original source.

  • InquisitiveApathy@lemm.ee
    link
    fedilink
    English
    arrow-up
    13
    ·
    2 months ago

    You’re probably thinking of the Lemmit bot and community. There’s no reason to not name and shame here.

    While I find it weird, unnecessary, and kind of gross too there’s an(or there should be an) implicit understanding that everything anyone puts on the internet is there to stay, forever. Rarely ever is deleting a photo actually removing for good and your tidbit about consent isn’t quite relevant.

    The vast majority of the women that are being reposted are Onlyfans models and have watermarks on their photos for this exact reason. Reposting it like this only serves as extra advertising. For real exhibitionists and people who aren’t making money off the transaction, yes it’s gross, but I don’t think you can create any universal rules to easily differentiate between the two without very heavy moderation.

    • Microw@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      2 months ago

      It’s not just women btw. But I agree. I’m not sure whether lemmit has a system in place for takedown requests based on own personality rights. They should have that.

      • InquisitiveApathy@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        You’re right it’s not. Nudes of everyone and the mirrors of SFW posts are also a problem. I made the mistake of commenting on a mirrored post without realizing once and then immediately blocked the bot and instance once I realized. I see no value in approaching a lack of content that way.

        The problem with allowing the option for takedowns though is that most people who are being mirrored don’t know that it’s happening. You can’t easily stop something that you don’t know exists and you never opted into in the first place.

  • sabreW4K3@lazysoci.al
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    2 months ago

    I’m not a fan of stealing nudes from people that willingly share them online with a chosen community. But people will always find weird ways to justify theft.