• moreeni@lemm.ee
    link
    fedilink
    arrow-up
    4
    arrow-down
    23
    ·
    1 year ago

    https://github.com/LemmyNet/lemmy/blob/main/LICENSE

    The software provided as is. Period. They never agreed to be support boys for someone, and the amount of work doesn’t really correlate to the amount of money they get from donations unless they both live in a third world country.

    https://jacobtomlinson.dev/posts/2022/dont-be-that-open-source-user-dont-be-me/

    It’s just a matter of not being entitled, that’s it. All I’m asking for is so that people would be more polite to FOSS devs. If they stop doing their work right now what are you going to do? Implement the mod tools yourself? Then go ahead.

    • gabe [he/him]@literature.cafe
      link
      fedilink
      arrow-up
      16
      arrow-down
      4
      ·
      1 year ago

      I’m sorry, but I have difficulty being polite to someone who has actively ignored addressing safety concerns that were brought up months ago. FOSS or not.

      • Rodeo@lemmy.ca
        link
        fedilink
        arrow-up
        2
        arrow-down
        13
        ·
        1 year ago

        Stop misconstruing it as safety. It’s about legality. Nobody’s safety is in jeopardy because they saw an illegal image accidentally. This is about following the law, not protecting the safety of users.

        • toasteecup@lemmy.world
          link
          fedilink
          English
          arrow-up
          15
          ·
          1 year ago

          nobody’s safety is in jeopardy

          You know, except for those abuse victims whose pictures are being spread around lemmy. Just sayin’

          • EhForumUser@lemmy.ca
            link
            fedilink
            arrow-up
            2
            arrow-down
            2
            ·
            edit-2
            1 year ago

            The theory behind why CSAM is illegal is that if someone is willing to pay for CSAM it incentivizes production of even more CSAM content to receive more payment. That incentivized additional production means even more abuse. A perfectly reasonable take and something that I think can be demonstrated.

            But why would you accidentally seeing CSAM prompt you to give payment to create that incentivization? Are you worried that you’re a closeted pedophile that will be ready to shower those who record such content to see more and more as soon as you get your first taste?

          • Rodeo@lemmy.ca
            link
            fedilink
            arrow-up
            1
            arrow-down
            6
            ·
            1 year ago

            I thought it was pretty apparent we were talking about Lemmy, but okay.

            The statements were about the Lemmy devs can and/or should be doing for safety. They simply do not have the power to stop child abuse by developing a social media platform. So then the safety in question must be the safety of people using Lemmy, because the Lemmy devs have some direct power over that.

            I’m sure you feel very morally aloof with your righteous retort, though.

        • gabe [he/him]@literature.cafe
          link
          fedilink
          arrow-up
          13
          ·
          1 year ago

          It ties into safety as well, websites have “trust and safety” teams. This is where it falls under. Sorry for not being more concise.

          • Rodeo@lemmy.ca
            link
            fedilink
            arrow-up
            2
            arrow-down
            3
            ·
            1 year ago

            No need to apologize, I just think safety is a misnomer here.

        • The Cuuuuube@beehaw.org
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          2
          ·
          1 year ago

          “CSAM laws aren’t for the safety of real people” is one of the hottest takes I’ve ever seen in my life

          • Rodeo@lemmy.ca
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            1 year ago

            Straight outta reddit with that one.

            I’m just going to copy paste my other comment:

            I thought it was pretty apparent we were talking about Lemmy, but okay.

            The statements were about the Lemmy devs can and/or should be doing for safety. They simply do not have the power to stop child abuse by developing a social media platform. So then the safety in question must be the safety of people using Lemmy, because the Lemmy devs have some direct power over that.

            I’m sure you feel very morally aloof with your righteous retort, though.

            • The Cuuuuube@beehaw.org
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              1 year ago

              Yes. Obviously we’re talking about Lemmy. We just still fundamentally disagree on the forms of harm, psychic and physical, that can be experienced through the rapid propagation of CSAM. Lemmy’s lacking mod tools have been a major topic of discussion for a while now. I don’t care to carry on this conversation because it’s clear our starting points are too far apart to meet in the middle

              • Rodeo@lemmy.ca
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                I think the other guy’s comment is well suited as a response to this, so again I’ll copy paste:

                The theory behind why CSAM is illegal is that if someone is willing to pay for CSAM it incentivizes production of even more CSAM content to receive more payment. That incentivized additional production means even more abuse. A perfectly reasonable take and something that I think can be demonstrated.

                But why would you accidentally seeing CSAM prompt you to give payment to create that incentivization?

                How could reason possibly prevail when the subject matter is so sensitive?