Since Reddit content being used to train AI was part of what triggered their Dumb Actions™️, is there a way to deal with this on Lemmy? If there’s a way to license API access or the content itself under, say, LGPL to prevent commercial AI from using it that would be awesome. With the way ActivityPub works I’m not sure if that’s possible though.

  • nachtigall@feddit.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    That is actually a quite interesting question. What is the license of the content posted to Lemmy? Would it be legal to share posts? Or use code posted here in proprietary projects? Do people retain full copyright, thus make sharing illegal? Can an instance in its legal terms define a standard license for content (like stackoverflow does)?

    Finally, who would enforce the license?

    Also, I don’t think people that scrape training data care about all of this.

  • chobeat@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Right now the whole model of generative AI and in general LLM is built on the assumption that training a machine learning model is not a problem for licenses, copyright and whatever. Obviously this is bringing to huge legal battles and before their outcome is clear and a new legal pratice or specific regulations are established in EU and USA, there’s no point discussing licenses.

    Also licenses don’t prevent anything, they are not magic. If small or big AI companies feel safe in violating these laws or just profit enough to pay fines, they will keep doing it. It’s the same with FOSS licenses: most small companies violate licenses and unless you have whistleblowers, you never find out. Even then, the legal path is very long. Only big corporate scared of humongous lawsuits really care about it, but small startups? Small consultancies? They don’t care. Licenses are just a sign that says “STOP! Or go on, I’m a license, not a cop”