• kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    This doesn’t work outside of laboratory conditions.

    It’s the equivalent of “doctors find cure for cancer (in mice).”

    • bier@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      I like that example, everytime you hear about some discovery that x kills 100% of cancer cells in a petri dish. You always have to think, so does bleach.

      • Meowoem@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        It’s clever really, people who don’t like ai are very lonelye to also not understand the technology, if you’re going to grift then it’s a perfect set of rubes - tell them your magic code will defeat the evil magic code of the ai and that’s all they need to know, fudge some numbers and they’ll throw their money at you

        • Misconduct@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 years ago

          What’s not clever is making stuff up to not really make a point after typing a whole paragraph lmao

  • bonus_crab@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    big companies already have all your uncorrupted artwork, all this does is eliminate any new competition from cropping up.

      • hperrin@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        You don’t follow the license that it was distributed under.

        Commonly, if you use open source code in your project and that code is under a license that requires your project to be open source if you do that, but then you keep yours closed source.

        • fidodo@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          I still wouldn’t call it stealing, but I guess “broke open source code licenses” doesn’t have the same impact, but I’d prefer accuracy.

          • bamboo@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 years ago

            It’s piracy, distributing copyrighted works against the terms of its license. I agree stealing is not really the right word.

  • ScaredDuck@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Won’t this thing actually help the AI models in the long run? The biggest issue I’ve heard is the possibility of AI generated images getting into the training dataset, but “poisoned” artworks are basically guaranteed to be of human origin.

  • General_Effort@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Explanation of how this works.

    These “AI models” (meaning the free and open Stable Diffusion in particular) consist of different parts. The important parts here are the VAE and the actual “image maker” (U-Net).

    A VAE (Variational AutoEncoder) is a kind of AI that can be used to compress data. In image generators, a VAE is used to compress the images. The actual image AI only works on the smaller, compressed image (the latent representation), which means it takes a less powerful computer (and uses less energy). It’s that which makes it possible to run Stable Diffusion at home.

    This attack targets the VAE. The image is altered so that the latent representation is that of a very different image, but still roughly the same to humans. Say, you take images of a cat and of a dog. You put both of them through the VAE to get the latent representation. Now you alter the image of the cat until its latent representation is similar to that of the dog. You alter it only in small ways and use methods to check that it still looks similar for humans. So, what the actual image maker AI “sees” is very different from the image the human sees.

    Obviously, this only works if you have access to the VAE used by the image generator. So, it only works against open source AI; basically only Stable Diffusion at this point. Companies that use a closed source VAE cannot be attacked in this way.


    I guess it makes sense if your ideology is that information must be owned and everything should make money for someone. I guess some people see cyberpunk dystopia as a desirable future. I wonder if it bothers them that all the tools they used are free (EG the method to check if images are similar to humans).

    It doesn’t seem to be a very effective attack but it may have some long-term PR effect. Training an AI costs a fair amount of money. People who give that away for free probably still have some ulterior motive, such as being liked. If instead you get the full hate of a few anarcho-capitalists that threaten digital vandalism, you may be deterred. Well, my two cents.

  • Zealousideal_Fox900@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    As an artist, nightshade is not something I will ever use. All my art is public domain, including AI. Let people generate as many pigeon pictures as they want I say!

        • Drewelite@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 years ago

          Yeah same. Empowering people to be more creative has never stuck me as something that needs to be gatekept. Tools have constantly improved allowing more people to become artists. If it’s the copying of styles you’re worried about, I’d take it up with every artist that’s learned from Picasso or Da Vinci.

  • vsis@feddit.cl
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    It’s not FOSS and I don’t see a way to review if what they claim is actually true.

    It may be a way to just help to diferentiate legitimate human made work vs machine-generated ones, thus helping AI training models.

    Can’t demostrate that fact neither, because of its license that expressly forbids sofware adaptions to other uses.

    Edit, alter, modify, adapt, translate or otherwise change the whole or any part of the Software nor permit the whole or any part of the Software to be combined with or become incorporated in any other software, nor decompile, disassemble or reverse engineer the Software or attempt to do any such things

    sauce: https://nightshade.cs.uchicago.edu/downloads.html

    • JATth@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      I read the article enough to find that the Nightshade tool is under EULA… :(

      Because it definitely is not FOSS, use it with caution, preferably on a system not connected to internet.

    • nybble41@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      The EULA also prohibits using Nightshade “for any commercial purpose”, so arguably if you make money from your art—in any way—you’re not allowed to use Nightshade to “poison” it.

      • Nommer@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        This is the part most people will ignore but I get that’s it’s mainly meant for big actors.

  • SPRUNT@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    Is there a similar tool that will “poison” my personal tracked data? Like, I know I’m going to be tracked and have a profile built on me by nearly everywhere online. Is there a tool that I can use to muddy that profile so it doesn’t know if I’m a trans Brazilian pet store owner, a Nigerian bowling alley systems engineer, or a Beverly Hills sanitation worker who moonlights as a practice subject for budding proctologists?

    • Australis13@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      2 years ago

      The browser addon “AdNauseum” can help with that, although it’s not a complete solution.

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      The only way to taint your behavioral data so that you don’t get lumped into a targetable cohort is to behave like a manic. As I’ve said in a past comment here, when you fill out forms, pretend your gender, race, and age is fluid. Also, pretend you’re nomadic. Then behave erratic as fuck when shopping online - pay for bibles, butt plugs, taxidermy, and PETA donations.

      Your data will be absolute trash. You’ll also be miserable because you’re going to be visiting the Amazon drop off center with gag balls and porcelain Jesus figurines to return every week.