• @[email protected]
    link
    fedilink
    English
    1411 months ago

    But did they super duper pinky promise, cross their heart, hope to die, poke a needle in their eye??

  • @[email protected]
    link
    fedilink
    English
    4511 months ago

    I’m betting the reason they want access to “moderate” your projects is to train their AI. Literally looking to steal artists work before it’s out the door.

    • @[email protected]
      link
      fedilink
      English
      1411 months ago

      That’s absolutely what’s going on.

      A fun way to combat this would be to get every artist to add giant, throbbing dicks to everything they create in Photoshop with the hope that it creates the thirstiest, nastiest AI model out there.

      • @[email protected]
        link
        fedilink
        English
        5
        edit-2
        11 months ago

        Not just dicks, but dicks mixed with other art so it just completely pollutes the training data and the AI has no idea how to draw anything without it kind of looking like a dick. Dicks with human and animal faces, boats shaped like dicks, dick buildings and landscapes etc.

        It would take an immense amount of bad data to actually work, but it would be funny.

  • @[email protected]
    link
    fedilink
    English
    77
    edit-2
    11 months ago

    Every time we trusted a large tech promise on an unverifiable claim, they ended up shafting us. Just sayin’.

  • @[email protected]
    link
    fedilink
    English
    311 months ago

    They just wanna review your work 😀. What if you’re trying to put a penis on Trump’s face and it’s too big or it’s pointing the wrong way or something? You know. Wouldn’t you want to be told stuff like " the police is coming unless you erase this now!" You know, things like that? It would definitely come in handy to catch kids doing nudes of others. Or adults doing nudes of other adults who didn’t know. I wouldn’t want to end up in a collage of nudes that is 20MBb 1080p or 4K.

  • nelson
    link
    fedilink
    English
    4211 months ago

    Here’s a License change which implies we’re datafarming all your assets.

    Here’s my word that we’re absolutely not goijf to be doing that. Trust me bro.

  • @[email protected]
    link
    fedilink
    English
    4911 months ago

    claims that the company often uses machine learning to review user projects for signs of illegal content

    OK, so what happens when Florida starts deciding more content is illegal?

    Literally big brother shit.

  • @[email protected]
    link
    fedilink
    English
    1011 months ago

    Bullshit

    I have a sub. I download their app. And the they have the gall to install anti piracy software?

  • @[email protected]
    link
    fedilink
    English
    38
    edit-2
    11 months ago

    Riiiight. And, pray tell Adobe, why in the everloving fuck woul you ever need to “review” private content that’s not posted anywhere? Stop acting like you’re the goddamned pre-crime agency from Minority Report and keep your dirty paws off stuff people are creating privately.

    You are providing tools, and that’s it. I can do horrible, illegal shit with my drill, but it doesn’t give Black&Decker any right to break into my house to do random checks and see if I’m drilling through kneecaps instead of wooden planks…

  • @[email protected]
    link
    fedilink
    English
    1211 months ago

    Interesting, we get to either hate them for going full big brother, or hate them for going full adobe in the first place. It’s nice to have a choice sometimes.