• @[email protected]
    link
    fedilink
    English
    107
    edit-2
    2 months ago

    Your “facial data” isn’t private information. You give it away every time you go outside.

    • @[email protected]
      link
      fedilink
      52 months ago

      Your face being outside isn’t your “facial data”. It has to at least have that image, sure, in good enough quality, easy enough, linked to any piece of your identity, e.g. name or security number. If you just walk around and people take photo of your face, they don’t have your “facial data”. That’s the entire reason why reverse image search and similar services exist. It is NOT an easy problem technically speaking.

    • ℍ𝕂-𝟞𝟝
      link
      fedilink
      English
      432 months ago

      But your likeness does belong to you. Try making money off of an AI movie featuring Taylor Swift.

      • @[email protected]
        link
        fedilink
        322 months ago

        Don’t paparazzi make plenty of money off of selling unauthorized photos of celebrities? Celebrities can control some uses of their likeness, but not all of them.

        • @[email protected]
          link
          fedilink
          202 months ago

          True, though for now paparazzi photos generally are “here’s the celebrity in real life doing [x]” whereas AI is “celebrity never did this thing and we applied their image / voice to it like they did.” Really difficult for celebs to shut down tabloid or fan ai-generated garbage, but I think the bigger issue for them right now is film or music studios just using their likeness to keep the profits churning

    • @[email protected]
      link
      fedilink
      332 months ago

      You’re talking about the American concept of having no privacy in public. Not all countries are like that.