• cum
    link
    fedilink
    English
    11 year ago

    Wow I’m shocked! Just like how OpenAI preached for “privacy and ethics” and went deafly silent on data hoarding and scraping, then privatizes their stolen scraped data. If they insist their data collection to be private, then it needs regular external audits by strict data privacy firms just like they do with security.

  • @[email protected]
    link
    fedilink
    English
    321 year ago

    What social contract? When sites regularly have a robots.txt that says “only Google may crawl”, and are effectively helping enforce a monolopy, that’s not a social contract I’d ever agree to.

  • palordrolap
    link
    fedilink
    2091 year ago

    Put something in robots.txt that isn’t supposed to be hit and is hard to hit by non-robots. Log and ban all IPs that hit it.

    Imperfect, but can’t think of a better solution.

    • @[email protected]
      link
      fedilink
      English
      141 year ago

      Yeah, this is a pretty classic honeypot method. Basically make something available but inaccessible to the normal user. Then you know anyone who accesses it is not a normal user.

      I’ve even seen this done with Steam achievements before; There was a hidden game achievement which was only available via hacking. So anyone who used hacks immediately outed themselves with a rare achievement that was visible on their profile.

      • @[email protected]
        link
        fedilink
        English
        31 year ago

        There are tools that just flag you as having gotten an achievement on Steam, you don’t even have to have the game open to do it. I’d hardly call that ‘hacking’.

      • @[email protected]
        link
        fedilink
        English
        131 year ago

        That’s a bit annoying as it means you can’t 100% the game as there will always be one achievement you can’t get.

    • @[email protected]
      link
      fedilink
      English
      4
      edit-2
      1 year ago

      Better yet, point the crawler to a massive text file of almost but not quite grammatically correct garbage to poison the model. Something it will recognize as language and internalize, but severely degrade the quality of its output.

    • Aatube
      link
      fedilink
      121 year ago

      robots.txt is purely textual; you can’t run JavaScript or log anything. Plus, one who doesn’t intend to follow robots.txt wouldn’t query it.

      • @[email protected]
        link
        fedilink
        English
        151 year ago

        You’re second point is a good one, but you absolutely can log the IP which requested robots.txt. That’s just a standard part of any http server ever, no JavaScript needed.

        • @[email protected]
          link
          fedilink
          English
          91 year ago

          You’d probably have to go out of your way to avoid logging this. I’ve always seen such logs enabled by default when setting up web servers.

      • @[email protected]
        link
        fedilink
        English
        451 year ago

        If it doesn’t get queried that’s the fault of the webscraper. You don’t need JS built into the robots.txt file either. Just add some line like:

        here-there-be-dragons.html
        

        Any client that hits that page (and maybe doesn’t pass a captcha check) gets banned. Or even better, they get a long stream of nonsense.

      • @[email protected]
        link
        fedilink
        English
        101 year ago

        People not intending to follow it is the real reason not to bother, but it’s trivial to track who downloaded the file and then hit something they were asked not to.

        Like, 10 minutes work to do right. You don’t need js to do it at all.

    • Lvxferre [he/him]
      link
      fedilink
      English
      105
      edit-2
      1 year ago

      Good old honeytrap. I’m not sure, but I think that it’s doable.

      Have a honeytrap page somewhere in your website. Make sure that legit users won’t access it. Disallow crawling the honeytrap page through robots.txt.

      Then if some crawler still accesses it, you could record+ban it as you said… or you could be even nastier and let it do so. Fill the honeytrap page with poison - nonsensical text that would look like something that humans would write.

      • @[email protected]
        link
        fedilink
        English
        81 year ago

        I’m the idiot human that digs through robots.txt and the site map to see things that aren’t normally accessible by an end user.

      • @[email protected]
        link
        fedilink
        English
        541 year ago

        I think I used to do something similar with email spam traps. Not sure if it’s still around but basically you could help build NaCL lists by posting an email address on your website somewhere that was visible in the source code but not visible to normal users, like in a div that was way on the left side of the screen.

        Anyway, spammers that do regular expression searches for email addresses would email it and get their IPs added to naughty lists.

        I’d love to see something similar with robots.

        • Lvxferre [he/him]
          link
          fedilink
          English
          29
          edit-2
          1 year ago

          Yup, it’s the same approach as email spam traps. Except the naughty list, but… holy fuck a shareable bot IP list is an amazing addition, it would increase the damage to those web crawling businesses.

          • @[email protected]
            link
            fedilink
            English
            91 year ago

            but with all of the cloud resources now, you can switch through IP addresses without any trouble. hell, you could just browse by IP6 and not even worry with how cheap those are!

            • Lvxferre [he/him]
              link
              fedilink
              English
              91 year ago

              Yeah, that throws a monkey wrench into the idea. That’s a shame, because “either respect robots.txt or you’re denied access to a lot of websites!” is appealing.

        • Lvxferre [he/him]
          link
          fedilink
          English
          51 year ago

          For banning: I’m not sure but I don’t think so. It seems to me that prefetching behaviour is dictated by a page linking another, to avoid any issue all that the site owner needs to do is to not prefetch links for the honeytrap.

          For poisoning: I’m fairly certain that it doesn’t. At most you’d prefetch a page full of rubbish.

  • Cosmic Cleric
    link
    fedilink
    English
    1201 year ago

    As unscrupulous AI companies crawl for more and more data, the basic social contract of the web is falling apart.

    Honestly it seems like in all aspects of society the social contract is being ignored these days, that’s why things seem so much worse now.

    • @[email protected]
      link
      fedilink
      English
      141 year ago

      Governments could do something about it, if they weren’t overwhelmed by bullshit from bullshit generators instead and lead by people driven by their personal wealth.

    • @[email protected]
      link
      fedilink
      English
      11 year ago

      these days

      When, at any point in history, have people acknowledged that there was no social change or disruption and everyone was happy?

  • Optional
    link
    fedilink
    English
    1071 year ago

    Well the trump era has shown that ignoring social contracts and straight up crime are only met with profit and slavish devotion from a huge community of dipshits. So. Y’know.

    • @[email protected]
      link
      fedilink
      English
      61 year ago

      Only if you’re already rich or in the right social circles though. Everyone else gets fined/jail time of course.

      • @[email protected]
        link
        fedilink
        English
        11 year ago

        Meh maybe. I know plenty of people who get away with all kinds of crap without money or connections.

  • 𝐘Ⓞz҉
    link
    fedilink
    English
    271 year ago

    No laws to govern so they can do anything they want. Blame boomer politicians not the companies.

    • @[email protected]
      link
      fedilink
      English
      11 year ago

      I think that good behavior is implicitly mandated even if there’s nobody to punish you if you don’t.

    • gian
      link
      fedilink
      English
      161 year ago

      Why not blame the companies ? After all they are the ones that are doing it, not the boomer politicians.

      And in the long term they are the ones that risk to be “punished”, just imagine people getting tired of this shit and starting to block them at a firewall level…

      • @[email protected]
        link
        fedilink
        English
        21 year ago

        Because the politicians also created the precedent that anything you can get away with, goes. They made the game, defined the objective, and then didn’t adapt quickly so that they and their friends would have a shot at cheating.

        There is absolutely no narrative of “what can you do for your country” anymore. It’s been replaced by the mottos of “every man for himself” and “get while the getting’s good”.

      • @[email protected]
        link
        fedilink
        English
        49
        edit-2
        1 year ago

        robots.txt is a file available in a standard location on web servers (example.com/robots.txt) which set guidelines for how scrapers should behave.

        That can range from saying “don’t bother indexing the login page” to “Googlebot go away”.

        IT’s also in the first paragraph of the article.

      • @[email protected]
        link
        fedilink
        English
        141 year ago

        Robots.txt is a file that is is accessible as part of an http request. It’s a backend configuration file that sets rules for what automatically running web crawlers are allowed. It can set both who is and who isn’t allowed. Google is usually the most widely allowed domain for bots just because their crawler is how they find websites for search results. But it’s basically the honor system. You could write a scraper today that goes to websites that it is being told it doesn’t have permission to view this page, ignore it, and still get the information

        • Echo Dot
          link
          fedilink
          English
          5
          edit-2
          1 year ago

          I do not think it is even part of the HTTP protocol I think it’s just a pseudo add-on. It’s barely even a protocol it’s basically just a page that bots can look at with no really pre-agreed syntax.

          If you want to make a bot that doesn’t respect robots.txt you don’t even need to do anything complicated, you just need to not include the requirement to look at the page. It’s not enforceable at all.

  • YTG123
    link
    fedilink
    English
    751 year ago

    We need laws mandating respect of robots.txt. This is what happens when you don’t codify stuff

    • Armok: God of Blood
      link
      fedilink
      English
      111 year ago

      Sounds like the type of thing that would either be unenforceable or profitable to violate compared to the fines.

      • @[email protected]
        link
        fedilink
        English
        211 year ago

        The battle cry of conservatives everywhere: It’s too hard!

        Except if it involves oppressing minorities and women. Then it’s a moral imperative worth all the time and money you can shovel at it regardless of whether the desired outcome is realistic or not.

        • Jojo
          link
          fedilink
          English
          41 year ago

          Seriously, could the party of “small government” get out of my business, please?

            • Jojo
              link
              fedilink
              English
              11 year ago

              I just wish the push and pull of politics didn’t have to be played as a zero sum game. I wish someone could take the initiative and just…

              I think both parties in America sing pretty loud about “law and order.” I haven’t heard that cry particularly loudly from either side over the other. I don’t think I’ve heard anyone who claims to be a Democrat saying the end goal is “small government” but I have heard it from Republican voices.

              Honestly, I would really prefer if we were in a system that enabled more parties, so we didn’t have “parties” that did such contradictory things as the current ones…

              • @[email protected]
                link
                fedilink
                English
                1
                edit-2
                1 year ago

                The GOP has historically been the party of law and order. Hence why they implied that blue lives matter more than black lives.

                thatsthejoke.png

                Just like how one party impeached a president of the other for obstruction and abuse of power, and the other impeached a president for checks notes lying about a blowjob.

    • Echo Dot
      link
      fedilink
      English
      361 year ago

      It’s a bad solution to a problem anyway. If we are going to legally mandate a solution I want to take the opportunity to come up with an actually better fix than the hacky solution that is robots.txt

    • @[email protected]
      link
      fedilink
      English
      191 year ago

      Turning that into a law is ridiculous - you really can’t consider that more than advisory unless you enforce it with technical means. For example, maybe put it behind a login or captcha if you want only humans to see it

      • Kairos
        link
        fedilink
        English
        91 year ago

        Are you aware of what “unlisted” means?

        • @[email protected]
          link
          fedilink
          English
          8
          edit-2
          1 year ago

          Yes, and there’s also no law against calling an unlisted phone number

          Also we already had this battle with robots.txt. In the beginning, search engines wouldn’t honor it either because they wanted the competitive advantage of more info, and websites trusted it too much and tried to wall off too much info that way.

          There were complaints, bad pr, lawsuits, call for a law

          It’s no longer the Wild West:

          • search engines are mature and generally honor robots.txt
          • websites use rate limiting to conserve resources and user logins to fence off data there’s a reason to fence off
          • truce: neither side is as greedy
          • there is no such law nor is that reasonable
          • Kairos
            link
            fedilink
            English
            41 year ago

            There’s also no law against visiting an unlisted webpage? What?

    • @[email protected]
      link
      fedilink
      English
      241 year ago

      AI companies will probably get a free pass to ignore robots.txt even if it were enforced by law. That’s what they’re trying to do with copyright and it looks likely that they’ll get away with it.

    • @[email protected]
      link
      fedilink
      English
      5
      edit-2
      1 year ago

      I hope not, laws tend to get outdated real fast. Who knows robots.txt might not even be used in the future and it just there adding space because of law reasons.

      • Echo Dot
        link
        fedilink
        English
        51 year ago

        We don’t need new laws we just need enforcement of existing laws. It is already illegal to copy copyrighted content, it’s just that the AI companies do it anyway and no one does anything about it.

        Enforcing respect for robots.txt doesn’t matter because the AI companies are already breaking the law.

          • Echo Dot
            link
            fedilink
            English
            31 year ago

            Copyright law in general needs changing though that’s the real problem. I don’t see the advantage of legally mandating that a hacky workaround solution becomes a legally mandated requirement.

            Especially because there are many many legitimate reasons to ignore robots.txt including it being misconfigured or it just been set up for search engines when your bot isn’t a search engine crawler.

      • jackeryjoo
        link
        fedilink
        English
        121 year ago

        You can describe the law in a similar way to a specification, and you can make it as broad as needed. Something like the file name shouldn’t ever come up as an issue.

      • kingthrillgore
        link
        fedilink
        English
        41 year ago

        robots.txt has been an unofficial standard for 30 years and its augmented with sitemap.xml to help index uncrawlable pages, and Schema.org to expose contents for Semantic Web. I’m not stating it shouldn’t not be a law, but to suggest changing norms as a reason is a pretty weak counterargument, man.

      • @[email protected]
        link
        fedilink
        English
        41 year ago

        This seems to interestingly prove the point made by the person this is in reply to. Breaking laws come with consequences. Not caring about a robots.txt file doesn’t. But maybe it should.

        • Tlaloc_Temporal
          link
          fedilink
          English
          31 year ago

          My angle was more about all rules being social contructs, and said rules being important for the continued operation of society, but that’s a good angle too.

          Lots of laws don’t come with real punishments either, especially if you have money. We can change this too.

  • @[email protected]
    link
    fedilink
    English
    101 year ago

    🤣🤣🤣🤣🤣🤣🤣 “robots.txt is a social contract” 🤣🤣🤣🤣🤣🤣🤣 🤡

    • TimeSquirrel
      link
      fedilink
      11
      edit-2
      1 year ago

      A lot of post-September 1993 internet users wouldn’t understand, I get it.

      • @[email protected]
        link
        fedilink
        English
        11 year ago

        post-September 1993

        you’re talking nonsense, for all I know today is Wed 11124 set 1993

    • circuscritic
      link
      fedilink
      English
      7
      edit-2
      1 year ago

      I’ve just converted to polytheism and have begun praying to the Emoji God asking them to use 1,000 origami cry laughing Emojis to smite you down, so that you may die how you lived.

      I hope it won’t be quick, or painless, but that’s up to the Gods now.

      • Cosmic Cleric
        link
        fedilink
        English
        01 year ago

        I hope it won’t be quick, or painless, but that’s up to the Gods now.

        Considering that we’re talking about emojis, it’ll definitely be silent.

      • Lvxferre [he/him]
        link
        fedilink
        English
        61 year ago

        It’s completely off-topic, but you know 4chan filters? Like, replacing “fam” with “senpai” and stuff like this?

        So. It would be damn great if Lemmy had something similar. Except that it would replace emojis, “lol” and “lmao” with “I’m braindead.”

  • @[email protected]
    link
    fedilink
    English
    131 year ago

    This is a very interesting read. It is very rarely people on the internet agree to follow 1 thing without being forced

    • Echo Dot
      link
      fedilink
      English
      161 year ago

      Loads of crawlers don’t follow it, i’m not quite sure why AI companies not following it is anything special. Really it’s just to stop Google indexing random internal pages that mess with your SEO.

      It barely even works for all search providers.

      • @[email protected]
        link
        fedilink
        English
        31 year ago

        The Internet Archive does not make a useful villain and it doesn’t have money, anyway. There’s no reason to fight that battle and it’s harder to win.

  • molave
    link
    fedilink
    English
    231 year ago

    Strong “the constitution is a piece of paper” energy right there

  • @[email protected]
    link
    fedilink
    English
    87
    edit-2
    1 year ago

    I would be shocked if any big corpo actually gave a shit about it, AI or no AI.

    if exists("/robots.txt"):
        no it fucking doesn't
    
    • BargsimBoyz
      link
      fedilink
      English
      21 year ago

      Yeah I always found it surprising that everyone just agreed to follow a text file on a website on how to act. It’s one of the worst thought out/significant issues with browsing still out there from the beginning pretty much.

    • @[email protected]
      link
      fedilink
      English
      441 year ago

      Robots.txt is in theory meant to be there so that web crawlers don’t waste their time traversing a website in an inefficient way. It’s there to help, not hinder them. There is a social contract being broken here and in the long term it will have a negative impact on the web.