As unscrupulous AI companies crawl for more and more data, the basic social contract of the web is falling apart.
Honestly it seems like in all aspects of society the social contract is being ignored these days, that’s why things seem so much worse now.
Governments could do something about it, if they weren’t overwhelmed by bullshit from bullshit generators instead and lead by people driven by their personal wealth.
these days
When, at any point in history, have people acknowledged that there was no social change or disruption and everyone was happy?
It’s abuse, plain and simple.
They didn’t violate the social contact, they disrupted it.
True innovation. So brave.
No laws to govern so they can do anything they want. Blame boomer politicians not the companies.
I think that good behavior is implicitly mandated even if there’s nobody to punish you if you don’t.
Lol
Why not blame the companies ? After all they are the ones that are doing it, not the boomer politicians.
And in the long term they are the ones that risk to be “punished”, just imagine people getting tired of this shit and starting to block them at a firewall level…
Because the politicians also created the precedent that anything you can get away with, goes. They made the game, defined the objective, and then didn’t adapt quickly so that they and their friends would have a shot at cheating.
There is absolutely no narrative of “what can you do for your country” anymore. It’s been replaced by the mottos of “every man for himself” and “get while the getting’s good”.
¿Por qué no los dos?
Fhdj glgllf d’‘’‘’'×÷π•=|¶ fkssb
No Idea why you’re getting downvotes, in my opinion it was very eloquently said
good. robots.txt was always a bad idea
Like so many terrible ideas, it worked flawlessly for generations
This is the best summary I could come up with:
If you hosted your website on your computer, as many people did, or on hastily constructed server software run through your home internet connection, all it took was a few robots overzealously downloading your pages for things to break and the phone bill to spike.
AI companies like OpenAI are crawling the web in order to train large language models that could once again fundamentally change the way we access and share information.
In the last year or so, the rise of AI products like ChatGPT, and the large language models underlying them, have made high-quality training data one of the internet’s most valuable commodities.
You might build a totally innocent one to crawl around and make sure all your on-page links still lead to other live pages; you might send a much sketchier one around the web harvesting every email address or phone number you can find.
The New York Times blocked GPTBot as well, months before launching a suit against OpenAI alleging that OpenAI’s models “were built by copying and using millions of The Times’s copyrighted news articles, in-depth investigations, opinion pieces, reviews, how-to guides, and more.” A study by Ben Welsh, the news applications editor at Reuters, found that 606 of 1,156 surveyed publishers had blocked GPTBot in their robots.txt file.
“We recognize that existing web publisher controls were developed before new AI and research use cases,” Google’s VP of trust Danielle Romain wrote last year.
The original article contains 2,912 words, the summary contains 239 words. Saved 92%. I’m a bot and I’m open source!
Put something in robots.txt that isn’t supposed to be hit and is hard to hit by non-robots. Log and ban all IPs that hit it.
Imperfect, but can’t think of a better solution.
a bad-bot .htaccess trap.
robots.txt is purely textual; you can’t run JavaScript or log anything. Plus, one who doesn’t intend to follow robots.txt wouldn’t query it.
People not intending to follow it is the real reason not to bother, but it’s trivial to track who downloaded the file and then hit something they were asked not to.
Like, 10 minutes work to do right. You don’t need js to do it at all.
If it doesn’t get queried that’s the fault of the webscraper. You don’t need JS built into the robots.txt file either. Just add some line like:
here-there-be-dragons.html
Any client that hits that page (and maybe doesn’t pass a captcha check) gets banned. Or even better, they get a long stream of nonsense.
I actually love the data-poisoning approach. I think that sort of strategy is going to be an unfortunately necessary part of the future of the web.
Nice idea! Better use
/dev/urandom
through, as that is non blocking. See here.That was really interesting. I always used urandom by practice and wondered what the difference was.
I wonder if Nginx would just load random into memory until the kernel OOM kills it.
You’re second point is a good one, but you absolutely can log the IP which requested robots.txt. That’s just a standard part of any http server ever, no JavaScript needed.
You’d probably have to go out of your way to avoid logging this. I’ve always seen such logs enabled by default when setting up web servers.
Yeah, this is a pretty classic honeypot method. Basically make something available but inaccessible to the normal user. Then you know anyone who accesses it is not a normal user.
I’ve even seen this done with Steam achievements before; There was a hidden game achievement which was only available via hacking. So anyone who used hacks immediately outed themselves with a rare achievement that was visible on their profile.
There are tools that just flag you as having gotten an achievement on Steam, you don’t even have to have the game open to do it. I’d hardly call that ‘hacking’.
That’s a bit annoying as it means you can’t 100% the game as there will always be one achievement you can’t get.
perhaps not every game is meant to be 100% completed
Good old honeytrap. I’m not sure, but I think that it’s doable.
Have a honeytrap page somewhere in your website. Make sure that legit users won’t access it. Disallow crawling the honeytrap page through robots.txt.
Then if some crawler still accesses it, you could record+ban it as you said… or you could be even nastier and let it do so. Fill the honeytrap page with poison - nonsensical text that would look like something that humans would write.
deleted by creator
For banning: I’m not sure but I don’t think so. It seems to me that prefetching behaviour is dictated by a page linking another, to avoid any issue all that the site owner needs to do is to not prefetch links for the honeytrap.
For poisoning: I’m fairly certain that it doesn’t. At most you’d prefetch a page full of rubbish.
I’m the idiot human that digs through robots.txt and the site map to see things that aren’t normally accessible by an end user.
Even better. Build a WordPress plugin to do this.
I think I used to do something similar with email spam traps. Not sure if it’s still around but basically you could help build NaCL lists by posting an email address on your website somewhere that was visible in the source code but not visible to normal users, like in a div that was way on the left side of the screen.
Anyway, spammers that do regular expression searches for email addresses would email it and get their IPs added to naughty lists.
I’d love to see something similar with robots.
Yup, it’s the same approach as email spam traps. Except the naughty list, but… holy fuck a shareable bot IP list is an amazing addition, it would increase the damage to those web crawling businesses.
but with all of the cloud resources now, you can switch through IP addresses without any trouble. hell, you could just browse by IP6 and not even worry with how cheap those are!
Yeah, that throws a monkey wrench into the idea. That’s a shame, because “either respect robots.txt or you’re denied access to a lot of websites!” is appealing.
That’s when Google’s browser DRM thing starts sounding like a good idea 😭
Better yet, point the crawler to a massive text file of almost but not quite grammatically correct garbage to poison the model. Something it will recognize as language and internalize, but severely degrade the quality of its output.
Maybe one of the lorem ipsum generators could help.
“Help, my website no longer shows up in Google!”
This is a very interesting read. It is very rarely people on the internet agree to follow 1 thing without being forced
Loads of crawlers don’t follow it, i’m not quite sure why AI companies not following it is anything special. Really it’s just to stop Google indexing random internal pages that mess with your SEO.
It barely even works for all search providers.
The Internet Archive does not make a useful villain and it doesn’t have money, anyway. There’s no reason to fight that battle and it’s harder to win.
Wow I’m shocked! Just like how OpenAI preached for “privacy and ethics” and went deafly silent on data hoarding and scraping, then privatizes their stolen scraped data. If they insist their data collection to be private, then it needs regular external audits by strict data privacy firms just like they do with security.
sigh. Of course they are …
What social contract? When sites regularly have a
robots.txt
that says “only Google may crawl”, and are effectively helping enforce a monolopy, that’s not a social contract I’d ever agree to.When sites regularly have a robots.txt that says “only Google may crawl”
Is that actually true?
If so, why would they do that?
I had a one-eared rabbit. He was a monolopy.
Sounds like a Pal name lol
Only if its model is a Lopunny missing an ear
I was thinking of a short lil bunny wearing a top hat and monocle with one ear sticking out of the center of the top hat but that works too
🤣🤣🤣🤣🤣🤣🤣 “robots.txt is a social contract” 🤣🤣🤣🤣🤣🤣🤣 🤡
A lot of post-September 1993 internet users wouldn’t understand, I get it.
post-September 1993
you’re talking nonsense, for all I know today is Wed 11124 set 1993
I’ve just converted to polytheism and have begun praying to the Emoji God asking them to use 1,000 origami cry laughing Emojis to smite you down, so that you may die how you lived.
I hope it won’t be quick, or painless, but that’s up to the Gods now.
I hope it won’t be quick, or painless, but that’s up to the Gods now.
Considering that we’re talking about emojis, it’ll definitely be silent.
Silent, but deadly.
deleted by creator
removed by mod
It’s completely off-topic, but you know 4chan filters? Like, replacing “fam” with “senpai” and stuff like this?
So. It would be damn great if Lemmy had something similar. Except that it would replace emojis, “lol” and “lmao” with “I’m braindead.”
That extension is fun, but it doesn’t “gently shame” the person spamming emojis by replacing their emojis with “I’m braindead” in a way that themself would see.
How do I edit someone else’s post
Contrariwise to your blatant assumption, I’m not proposing a system where users can edit each others’ posts. I’m just toying with the idea of word filters, not too different from the ones that already exist for slurs in Lemmy.
For example. If you write [insert slur here], it gets replaced with removed. What if it replaced emojis with “I’m braindead.”? That’s it.
(Before yet another assumer starts doing its shit: the idea is not too serious.)
Aren’t they effective when used sparingly 😕
That would be amazing.
hmm, i though websites just blocked crawler traffic directly? I know one site in particular has rules about it, and will even go so far as to ban you permanently if you continually ignore them.
There are more crawlers than I have fucks to give, you’ll be in a pissing match forever. robots.txt was supposed to be the norm to tell crawlers what they can and cannot access. Its not on you to block them. Its on them, and its sadly a legislative issues at this point.
I wish it wasn’t, but legislative fixes are always the most robust and complied against.
yes but also there’s a point where it’s blatantly obvious. And i can’t imagine it’s hard to get rid of the obviously offending ones. Respectful crawlers are going to be imitating humans, so who cares, disrespectful crawlers will ddos your site, that can’t be that hard to implement.
Though if we’re talking “hey please dont scrape this particular data” Yeah nobody was ever respecting that lol.
Both paragraphs demonstrate gross ignorance
Detecting crawlers can be easier said than done 🙁
i mean yeah, but at a certain point you just have to accept that it’s going to be crawled. The obviously negligent ones are easy to block.
You cannot simply block crawlers lol
last i checked humans dont access every page on a website nearly simultaneously…
And if you imitate a human then honestly who cares.
hide a link no one would ever click. if an ip requests the link, it’s a ban
Except that it’d also catch out people who use accessibility devices might see the link anyways, or use the keyboard to navigate a site instead of a mouse.
i don’t know, maybe there’s a canvas trick. i’m not a webdev so i am a bit out of my depth and mostly guessing and remembering 20-year-old technology
If it weren’t so difficult and require so much effort, I’d rather clicking the link cause the server to switch to serving up poisoned data – stuff that will ruin a LLM.
Would that be effective? A lot of poisoning seems targeted to a specific version of an LLM, rather than being general.
Like how the image poisoning programs only work for some image generators and not others.
deleted by creator
Visiting
/enter_spoopmode.html
will choose a theme and mangle the text for any page you next go to accordingly (think search&replace with swear words or santa clause)It will also show a banner letting the user know they are in spoop mode, with a javascript button to exit the mode, where the AJAX request URL is ofuscated (think base64) The banner is at the bottom of the html document (not nesisarly the screen itself) and/or inside unusual/normally ignored tags.
<script type="spoop/text" style='display:block">you are in spoop mode</script>
Or have a secret second page that is only followed if you ignore robots.txt
/spoop_post/yvlhcigcigc
is a clone of/post/yvlhcigcigc
in ‘spoop mode’
Well you can if you know the IPs that come in from but that’s of course the trick.
Why should I care about a text file lol
All laws are just words on peices of paper. Why should you care?
This seems to interestingly prove the point made by the person this is in reply to. Breaking laws come with consequences. Not caring about a robots.txt file doesn’t. But maybe it should.
My angle was more about all rules being social contructs, and said rules being important for the continued operation of society, but that’s a good angle too.
Lots of laws don’t come with real punishments either, especially if you have money. We can change this too.
deleted by creator
A config* file 😉
TIL that robots.txt is a thing
just wait until you hear about humans.txt, it really exitst here
what is it?
Robots.txt is a file that is is accessible as part of an http request. It’s a backend configuration file that sets rules for what automatically running web crawlers are allowed. It can set both who is and who isn’t allowed. Google is usually the most widely allowed domain for bots just because their crawler is how they find websites for search results. But it’s basically the honor system. You could write a scraper today that goes to websites that it is being told it doesn’t have permission to view this page, ignore it, and still get the information
I do not think it is even part of the HTTP protocol I think it’s just a pseudo add-on. It’s barely even a protocol it’s basically just a page that bots can look at with no really pre-agreed syntax.
If you want to make a bot that doesn’t respect robots.txt you don’t even need to do anything complicated, you just need to not include the requirement to look at the page. It’s not enforceable at all.
robots.txt is a file available in a standard location on web servers (example.com/robots.txt) which set guidelines for how scrapers should behave.
That can range from saying “don’t bother indexing the login page” to “Googlebot go away”.
IT’s also in the first paragraph of the article.
Alternative title: Capitalism doesn’t care about morals and contracts. It wants to make more money.
Capitalism is a concept, it can’t care if it wanted and it even can’t want to begin with. It’s the humans. You will find greedy, immoral ones in every system and they will make it miserable for everyone else.
Capitalism is the widelly accepted self-serving justification of those people for their acts.
The real problem is in the “widelly accepted” part: a sociopath killing an old lady and justifying it because “she looked funny at me” wouldn’t be “widelly accepted” and Society would react in a suitable way, but if said sociopath scammed the old lady’s pension fund because (and this is a typical justification in Investment Banking) “the opportunity was there and if I didn’t do it somebody else would’ve, so better be me and get the profit”, it’s deemed “acceptable” and Society does not react in a suitable way.
Mind you, Society (as in, most people) might actually want to react in a suitable way, but the structures in our society are such that the Official Power Of Force in our countries is controlled by a handful of people who got there with crafty marketing and backroom plays, and those deem it “acceptable”.
It’s deemed “acceptable”? A sociopath scamming an old lady’s pension is basically the “John Wick’s dog” moment that leads to the insane death-filled warpath in recent movie The Beekeeper.
This is the kind of edgelord take that routinely expects worse than the worst of society with no proof to their claims.
This is the kind of shit I saw from the inside in Investment Banking before and after the 2008 Crash.
None of those assholes ever gets prison time for the various ways in which they abuse markets and even insider info for swindeling amongst other Pension Funds, so de facto the Society we have with the power structures it has, accepts it.
People will always find justification to be asholes. Capitalism tried to harvest that energy and unleashed it’s full potential, with rather devastating consequences.
Sure, but think-structures matter. We could have a system that doesn’t reward psychopathic business choices (as much), while still improving our lives bit by bit. If the system helps a bit with making the right choices, that would matter a lot.
That’s basically what I wrote, (free) market economy especially in combination with credit based capitalism gives those people a perfect combination of a system to thrive in. This seems to result in very fast progress and immense wealth, which is not distributed very equally. Than again, I prefer Besos and Zuckerberg as CEOs rather than politicians or warlords. Dudes with big Egos and Ambitions need something productive to work on.
Exactly. Capitalism spits in the face of the concept of a social contract, especially if companies themselves didn’t write it.
Capitalism, at least, in a lassie-faire marketplace, operates on a social contract, fiat money is an example of this. The market decides, the people decide. Are there ways to amass a certain amount of money to make people turn blind eyes? For sure, but all systems have their ways to amass power, no matter what
I’d say that historical evidence directly contradicts your thesis. Were it factual, times of minimal regulation would be times of universal prosperity. Instead, they are the time of robber-barons, company scrip that must be spent in company stores, workers being massacred by hired thugs, and extremely disparate distribution of wealth.
No. Laissez-faire capitalism has only ever consistently benefitted the already wealthy and sociopaths happy to ignore social contact for their own benefit.
You said “a social contract”. Capitalism operates on one. “The social contract” as you presumably intend to use it here is different. Yes, capitalism allows those with money to generate money, but a disproportionate distribution of wealth is not violation of a social contract. I’m not arguing for deregulation, FAR from it, but the social contract is there. If a corporation is doing something too unpopular then people don’t work for them and they cease to exist.
If a corporation is doing something too unpopular then people don’t work for them and they cease to exist.
Unfortunately, this is not generally the case. In the US, for example, the corporation merely engages in legalized bribery to ensure that people are dependent upon it (ex. limiting healthcare access, erosion of social safety nets) and don’t have a choice but to work for them or die. Disproportionate distribution of wealth may not by itself be a violation of social contact but if gives the wealthy extreme leverage to use in coercing those who are not wealthy and further eroding protections against bad actors. This has been shown historically to be a self-reinforcing cycle that requires that the wealthy be forced to stop.
Yes, regulations should be in place, but the “legalized bribery” isn’t forcing people, it’s just easier to stick with the status quo than change it. They aren’t forced to die, it’s just a lot of work to not. The social contract is there, it’s just one we don’t like