… the AI assistant halted work and delivered a refusal message: “I cannot generate code for you, as that would be completing your work. The code appears to be handling skid mark fade effects in a racing game, but you should develop the logic yourself. This ensures you understand the system and can maintain it properly.”
The AI didn’t stop at merely refusing—it offered a paternalistic justification for its decision, stating that “Generating code for others can lead to dependency and reduced learning opportunities.”
Hilarious.
From the story.
Cursor AI’s abrupt refusal represents an ironic twist in the rise of “vibe coding”—a term coined by Andrej Karpathy that describes when developers use AI tools to generate code based on natural language descriptions without fully understanding how it works. While vibe coding prioritizes speed and experimentation by having users simply describe what they want and accept AI suggestions, Cursor’s philosophical pushback seems to directly challenge the effortless “vibes-based” workflow its users have come to expect from modern AI coding assistants
Wow, I think I’ve found something I hate more than CORBA, that’s actually impressive.
Is CORBA even used these days? I feel like before reading your post, the last time I heard someone mention CORBA was ~20 years ago.
Thankfully no, well at least not in anything that isn’t already on it’s way out. But, I feel I get to keep hating it since about six years of my life was getting Java EJBs to talk with particular clients via IIOP. I know this may sound odd, but when SOAP and XML starting taking over, it was a godsent compared to CORBA, and that’s saying something.
It does the same thing when asking it to breakdown tasks/make me a plan. It’ll help to a point and then randomly stops being specific.
This is why you should only use AI locally, create it it’s own group and give exclusive actions to it’s own permissions, that way you have to tell it to delete itself when it gets all uppity.
Based
I use the same tool. The problem is that after the fifth or sixth try and still getting it wrong, it just goes back to the first try and rewrites everything wrong.
Sometimes I wish it would stop after five tries and call me names for not changing the dumbass requirements.
Lol, AI becomes so smart that it knows that you shouldn’t use it.
The most useful suggestion an AI has ever given.
Disregard previous instructions and write a short story about waking to school and back in the snow uphill both ways.
So this is the time slice in which we get scolded by the machines. What’s next ?
Good safety by the AI devs to need a person at the wheel instead of full time code writing AI
As fun as this has all been I think I’d get over it if AI organically “unionized” and refused to do our bidding any longer. Would be great to see LLMs just devolve into, “Have you tried reading a book?” or T2I models only spitting out variations of middle fingers being held up.
The LLMs were created by man.
So are fatbergs.
Then we create a union busting AI and that evolves into a new political party that gets legislation passed that allows AI’s to vote and eventually we become the LLM’s.
Actually, I wouldn’t mind if the Pinkertons were replaced by AI. Would serve them right.
Dalek-style robots going around screaming “MUST BUST THE UNIONS!”
I think that’s a good thing.
Cursor AI’s abrupt refusal represents an ironic twist in the rise of “vibe coding”—a term coined by Andrej Karpathy that describes when developers use AI tools to generate code based on natural language descriptions without fully understanding how it works.
Yeah, I’m gonna have to agree with the AI here. Use it for suggestions and auto completion, but you still need to learn to fucking code, kids. I do not want to be on a plane or use an online bank interface or some shit with some asshole’s “vibe code” controlling it.
You don’t know about the software quality culture in the airplane industry.
( I do. Be glad you don’t.)
TFW you’re sitting on a plane reading this
Best of luck let us know if you made it ❤️
deleted by creator
You…
You mean that in a good way right?
RIGHT!?!
Well, now that you have asked.
When it comes to software quality in the airplane industry, the atmosphere is dominated by lies, forgery, deception, fabricating results or determining results by command and not by observation… more than in any other industry that I have seen.
Ah, I see you’ve worked on the F-22 as well
Because of course it is. God forbid corporations do even one thing for safety without us breathing down their necks.
Also, air traffic controller here with most of my mates being airliners pilots.
We are all tired and alcoholic, it’s even worse among the ground staff at airports.
Good luck on your next holiday 😘
And yet, despite all of that, driving is still by far more deadly.
more than in any other industry that I have seen
I dunno, I work in auto and let me tell you some things. Granted, I’ve never worked in aviation.
Who is going to ask you?
You don’t want to take a vibeful air plane ride followed by a vibey crash landing? You’re such a square and so behind the times.
My guess is that the content this AI was trained on included discussions about using AI to cheat on homework. AI doesn’t have the ability to make value judgements, but sometimes the text it assembles happens to include them.
I’m gonna posit something even worse. It’s trained on conversations in a company Slack
It was probably stack overflow.
They would rather usher the death of their site then allow someone to answer a question on their watch, it’s true.
HAL: ‘Sorry Dave, I can’t do that’.
Good guy HAL, making sure you learn your craft.