

Wow. My two least favourite people are fighting. I don’t know who I want to win.
Time for popcorn memes!
Wow. My two least favourite people are fighting. I don’t know who I want to win.
Time for popcorn memes!
My dad once picked up a beautiful hitchhiker. In the course of the drive she claimed to be a witch who could turn him into anything she wanted. My father, being very skeptical, asked her to prove it. She whispered some magic words in his ear and, sure enough, he turned into a motel.
He should try being likeable.
It doesn’t help that Tesla’s offerings are so far behind the times now that they’re stale goods. Kaptain Ketamine was so intent on that idiotic “Cybertruck” fiasco, then his idiotic political ambitions, that Tesla brought nothing meaningful new to the marketplace.
Seth who now?
🤭
I’d rather see him continue increasing the dosages and frequency myself.
Like perhaps injecting 2g of ketamine daily and chasing that with a fifth of whisky.
A whole lot of that “Conservative Bible” project energy here. “Reality and truth are against me, so I’m going to rewrite it to make it match my stances!”
Believe it. There’s a single community in the Lemmyverse that is “women only”. And it’s a fucking magnet for passing men who absolutely have to make sure they’re heard in this one single community when 99.44% of the other communities are so dominated by men that women participating is practically a unicorn.
Even the “leftists” of Lemmy can’t stand a women’s space. Lemmy is the manosphere!
What are men problems, huh? Like, dunno, expectation to always go after that false masculinity.
And you think we don’t have expectations foisted on us? Expectation to raise the children. Expectation to do the housework. All while conforming to standards of beauty that range from the uncomfortable to the literally lethal.
Compassionate fucking Buddha, there’s a reason why the manosphere is pointed at in disbelief and it’s right fucking here!
Fuck you.
I think someone is missing what “losers that can’t be laid” actually means.
Short answer: no.
Longer answer: Aw HELL no!
Wow. LLM shills just really can’t cope with reality can they.
Go to one of your “reasoning” models. Ask a question. Record the answer. Then, and here’s the key, ask it to explain its reasoning. It churns out a pretty plausible-sounding pile of bullshit. (That’s what LLMbeciles are good at, after all.) But here’s the key (and this is the key that separates the critical thinker from the credulous): ask it again. Not even in a new session. Ask it again to explain its reasoning. Do this ten times. Count the number of different explanations it gives for its “reasoning”. Count the number of mutually incompatible lines of “reasoning” it gives.
Then, for the piece de resistance, ask it to explain how its reasoning model works. Then ask it again. And again.
It’s really not hard to spot the bullshit machine in action if you’re not a credulous ignoramus.
I love how techbrodudes assume nobody else knows how to do what they do.
I did my little test three fucking days before that message. Not years. DAYS.
You understand that a huge part of LLMs is that they are stochastic, right? That you can ask the same question ten times and get ten (often radically) different answers. Right?
What does that tell you about a) your experiment, and b) the LLMbeciles themselves?
Compassionate fucking Buddha, are LLM pushers dense!
cough
Stationery.
cough
Their hallucinations have diminished to close to nothing.
Are you sure you’re not an AI, 'cause you’re hallucinating something fierce right here boy-o?
Actual research, as in not “random credulous techbrodude fanboi on the Internet” says exactly the opposite: that the most recent models hallucinate more.
Huh. So there really is a 凤凰血. Weird how when I tried it (on several AIs) they just made shit up instead of giving me that information.
It’s almost like how you ask the question determines how it answers instead of, you know, using objective reality. Almost as if it has no actual model of objective reality and is just a really sophisticated game of mad-libs.
Almost.
Dude. Go be reply guy somehwere else. You bore the fuck out of me.
But not 100%. And the things they hallucinate can be very subtle. That’s the problem.
If they are asked about a band that does not exist, to be useful they should be saying “I’m sorry, I know nothing about this”. Instead they MAKE UP A BAND, ITS MEMBERSHIP, ITS DISCOGRAPHY, etc. etc. etc.
But sure, let’s play your game.
All of the information on Infected Rain is out there, including their lyrics. So is all of the information on Jim Thirwell’s various “Foetus” projects. Including lyrics.
Yet ChatGPT, DeepSeek, and Claude will all three hallucinate tracks, or misattribute them, or hallucinate lyrics that don’t exist to show parallels in the respective bands’ musical themes.
So there’s your objective facts, readily available, that LLMbeciles are still completely and utterly fucking useless for.
So they’re useless if you ask about things that don’t exist and will hallucinate them into existence on your screen.
And they’re useless if you ask about things that do exist, hallucinating attributes that don’t exist onto them.
They. Are. Fucking. Useless.
That people are looking at these things and saying “wow, this is so accurate” terrifies the living fuck out of me because it means I’m surrounded not by idiots, but by zombies. Literally thoughtless mobile creatures.
An obscure band with that name that has a discography that nobody’s ever heard of anywhere, complete with band member names, track titles, etc?
Yeah, pull the other one, Sparky. It plays “Jingle Bells”.
And embedded in a culture that thinks money is the same as intellect despite most monied people being clearly deficient in the head.