I think we’re running out of advancements that make life better, now all technology does is make production cheaper/increase shareholder value.
I just moved. The number of companies where I had to argue with an AI phone system that refused to let me speak to an actual person in the past month is more than 10. I’m sure that made costs cheaper, but I know that made their value go down to me.
I hate what AI has become and is being used for, i strongly believe that it could have been used way more ethically, solid example being Perplexity, it shows you the sources being used at the top, being the first thing you see when it give a response. The opposite of this is everything else. Even Gemini, despite it being rather useful in day to day life when I need a quick answer to something when I’m not in the position to hold my phone, like driving, doing dishes, or yard work with my ear buds in
Yes, you’re absolutely right. The first StarCoder model demonstrated that it is in fact possible to train a useful LLM exclusively on permissively licensed material, contrary to OpenAI’s claims. Unfortunately, the main concerns of the leading voices in AI ethics at the time this stuff began to really heat up were a) “alignment” with human values / takeover of super-intelligent AI and b) bias against certain groups of humans (which I characterize as differential alignment, i.e. with some humans but not others). The latter group has since published some work criticizing genAI from a copyright and data dignity standpoint, but their absolute position against the technology in general leaves no room for re-visiting the premise that use of non-permissively licensed work is inevitable. (Incidentally they also hate classification AI as a whole; thus smearing AI detection technology which could help on all fronts of this battle. Here again it’s obviously a matter of responsible deployment; the kind of classification AI that UHC deployed to reject valid health insurance claims, or the target selection AI that IDF has used, are examples of obviously unethical applications in which copyright infringement would be irrelevant.)
criticizing genAI from a copyright
There is russian phrase “fight of beaver and donkey”, which loosely means fight of two shits. Copyright is cancer and capitalist abuse of genAI is cancer.
Copyright is actually very important, especially to independent authors, photographers, digital artists, traditional artists, videographers (YouTubers as an example), and especially movie producers. Copyright protects their work from being taken by someone else and claimed as their own, however special cases do exist where other individuals are allowed to use copywritten material that is not theirs, this is where fair use comes into play. If we did not have fair use, but still had Copyright, the large majority of YouTube videos would be illegal, from commentary videos to silly meme videos. So calling Copyright a cancer is like wanting their work to be out in a field of monkeys and hope they don’t notice it, spoiler, they always do.
and claimed as their own
You are talking about either right for name or right of authorship. I don’t remember which is which, but in normal countries(aka in Europe) it is inalienable right, unlike copyright, which can be sold.
If you want copyright THAT badly, you should demand making it inalienable instead of protecting status quo of total publisher’s control.
EDIT: right for name is to have your name on art, right of authorship is to call yourself author.
EDIT2: copytight outside of capitalism just does not make sense.
I get what you mean, but people might read this and think Perplexity is an ethical company.
https://opendatascience.com/perplexity-ai-ceo-offers-to-step-in-amidst-nyt-tech-workers-strike/
Interesting, I definitely see where his mind is at with that offer, and how it was easily misunderstood
Ummmm I don’t think that’s the right take away from this story, though you’re certainly entitled to a different opinion
Yes. Microsoft Recall were a great idea…
Sarcasm
No lie, I actually really love the concept of Microsoft Recall, I’ve got the adhd and am always trying to retrace my steps to figure out problems i solved months ago. The problem is for as useful as it might be it’s just an attack surface.
One of the leading sources of enshitification.
But the poor shareholders!
they don’t care. you’re not the audience. the tech industry lives on hype. now it’s ai because before that they did it with nft and that failed. and crypto failed. tech needs a grift going to keep investors investing. when the bubble bursts again they’ll come up with some other bullshit grift because making useful things is hard work.
Yup, you can see it in talks on annual tech conferences. Last year it was APIs, this year it’s all AI. They’ll just move on to the next trendy thing next year.
To be fair, APIs have been around since the 70s,and are not trendy, they’re just required to have a common interface for applications to request and perform actions with each other.
But yeah, AI is mostly trendy bullshit
I was referring mostly about security conferences. Last year almost every vendor was selling API security products. Now it’s all AI infused products.
Last year it was APIs
Hahaha the inane shit you can read on this website
Have you been to any appsec conferences last year? It was all API security. This year it was all AI-leveraged CI/CD, code/vulnerability review, etc.
Nft didn’t fail, it was just the idiotic selling of jogs for obscene amounts that crashed (most of that was likely money laundering anyway). The tech still has a use.
Wouldn’t exactly call crypto a failure, either, when we’re in the midst of another bull run.
when we’re in the midst of another bull run.
Oh, that’s nice. So, who’s money are you rug pulling?
Words have meaning, let’s use them properly, okay?
Somebody has to hold the bag. I don’t think this is functionally different.
I was ok with crypto and nft because it was up to me to decide if I want to get involved in it or not.
AI does seem to have impact at jobs, at least employers are trying to use it and see if it actually will allow them to hire less staff, I see that for SWE. I don’t think AI will do much there though.
it’s not up to you, it just failed before it could be implemented. many publishers already commit to in-game nfts before they had to back down because it fell apart too quickly (and some still haven’t). if it held on for just a couple more years there wouldn’t be a single aaa title that doesn’t have nfts today.
crypto was more complicated because unlike these two you can’t just add it and say “here, this is all crypto now” because it requires prior commitment and it’s way too complicated for the average person. plus it doesn’t have much benefit: people already give you money and buy fake coins anyway.
I’m giving examples from games because it’s the most exploitative market but these would also seep into other apps and services if not for the hurdles and failures. so now we’re stuck with this. everyone’s doing it because it’s a gold rush except instead of gold it’s literal shit, and instead of a rush it’s literal shit.
— tangent —
… and just today I realized I had to switch back to Google assistant because literally the only thing gemini can do is talk back to me, but it can’t do anything useful, including the simplest shit like converting currency.
“I’m sorry, I’m still learning” – why, bitch? why don’t you already know this? what good are you if I ask you to do something for convenience and instead you tell me to do it manually and start explaining how I can do the most basic shit that you can’t do as if I’m the fucking idiot.
But then we wouldn’t have to pay real artists for real art anymore, and we could finally just let them starve to death!
In film school (25 years ago), there was a lot of discussion around whether or not commerce was antithetical to art. I think it’s pretty clear now that it is. As commercial media leans more on AI, I hope the silver lining will be a modern Renaissance of art as (meaningful but unprofitable) creative expression.
Issue is, that 8 hours people spend in “real” jobs are a big hindrance, and could be spent on doing the art instead, and most of those ghouls now want us to do overtime for the very basics. Worst case scenario, it’ll be a creativity drought, with idea guys taking up the place of real artists by using generative AI. Best case scenario is AI boom totally collapsing, all commercial models become expensive to use. Seeing where the next Trump administration will take us, it’s second gilded age + heavy censorship + potential deregulation around AI.
If your motives are profit, you can draw furry porn or get a real job.
Strangely, that is a lot of who is complaining. It was a Faustian bargin: draw furry porn and earn money but never be allowed to use your art in a professional sense ever again.
Then AI art came and replaced them, so it became loose-loose.
I don’t know where else you could find enough work to sustain yourself other than furry porn and hentai before Ai. Post Ai, even that is gone.
Eh, I’ve made a decent living making commercials and corpo stuff. But not for lack of trying to get paid for art. For all the money I made working on ~50 short films and a handful of features, I could maybe buy dinner. Just like in the music industry, distributors pocket most of the profit.
Art seems like a side hussle or a hobby not a main job. I can’t think of a faster way to hate your own passion.
I wanted to work as a programmer but getting a degree tought me I’m too poor to do it as a job as I need 6 more papers and to know the language for longer than it existed to even interview to earn the grind. Having fun building a stupid side project to bother my friends though.
Exactly. I can code and make a simple game app. If it gets some downloads, maybe pulls in a little money, I’m happy. But I’m not gonna produce endless mtx and ad-infested shovelware to make shareholders and investors happy. I also own a 3D printer. I’ve done a few projects with it and I was happy to do them, I’ve even taken commissions to model and print some things, but it’s not my main job as there’s no way I could afford to sit at home and just print things out all month.
My only side hussle worthy skill is fixing computers and I rather swallow a hot soldering iron than meet a stranger and get money involved.
Yiff in hell furf-
Wait, what
You are overexaggerating under assumption that there will exist social and economic system based on greed and death threats, which sounds very unreali-- Right, capitalism.
Is there a way to fight back? Like I do t need Adobe in my Microsoft Word at work, can I just make a script that constantly demands AI content from it that is absolutely drivel, and set it running over the weekend while I’m not there? To burn up all their electricity and/or processing power?
They would probably detect that and limit your usage.
Even not using their service still leaves its pollution. IMO the best way to fight back is to support higher pollution taxes. Crypto, AI, whatever’s next - it should be technology agnostic.
I got a christmas card from my company. As a part of the christmas greeting, they promoted AI, something to the extent of “We wish you a merry christmas, much like the growth of AI technologies within our company” or something like that.
Please no.
That’s so fucking weird wtf. Do you work for Elon Musk or something lmao
I work as a dev in an IT consulting company. My work includes zero AI development, but other parts of ghe company are embracing it.
Same, and same…
LMAO
According to some meme I saw, it’s gonna fuck your wife in 2025.
I mean, it opens the door to new kinks
You mean that?
Absolutely. More spice
deleted by creator
I like this typography, but I don’t like somebody pretending they represent everybody.
It has represented over 1000 people so far.
Probably, but voting isn’t supposed to mean, “I think this topic is worth discussing.” not “I agree with everything about this blurb.”
“AI” isn’t ready for any type of general consumer market and that’s painfully obvious to anyone even remotely aware of how it’s developing, including investors.
…but the cost benefit analysis on being first-to-market with anything even remotely close to the universal applicability of AI is so absolutely insanely on the “benefit” side that it’s essentially worth any conceivable risk, because the benefit if you get it right is essentially infinite.
It won’t ever stop
I kind of like AI, sorry.
But it should all be freely available & completely open sourced since they were all built with our collective knowledge. The crass commercialization/hoarding is what’s gross.
I mean you’re technically correct from a copyright standpoint since it would be easier to claim fair use for non-commercial research purposes. And bots built for one’s own amusement with open-source tools are way less concerning to me than black-box commercial chatbots that purport to contain “facts” when they are known to contain errors and biases, not to mention vast amounts of stolen copyrighted creative work. But even non-commercial generative AI has to reckon with it’s failure to recognize “data dignity”, that is, the right of individuals to control how data generated by their online activities is shared and used… virtually nobody except maybe Jaron Lanier and the folks behind Brave are even thinking about this issue, but it’s at the core of why people really hate AI.
You had me in the first half, but then you lost me in the second half with the claim of stolen material. There is no such material inside the AI, just the ideas that can be extracted from such material. People hate their ideas being taken by others but this happens all the time, even by the people that claim that is why they do not like AI. It’s somewhat of a rite of passage for your work to become so liked by others that they take your ideas, and every artist or creative person at that point has to swallow the tough pill that their ideas are not their property, even when their way of expressing them is. The alternative would be dystopian since the same companies we all hate, that abuse current genAI as well, would hold the rights to every idea possible.
If you publicize your work, your ideas being ripped from it is an inevitability. People learn from the works they see and some of them try to understand why certain works are so interesting, extracting the ideas that do just that, and that is what AI does as well. If you hate AI for this, you must also hate pretty much all creative people for doing the exact same thing. There’s even a famous quote for that before AI was even a thing. “Good artists copy, great artists steal.”
I’d argue that the abuse of AI to (consider) replacing artists and other working creatives, spreading of misinformation, simplifying of scams, wasting of resources by using AI where it doesn’t belong, and any other unethical means to use AI are far worse than it tapping into the same freedom we all already enjoy. People actually using AI for good means will not be pumping out cheap AI slop, but are instead weaving it into their process to the point it is not even clear AI was used at the end. They are not the same and should not be confused.
a rite of passage for your work to become so liked by others that they take your ideas,
ChatGPT is not a person.
People learn from the works they see […] and that is what AI does as well.
ChatGPT is not a person.
It’s actually really easy: we can say that chatgpt, which is not a person, is also not an artist, and thus cannot make art.
The mathematical trick of putting real images into a blender and then outputting a Legally Distinct™ one does not absolve the system of its source material.
but are instead weaving it into their process to the point it is not even clear AI was used at the end.
The only examples of AI in media that I like are ones that make it painfully obvious what they’re doing.
Yes! So much better statement.
Yeah. I’ve been interested in AI for most of my life. I’ve followed AI developments, and tinkered with a lot of AI stuff myself. I was pretty excited when ChatGPT first launched… but that excitement turned very sour after about a month.
I hate what the world has become. Money corrupts everything. We get the cheapest most exploitative version of every possible idea, and when it comes to AI - that’s pretty big net negative on the world.
I like what we could be doing with AI.
For example there’s one AI that I read about awhile back that was given data sets on all the known human diseases and the medications that are used to treat them.
Then it was given data sets of all the known chemical compounds(or something like that, can’t remember the exact wording)
Then it was used to find new potential treatments for diseases. Like new antibiotics. Basically it gives medical researchers leads to follow.
That’s fucking cool and beneficial to everyone. It’s a wonderful application of the tech. Do more of that please.
What you are talking about is machine learning which is called AI. What the post is talking about is LLMs which are also called AI.
AI by definition means anything that exhibits intelligent behavior and it is not natural in nature.
So when you use GMaps to find the shortest path between 2 points that’s also AI (specifically called local search).
It is pointless to argue/discuss AI if nobody even know which type they are specifically talking about.
I’m talking about AI in the context of this conversation.
I’m sorry it upsets you that capitalism has yet again redefined another word to sell us something else, but nobody here is specifically responsible for the language we’ve been given to talk about LLMs.
Perhaps writing to Mirriam Webster about the issue could reap the results you’re looking for.
LLMs are an instance of AI. There are many. Typically, the newest promising one is what the media will refer to as “AI” because the media don’t do subtlety.
There was a time when expert systems were the one thing the media considered to be AI (and were overhyped to the point of articles wondering if they’d make jobs like general practitioners obsolete). Now it’s generational neural nets. In twenty years it’ll be something else.
The issue is, people tend to overgeneralize and also get averted when some buzzword is repeated too much.
So, this negatively affects the entire field of any AI.