Headline: Google reinvents testing automation tool, now with AI
Anthropic released an api for the same thing last week.
Yeah, but they encourage confining it to a virtual machine with limited access.
I bought it so I could use it though. The ai can buy it’s own computer to use!
How do they not get people don’t fucking want this. It’s like they’re in a race to see who cand develop the shittiest product.
They are monopolists/oligopolists, operating like a cartel.
They create new paradigms as they please, because there is no alternative.
Consumer preferences don’t mean dick in a highly uncompetetive market with absurd costs to entry.
Shareholders are the customers now. People are livestock to be milked dry, worked to the bone, and chopped up and served to corporations.
Because people are the product, and these anti-features improve the extortability of that product.
AI is so hot right now and these incompetent doorknobs have FOMO.
It’s like they’re in a race to see who cand develop the shittiest product.
Because it is. That’s why it is called enshittification.
They’re not ultimately making it for people to use. They’re creating a playground for AI to work and learn in, thereby letting their AI access human behavior data that other companies don’t yet have.
Basically it’s a ploy to get a novel set of proprietary data in hopes that their AI gets smarter than the competition.
Nationalize AI companies.
I bet you plenty of people absolutely do want this.
Most people can barely use a computer and would love this if it worked well.
You forget a huge percentage of users can barely access their emails.
Lemmy is very techcentric and most users on here are far from the average consumer on technical literacy.
You just aren’t the target audience.
They will implement it, they are just trying different methods until one sticks.
I could see the appeal as open source, self hosted software.
Not from data vacuums.
*privacy respecting We all know this is meant for data hervesting.
I’m just waiting for someone to make someting like this but runs locally and doesn’t feed a never ending data black hole. It doesn’t need to be that powerful to be useful too.
Instead, you can directly give it commands in your browser and it should automatically do everything you need, including filling out forms and clicking buttons. AI-tasked examples include opening pertinent web pages, compiling search data into easily readable tables, purchasing products, or booking flights.
This sounds like it would be great for people with accessibility needs—if only Google was trustworthy and had a fiduciary duty to humanity…
Alas.
Silver lining is that this may be the end of captchas
Only the end of captchas for people who use Google’s identification standards. With device IDs and accounts they already usher many users past Captcha and I think the more who do that, the worse captchas can get for us who don’t
Nope, still need that training data to sell to self-driving car tech.
i guess it’s just puppeteer + ai prompting
unless AI gets significantly better in the next year or so I doubt it’s gonna be any better than someone spending an hour writing a puppeteer scraping script
Prefect. It can click buttons and complete tasks.
No sir, I did not accept your terms and conditions, my browser did.
Can’t wait till AI can just learn and utilize my consciousness on my behalf. That way I don’t need to exist.
Didn’t windows also start tracking how you use windows?
Google doesn’t have the leverage to get it into your OS against your wishes. They can’t even make me have their apps (search, assistant) on their Android.
WELL DON’T JINX IT
Google should create AI agents to use their services and pay for their bullshit since they like AI agents so much.
This whole AI thing is starting to feel less like focused research and more like Free Jazz jam night at the local dive.
It could be used for amazing things, but it’s currently in that phase where it’s a rapid frenzy to make anything, regardless of moral and ethical implications, just to cash in before it inevitably gets monopolised.
Imagine it was actually more open source and privacy focused. Yes, it would likely learn at a slower pace but at least it would be something more for the people rather than the big corpos.
regardless of moral and ethical implications
You are pointing on capitalism, not AI.
just to cash in before it inevitably gets monopolised.
Yep, capitalism.
Only everybody gets their pocket picked
Just like my local dive…
Corpornet