Ai stands for artificial income.
The start(-up?)[sic] generates up to $2 billion annually from ChatGPT and an additional $ 1 billion from LLM access fees, translating to an approximate total revenue of between $3.5 billion and $4.5 billion annually.
I hope their reporting is better then their math…
Maybe they also added 500M for stuff like Dall-E?
Good point - it guess it could have easily fallen out while being edited, too
Probably used ChatGPT….
deleted by creator
I hope so! I am so sick and tired of AI this and AI that at work.
I do expect them to receive more funding, but I also expect that to be tied to pricing increases. And I feel like that could break their neck.
In my team, we’re doing lots of GenAI use-cases and far too often, it’s a matter of slapping a chatbot interface onto a normal SQL database query, just so we can tell our customers and their bosses that we did something with GenAI, because that’s what they’re receiving funding for. Apart from these user interfaces, we’re hardly solving problems with GenAI.
If the operation costs go up and management starts asking what the pricing for a non-GenAI solution would be like, I expect the answer to be rather devastating for most use-cases.
Like, there’s maybe still a decent niche in that developing a chatbot interface is likely cheaper than a traditional interface, so maybe new projects might start out with a chatbot interface and later get a regular GUI to reduce operation costs. And of course, there is the niche of actual language processing, for which LLMs are genuinely a good tool. But yeah, going to be interesting how many real-world use-cases remain once the hype dies down.
It’s also worth noting that smaller model work fine for these types of use cases, so it might just make sense to run a local model at that point.
Bubble. Meet pop.
Last time a batch of these popped up it was saying they’d be bankrupt in 2024 so I guess they’ve made it to 2025 now. I wonder if we’ll see similar articles again next year.
I will be in a perfect position to snatch a discount H100 in 12 months
Oh no!
Anyway…
Now’s the time to start saving for a discount GPU in approximately 12 months.
They don’t use GPUs, they use more specialized devices like the H100.
Everyone that doesn’t have access to those is using gpus though.
We are talking specifically about OpenAI, though.
Yep and if OpenAI goes under the whole market will likely crash, people will dump their GPUs they’ve been using to create models and then boom, you’ve got a bunch of GPUs available.
That would depend entirely on why OpenAI might go under. The linked article is very sparse on details, but it says:
These expenses alone stack miles ahead of its rivals’ expenditure predictions for 2024.
Which suggests this is likely an OpenAI problem and not an AI in general problem. If OpenAI goes under the rest of the market may actually surge as they devour OpenAI’s abandoned market share.
People who previously were at the high end of GPU can now afford used H100s -> they sell their GPUs -> we can maybe afford them
the hermit crab gambit, everyone line up in order of size!
Can I use a H100 to run hell divers 2?
This sounds like FUD to me. If it were it would be acquired pretty quickly.
They’re wholly owned by Microsoft so it’d probably be mothballed at worst.
For another conversation I need some evidence of that, where did you find it?
I was wrong!
Thanks
Half.
Ah that’s news to me, my fault
I hope not, I use it a lot for quickly programming answers and prototypes and for theory on my actuarial science MBA.
I use it all the time for work especially for long documents and formatting technical documentation. It’s all but eliminated my removed work. A lot of people are sour on AI because “it’s not going to deliver on generative AI etc etc” but it doesn’t matter. It’s super useful and we’ve really only scratched the surface of what it can be used for.
I also thing we just need to find use cases where it is working.
While it will not solve everything, it did solve some things. Like you have found, I have used it for generating simple artwork for internal documents, that would never get design funding (even if it would I would have spent much more time dealing with designer), rewriting sentences so it sounds better, grammar check, quick search engine, enciclopedia, copywriting some non important texts…
I would pay few bucks per month if it wasn’t free. I gave it to grammarly and barely use it.
So I guess next step is just reducing cost of running those models, which is not that hard as we can see by open source space.
I find you can just run local models for that. For example, I’ve been using gpt4all with a the phind model and it works reasonably well
How much computer power they need? My pc is pretty old :/
If you have a GPU, it should work, but will take longer to generate answers than an online service likely.
Is 1) the fact that an LLM can be indistinguishable from your original thought and 2) an MBA (lmfao) supposed to be impressive?
I don’t think that person is bragging, just saying why it’s useful to them
PLEASE!!!
womp womp