• @[email protected]
    link
    fedilink
    English
    1310 months ago

    The start(-up?)[sic] generates up to $2 billion annually from ChatGPT and an additional $ 1 billion from LLM access fees, translating to an approximate total revenue of between $3.5 billion and $4.5 billion annually.

    I hope their reporting is better then their math…

  • Ephera
    link
    fedilink
    2010 months ago

    I do expect them to receive more funding, but I also expect that to be tied to pricing increases. And I feel like that could break their neck.

    In my team, we’re doing lots of GenAI use-cases and far too often, it’s a matter of slapping a chatbot interface onto a normal SQL database query, just so we can tell our customers and their bosses that we did something with GenAI, because that’s what they’re receiving funding for. Apart from these user interfaces, we’re hardly solving problems with GenAI.

    If the operation costs go up and management starts asking what the pricing for a non-GenAI solution would be like, I expect the answer to be rather devastating for most use-cases.

    Like, there’s maybe still a decent niche in that developing a chatbot interface is likely cheaper than a traditional interface, so maybe new projects might start out with a chatbot interface and later get a regular GUI to reduce operation costs. And of course, there is the niche of actual language processing, for which LLMs are genuinely a good tool. But yeah, going to be interesting how many real-world use-cases remain once the hype dies down.

    • ☆ Yσɠƚԋσʂ ☆OP
      link
      fedilink
      510 months ago

      It’s also worth noting that smaller model work fine for these types of use cases, so it might just make sense to run a local model at that point.

  • @[email protected]
    link
    fedilink
    710 months ago

    Last time a batch of these popped up it was saying they’d be bankrupt in 2024 so I guess they’ve made it to 2025 now. I wonder if we’ll see similar articles again next year.

  • Riskable
    link
    fedilink
    English
    3410 months ago

    Now’s the time to start saving for a discount GPU in approximately 12 months.

    • FaceDeer
      link
      fedilink
      1710 months ago

      They don’t use GPUs, they use more specialized devices like the H100.

          • @[email protected]
            link
            fedilink
            110 months ago

            Yep and if OpenAI goes under the whole market will likely crash, people will dump their GPUs they’ve been using to create models and then boom, you’ve got a bunch of GPUs available.

            • FaceDeer
              link
              fedilink
              110 months ago

              That would depend entirely on why OpenAI might go under. The linked article is very sparse on details, but it says:

              These expenses alone stack miles ahead of its rivals’ expenditure predictions for 2024.

              Which suggests this is likely an OpenAI problem and not an AI in general problem. If OpenAI goes under the rest of the market may actually surge as they devour OpenAI’s abandoned market share.

          • @[email protected]
            link
            fedilink
            710 months ago

            People who previously were at the high end of GPU can now afford used H100s -> they sell their GPUs -> we can maybe afford them

  • arran 🇦🇺
    link
    fedilink
    410 months ago

    This sounds like FUD to me. If it were it would be acquired pretty quickly.

  • @[email protected]
    link
    fedilink
    410 months ago

    I hope not, I use it a lot for quickly programming answers and prototypes and for theory on my actuarial science MBA.

    • @[email protected]
      link
      fedilink
      310 months ago

      I use it all the time for work especially for long documents and formatting technical documentation. It’s all but eliminated my removed work. A lot of people are sour on AI because “it’s not going to deliver on generative AI etc etc” but it doesn’t matter. It’s super useful and we’ve really only scratched the surface of what it can be used for.

      • @[email protected]
        link
        fedilink
        110 months ago

        I also thing we just need to find use cases where it is working.

        While it will not solve everything, it did solve some things. Like you have found, I have used it for generating simple artwork for internal documents, that would never get design funding (even if it would I would have spent much more time dealing with designer), rewriting sentences so it sounds better, grammar check, quick search engine, enciclopedia, copywriting some non important texts…

        I would pay few bucks per month if it wasn’t free. I gave it to grammarly and barely use it.

        So I guess next step is just reducing cost of running those models, which is not that hard as we can see by open source space.

    • @[email protected]
      link
      fedilink
      English
      210 months ago

      Is 1) the fact that an LLM can be indistinguishable from your original thought and 2) an MBA (lmfao) supposed to be impressive?