Investors are barely breaking even as the venture is hardly making any profits due to a shortage of chips, divided interests, and more.

… OpenAI has already seen a $540 million loss since debuting ChatGPT.

… OpenAI uses approximately 700,000 dollars to run the tool daily.


⚠️ First off, apologies as I didn’t cross check. Take it w/ a grain of salt.


This piece of news, if true, somehow explains why OpenAI has been coming up w/ weird schemes for making $$$ like entering the content moderation space.

On a similar note, I wonder if this had been a key driver (behind the scenes) in the recent investment in open source AI initiatives (Haidra comes to my mind?) Perhaps some corporations who haven’t got enough $$$ to fund their own dedicated research group are looking to benefit from an open source model?

  • Clymene@lemmy.ml
    link
    fedilink
    English
    arrow-up
    71
    arrow-down
    2
    ·
    1 year ago

    Too much is made of the shrinking user base. I’m sure they’ll come back with a vengeance come the start of the school year in the northern hemisphere.

    Also, maybe a tool like this shouldn’t be privately funded? Most of the technology is based on university funded research we all paid for. mRNA vaccine research was similarly funded with public money in mostly universities, and now we have to pay some private company to sell it back to us. How is that efficient? AI should be common property.

    • Uranium3006@kbin.social
      link
      fedilink
      arrow-up
      26
      ·
      1 year ago

      honestly I’d rather open source AI I can run locally. even for something like GPT4 an enterprise-scale operation could afford the hardware

    • Ubermeisters@lemmy.zip
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      1 year ago

      If it’s made from all of us it should be free for all of us.

      I’m fine with these researchers going out and scraping the social networks to train models, it’s incredibly advantageous to society in general. But it’s gotta be crystal clear transparency and it’s gotta be limitlessly free to all who want to.

      It’s the only way that any of this won’t result in another massive boundary between the 1% and us pod living grunts. It’s already a devisively powerful technology when harnessed adversarially, that power is reduced when everyone has access to it as well.

  • j4k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    8
    ·
    1 year ago

    Open AI died the moment Meta’s Llama model weights were replicated completely open source. The outcome is guaranteed. It does not matter how much better the enormous proprietary model can be, people will never be okay with the level of intrusive data mining required for OpenAI or Google’s business model. Personal AI tech must be open source and transparent with offline execution. AI is the framework of a new digital economy, not the product.

    • TheEntity@kbin.social
      link
      fedilink
      arrow-up
      86
      arrow-down
      1
      ·
      1 year ago

      people will never be okay with the level of intrusive data mining required for OpenAI or Google’s business model

      Where do you meet these people? I need more of such people in my life.

    • griD@feddit.de
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      1 year ago

      AI is the framework of a new digital economy, not the product.

      That is one interesting sentence. Thanks.

    • krellor@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      I don’t think it’s as much that the meta model was replicated as much as they fully open sourced it with a license for research and commercial use.

      I actually think the market demand will be fairly small for fully offline AI. The largest potential customers might be government who require full offline hosting, and there is a small group of companies servicing that niche. But even government customers who require that their data is segmented are simply having enclaves setup by the big cloud platforms where they guarantee that inputed data isn’t fed into the training process and doesn’t leave the customer environment.

      I fully support folks who sustain open source AI frameworks, but in terms of commercial customers that will drive industry trends with dollars, I expect there will be demand for hosted solutions that use proprietary models.

        • AngrilyEatingMuffins@kbin.social
          link
          fedilink
          arrow-up
          8
          ·
          edit-2
          1 year ago

          The courts aren’t going to side with copyright holders. As much as the US loves its ridiculous copyright laws it loves profit and being on the bleeding edge of tech more. There is absolutely ZERO chance that the United States will let China, who does not care about IP, carry the keys to the AI kingdom

    • Tigbitties@kbin.social
      link
      fedilink
      arrow-up
      15
      ·
      1 year ago

      $700k/dsy. That’s $260 million per year. There are 360 million paid seats of MS360. So they’d have to raise their prices $0.73 per year to cover the cost. No one will notice.

    • The Prompt:
      How OpenAI can generate more revenue?

      The Response:
      OpenAI, the organization behind the development of AI models like GPT-3, has a few strategies to generate revenue:

      1. Licensing and Subscription Models: OpenAI can license its AI models and technologies to businesses, developers, and organizations for various applications. This could involve offering different pricing tiers based on usage, features, and support.

      2. API Access: OpenAI can offer access to its AI models through APIs (Application Programming Interfaces). Developers and companies can integrate these APIs into their products, applications, or services, paying for the amount of usage.

      3. Customized Solutions: OpenAI can collaborate with specific industries or companies to develop customized AI solutions tailored to their needs. These solutions can range from customer support bots to content creation tools.

      4. Premium Features: OpenAI could offer advanced features, improved performance, or specialized functionality as part of premium packages, enticing users to upgrade their subscriptions.

      5. Partnerships and Collaborations: OpenAI can collaborate with other tech companies, research institutions, or industries to jointly develop and commercialize AI applications.

      6. Consulting and Services: OpenAI could provide consulting services, training, and expertise in AI technology to help organizations integrate AI into their operations effectively.

      7. Training and Workshops: OpenAI can offer training programs, workshops, and educational resources to help individuals and businesses better understand and utilize AI.

      8. Data Services: OpenAI can offer datasets, annotations, and data-related services to researchers, businesses, and developers working on machine learning projects.

      9. Vertical-Specific Solutions: Developing AI solutions tailored for specific industries, such as healthcare, finance, or marketing, and offering them as industry-specific packages.

      10. Community Contributions: OpenAI can build a community around its AI models and offer a platform for users to share and monetize their own creations, benefiting from the broader ecosystem.

      It’s important to note that OpenAI’s approach to generating revenue will likely be influenced by factors such as market demand, ethical considerations, user feedback, and the evolving landscape of AI technologies and regulations. OpenAI aims to strike a balance between generating revenue and advancing its mission of ensuring AI benefits all of humanity.

  • saud@lemmy.ml
    link
    fedilink
    arrow-up
    31
    ·
    1 year ago

    I really don’t understand how this is possible when Microsoft has invested billions into OpenAI

    • atticus88th@lemmy.world
      link
      fedilink
      arrow-up
      21
      arrow-down
      3
      ·
      1 year ago

      All it takes is a couple of dudes to buy a couple of yachts, private planes maybe another home or two. And poof. Its gone.

    • Uncle_Bagel@midwest.social
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      Burning through billions of investors money isnt the same as being profitable. The Silicon Valley gravy train is over, and investors are actually demanding to start seeing returns on their investments.

  • donuts@kbin.social
    link
    fedilink
    arrow-up
    31
    arrow-down
    2
    ·
    1 year ago

    AI as a business is already running on fumes, and it’s going to become even more expensive once intellectual property law catches up to them. We can only hope that the AI bubble bursting doesn’t take the entire market economy down with it…

      • donuts@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I mean, I get you, but personally I don’t really like the idea of millions of innocent people losing their homes and most of their savings because some fucking dweebs decided to put all of our collective wealth in legally dubious automatic junk “content” generators. I’ve lived through enough crashes to know that it’s never the big guys that get fucked when everything goes tits up, it’s us, our parents, our grandparents, etc.

        • borlax@lemmy.borlax.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Yeah status quo is the only reason to not throw caution to the wind and burn the whole thing down. It’s why nothing will ever get better.

    • 👁️👄👁️@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Well it doesn’t help that ChatGPT is unoptimized as fuck with like 185b parameters for 3.5, and somewhere in the trillions for 4

  • DefinitelyNotAPhone [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    1 year ago

    Company whose business model is based entirely on running an enormously massive and expensive LLM and then serving content with it publicly for free with no greater ideas on actually turning that into a business is going under. In other news, water still wet.

    I’ll admit I thought the AI bubble was going to last longer than a few months (and inevitably FAANG will probably artificially extend it until even they have to admit there’s not a ton of productive real world uses for it), but I suppose late stage capitalism has to speedrun the boom-bust cycle as it gets increasingly desperate for profit.

    • RubberDucky@programming.dev
      link
      fedilink
      arrow-up
      13
      ·
      1 year ago

      And instead of trying to make it use less resources to run, unlike Llama tries, openai just makes a new gpt that needs even more resources

      • Durotar@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        openai just makes a new gpt that needs even more resources

        If they have investors who are paying for that, I see no problem. Operating at loss is not newsworthy nowadays, this is new reality.

    • somename [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      Well, AI is still going to be a buzzword for capitalists to throw around, as it does actually have uses and big profit usages in certain fields. Just, like, it’s certain fields. Then the grifters will continue to try to extrapolate that success to increasingly far removed use cases, with increasingly stupid promises.

    • Kayn@dormi.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      It has found a few legitimate uses though, hasn’t it? GitHub Copilot comes to mind, although the legal implications of it are up in the air.

  • 👁️👄👁️@lemm.ee
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    They also didn’t design ChatGPT to be power efficient at all, so that’s bloating up their operating costs a ton.

  • boyi@lemmy.sdf.org
    link
    fedilink
    arrow-up
    16
    ·
    edit-2
    1 year ago

    Sorry to say, I would take this with grain of salt. Not making profits is part of business model of these pioneering companies. Google, Amazon and Uber (etc) were in the negatives for so many years and they absorbed the losses in order to be the dominant brands where at the end users become dependent on them. At that point they’ll start to charge exorbitantly and forcefully add unneeded features that will exert more control upon their users but there’s nothing that they can do but pay, for the simple fact that they can’t do without them.

    • Sinonatrix [comrade/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      1 year ago

      be “”“worth”“” 2.7T

      unable to afford world’s most hyped research project despite it burning less than 1b

      Is this IBMification or whatever tech bros are calling late capitalism now

  • roguetrick@kbin.social
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    High interest rates baby. I noted this was happening when people were complaining about lowered quality because they were using less resource intensive operations.