A Blog by Jonathan Low

 

Oct 11, 2023

Why VCs Are Concerned That AI Has Yet To Be Profitable For Big Tech

Venture investors' concern about AI may be growing as even massively funded big tech companies are losing money on their sales of AI models. The problem is that operational costs are proving higher than expected, causing corporate customers to proceed carefully - or even cut back. The tech economies of scale which tech users have come to expect are not yet evident with AI. 

These concerns may begin to generate a more cautious approach to AI investment as venture investors assess whether the fact that the big guys can't yet figure out how to make a profit will present hurdles too difficult  for startups or early stage companies looking to attract increasingly impatient investors. JL

Benj Edwards reports in ars technica:

While companies are still investing in AI, the cost of running AI models is proving to be a significant hurdle. Microsoft's GitHub Copilot has been operating at a loss despite 1.5 million users and integrating into half of their coding projects. Users pay a fee of $10 a month, but the cost to Microsoft exceeds $20 a month. Each ChatGPT query may cost 4 cents to run. Corporate customers are unhappy with the high running costs, (which) is tied to AI computations that require new calculations for each query, unlike software's economies of scale. This makes flat-fee models for AI services risky, as increasing customer usage drives up operational costs, leading to losses. "We might be near the peak (of AI investment) before reality sets in."

Big Tech companies like Microsoft and Google are grappling with the challenge of turning AI products like ChatGPT into a profitable enterprise, reports The Wall Street Journal. While companies are heavily investing in AI tech that can generate business memos or code, the cost of running advanced AI models is proving to be a significant hurdle. Some services, like Microsoft's GitHub Copilot, drive significant operational losses.

 

Generative AI models used for creating text are not cheap to operate. Large language models (LLM) like the ones that power ChatGPT require powerful servers with high-end, energy-consuming chips. For example, we recently cited a Reuters report with analysis that claimed each ChatGPT query may cost 4 cents to run. As a result, Adam Selipsky, the chief executive of Amazon Web Services, told the Journal that many corporate customers are unhappy with the high running costs of these AI models.

The current cost challenge is tied to the nature of AI computations, which often require new calculations for each query, unlike standard software that enjoys economies of scale. This makes flat-fee models for AI services risky, as increasing customer usage can drive up operational costs and lead to potential losses for the company.

Some companies are trying to dial back costs, while others continue to invest more deeply in the tech. Microsoft and Google have introduced more expensive AI-backed upgrades to their existing software services, while Zoom reportedly tried to reduce costs by sometimes using a less complex in-house AI model for some tasks. Adobe is approaching the problem with activity caps and charging based on usage, while Microsoft and Google are typically sticking with flat fees.

\

Microsoft's head of corporate strategy, Chris Young, thinks seeing a return on investments in AI will take additional time as people figure out the best ways to use it. "We’re clearly at a place where now we’ve got to translate the excitement and the interest level into true adoption," he told the outlet.

Notably, the WSJ report claims that Microsoft's GitHub Copilot, which assists app developers by generating code, has been operating at a loss despite attracting more than 1.5 million users and integrating into nearly half of their coding projects. Users pay a flat fee of $10 a month for the service, but the cost to Microsoft exceeds $20 a month per user on average, according to a person familiar with the matter. In some cases, individual power users have cost the company as much as $80 a month.

One of the reasons AI services are so costly is that some companies have been reaching for the most powerful AI models available. For example, Microsoft uses OpenAI’s most complex LLM, GPT-4, for many of its AI features. GPT-4 is among the largest and most expensive AI models to operate, demanding significant computing power. The WSJ quipped that using this model for basic tasks such as summarizing an email is like "getting a Lamborghini to deliver a pizza," suggesting that using the most capable AI models can be overkill for simple tasks.

 

Along these lines, Microsoft has been exploring less costly alternatives for its Bing Chat search engine assistant, including Meta's Llama 2 language model. However, over time, due to advances in AI acceleration hardware, the costs to operate these complex models will likely come down. Whether those advancements can come soon enough to match this year's hype cycle over AI is uncertain.

While there's still excitement in the sector, the WSJ reports that we might be near the peak before a reality check sets in. Some experts anticipate a more stringent financial approach in the near future, highlighted by May Habib, CEO of generative AI firm Writer, who told the outlet, "Next year, I think, is the year that the slush fund for generative AI goes away." This suggests that we may soon see the industry transition from enthusiasm and experimental budgets to a phase where the focus will be on whether these AI models can actually contribute to company profitability.

2 comments:

ronaldo7 said...

These concerns may begin to generate a more cautious approach to AI investment as venture investors assess whether the fact that the big guys can't yet figure out how to make a profit will present hurdles too difficult for startups or early stage companies looking to attract increasingly impatient investors. JL

ronaldo7 said...

The current cost challenge is tied to the nature of AI computations, which often require new calculations for each query, unlike standard software that enjoys economies of scale. This makes flat-fee models for AI services risky, as increasing customer usage can drive up operational costs and lead to potential losses for the company.
pikashow download

Post a Comment