How Chatbots Make Money?


A recent article in the Harvard Business Review (HBR) talks about how tech companies behind the ChatGPT, Bard, and LLaMA make money?

In a interview with the HBR, Andy Wu, the Arjun and Minoo Melwani Family Associate Professor of Business Administration at Harvard Business School, talked about “the challenging economics of AI, how business models are likely to differ from traditional software models, and some of the potentially painful tradeoffs ahead for companies such as Google, Microsoft, and others”.

Below are a few selected excerpt from the Prof Wu’s interview:

  • “I think the basic economics of a generative AI are being overlooked. There are significant unanswered questions in terms of how people will actually make money with this technology. Google and OpenAI and others can’t lose money in perpetuity. But it’s not yet obvious to anyone exactly how this will be monetized. At minimum, I can tell you that we are going to need new business models, and the integration of generative AI is going to transform how we monetize software and the business model.
  • “Our notions of fixed cost and variable costs are different here than they were for any other form of computing we’ve lived through in the past. The key insight is that the variable cost of delivering generative AI to an end user is not zero… which means we can’t necessarily be handing out future software-as-a-service applications containing generative AI for free to anyone or even as a paid subscription without usage limits as we are used to today. Usage pricing is going to be much more important.
  • “A second distinction is that a significant portion of the core technology is open source, and a lot of the data being used to train these models is public data and may be copyrighted but is publicly available online. The barriers to entry for AI are not as high as it may seem. So many companies will be in the game, at least for specific vertical AI models and applications.