
This snack byte will take approx 4 minutes to consume.
More than two years after ChatGPT burst onto the scene, software companies still haven’t nailed down a compelling pricing model for AI tools. As a technology expert who’s been tracking these trends, I can tell you that the challenge is as tricky as balancing a supercomputer on a unicycle.
For years, vendors have typically charged a monthly fee per user for AI features, assistants, and copilots, similar to how they price Software as a Service (SaaS) products. But here’s the kicker: the compute demands of AI are astronomical. When you’re running high-powered algorithms that require massive amounts of processing power, you end up with eyebrow-raising price tags.
Take Microsoft’s AI Copilot, for instance. Some chief information officers (CIOs) balked at the idea of paying $30 per user per month to add Copilot to Microsoft 365—a 60% premium over the non-AI version of the suite. Greg Meyers, Chief Digital and Technology Officer at Bristol-Myers Squibb, lamented, “A year ago everything was way overpriced. Most companies overestimated how much more we would be willing to pay for an AI feature.”
Even United Airlines’ CIO Jason Birnbaum admits that while AI promises to revolutionize work, the current pricing models leave many enterprises on the fence. “We’re in a place where prices are high and simultaneously companies are trying to understand how to drive value out of it,” he explained.
For general-purpose tools like Copilot, which are charged on a per-seat basis, Birnbaum cautioned, “We’re not really ready to deploy it on a broad basis.” It’s as if we’re all trying to decide whether AI tools are the new luxury sports cars or more like economy class—they both get you where you need to go, but the price difference is staggering.
Vendors are now pivoting their pricing strategies in a bid to attract a broader user base. Alphabet’s Google, for example, announced in January a shift in its Business Standard plan for its Workspace productivity suite. Previously, enterprises were charged $12 per user per month plus an extra $20 for access to its Gemini AI business tools.
Now, Google is rolling out a $14 package that bundles Gemini AI features right into Workspace. It’s a bold move that aims to simplify the cost equation and make AI adoption more attractive, though it might leave accountants scratching their heads.
Microsoft, never one to be outdone, has introduced consumption-based pricing with its new Microsoft 365 Copilot Chat. Instead of a flat per-user monthly fee, customers now pay a few cents per use, depending on the interaction. This flexible model could lower the barrier to entry for enterprises trying to scale AI adoption across their organizations.
During Microsoft’s earnings call last week, CEO Satya Nadella revealed that customers who initially purchased the $30 Copilot offering expanded their seat count by more than 10X over the past 18 months. Jared Spataro, Microsoft’s Chief Marketing Officer for AI at Work, noted, “A per user per month charge can sometimes be difficult if you’re trying to go to broad scale because you’re just not sure how to value something.”
It seems Microsoft is betting that a more granular, usage-based model will win favor with cost-conscious CIOs.
On the flip side, some enterprises are taking matters into their own hands. Kathy Kay, CIO of Principal Financial, mentioned that she’s already testing the new Copilot Chat tool to assess its cost-effectiveness. She even hinted that if vendors don’t price their AI solutions fairly, companies might simply build their own in-house capabilities.
Nationwide’s Chief Technology Officer Jim Fowler echoed this sentiment: “If they aren’t fair and equitable in how they price those tools and transactions, they’re actually going to incent me to build my own capability over buying theirs.” In this rapidly evolving, high-stakes environment, the pricing debate is as wild as a freewheeling startup in Silicon Valley’s wild west.
Salesforce is also jumping into the fray. In September, the company unveiled a new pricing plan that allows enterprises to toggle between per-month licenses for human employees and consumption-based models for AI agents. Executive Vice President Bill Patterson at Salesforce admitted that for some of the AI investments made over the past two years, “the jury is still out” on whether the current pricing models truly capture the value of these advanced tools.
Meanwhile, Amazon Web Services (AWS) is betting on a different strategy with its Bedrock platform. AWS offers access to AI models from companies like Anthropic, Meta, and Mistral AI using either a no-commitment, pay-as-you-go pricing model starting at less than one cent per interaction or a time-based commitment starting at $25 per hour.
Amazon’s work assistant, Amazon Q, is also available at tiers ranging from $3 to $20 per user per month. These flexible pricing models are designed to keep pace with the rapidly declining costs of running AI models, ensuring that enterprises aren’t left feeling like they’re overpaying for something they can eventually build in-house.
All this pricing experimentation comes at a time when the underlying costs of AI, especially the compute power required to run these models, are falling. Earlier this year, OpenAI CEO Sam Altman admitted on X (formerly Twitter) that their $200-a-month ChatGPT Pro plan was losing money because users were consuming more than anticipated.
In contrast, the ChatGPT Enterprise plan is currently priced between $30 and $45 per seat, suggesting that different market segments will drive different pricing strategies.
From my perspective as a long-time technology observer, this period in AI pricing is both exhilarating and a bit nerve-wracking. The current market is akin to the early days of the dot-com boom—filled with promise, innovation, and the occasional spectacular miscalculation. Investors, CIOs, and tech vendors are all trying to navigate these uncharted waters, and the pressure to find a sustainable model is immense.
In a world where technology is advancing faster than ever, and where our modern-day gold rush is defined by data centers and AI models, one thing is clear: the pricing models for these powerful tools are still a work in progress.
Whether you’re a Wall Street financier, a seasoned CIO, or simply someone who marvels at the future of technology, the journey to sustainable AI pricing promises to be as dynamic and unpredictable as the AI itself.
As companies continue to experiment with different pricing strategies—from per-user fees to consumption-based models—the AI landscape will keep evolving.
And if one thing remains certain, it’s that in this wild, wild world of technology, the rules of the game are still being written, one innovative, eyebrow-raising, and occasionally hilarious pricing model at a time.