This post is inspired by Frank Slootman, CEO of Snowflake. In an interview with CNBC he commented
“AI is not going to be cheap. I mean, somebody's paying for these wonderful Nvidia results. There needs to be a business model that's associated, you know, with the technology. One of the great things about Search, when it showed up was not only that it was great technology, but they also had a business model that paid for it. We need that here as well. Otherwise, it's fun and games, and it's an expensive hobby, that's not going to last…”
He was alluding to another Nvidia blowout quarter. Jensen Huang, CEO of Nvidia explained some of the trends driving their demand
“The world has something along the lines of about a trillion dollars’ worth of data centers installed in the cloud and enterprise and otherwise…And that trillion dollars of data centers is in the process of transitioning into accelerated computing and generative AI.”
This of course, begs the question – Is ALL the data in ALL those data centers fodder for processing with Nvidia products? They could be - but not necessarily at today’s premium prices. This documentary does a nice job describing Nvidia’s evolution from gaming chips to a powerhouse in generative AI. It touches on “premium” use cases in healthcare, design etc. that nicely leverage his technology.
My recommendation would be vendors poll their customers for use cases where they are willing to pay premium pricing. And think broader - the investment is not just going be around the GPUs, LLMs and other plumbing. Vendors will need to accumulate data around high-value applications. That will call for hiring unique domain experts and accumulating plenty of domain specific data not easily accessible today in the cloud. It is squirreled away somewhere in those trillion dollar worth of data centers, and even worse in corporate spreadsheets.
Let’s face it – the vast majority of enterprise data in the cloud today is back office accounting, HCM, procurement and basic CRM data. If you expect premium pricing, it will likely come from operational data unique to verticals and countries. To access that data, you will have to acquire specialist vendors like Oracle did with Cerner. Not cheap – Oracle paid $28 billion for a small fraction of largely US patient-specific data and it will have to navigate HIPAA and other privacy constraints around that data. Another option – come up with incentives for some of the most innovative customers in each domain to share their data to train your machines. Either way be prepared to invest.
For many vendors, their predictive AI may offer more value to customers than generative AI. If you can preclude unplanned shutdowns of expensive assets with preventive maintenance AI or you can reduce production and logistics footprint, waste and scrap through better demand forecasting AI, that may be exactly what your customers need.
I like Frank’s call for a business model for AI. I think vendors should start with finding out what customers are willing to pay for as “premium” use cases, then work backwards and figure out whether/how to acquire the premium infrastructure, domain knowledge and the data to train machines for those ambitious use cases.
That would be a business model better aligned with customer value and far easier to sell.
Comments
A new business model for AI?
This post is inspired by Frank Slootman, CEO of Snowflake. In an interview with CNBC he commented
“AI is not going to be cheap. I mean, somebody's paying for these wonderful Nvidia results. There needs to be a business model that's associated, you know, with the technology. One of the great things about Search, when it showed up was not only that it was great technology, but they also had a business model that paid for it. We need that here as well. Otherwise, it's fun and games, and it's an expensive hobby, that's not going to last…”
He was alluding to another Nvidia blowout quarter. Jensen Huang, CEO of Nvidia explained some of the trends driving their demand
“The world has something along the lines of about a trillion dollars’ worth of data centers installed in the cloud and enterprise and otherwise…And that trillion dollars of data centers is in the process of transitioning into accelerated computing and generative AI.”
This of course, begs the question – Is ALL the data in ALL those data centers fodder for processing with Nvidia products? They could be - but not necessarily at today’s premium prices. This documentary does a nice job describing Nvidia’s evolution from gaming chips to a powerhouse in generative AI. It touches on “premium” use cases in healthcare, design etc. that nicely leverage his technology.
My recommendation would be vendors poll their customers for use cases where they are willing to pay premium pricing. And think broader - the investment is not just going be around the GPUs, LLMs and other plumbing. Vendors will need to accumulate data around high-value applications. That will call for hiring unique domain experts and accumulating plenty of domain specific data not easily accessible today in the cloud. It is squirreled away somewhere in those trillion dollar worth of data centers, and even worse in corporate spreadsheets.
Let’s face it – the vast majority of enterprise data in the cloud today is back office accounting, HCM, procurement and basic CRM data. If you expect premium pricing, it will likely come from operational data unique to verticals and countries. To access that data, you will have to acquire specialist vendors like Oracle did with Cerner. Not cheap – Oracle paid $28 billion for a small fraction of largely US patient-specific data and it will have to navigate HIPAA and other privacy constraints around that data. Another option – come up with incentives for some of the most innovative customers in each domain to share their data to train your machines. Either way be prepared to invest.
For many vendors, their predictive AI may offer more value to customers than generative AI. If you can preclude unplanned shutdowns of expensive assets with preventive maintenance AI or you can reduce production and logistics footprint, waste and scrap through better demand forecasting AI, that may be exactly what your customers need.
I like Frank’s call for a business model for AI. I think vendors should start with finding out what customers are willing to pay for as “premium” use cases, then work backwards and figure out whether/how to acquire the premium infrastructure, domain knowledge and the data to train machines for those ambitious use cases.
That would be a business model better aligned with customer value and far easier to sell.
A new business model for AI?
This post is inspired by Frank Slootman, CEO of Snowflake. In an interview with CNBC he commented
“AI is not going to be cheap. I mean, somebody's paying for these wonderful Nvidia results. There needs to be a business model that's associated, you know, with the technology. One of the great things about Search, when it showed up was not only that it was great technology, but they also had a business model that paid for it. We need that here as well. Otherwise, it's fun and games, and it's an expensive hobby, that's not going to last…”
He was alluding to another Nvidia blowout quarter. Jensen Huang, CEO of Nvidia explained some of the trends driving their demand
“The world has something along the lines of about a trillion dollars’ worth of data centers installed in the cloud and enterprise and otherwise…And that trillion dollars of data centers is in the process of transitioning into accelerated computing and generative AI.”
This of course, begs the question – Is ALL the data in ALL those data centers fodder for processing with Nvidia products? They could be - but not necessarily at today’s premium prices. This documentary does a nice job describing Nvidia’s evolution from gaming chips to a powerhouse in generative AI. It touches on “premium” use cases in healthcare, design etc. that nicely leverage his technology.
My recommendation would be vendors poll their customers for use cases where they are willing to pay premium pricing. And think broader - the investment is not just going be around the GPUs, LLMs and other plumbing. Vendors will need to accumulate data around high-value applications. That will call for hiring unique domain experts and accumulating plenty of domain specific data not easily accessible today in the cloud. It is squirreled away somewhere in those trillion dollar worth of data centers, and even worse in corporate spreadsheets.
Let’s face it – the vast majority of enterprise data in the cloud today is back office accounting, HCM, procurement and basic CRM data. If you expect premium pricing, it will likely come from operational data unique to verticals and countries. To access that data, you will have to acquire specialist vendors like Oracle did with Cerner. Not cheap – Oracle paid $28 billion for a small fraction of largely US patient-specific data and it will have to navigate HIPAA and other privacy constraints around that data. Another option – come up with incentives for some of the most innovative customers in each domain to share their data to train your machines. Either way be prepared to invest.
For many vendors, their predictive AI may offer more value to customers than generative AI. If you can preclude unplanned shutdowns of expensive assets with preventive maintenance AI or you can reduce production and logistics footprint, waste and scrap through better demand forecasting AI, that may be exactly what your customers need.
I like Frank’s call for a business model for AI. I think vendors should start with finding out what customers are willing to pay for as “premium” use cases, then work backwards and figure out whether/how to acquire the premium infrastructure, domain knowledge and the data to train machines for those ambitious use cases.
That would be a business model better aligned with customer value and far easier to sell.
August 27, 2023 in AI, ML, Analytics, Big Data, Industry Commentary | Permalink