This post is inspired by Frank Slootman, CEO of Snowflake. In an interview with CNBC he commented
“AI is not going to be cheap. I mean, somebody's paying for these wonderful Nvidia results. There needs to be a business model that's associated, you know, with the technology. One of the great things about Search, when it showed up was not only that it was great technology, but they also had a business model that paid for it. We need that here as well. Otherwise, it's fun and games, and it's an expensive hobby, that's not going to last…”
He was alluding to another Nvidia blowout quarter. Jensen Huang, CEO of Nvidia explained some of the trends driving their demand
“The world has something along the lines of about a trillion dollars’ worth of data centers installed in the cloud and enterprise and otherwise…And that trillion dollars of data centers is in the process of transitioning into accelerated computing and generative AI.”
This of course, begs the question – Is ALL the data in ALL those data centers fodder for processing with Nvidia products? They could be - but not necessarily at today’s premium prices. This documentary does a nice job describing Nvidia’s evolution from gaming chips to a powerhouse in generative AI. It touches on “premium” use cases in healthcare, design etc. that nicely leverage his technology.
My recommendation would be vendors poll their customers for use cases where they are willing to pay premium pricing. And think broader - the investment is not just going be around the GPUs, LLMs and other plumbing. Vendors will need to accumulate data around high-value applications. That will call for hiring unique domain experts and accumulating plenty of domain specific data not easily accessible today in the cloud. It is squirreled away somewhere in those trillion dollar worth of data centers, and even worse in corporate spreadsheets.
Let’s face it – the vast majority of enterprise data in the cloud today is back office accounting, HCM, procurement and basic CRM data. If you expect premium pricing, it will likely come from operational data unique to verticals and countries. To access that data, you will have to acquire specialist vendors like Oracle did with Cerner. Not cheap – Oracle paid $28 billion for a small fraction of largely US patient-specific data and it will have to navigate HIPAA and other privacy constraints around that data. Another option – come up with incentives for some of the most innovative customers in each domain to share their data to train your machines. Either way be prepared to invest.
For many vendors, their predictive AI may offer more value to customers than generative AI. If you can preclude unplanned shutdowns of expensive assets with preventive maintenance AI or you can reduce production and logistics footprint, waste and scrap through better demand forecasting AI, that may be exactly what your customers need.
I like Frank’s call for a business model for AI. I think vendors should start with finding out what customers are willing to pay for as “premium” use cases, then work backwards and figure out whether/how to acquire the premium infrastructure, domain knowledge and the data to train machines for those ambitious use cases.
That would be a business model better aligned with customer value and far easier to sell.
As we have moved to virtual briefings, I have increasingly been excerpting short video segments (with permission) as part of my Analyst Cam series.
We are running a series of posts on Workday’s Industries. Tom Peff in Product Marketing kicked it off with their Planning and Analytics across the Service Sector where they focus. Yesterday, Greg Volpe, Solution Marketing Director, drilled down into analytical applications in Higher Education. Today, William Bercik, Sr. Director, Healthcare Strategy will do the same for Healthcare. In the New Year, we will have a focus on Workday’s transactional capabilities in some of these sectors.
William addresses the organizational areas Workday covers for healthcare customers even though they don’t have clinical functionality (but do have impressive administrative capabilities) in the graph below
He covers several use cases and solution sets – Advanced Analytics, Planning Model Toolkit and Cost & Profitability Management.
He also provides a synopsis of implementations at Fairview Health which operates 10 hospitals and 48 primary care clinics, and Christiana Care with 50,000 patient admissions a year.
As we have moved to virtual briefings, I have increasingly been excerpting short video segments (with permission) as part of my Analyst Cam series.
We are running a series of posts on Workday’s Industries. Yesterday, Tom Peff in Product Marketing kicked it off with their Planning and Analytics across the Service Sector where they focus. Today, Greg Volpe, Solution Marketing Director, drills down into analytical applications in Higher Education. Tomorrow, William Bercik will do the same for Healthcare. In the New Year, we will have a focus on Workday’s transactional capabilities in some of these sectors.
Workday has over 350 customers in this vertical. Greg defines them as ranging from “small community colleges up to large research institutions and institutions with medical facilities”. He identifies some of the challenges the industry faces. Even before COVID, there was a secular decline in student enrollments, especially with foreign students. Most educational institutions also persist with dated technologies. With growing cost pressures, there are moves to shared service models across multiple entities. And the clear need for better profitability analysis.
He does a nice job walking through 9 planning trends in the graph below:
He runs through a case study around faculty and other staff insights and planning at Wake Forest U and Financial Planning at U of Virginia.
Nice overview of the state of the sector and Workday’s coverage in 22 minutes.
As we have moved to virtual briefings, I have increasingly been excerpting short video segments (with permission) as part of my Analyst Cam series.
Over the next few days, we will profile a series of posts on Workday’s Industries. Kicking it off today is Tom Peff in Product Marketing. He covers Workday’s Planning and Analytics across the Service Sector where they focus. Tomorrow, Greg Volpe will drill down into the analytical applications in Higher Education, and day after William Bercik will do the same for Healthcare. In the New Year, we will have a focus on Workday’s transactional capabilities in some of these sectors.
Tom starts off acknowledging that planning and analytics applications tend to be horizontal and apply across industries. However, Workday development has also focused on areas “that benefit certain industries disproportionately.” He discusses each of the unique functionalities below.
He next talks about Industry Models mapped to key Use Cases in 13 industries. As he says, each has micro-verticals – so under banking, you also have credit unions, regional banks and asset management. You can filter using a vertical lens. If you are a nonprofit, what are the critical models that you need to render and operate your business? Grant planning, program expense, budgeting, personnel planning, membership and events etc.
He then presents on a library of 70 Common templates that partners have developed around use cases in different industries. That is part of a marketplace that Workday is evolving which will not just focus on Planning apps. It will also include apps build on their development platform, Extend and packaged applications that partners have built.
Finally, he provides a glimpse at how Workday is organizing for more of a vertical push.
As we have moved to virtual briefings, I have increasingly been excerpting short video segments (with permission ) as part of my Analyst Cam series.
This time it is Matthew Wright, Founder and CEO of Specright which provides specification data management to Fortune 1000 companies with complex needs.
They were introduced to me by SAP.iO Foundries, which represent SAP’s startup programs, including accelerators, that enable startups that can deliver value to SAP customers (see their presentation here ).
Specright was one of the startups in their Sustainability cohort last year but as you will hear precision in specifications can have an impact in every process which touches raw materials, ingredients, formulas, packaging and finished goods.
Matthew’s passion comes from a couple of decades in the packaging industry. As he describes in his book The Evolution of Products and Packaging, we have seen an explosion in SKUs and complexity in supply chains. As he says, existing ERP, PLM and other software were designed before complexity grew exponentially, and the lack of focus on specifications is the root cause of many supply chain issues we are seeing. BTW, Gartner identified them as a Cool Vendor in their Supply Chain category in 2020.
He presents on three customers and how they have benefited from better specs – one reduced scrap production, the second reduced spend on corrugate packaging and the third in sourcing a critical ingredient across multiple product SKUs. He also describes how they have helped industries set standards and work with other bodies which are tasked with defining them.
Very nicely done, and I am looking forward to reading his book.
As we have moved to virtual briefings, I have increasingly been excerpting short video segments (with permission ) as part of my Analyst Cam series.
This time it is Rich Wagner, CEO of Prevedere, which offers what it calls “intelligent forecasting”. I would say it is helping move that analytical world forward by helping focusing on external signals not just building on historical trends. Thanks to the Workday Ventures team for the introduction.
“To deliver those forecasts, the NHC team takes an invasive approach to collecting a wide variety of external, real - time data. Too much business forecasting today is based on internal and historical data or looks at Google or external sources, such as Bloomberg and Gartner for its primary data.”
The case study described all the data the NHC collects via satellites, buoy and other sensors, from Hurricane Hunter flights and other sources. How it uses supercomputers to crunch multiple, often conflicting models. How it audits its performance at the end of each season and keeps improving storm track forecasts every year.
Listening to Rich, I got a similar vibe and as he wrote in Forbes
“Companies should be checking the millions of global data series available from thousands of public and private sources across the world related to markets, industry, weather, trade policy, geopolitics, macroeconomics, consumer behavior and more. Most often, companies will focus on their internal historical data trends coupled with anecdotal viewing of general industry data sets and gross domestic product (GDP), all of which are no longer predictable. Companies need all of the data necessary to provide a holistic view of global, regional and local economies in order to monitor for key signals of upcoming changes in demand.”
And these days, almost every enterprise can easily access machine learning capabilities to hone in on variables that are particularly relevant to their industry, geography, channels and other attributes. The definition of book of record needs to become much wider.
He also discusses how his tools integrate with other planning tools and with ERP applications like those from Workday.
We have a nice conversation about a customer, Tractor Supply Co. They have taken advantage of market signals about the growth of a “Country Suburban” demographic and it has shown up in their new store locations and in SKUs to emphasize items like poultry in backyards.
He also has a very interesting POV on how the 2008 recession was very different from the 2020 pandemic induced one. We also discuss how enterprises are increasingly looking for integrated planning tools and emphasizing consensus forecasts across functional silos.
I found the conversation riveting. Enjoy it below.
When they held ZohoDay in February, Zoho had briefed me in detail on their BI roadmap. I had given them some feedback. So did several other analysts during the day. They took that input and managed to work through the chaos of the recent COVID surge in India and launched their expanded BI platform this week.
Zoho Analytics is already used by more than 50,000 organizations. Interestingly, Zoho says 60% of existing users have chosen it as their BI tool even when running on non-Zoho applications and data sources.
I have excerpted below 3 new components demoed by Chandrashekar LSP, the Chief Evangelist for that product area. There is a definite emphasis on self-service.
It starts off with DataPrep, an AI-driven data preparation tool. Next at 4.35, he walks through Data Stories. It includes a portal builder (Zoho Sites) and presentation software (Zoho Show). Finally at 8.37, he presents on the augmented analytics layer which leverages Ask Zia, Zoho's conversational interface.
The recent enhancements make it even more intuitive to end users and should make it even compelling for use either with Zoho data (especially if an enterprise is using many of nearly 50 integrated Zoho One apps) or from other sources.
In the 45th episode of the series, we have Pete Schlampp, EVP Product Development at Workday.
Last week, I had the chance to present to some Chief Data Officers. I made the point that 2020 was an inflection point for data and analytics. We saw countless new data streams - related to COVID-19 data, the new census, shortened sports seasons stats, polls from the turbulent US elections, countless supply chain disruptions and many more. We also saw a new generation of players that leverage enterprise data go public and otherwise gain prominence - Snowflake, Palantir and C3.ai among them.
Workday is better known for its transaction capabilities in finance, hcm and procurement but it has gradually become an analytics powerhouse with acquisitions like Adaptive, Platfora, Gridcraft and others. During the pandemic they launched several bite sized apps which leveraged their analytics which I profiled in the Analyst Cam including VIBE for diversity management and Vaccine Management.
As Aneel Bhusri, the Co-CEO recently said "the future is about data, insights and predictions"
So it was nice to spend time with Pete, who has been the key architect of Prism Analytics which is the cornerstone of many of the apps. We also discuss his expanded role which extends beyond analytics, his world view of "data-infused applications", machine learning and the role of Workday partners and their domain knowledge in the creation of many more bite-sized apps.
In the 32nd episode of the series is Bret Greenstein, SVP and Global Markets Head of AI & Analytics at Cognizant.
'How has COVID changed the way customers are looking at analytics?" - I asked that in this post which covered 12 data streams and use cases from 2020.
I invited Bret to talk about how the world changed with his client base and from his regular interactions with Chief Data Officers.
He presents with a series of simple questions like What Happened? Why did it happen? How did we miss that? and provides plenty of examples.
A compelling case for why decisioning needs to be refreshed. Very effective set of questions enterprises should be asking post-pandemic about their data strategies and analytics.
I have been doing video interviews with a number of C-level executives about acrobatics they have been seeing in various vertical sectors during the COVID-19 crisis and the "New normal" they can expect as the economy wakes up. Here is the index to the growing list of interviews.
This time it is Brett Hurt, who is the CEO and co-founder of data.world, a Public Benefit Corporation (and Certified B Corporation) focused on building the world's largest collaborative public data catalog. He has always been a data guy - prior to this he founded Bazaarvoice (which went public) and Coremetrics (which was acquired by IBM)
Brett is also the co-owner of Hurt Family Investments (HFI), alongside his wife, Debra. They are involved in 93 startups and counting and 29 VC funds, mostly based in Austin, TX
It has been an amazing year for new data streams - COVID-19 related, the census, the election polling, shortened sports seasons - see others I had identified here. So, it was nice to have someone who has been a lifelong data guy.
He says "data is one of the least networked resources". At one extreme, online advertising data has been highly optimized to help companies market and sell. He hopes vast troves of cancer, climate change, nutrition, poverty and other data which are highly siloed can evolve on that trajectory. His mission is help those data silos, make cost of accessing that data much lower and to democratize the traditional world of analytical tools which were designed by "IT people for use by IT people". He talks in terms of "data philanthropists" - bringing open source concepts to data world.
He talks about data as facilitating "story telling" and also has advice for young data scientists. We spend some time on the Certified B entity format which prioritizes purpose alongside profit. We also spend time talking about the phenomenon which is Austin, TX where he was born and raised and where he is obviously very well connected.
Bit of trivia - he missed explaining the Owl icon on his swag. Owl is acronym for Web Ontology Language, which helps with knowledge representation.
Wide ranging, very optimistic conversation. Ontology, Knowledge Graph - data geeks will especially enjoy listening to him.
A new business model for AI?
This post is inspired by Frank Slootman, CEO of Snowflake. In an interview with CNBC he commented
“AI is not going to be cheap. I mean, somebody's paying for these wonderful Nvidia results. There needs to be a business model that's associated, you know, with the technology. One of the great things about Search, when it showed up was not only that it was great technology, but they also had a business model that paid for it. We need that here as well. Otherwise, it's fun and games, and it's an expensive hobby, that's not going to last…”
He was alluding to another Nvidia blowout quarter. Jensen Huang, CEO of Nvidia explained some of the trends driving their demand
“The world has something along the lines of about a trillion dollars’ worth of data centers installed in the cloud and enterprise and otherwise…And that trillion dollars of data centers is in the process of transitioning into accelerated computing and generative AI.”
This of course, begs the question – Is ALL the data in ALL those data centers fodder for processing with Nvidia products? They could be - but not necessarily at today’s premium prices. This documentary does a nice job describing Nvidia’s evolution from gaming chips to a powerhouse in generative AI. It touches on “premium” use cases in healthcare, design etc. that nicely leverage his technology.
My recommendation would be vendors poll their customers for use cases where they are willing to pay premium pricing. And think broader - the investment is not just going be around the GPUs, LLMs and other plumbing. Vendors will need to accumulate data around high-value applications. That will call for hiring unique domain experts and accumulating plenty of domain specific data not easily accessible today in the cloud. It is squirreled away somewhere in those trillion dollar worth of data centers, and even worse in corporate spreadsheets.
Let’s face it – the vast majority of enterprise data in the cloud today is back office accounting, HCM, procurement and basic CRM data. If you expect premium pricing, it will likely come from operational data unique to verticals and countries. To access that data, you will have to acquire specialist vendors like Oracle did with Cerner. Not cheap – Oracle paid $28 billion for a small fraction of largely US patient-specific data and it will have to navigate HIPAA and other privacy constraints around that data. Another option – come up with incentives for some of the most innovative customers in each domain to share their data to train your machines. Either way be prepared to invest.
For many vendors, their predictive AI may offer more value to customers than generative AI. If you can preclude unplanned shutdowns of expensive assets with preventive maintenance AI or you can reduce production and logistics footprint, waste and scrap through better demand forecasting AI, that may be exactly what your customers need.
I like Frank’s call for a business model for AI. I think vendors should start with finding out what customers are willing to pay for as “premium” use cases, then work backwards and figure out whether/how to acquire the premium infrastructure, domain knowledge and the data to train machines for those ambitious use cases.
That would be a business model better aligned with customer value and far easier to sell.
August 27, 2023 in AI, ML, Analytics, Big Data, Industry Commentary | Permalink | Comments (1)