Recently my colleague Corey Schroeder and I attended Distribution Strategy Group’s Applied AI for Distributors conference in Chicago.
Corey, Vendavo’s Vice President of Product Management, hosted a breakout session called “Are You Prepared for the AI-Driven Sales and Pricing Revolution?” where he discussed the finer points of using AI to stay ahead of competitors, while turbocharging revenue generation.
Corey’s presentation had three main points:
AI Explainability is Critical
Rather than using a black box approach where the layer between input and output is opaque, AI should provide visibility and oversight to humans so they can explain how a model arrived at its conclusions. A glass box allows users to see inside, assess accuracy and validity, and rapidly improve the solution.
Models Cannot Be Too Complex, Nor Too Simple
When models are too complex, they become impenetrable to everyday users. They could also be overfit to training data, meaning they are not useful in the real world. Simplistic approaches run the risk of not accurately describing what they attempt to model. Everyday users may ignore output because they think they can do better by themselves. The key is to strike a balance: complex enough to be insightful, simple enough to be usable.
The Best Results Come from a Mix of Human Input and AI
Human input into AI is a feature, not a bug because the combination of art and science leads to the best results. Pairing a human with the machine provides a holistic view that yields optimized results that make sense in the real world. Giving people tools to do their job well is a winning play.
AI can spin through mountains of data, help achieve precision and scale in monetization, and ultimately guide pricing and sales organizations to make better decisions. Explainable AI with the right level of sophistication and input from humans will have a much higher chance of being adopted by the organization.
The reality is that an AI revolution will be powered by data. Corey pointed out that, “Implementing AI is not a technology-building exercise. The quality and architecture of your data is just as important as the technology monetizing it.”
It should come as no surprise that this resonated with the audience. Most of the follow-up questions were about data quality and readiness. As we all know, perfect data doesn’t exist, so it’s more productive to focus on making data good enough. Corey has the following advice for evaluating data readiness:
Start with Data in Your ERP
Transactional data, customer master and product master are the most frequently used data at the start of an AI implementation. Because these usually come from ERP, most companies have at least a basic level of cleanliness in their data by virtue of loading into ERP. It may not be perfect, but it could be a good place to start.
Pilot with Key Stakeholders
A great way to test and improve data is to use it in a real-world application where people can make decisions. Share results with business leaders and others who can vet it. Using data is the best way to clean it up. Pick a pilot area and work with a team that is interested in moving the project forward. Have them focus on the big picture and not specific lines. Data visualization is a great way to identify issues.
Cover the Basics
Most companies start with basic data and evolve from there. The most common building block is line-item transactional data. From there, they enrich their data with things like customer industry or NAICS code, and an estimate of customer size. They will also pair that customer data with product master data, such as product hierarchy (from a high-level product line down to a SKU number), product lifecycle, and fast- or slow-mover classification.
A quick gauge to determine if the data is in a good spot is when people are nitpicking results. If the issue is small, then they are looking for things to be wrong. If you find yourself adjusting line-item data instead of the way data is structured, chances are you’re already in a good place.
Plan Your Data Resources
When data is in good enough shape, then you must have resources to do something with your data. This usually comes down to your people and tools. Business users with defined goals, such as increasing profitability or reducing supply chain waste, are ideal candidates to review the newly cleansed data because they have firsthand knowledge about how that data should work. They will probably need tools above and beyond Excel to extract the most value they can from the data, so consider partnering with vendors to scale up quickly.
If data is wrong, then it is important to describe precisely why it is wrong. That way IT resources can have a targeted plan to improve the data, rather than attempting to fix a nebulous “data is bad” problem. IT will need resources to clean up and transform data. Engage with leadership early and often to ensure prioritization of data cleanliness and adequate bandwidth for the IT team.
Although getting the data right could be a lot of work, the opportunity to leverage good data with great AI is irresistible. Companies that embark on this journey will be able to evaluate willingness to pay at more granular levels and capture more revenue and margin. Guidance will be automated, so it will always be available to the sales team. Ultimately, companies will be able to react quicker than competition to market changes, remain relevant to customers and win the battle for growth.
Dan Cakora is a Business Consultant at Vendavo and has worked in various aspects of pricing for over 15 years. Dan started his career as a Field Economist responsible for helping to measure inflation for the federal government. Before joining the Vendavo team, Dan was a customer at a large, international B2B distributor. He has led Pricing teams, developed Pricing and Sales Enablement products, and has a passion for data visualization. Dan has an MBA from DePaul University and a BS in Economics from Purdue University.