I was listening to Jonathan Wright, Global Managing Partner for Supply Chain Consulting at IBM - discussing with Malcolm Gladwell on his podcast Managing supply chain volatility in the height of the shopping season the latest evolution of supply chain innovation spawned by the global pandemic.
The podcast got me thinking about the current issues with supply chain that have been splattered all over the news lately, and looking back to the past several years (decades?) on the so-called innovations that myself and some of my Optilogic colleagues have experienced in the supply chain technology space (CPFR, VMI, Causal Forecasting, EDI, Transora, visibility, etc). We see all this hype around AI, ML and Data Analytics and wonder if any of the new stuff is really new or just repackaged, re-named stuff that was never really properly deployed.
The podcast did get us thinking about how we are reaching back into some old tech to help solve the future's problem - perhaps with a twist here and there. Let's dive into some of these concepts brought up in the podcast.
Artificial Intelligence, Machine Learning
While these terms seem to be new and everywhere today, the tech has been around since the 1950s. And while things have improved significantly since then, you just need to talk to Alexa or Siri to realize there's still some room to improve.
For supply chain, where we have seen a lot of progress is in understanding drivers of demand and supply—to use these advanced algorithms to more deeply understand why supply or demand is changing. AI solutions such as IBM Watson are being deployed to better track and identify demand and supply issues in real-time.
The podcast talks about the interconnected supply chain, but that concept has been around since the days of EDI to send orders to suppliers. The real breakthrough it seems is in the ability to predict events, disruptions, spikes in demand before they happen, and many companies have been investing heavily in control towers to monitor these events.
If you take this technology beyond the day to day visibility and incorporate this into longer term risk and resiliency planning—this can be another use that we are starting to see in order to create a broader understanding of key dimensions that impact metrics other than cost and service.
An interesting point on the podcast was when Jonathan talked through an example where analysts from two companies had differing takes on projected supply shortfalls and projected time for recovery coming from AI projections. One analyst looked at the projections and assumed inventory wouldn't be an issue. The other dug a bit deeper, visited the supplier, and realized the recovery time would be much longer. The former company ran out of supply, the latter company planned for alternate suppliers and was able to meet demand.
This is where advanced solvers always need the art of the human touch and context to really grasp what the data is saying, how believable it is, and rely on analyst experience to make the final say in how best to move ahead.
It was really a great point in the podcast when Malcolm asked Jonathan, "so when do analysts start believing the data?". Jonathan really couldn't answer this and I don't think anyone could in all fairness to Jonathan. It's part of the art that goes into analytics and supply chain design. It's why scenarios and alternatives are looked at to test out possible outcomes, and why you dig a bit deeper to truly understand what the machines/algorithms are telling you.
Seeing Multiple Universes and Outcomes
The other piece of the podcast that struck me was when Jonathan talked about the lack of companies having a digital map of their supply chain. The reason it struck me was that supply chain design technology has been around since the 80s (maybe earlier than that) and after all that time—the importance of being able to build out alternate scenarios to manage risk, resiliency, and disruptions are not core to many Fortune 1000 companies.
Part of the issue stems from a lack of vision from the company, but part of it is the lack of usability in the technology, along with the inability to truly model at the SKU/transaction level. Still, another issue is the lack of skilled resources that can understand the end to end dynamics of a global supply chain. This type of deep understanding at the transaction level to then evaluate alternate supply chains strategically—not just for the next week but for the next few months and years—is where we see these AI technologies meet the prescriptive technologies used for designing current and future supply chain ecosystems.
Some of our recent engagements at Optilogic have been combining deep learning tech with optimization, and simulation with genetic algorithms. The flexibility of powerful solvers coupled with cloud architecture tied to simple interfaces and tailored outputs keep solutions more understandable for businesses than traditional platforms and adds agility in time to deliver this technology.
What do you think? We'd love to hear from you on this topic.
About the author:
John is the Vice President of Business Development at Optilogic. Prior to joining, he was part of the leadership team at LLamasoft, Inc. helping the company go from 12 people to over 500 across 10 years, with roles in pre-sales, professional services, alliances, and country manager. John also worked in pre-sales and business development roles for several small supply chain companies with technologies in inventory optimization, demand planning and causal forecasting, network optimization, S&OP, and finite capacity scheduling.