Subscribe to Receive the Latest Supply Chain Design Resources
Science = art.
Both are well crafted, full of meaning, and clean in design. And modeling is the perfect combination of art and science. It is both wildly creative and filled with patterns in nature. But what exactly is modeling?
Modeling is a mathematical formulation made up of a series of linear equations. It is a set of variables that have representation in the real-world. But which world do they represent? This is where modeling becomes an art. Each model, each constraint, and each variable are an abstraction of reality, but there are an infinite number of realities. The modeler’s job is to decompose these realities through a series of often non-linear layers, pulling data from execution systems (CRM, TMS, WMS, etc.), adding in knowledge of both the current design and the potential designs from future realities to be studied, resulting in a model that can be studied.
The modeler crosses the line from scientist to explorer as he takes the journey of discovering what opportunities exist in the model. The modeler studies the possible outcomes of the future and then begins the process of interweaving these paths with the current, discovering the potential of the what ifs that can be leveraged to minimize or maximize the opportunistic future that we can create.
We have compiled 4 general principles to help guide a model designer in the creative thinking and model creation process:
The perspective of the model is limited to the timeframe created. If the timeframe is infinite, then there is no need for fixed cost modeling as they are overpowered by operating costs. Conversely, if only modeling for a year into the future, fixed capital costs will overpower the model if not accounted for properly.
Keeping this in mind, a modeler can proportion their fixed costs using accounting best practices for the time value of money. When dealing with fixed costs, a modeler should study the effects of cost during the sensitivity analysis stage to ensure they are interpreted as expected by the model.
Level of Detail
Modelers are often constrained by the processing power of their CPU. Even with a cloud-based optimization studio like Optilogic, there is a limit to the amount of processing power available and a limit to our ability to consume this data.
Our advice: simplify your model where you can without loss of information for the problem you are answering. Machine learning has excellent clustering algorithms that can be leveraged to better understand your product, demand, and supply chain before deciding on an aggregation strategy. Many alternative options can be considered from a simple k-means clustering and pareto analysis approach, while still being simple to implement. Additionally, you can leverage affinity analysis to help provide good aggregation strategies. Whatever approach you choose, keep it simple while avoiding complicated abstractions that leave you scratching your head at the outputs and what to do with them.
When hunting for the dimensions for aggregation, start with demand volume, leveraging company business rules, Bill of Materials, demand and product SKU affinity analysis, and product likeness from a supply chain or production perspective.
If your problem is anything but toy in size and complexity, attacking non-linear costs directly can be problematic. To keep it more solvable, discretize through piecewise linear curves. With a small number of breaking points (generally five or less is a good rule of thumb for most curves), a modeler can often abstract the problem without a significant loss of information.
While modeling, it is important to use variables that are representative of the problem—theoretically equivalent variables tend not to solve well. There are times when indirect variables outperform the direct formulation, but when in doubt, create variables that mimic real-world decision-making as closely as possible. Generally, the algorithm will leverage this structure to effectively search and prune the available feasibility region.
The Art of Modeling and the Modeler
Modeling strategies do not always consider ongoing events in the real world, which means as modelers we must be creative and have a broad perspective.
For a modeler to be successful, they need to understand the limits of the solver and leverage these for the value that they bring. While it is a struggle to get all the necessary information into a single first model, it is all about starting small. At this point, the best practices of development should be applied.
Make a plan, start simple, stay feasible, and layer in complexity slowly. As researchers and modelers continue to study the infinity of the multiverse, we will know how to better serve our current singular universe.
Grow Your Knowledge
United Rentals Powers Facility Footprint Optimization with Optilogic
Optilogiv interviews Carey Boggess, Director, Footprint Development at United Rentals. Carey shares insight into how the company uses Optilogic to optimize its footprint to make faster decisions on lease renewals and plan out the market strategy for their branches to provide exceptional service to customers.
Sensitivity at Scale: Run Thousands of Sensitivity Scenarios in Parallel
Learn how to use Cosmic Frog Sensitivity at Scale: Automatically create hundreds or thousands of sensitivity scenarios and rapidly analyze on Optilogic’s SaaS platform to select the most resilient designs.
Optilogic’s Groundbreaking Sensitivity at Scale Enables Large-Scale Sensitivity Analysis With a Single Click
Supply chain network design software innovator, Optilogic, is making true large-scale sensitivity analysis possible with Sensitivity at Scale, which runs hundreds or thousands of sensitivity scenarios in parallel with a single click.