Demand forecasting solution for waste reduction
- A large big box retailer offers a freshly cooked product as a strategic loss leader.
- The existing demand forecasting solution determining production quotas was inaccurate because it was done manually using only a few historical data points.
- The goal for new forecast algorithm tool was to reduce waste, increase sales, and decrease customer dissatisfaction with this product.
- The retailer worked with Neal Analytics to build a solution that provided a more accurate sales forecast.
- Neal Analytics’ machine learning-powered Demand Forecasting solution reduced forecasting errors to below 10% per day at a 30-minute granularity level.
The customer is a large, big box retailer. Among other products, the retailer sells freshly cooked and ready-to-eat food products that serve as a strategic loss leader.
Already operating at a loss for this product, the retailer wanted to improve operational efficiencies and reduce demand forecasting errors to under 10% with 30-minute accuracy. However, their existing demand forecasting for this product had been done manually and wasn’t accurate enough.
The retailer’s team used the sales data from the same day the prior year (or the sales for the previous few weeks) and added 5%. This simple forecasting model lacked accuracy and led to significant product waste. Its relatively new analytics team needed a more accurate demand forecasting solution.
It was also critical, in the overall business goal-setting process, to select the appropriate target. In this situation, the models could be optimized for either waste reduction or sales maximization. The former would allow the sale of all or most of their products. It would not have to be scrapped or recycled into even lower margin products. The latter goal would have focused on consistently cooking enough products to support most or all the consumer’s demand, but it would increase waste significantly.
The primary client goal was waste reduction; therefore, the Neal Analytics team built a model with this primary business goal in mind.
The Neal Analytics Advanced Demand Forecasting Solution
Neal Analytics faced several challenges in its journey to build an accurate demand forecasting solution. A couple of them were:
- The client did not provide any historical data from their existing demand forecast solution, which would act as a benchmark for comparison against Neal Analytics’ predicted results.
- The client did not have processes or metrics in place to track relevant data with enough granularity that would help in predicting a more accurate forecast.
Neal Analytics’ approach to building an advanced demand forecasting solution followed these steps:
- Assess the currently available and limitations of the current forecast method
- Find the type of data to track and build a robust forecasting solution
- Develop a plan for Neal Analytics to deliver a short-term pilot and a long-term scalable solution
The goal was first to build a working pilot, then run it through proprietary test suites, and finally take incremental steps towards a production-ready solution.
The 5 Project Stages
STAGE 1: Assessment and rapid insights
This stage set clear business objectives and performance metrics. The team assessed parameters such as costs, benefits, feasibility, and timelines. It also compiled all available data sources that would be used for the solution.
STAGE 2: Working pilot
The initial model helped identify limitations and estimated the required production-level development.
STAGE 3: In-market in four store – Phase 1
An incremental model with proven results was deployed in the cloud and used to test the forecast in four stores. This model partially consumed real data from the client.
STAGE 4: In-market in four stores – Phase 2
A generalized model infrastructure was deployed on the Microsoft Azure cloud to support the first stores and prepare for scaling to more.
STAGE 5: Functionally optimized
The final production-fit solution was deployed. This solution used automated mechanisms to improve its machine learning algorithms based on new hourly and daily data.
The Solution Breakdown
The team used the forecasting model from the first pilot stores as a benchmark for the other three test stores. There, the solution was modified to suit the specific factors that affected each store’s demand patterns.
For instance, a store close to a large college had school breaks playing a significant role in its sales.
Because of the accuracy demands at both a daily and 30-minute granularity level, as well as the discrepancies in consumer purchasing behavior depending on the store, the Neal Analytics team had to build a multi-tier, multi-level store-specific set of models to achieve an acceptable result.
To achieve this, the demand forecasting solution from Neal Analytics has three steps.
- Step 1: Generate 30-minute forecasts for an entire day called a “sub-daily” model
- Step 2: Generate the daily forecast by creating an ensemble of models
- Step 3: Scale the 30-minute forecast (1) using the total of the daily forecast (2)
Step 1: Sub-daily model to generate an initial set of 30-minute forecasts (30-min model)
The sub-daily model used a cutting-edge algorithm called FB Prophet (an open-source algorithm developed by Facebook) to identify the number of products that would sell at a particular time of day.
For example, the table on the right shows the type of prediction for morning sales it would produce.
With the sub-daily model, the retailer was able to identify the time during the day at which sales were highest. However, the number of units sold was not accurate enough. If the model was precise at a 30-minute level, the accumulated errors over these periods for a full day was too large to be satisfactory.
Step 2: Generate daily forecasts (daily model)
Neal Analytics built an “ensemble model” for the daily model.
This ensemble model combines the outputs of three separate algorithms’ predictions: The FB Prophet, a statistical algorithm combining historical data across multiple periods, and a simple linear historical model.
Step 3: Scale 30-minute forecast to daily forecast for more accuracy
The final step scales the initial 30-minute level prediction based on the ensemble daily forecast.
For instance, if the daily forecast in Step 2 was 500, and 400 in Step 1 (adding all the 30-minute predictions together), the new forecast, per 30-minute periods, would be 500/400, or 1.25, larger. For example, if the 9-9.30am forecast were 15 products, the final one would be 19 products (1.25*15 rounded up).
Factors considered while building the ensemble model
Each of the three models had strengths and weaknesses. There were also certain factors specific to a particular location. For instance, one store was close to the Canadian border. It would have a lot more Canadian customers because of advantageous exchange rates, better prices, and a lower sales tax.
Some factors considered for the machine learning models were:
- Sales data from previous years
- Weather and seasonality
- Time of year
- Local environment
- Day of the week
- Neighboring high schools and colleges
- Combination of local cultural and sporting events such as an NFL football game
- Feedback from the store staff on excel sheets to compare the actual sales and the daily forecasts from the deployed pilot
The latter provided insights into what the model was missing or failing to understand at the most granular level. For example, a school rush would cause an ad-hoc spike in sales for a particular day at the store. These unpredictable factors would not be built into the model.
Neal Analytics’ final demand forecasting solution architecture
For this engagement, Neal Analytics built over 20 models, with one model per store per day. Before these new models, the client estimated the actual waste to be well over 10% at all stores across the US. After implementing Neal Analytics’ machine learning-powered, demand forecasting solution, the waste was consistently below this project goal of 10%.
The model also helped the customer to better understand future sales trends and reduce product recycling.