Every time an Amazon Customer makes a purchase, the Fulfillment Optimization (FO) Team determines how to fulfill that order in the most cost-effective way while meeting the delivery promise. Our planning inputs — Units per Box (UPB), Destination Demand Forecast (DDF), and Cube per Package (CPP) — are foundational signals consumed by transportation planning, capacity planning, and cost forecasting systems across Amazon's fulfillment network. Getting these forecasts right directly impacts billions of dollars in annual fulfillment cost.
We are part of Amazon's Supply Chain Optimization Technology (SCOT) Group, which develops systems that optimize inventory placement, transportation, and fulfillment plans across marketplaces worldwide.
The FO Planning & Forecasting team is seeking a Business Intelligence Engineer (BIE) who combines deep analytical skills with a builder's mindset — someone who can architect data pipelines, develop automated reporting systems, and apply AI-powered tooling to accelerate insight generation and decision-making at scale.
Key job responsibilities
- Own the data architecture and reporting infrastructure for UPB, DDF, and CPP forecasting inputs across US and international marketplaces
- Build and maintain automated pipelines that produce weekly forecast bridges, variance decompositions, and accuracy tracking consumed by leadership (WBR, QBR, OP cycles)
- Develop AI-assisted analytical workflows that automate recurring analyses, anomaly detection, and root-cause investigation across large-scale forecasting datasets
- Partner with research scientists and economists to validate model outputs, backtest forecast accuracy, and translate model improvements into business impact ($M attribution)
- Design and build self-service dashboards and data products that enable product managers and scientists to independently explore forecast performance without ad-hoc requests
- Mine and integrate data across simulation results, log files, fulfillment systems, and transportation datasets to identify trends, quantify risks, and support planning decisions
- Drive data quality improvement projects — defining data contracts, monitoring freshness/completeness, and building alerting systems that surface issues before they reach downstream consumers
- Collaborate with software development teams to implement analytics systems and data structures that support ML model delivery and large-scale experimentation
A day in the life
Your morning starts with an automated variance report your pipeline generated overnight — Units per box (UPB) missed plan, and the system already attributed the gap to a drop in inventory availability. You add context and push the summary to leadership before 10am. Mid-day, you're building backtesting infrastructure for a scientist's new model, then pairing with a partner team to root-cause an unexpected data drift. In the afternoon, you're developing an AI agen