Supply chains generate more data than ever. And yet, most analytics programs still struggle to turn that data into decisions. At 205 Data Lab, we’ve long advocated a data-warehouse-first approach: consolidate, clean, and standardize data before layering business intelligence on top.
When we spoke with Matthew Burgos - former Director of Data and Decision Intelligence at fairlife and Senior Manager of Supply Chain Data Strategy at US Foods - he offered a complementary view: a semantics-first approach. Matthew’s point was about meaning. Companies keep adding dashboards and APIs but never agree on what their data means. It immediately resonated with us, and made us realize that most of the value of what we do in data modeling is rooted in semantics work. Understanding the business concept and capturing it in the data transformations that we build.
This conversation revealed two perspectives that every data-driven supply-chain team should understand: data warehouse as the enabler, and semantics as the direction-setter for data modeling. Together, they form a structure that makes BI scalable, explainable, and ready for AI.
As 205 Data Lab, our philosophy begins with the practical reality of integration.
You can’t model what you can’t access.
A modern data warehouse—Snowflake, Fabric, or Databricks—provides the infrastructure to ingest and standardize data from the systems that run the supply chain:
By consolidating these feeds in one governed environment, organizations gain the stability to perform consistent joins, transformations, and quality checks.
This warehouse-first discipline turns scattered operational systems into a unified analytical backbone—an essential first step before visualization or AI.
In business-intelligence terms, the warehouse is the source of truth. It gives analysts clean tables, clear lineage, and repeatable refresh cycles. Without it, reporting devolves into manual reconciliation and version drift.

Matthew approaches the same problem from the opposite direction. His experience on both the warehouse floor and the analytics side taught him that integration alone isn’t enough.
What matters most is the shared meaning of the data flowing into that warehouse. In his view, the real foundation isn’t pipelines—it’s the semantic layer that defines how the business measures performance. Examples in the supply chain domain:
When semantics come first, every downstream activity—from modeling to dashboard design—has a reference point. Data engineers build pipelines that serve defined entities. Analysts create visuals aligned to common metrics. AI copilots retrieve information through structured meaning, not guesswork.
Matthew’s philosophy reframes the semantic layer as the architectural foundation. It is drawn first, even if it’s implemented last.
The warehouse enables scale and integration; the semantic layer ensures consistency and understanding. You build the warehouse first, but you design it around semantic intent—every table, transformation, and metric should trace back to a defined business concept.
In simple terms, the warehouse defines how data flows, and the semantic layer defines why it flows that way. One is the physical foundation, the other the conceptual blueprint. Get either wrong, and the structure collapses: integrated data without meaning is noise; defined meaning without reliable data is theory.

In practice, all data work culminates at the semantic layer—the point where integrations, transformations, and metric calculations converge into business meaning before reaching BI dashboards or AI tools. It’s where data becomes understandable and actionable.
Importantly, this doesn’t mean the semantic layer sits outside the warehouse. In modern architectures, ingestion, integration, transformation, and even semantic modeling often happen inside the data warehouse. The distinction is logical, not technical: the warehouse is the execution engine; the semantic layer is the interpretive structure.
A funnel:
Source systems → Ingestion → Integration → Transformation → Semantic layer → BI / AI consumption.
Each step narrows the gap between operational data and shared understanding, ensuring that what reaches the dashboard already carries consistent business meaning.
Business intelligence lives at the intersection of these two philosophies.
When warehouse and semantics are aligned, supply-chain dashboards stop being static scorecards and start acting as operational tools.
With data-warehouse-first discipline, every metric has traceable lineage across systems.
With semantics-first clarity, those metrics carry the same definition from the loading dock to the boardroom.
The results are tangible:
Beyond data accuracy, centralizing metric calculations in the warehouse delivers measurable ROI. BI teams spend less time rebuilding measures for each visualization layer and more time analyzing outcomes. A single, governed metric definition prevents the “five different versions” problem Matthew described—one of the most common sources of confusion in supply-chain reporting.
Matthew Burgos and 205 Data Lab meet on the same idea: supply-chain analytics should be built to adapt, not just to launch.
The data warehouse provides the structure that can shift with new systems. The semantic layer provides the shared meaning that travels with them. Together, they create an architecture designed for change—stable in logic, flexible in implementation.
Integrate your systems first (data-warehouse-first). Define shared meaning next (semantics-first). Do both, and you’ll build a BI system that evolves gracefully, no matter what changes next.
Stay in the loop with everything you need to know.