Supplier Engagement for Scope 3 Data: From Email Chains to Actual System

Most companies know they need Scope 3 data from suppliers. Almost none have a process that works beyond 'send an email and wait three months.' Here's how structured supplier engagement moves you from spend-based guesses to primary data — and what still doesn't work.

Denis Kargl February 27, 2026 13 min read
Scope 3Supplier EngagementSupply Chain EmissionsASRSCarbon AccountingData Quality
Supplier Engagement for Scope 3 Data: From Email Chains to Actual System

We talked to a sustainability manager at a mid-size construction firm last month who described their Scope 3 data collection process as "sending emails into the void." She'd emailed 35 suppliers requesting emissions data. Eight responded. Three sent something usable. The other five sent PDFs that ranged from a single number with no methodology to a 60-page sustainability report where the relevant figure was buried on page 43. That was six months of work.

This is not an unusual story. It's the standard experience for Australian companies trying to collect supplier-specific emissions data for Scope 3 reporting. And with ASRS Group 2 entities starting to report from July 2026 — and Scope 3 becoming mandatory in their second year — the email-into-void approach isn't going to cut it.

The problem isn't that sustainability teams don't know they need Scope 3 data. The problem is that "send an email to procurement, who sends an email to the supplier, who replies three months later with a number nobody can verify" is still the dominant workflow in corporate Australia. There's no tracking. No status visibility. No way to know whether a supplier opened your request, started working on it, or filed it directly in the bin.

We built a supplier engagement module inside Carbonly because we kept watching this exact pattern repeat. Not because the technology is complicated — it isn't. But because nobody was treating supplier data collection as a workflow that needs states, deadlines, and audit trails. They were treating it as correspondence.

The Data Quality Problem You're Actually Solving

Here's what most people miss about Scope 3: the number you report is only as credible as the data behind it. And right now, most Australian companies are reporting Scope 3 using spend-based estimates. That means they took their procurement spend, classified it by industry code, and multiplied by an EEIO emission factor from something like EXIOBASE.

Spend-based gets you a number. But NQA's 2025 analysis of over 50 ISO 14064-1 verified companies found that spend-based factors overestimate actual emissions by an average of 63%. In some sectors, the gap was 79%. The reason is straightforward: industry-average emission factors can't reflect what a specific supplier actually does. A concrete supplier running on renewable electricity looks identical to one running on brown coal when you're using spend-based factors. Both get the same $/tonne CO2-e multiplier.

That's not a rounding error. It's a structural distortion that makes your Scope 3 number almost useless for decision-making. You can't set a credible SBTi target (which requires coverage of at least 67% of Scope 3 if those emissions exceed 40% of your total) based on numbers that might be off by 60-80%.

The GHG Protocol defines four tiers of data quality for Scope 3: primary (supplier-specific), secondary (industry-average with activity data), estimated (modelled), and spend-based. The goal is simple in theory: move your key suppliers up that hierarchy over time. Year one, everything is spend-based. Year two, your top 20 suppliers provide activity data. Year three, a handful provide verified primary data. In practice, almost nobody gets past year one without a system.

Why Email Doesn't Scale

The fundamental issue with email-based supplier engagement is that it has no memory and no status. You send a request. The supplier either replies or doesn't. If they don't, you follow up. If they do, you get data in whatever format they chose — a spreadsheet, a PDF, a number in the body of the email, a link to their sustainability report. You copy it somewhere. Maybe a spreadsheet. Maybe a shared drive. You can't tell your auditor when the data was received, who validated it, or whether it matches what the supplier reported last year.

Consider a company with 200 active suppliers. Even if you only engage the top 50 (which covers maybe 80% of your spend), that's 50 separate email threads. Multiply by follow-ups, clarifications, rejections, resubmissions. You're looking at 200-400 emails across a reporting cycle. All of it unstructured. None of it auditable in any meaningful way.

The ASRS framework is clear that Scope 3 data needs to be traceable. AASB S2 paragraph 29(a) requires disclosure of greenhouse gas emissions across the value chain, and assurance requirements escalate over time — limited assurance from year two, moving toward reasonable assurance. An auditor will ask: where did this number come from? When was it received? Who checked it? What methodology did the supplier use? If your answer is "let me search my inbox," that's a problem.

What a Structured Workflow Actually Looks Like

When we designed the supplier engagement module, we mapped it against what we'd seen go wrong in practice. The core is a status workflow that mirrors real supplier behaviour, not an idealised process.

Every supplier in the system sits in one of five states: not engaged, invited, data requested, data received, or verified. That sounds obvious. But having these states visible — across your entire supplier base, at a glance — changes the conversation entirely. Instead of "I think we emailed the concrete supplier in August," it's "12 suppliers are in data_requested, 8 are in data_received waiting for verification, 23 are still not_engaged." That's something you can report to a board. Something your auditor can examine.

Each data request also has its own lifecycle: draft, sent, viewed, in progress, submitted, verified (or rejected). When a supplier opens the request, you know. When they start entering data, you know. When they submit, the data lands directly in a structured format — not an email attachment you need to manually parse.

Suppliers get an engagement score from 0 to 100, calculated from responsiveness, data completeness, and data quality tier. A supplier who responds promptly with primary data scores high. One who ignores three requests scores near zero. Over time, this score becomes genuinely useful — it tells you where to focus effort and where to accept that spend-based estimates are the best you'll get.

Start with Spend, Upgrade by Exception

The practical approach — and the one that actually works — is to start every supplier at the spend-based tier and upgrade selectively. Here's how that looks in practice.

First, you load your procurement data. Every supplier gets a profile: ABN, ANZSIC industry code, which Scope 3 category they fall into (purchased goods and services, capital goods, upstream transport, etc.), and annual spend. The system calculates a spend-based emission estimate for each supplier using EEIO factors. That's your baseline. It's rough. But it's complete — you've got coverage across your entire supply chain from day one.

Then you triage. Sort by emissions contribution. For most companies, the pattern is stark: Woolworths reports that Scope 3 is 23 times their combined Scope 1 and 2 emissions — 96% of their total footprint. And within that Scope 3, Category 1 (purchased goods and services) typically represents 40-70% alone. Your top 20-30 suppliers by spend probably account for 70-80% of your supply chain emissions. Those are the ones worth engaging.

For each of those high-impact suppliers, you send a structured data request through the system. Not a survey. Not a 40-question questionnaire. A specific ask: provide your total electricity consumption, natural gas consumption, diesel consumption, and any other fuel or refrigerant data for the period. Activity data they already have on their bills. You convert it to emissions yourself using NGA Factors. That moves them from spend-based to secondary data quality in one step.

The suppliers who can provide their own calculated Scope 1 and 2 emissions — allocated to your purchases — get classified as primary data. That's the top tier. Don't expect many suppliers there in year one. Maybe five or ten out of 200. That's normal.

The Buying Power Question (Honest Take)

Here's something that doesn't get said enough in carbon accounting marketing: supplier engagement success depends almost entirely on your buying power. If you're a $2 billion retailer asking a $50 million supplier for emissions data, they'll respond. They don't want to lose the contract. If you're a $30 million construction firm asking a $500 million concrete supplier for data, you're one of their 800 customers and they have zero incentive to invest time in your request.

We've seen this play out repeatedly. Large companies with concentrated procurement — retailers, miners, government agencies — can run effective supplier engagement programs because they have commercial pull. Mid-market companies with fragmented supply chains genuinely struggle. Their top supplier might represent 4% of that supplier's revenue. That's not enough buying power to get a response, let alone quality data.

This is why spend-based estimates aren't going away any time soon for the long tail of suppliers. And it's why the ASRS modified liability framework exists — the legislation explicitly acknowledges that Scope 3 data quality will be imperfect. For financial years commencing between 1 January 2025 and 31 December 2027, only ASIC can take enforcement action on Scope 3 disclosures. Private litigants can't sue over your Scope 3 numbers during that window. The regulators understood this was going to be messy.

But — and this matters — the modified liability period ends. After December 2027, you're fully exposed. Three years sounds like a lot of runway. It isn't, when you consider that getting primary data from even twenty suppliers takes two full reporting cycles of relationship building.

What "Verified" Actually Means

When a supplier submits data through the system, it doesn't automatically become trusted. The submitted data enters a verification step where your team reviews it against what's plausible.

We flag obvious issues: a supplier whose emissions dropped 80% year-on-year without explanation, electricity consumption figures that don't match the supplier's reported revenue or headcount, emission factors that look like they were pulled from the wrong country's dataset. The system won't catch everything. But it catches the common problems — the data entry errors, the unit mismatches (MWh versus kWh is a perennial favourite), the suppliers who reported total company emissions instead of allocated emissions for your purchases.

Rejected submissions go back to the supplier with specific notes on what needs correcting. The entire exchange is logged. That audit trail — who submitted what, when it was reviewed, why it was rejected or accepted, what methodology the supplier used — is what your auditor actually needs. Not just the final number. The provenance.

What We Haven't Solved (And What Nobody Has)

We're going to be honest about the limitations because we think the carbon accounting industry has a credibility problem with overselling.

Small suppliers can't give you what they don't have. A landscaping company with 15 employees and no energy management system doesn't know their Scope 1 emissions. They don't have a sustainability team. They might not even know what Scope 1 means. For these suppliers — and they'll be 60-70% of your supplier base by count — spend-based is it. Maybe forever. The engagement module tracks them, but you shouldn't expect responses.

Data quality from suppliers is wildly variable. We've seen suppliers report "zero emissions" (they meant they bought offsets, not that they don't emit), suppliers who reported revenue instead of energy consumption, suppliers who sent three-year-old data because they hadn't updated since their last CDP response. Having a system doesn't fix the quality of what goes into it. It just makes the quality visible and auditable.

Cross-border supply chains add a currency and factor complexity that we've written about separately. If 40% of your procurement is from overseas suppliers, the engagement challenge multiplies. Different reporting standards, different emission factor databases, different levels of climate reporting maturity. An Australian supplier probably knows what NGA Factors are. A Vietnamese steel supplier does not.

And finally — we still think Scope 3 methodology is genuinely unsettled. The GHG Protocol is reviewing its standards. Double-counting between a buyer's Scope 3 and a supplier's Scope 1 and 2 is baked into the framework by design. The numbers won't reconcile perfectly at a supply-chain level, and that's an intellectual problem the industry hasn't resolved. We work with the framework as it exists, but we won't pretend it's more precise than it is.

The Year-Two Inflection Point

For ASRS Group 2 entities, here's the timeline that matters: your first reporting year (FY starting July 2026) defers Scope 3. Your second year does not. That means your FY28 report — likely due alongside your annual financials in late 2028 or early 2029 — needs Scope 3 numbers backed by documented methodology and ready for limited assurance.

Working backwards: you need supplier data in hand by mid-2028 at the latest. To have it by then, you need to start engaging suppliers by mid-2027. To know which suppliers to engage, you need a spend-based baseline by early 2027. To build that baseline, you need your procurement data mapped to ANZSIC codes and Scope 3 categories by late 2026.

That chain of dependencies is why starting now — even if your first reporting year is still months away — isn't early. It's barely on time.

The companies that will report credible Scope 3 numbers in their second year are the ones building a supplier engagement pipeline right now. Not sending bulk emails. Building a system where every supplier has a status, every request has a deadline, every response gets verified, and every decision is documented.

Spend-based gets you through year one. But auditors, investors, and SBTi validators will all expect to see progress toward supplier-specific data by year two. The question isn't whether you need a system for this. It's whether you build one before the deadline forces you to.


Related Reading: