Your Board Asked Why Emissions Went Up 15%. Can You Answer in 30 Seconds?
The CFO sees a quarterly number trending up and asks 'why?' The sustainability manager says 'I'll get back to you' and disappears into spreadsheets for two days. The problem isn't the maths. It's that most carbon software shows you totals but can't explain what changed or why.
We've watched this play out in boardrooms across Australia. The quarterly emissions number goes up. The CFO turns to the sustainability manager. "Why?" And the sustainability manager, who probably knows the answer intuitively, can't back it up with specific numbers on the spot. So they say, "I'll pull the data and get back to you."
Two days later, they've decomposed the variance manually in a spreadsheet. By then, the board has moved on. The moment for action passed.
This is the gap between having carbon data and actually understanding it. Most carbon accounting software will tell you that your Scope 1 emissions went up 15% quarter-on-quarter. What it won't tell you is why. And the "why" is the only part that matters when a board needs to decide what to do about it.
AASB S2 paragraph 6 requires entities to disclose how the governance body oversees climate-related risks and opportunities, including how and how often the board is informed. That's not a once-a-year requirement. It describes a continuous governance rhythm where directors need to understand emissions trends well enough to exercise real oversight. You can't exercise oversight of a number you can't explain.
A 15% increase could mean five completely different things
Here's what makes carbon variance analysis hard. A single percentage change in total emissions can be driven by entirely different root causes, each requiring a different response.
Consider a construction contractor reporting under NGER with operations across three states. Their Scope 1 and 2 emissions jumped 15%, about 420 tCO2-e, compared to the previous quarter. The sustainability manager opens a spreadsheet. Now what?
That 420 tCO2-e increase could be diesel consumption spiking because two additional excavators mobilised on a highway project in Queensland. It could be a new office tenancy in Melbourne adding electricity load for the first time. It could be the NGA Factors update changing the emission factor for natural gas. It could be a duplicate fuel docket that inflated the diesel total by 30,000 litres. Or it could be seasonal: winter gas consumption at the head office following the same pattern it does every year.
Each cause has a different implication. An equipment mobilisation is expected and operationally justified. A new facility is a scope boundary change that needs documenting. A factor update is a calculation artefact, not an operational change. A duplicate is a data quality error that needs correcting before the number reaches a NGER report or an AASB S2 disclosure. And seasonal variation is noise that shouldn't trigger concern.
The sustainability manager knows all this. But separating these causes from a single aggregate number, manually, takes hours. Sometimes days. You have to compare two periods side by side, break the data down by project, then by fuel type, then by site, identify the largest movers, check each one against operational context, and rule out data errors.
That's not carbon accounting. That's forensic data analysis. And it's happening in Excel.
The board governance problem nobody talks about
Here's where this gets legally interesting. Under AASB S2, governance disclosures aren't optional window dressing. Entities must explain whether the board has the competence to understand climate data, how climate risks factor into strategy and remuneration, and how often the board receives climate-related information.
The AICD's research on boardroom readiness for AASB S2 found that organisations consistently underestimated the complexity and resource demands of climate reporting, with gaps in board-level climate literacy emerging as a recurring challenge. That lines up with what we see. It's not that directors don't care about climate. It's that the data reaches them too slowly, too aggregated, and without enough context to act on.
Think about how financial reporting works. If revenue dropped 15% in a quarter, the CFO wouldn't present a single number and leave. They'd show a waterfall chart: "We lost $2.3M from the Melbourne contract winding down, gained $1.1M from the Perth expansion, and $400K is a timing difference on a delayed invoice." The board gets cause-and-effect in thirty seconds. They can ask intelligent follow-up questions. They can make decisions.
Carbon data gets none of this treatment. The board sees "emissions went up 15%" and either panics or shrugs. Neither response is useful.
PwC's 2025 Global Sustainability Reporting Survey, covering 496 executives across 40 countries, found that more than 60% of organisations increased the amount of senior leadership time spent on sustainability reporting in the past year. That time isn't going toward strategy. It's going toward assembling the numbers. And 90% of organisations not yet reporting still rely on spreadsheet-based sustainability data collection, which means variance analysis is a manual job every single quarter.
For Australian companies reporting under both NGER and AASB S2, this creates a compounding problem. NGER requires annual reporting to the Clean Energy Regulator by 31 October. AASB S2 creates an ongoing board governance obligation. You can't wait until September to work out why your Q1 numbers looked wrong.
What variance decomposition should actually look like
We built Carbonly's period-over-period analysis because we'd spent years watching sustainability teams do this work by hand. The output should look something like this for that hypothetical construction contractor:
Scope 1 increased 15% (+420 tCO2-e) quarter-on-quarter. Here's the breakdown:
The top contributor was diesel at the Bruce Highway project, up 310 tCO2-e (+34%). This aligns with the mobilisation of two additional excavators in February. The second contributor was natural gas at the head office, up 85 tCO2-e (+22%). This is seasonal and consistent with prior-year winter patterns. Partially offsetting: electricity across all sites dropped 45 tCO2-e (-3%) because the NSW grid emission factor decreased from 0.66 to 0.64 kg CO2-e per kWh in the latest NGA Factors update.
Data quality check: no anomalies detected, no duplicate documents, no missing billing periods. The variance is operationally driven.
That's the answer the board needs. Not "emissions went up 15%," but "here's exactly where, here's exactly why, and here's whether you should be concerned."
The decomposition works by comparing consumption data and emission factors across two user-selected periods, breaking the change down by project, material type, and scope. It identifies the largest absolute contributors to the variance, flags whether changes correlate with known operational events (like equipment mobilisations or new facility additions), checks for data anomalies that might be inflating or deflating the number, and separates emission factor changes from actual consumption changes.
That last point matters more than most people realise. When the NGA Factors update annually (the 2025 edition introduced new hydrogen combustion factors and updated state-based grid factors), every company's emissions change even if their operations didn't. If your Scope 2 dropped 3% and you don't know whether that's because you used less electricity or because the grid got cleaner, you can't take credit for a reduction you didn't earn. And if you do claim it in a sustainability report, you're in ACCC greenwashing territory. The ACCC secured an $8.25 million penalty against a consumer goods company in 2025 for misleading environmental claims on product packaging. Accuracy isn't optional.
The "I'll get back to you" tax
We don't have a precise industry benchmark for how long manual variance analysis takes. But we can estimate from the components. Pulling data from multiple sources: half a day. Reconciling billing periods and units: two to four hours. Building a comparison in Excel: another half day. Checking for anomalies and duplicates: a few hours if you're thorough. Writing up the explanation: an hour or two. Total: roughly two full working days for a moderately complex organisation.
Do that quarterly and you've lost eight working days per year on a task that should take minutes. For a sustainability manager earning $130,000 to $160,000 (a typical range for mid-senior roles in Australian corporates), that's roughly $4,000 to $5,000 in salary cost alone. But the real cost isn't the labour. It's the delay.
A board that gets variance explanations two days after asking for them makes slower decisions than one that gets them in real time. And under AASB S2, the quality of board oversight is now a disclosure requirement. If your governance disclosure says the board receives quarterly climate updates, but in practice the sustainability team needs a week to compile the data, that gap between what you disclose and how you operate becomes an assurance risk.
The Clean Energy Regulator's enforceable undertaking against an ASX-listed energy company in July 2025 is a useful example. The company inadvertently misstated components of its NGER reports across multiple reporting periods. The outcome: three years of mandatory reasonable assurance audits and an external consultant to rebuild their data control systems. The regulator didn't allege fraud. The problem was bad data processes. Manual variance analysis, where nobody catches the duplicate or the unit error until the annual report, is exactly the kind of process weakness that leads to these outcomes.
What we're still working on
We should be honest about what's hard here. Variance decomposition for Scope 1 and 2 is relatively straightforward because the data is mostly utility bills and fuel records that we control. You know the consumption, you know the factor, you can compare them period over period.
Scope 3 is messier. If your purchased goods emissions went up 20%, decomposing that into "which suppliers, which materials, which emission factors changed" requires supplier-level data that many organisations simply don't have yet. We're building toward this, but we won't pretend it's a solved problem. Scope 3 variance analysis depends on data quality that most supply chains can't deliver today. We're getting better at it. So is the rest of the industry. But it's not there yet.
For Scope 1 and 2, though, there's no reason a sustainability manager should spend two days answering a question that the data already contains the answer to. The consumption figures are in the system. The emission factors are in the system. The project allocations are in the system. The anomaly flags are in the system. The only thing missing is software that connects those pieces and presents the answer in plain language.
The 30-second answer
The next time your board asks why emissions went up, you should be able to answer in the meeting. Not after it. Not two days later. Right there, with project-level detail, material-level breakdowns, factor change impacts, and data quality confirmation.
That's what Carbonly's variance analysis does. It decomposes period-over-period changes into their specific contributing factors so the sustainability manager becomes the person with the answer, not the person who needs to go find it.
If you're facing AASB S2 governance disclosures and your current process for explaining emissions changes involves Excel and two days of detective work, reach out at hello@carbonly.ai. We'll show you what the 30-second version looks like.