Skip to content

Reflections on GRESB’s Data Quality Initiative

ESG reporting has come a long way in the commercial real estate industry. Perhaps the best proxy of progress is the Global Real Estate Sustainability Benchmark (GRESB) which now boasts over USD $3 trillion of asset value participating in its annual survey. That boils down to the majority of institutional owners with more than USD $1 billion in the Americas and EMEA. Penetration in APAC is swiftly gaining parity. This should be celebrated as a triumph of transparency and acknowledged as a major step forward for the practice of sustainability in any asset class.

“But GRESB is a mechanism for ESG reporting, not a driver in and of itself. The drivers are investors, occupiers, and regulators, in that order.”

But GRESB is a mechanism for ESG reporting, not a driver in and of itself. The drivers are investors, occupiers, and regulators, in that order. And each of these groups is asking more frequent and more discerning questions about the ESG claims made by assets owners. These questions are zeroing in on data quality.

The questions become especially fraught when we try to understand data quality at a level as abstracted as a real estate fund or portfolio, which is GRESB’s domain and where most institutional real estate investors operate. Is reported data reflective of the owner’s total financial stake or operational control in all underlying assets? Is utility data coverage representative of the whole building or common areas only? Does CapEx on sustainability projects correspond to all building area(s) affected?

These technicalities matter because they’re where GRESB performance scores and many other fund-level metrics originate. Perversely, it’s also the level at which the truth is most easily muddied.

The tension between accuracy and scale, especially in an environment as dynamic as real estate where buildings are bought, retrofit, leased, refinanced, and sold with regularity leads to the ultimate question: can ESG data be trusted to the same degree as traditional financial data? Unfortunately the answer from most institutional real estate investors is “not yet”.

Turning that response into an unequivocal and emphatic “yes” is the raison d’etre of GRESB’s data quality initiative, and the rationale behind the data quality standard it may yield.

“While it’s difficult to dispute the need for improved data quality, it’s much harder to see what a standard might look like. For a glimpse, we might start by looking at what already exists.”

You may be surprised to find measures of ESG data quality already exist and are regularly applied. CDP, for example, calculates an “Uncertainty Statistic,” which is a measure of the ratio of actual versus estimated data in a given report, providing investors a sense of how much confidence they can have in the claims contained therein. While GRESB doesn’t have the concept of “uncertainty,” it does run an array of checks before and after reports are submitted, including extensive tests for unrealistic utility consumption relative to floor area and asset type aka “intensity checks.”

These and many other approaches from codified standards like ISO to newer techniques within the universe of A.I. are examples of available methodologies to address data quality. Whatever tests emerge, imagine enforcing them uniformly across every single reported data point then pushing them down to the asset (or even utility meter) level. The result might just be a regime trustworthy enough for even the most skeptical investor, and financial markets broadly.

I do not mean to suggest that goal is necessarily the consensus view of GRESB or the Technical Working Group – it is Measurabl’s. The GRESB Data Quality Technical Working Group has a lot of work in front of it, and it’s too early to say exactly what the Standard might look like. It’s also far too early to say how GRESB may implement it.

In the meantime, the process unfolds: over 30 entities from every aspect of the business – prestigious owners like Alexandria REIT, elite service providers like CBRE, and software companies like Measurabl – are involved in a multi-month, fully documented exercise to deliver something that moves the industry closer to a level of quality that satiates investors and unlocks the industry’s ability to extract maximum value from ESG in its day-to-day activities.

“GRESB is focused on identifying four “intrinsic characteristics” of data quality.

What we know after the first meeting of the Technical Working Group on Dec 11 is that GRESB is focused on identifying four “intrinsic characteristics” of data quality. Here’s what they are and the general concept behind them:

  1. Timeliness: The more recent an indicator, the more reflective it is likely to be of the current state of the asset/fund. Conversely, the older the data, the more difficult it is for a reasonable investor or lender to trust since they don’t know whether the data is reflective of the current state of the asset/fund under scrutiny.
  2. Accuracy: Statements that appear too good to be true or are out of step with industry expectations, should be detectable, flagged, and explained or corrected. There’s enough information available to know reasonable data coverage rates for a given asset class, for example. Using that understanding, accuracy could be defined as having a standard deviation from the mean expectation.
  3. Completeness: Are an asset’s total floorspace and sub-type(s) defined? Are all meter readings from the time period across all floor area(s) accounted for? Gaps in building definition or meter readings are like words omitted from a sentence – drop too many and it’s not at all clear what type of asset we’re evaluating or what meter readings are normal, at which point benchmarking becomes useless.
  4. Lineage: Is the original data point and any subsequent transmissions/transformations documented? If we can’t say where the data came from nor account for how it has mutated over time, why should we trust the end claim?

Irrespective of whether we define data quality along these four vectors of timeliness, accuracy, completeness, and lineage, or others, the question becomes “how do we know when data is timely?” or “how do we know what data is accurate?” Is there a threshold to meet or exceed? Or is there instead a range of acceptable results? Are these thresholds or ranges absolute or peer relative..?

Irrespective of what these “tests” are, what they represent is extraordinary: a way to objectively determine the reliability of real estate ESG data. I expect attention to shift to these critical questions on how to measure data quality as soon as the characteristics that define data quality are cemented by the Technical Working Group and passed on to the various GRESB Benchmarking Committee and other governance structures for consideration.

I commonly hear real estate experts ask “what’s next?” when it comes to sustainability. “Resilience” or “health and wellbeing” are routinely proffered. But I suspect it will be something far more basic than any new frontier of sustainability. Instead, it will be a way to know just how reliable the data we’ve been reporting actually is.

Facebook
Twitter
LinkedIn
Pinterest