The Real-Estate Density Debate: How Can Technology Help?
A Density Matrix is an approach used by developers to understand the number of possible housing units per an area of land based off the site context, accessibility or the average their company typically achieves. For example, the Sustainable Residential Quality Density Matrix (SRQ), in the current London Plan (2019), gives a density range based off a site’s Public Transport Accessibility Level (PTAL) and urban setting categorisation. The speed and ease of using this method makes it easy to adopt but is this broad-brush approach holding us back from truly maximising sites to deliver homes? Or is it resulting in investments into sites where the capacity is not possible due to the site context and constraints?
The current form of the London Plan density matrix, for example, has come under frequent scrutiny since its introduction 15 years ago. It is stated in the plan that this matrix should not be applied mechanically, however, according to the GLA’s review of the London Plan, these numbers have dominated policy approach. You can see why as the simple existence of the ranges creates the basis of an argument for objectors of a scheme and a benchmark of what a site is expected to achieve without considering other factors. But another finding from the review was that only 35% of developments were in the benchmarked range, with 50% over the maximum for its location anyway.
So what other basis can developers use to understand the viability of potential plots of land? The matrix approach provided speed but lacks the accuracy required, especially on more complex tranches of land. A site feasibility study carried out by an architect provides more accuracy but takes more time and money, but how can companies ensure that investment is on the sites with the most potential? It poses the question of whether this is something that developers should be exploring, or should the responsibility lie with land owners and planning authorities to understand their portfolios?
In our recent mass land viability study for TfL, SiteSolve was utilised to provide something in the middle, a high-level study into the potential of 2000+ plots of land. The generative algorithm enabled context and geometrical constraints to be considered in the calculation of the number of units that could fit on a site, and in a lot of cases produced significantly different numbers in comparison to the policy-based density result. For approximately 50% of the sites it calculated significantly less units as complex boundary geometry, restricted access or constraints significantly impacted the cost of building out the development. While for 30% of the sites it calculated significantly more units when considering the context on the surrounding building heights. This process does not remove the need for good design to be undertaken but simply unlocks the sites with the most potential.
Technology can also be used to test sites further. New tools for scheme evaluation enable analysis to be taken faster and earlier in the design process to push designs and create the evidence for the design choices and evolution.
This process is paramount in the coming years as the Density Matrix in the new Draft London Plan has been removed and replaced by Policy D6 – Optimising Housing Density. It calls for a more qualitative design led approach, with emphasis on scrutinising the design of higher density developments. This change has led to debates of its own on how to protect against over and under development. This is mirrored nationally with the announcement that councils may have greater powers to refuse sub-optimal development in the draft National Planning Policy Framework. What is clear though is a balance needs to be struck between densification of areas to create the homes needed and well-designed liveable places that fit within their context. The challenge we face is ensuring and proving that schemes do both and we believe technology such as SiteSolve can be a way to facilitate that.