The Inman Connect show in NYC this past January included a panel focused on using data to understand and forecast the NYC housing market. Sitting on the panel were two chief economists, a provider of active listings metrics, and yours truly. The room was packed, and Brian Boreo of 1000Watt Consulting led us through the presentations. We had each been asked to produce a single slide illustrating how our company looks at the marketplace. The Panelists
Mark Fleming - Chief Economist, First American CoreLogic Mike Simonson – Founder, Altos Research Stan Humphries – Chief Economist, Zillow.com And me – CIO at Onboard, Folklore and Mythology BA and Olympic Curling enthusiast
I’m not an economist.
I am a data junkie and trend analysis enthusiast.
Stepping back, I took a moment and considered what content the other panelists were likely to focus on. All of us have access to essentially the same types of content including:
· Public records · Listings records · Search metrics
I surmised that Mark and Stan would likely focus on what I consider “near past actuals” by looking at sold data tends (volume, price) and perhaps listing volumes and days on market information, either as a snapshot or trended over time. This type of content paints an accurate picture of what has just happened in the marketplace, generally reflecting buyer activity on market prices sixty to ninety days aged (the typical time between agreeing on price and the transfer and recording of the transaction). The trend lines created are typically projected into the future as a predictive tool. I also figured Mike would look at listing activity metrics trends such as list price trends, price reduction activity, days on market and volumes. This listing activity content is more of a forward looking indicator as it provides information about properties that are likely to transfer in the next sixty to ninety days but relies significantly on pricing data which actually indicates what the seller thinks – or perhaps would like – their property to sell for. We’ve seen periods where list and sale price vary hugely and other times where they are close in proximity.
The analysis the panelists provide on top of this data was informative, timely and well understood by those familiar with typical housing statistics.
But I felt there were other ways to look at this…
Look at underlying factors, not end point results
At Onboard, we also look at all the numbers, statistics and trends we can in sold data, tax basis, distressed property volumes, pricing trends, listing activity and construction data. The issue with looking at these as predictors of the marketplace is that these data points all represent past activity and are, in turn, the result of buyer/seller decisions, the availability of money and other underlying factors. The key to predicting the market is in accessing the drivers that impact what a buyer/seller will do rather than looking only at what they already did.
We believe that understanding the underlying factors and then applying local market knowledge is a different and meaningful perspective that can, when combined with a hyperlocal analysis model, provide startling insight into why the market is behaving as it is and how it is likely to behave in the future. This insight must be reflected against the actual local market activity (once it occurs) on a continuous basis.
During the initial financial turmoil two years ago, we were approached by a number of private and government concerns regarding how one can identify housing risk as a local level. With Onboard’s hyperlocal modeling expertise and access to data, we were able to approach the problem from a number of directions.
Ultimately, we created a forward looking housing distress index which provides comparative information between local markets. This allowed us to look at the health or deterioration of underlying housing distress factors of any city, county or neighborhood and identify – relative to other parts of the country – how the area is likely to perform. This was a critical concern for anyone analyzing a portfolio of properties for either investment opportunity or relief direction as it provides a basis for comparing area risk. Typically this local area risk is then considered against specific property risks (mortgage details, resident credit, etc.). We looked at a large number of data points over time and found that the following – in combination – provided a locally reliable evaluation method:
· Vacancy and occupancy data · Employment statistics · Household income · Change in HPI (home price index) from highest value · At risk mortgage origination volumes
In each case, we considered the change in these values over time and the velocity of that change relative to the larger marketplace. The results were normalized to a 1 – 10 index with low values signifying indicators of continuing distress and high values indicating little or no such indicators when compared to the national landscape. We found – within reason – that these values forecasted activity in the marketplace so long as both global market factors and local knowledge was applied on top of this analysis. Global factors might currently include the federal home buying incentive and low interest rates. Local factors might include knowledge of new construction units soon to hit the market or a large factory closing.
Over the past two years we’ve compared the results of this model statistically to three, six and nine month trailing indicators (sales volume and pricing, days on market, foreclosure volumes) and found a surprisingly tight correlation.
If that explanation left you cross-eyed, take a look at the map image below. It represents the underlying Q2 and Q3 2009 factors and we believe predict conditions for the current and near term market in NYC. When compared to the previous forecast, the model predicted the uptick in sales and the reduction in gap between list price and sale price experienced in much of the market during the recent 4th quarter.
To analyze this map, the dark areas indicate a stronger market where houses are likely to hold their value through the sales cycle, inventory is not flooded, and the number of properties in distress relative to overall inventory is likely to remain low. Lighter areas show significant downward pressures on the market. Depending on where in the market correction cycle the local market is, this could mean significant foreclosure activity will continue or simply that properties will be slow to move without some discount. It is at this point that local knowledge must be applied – something that Onboard believes the local broker and Realtor are uniquely positioned to do.
The level of detail here is to the neighborhood and block group level – a very fine level of analysis made possible by the application of Onboard’s geography model to all the underlying data points supported by specific spatial analysis techniques. The result is that one can see the market differences between Jamaica, Queens and the neighborhoods that border it.
What we don’t know
This model appears to work well now and for this type of volatile marketplace. This same volatility makes some traditional analysis methods (Case-Schiller, etc.) less reliable in our opinion. When the market stabilizes or during a period of rapid price increases, it is unclear whether this model will continue to offer value as a predictive tool. It is likely that we will find additional underlying factors that need to be considered during an up market.
In the meantime, it’s fun to look at….
Image Credit: Wikipedia