Big Data, Statistics, and the Appraiser
One of the most popular articles on my blog provides instructions to perform simple linear regression in Excel in order to justify adjustments to comparable sales. Without question, big data is here and there is a hunger in the appraisal world to learn more about statistics. Though I only accept commercial assignments, I’ve been loosely following developments in the residential appraisal world, particularly those related to Collateral Underwriter. My understanding, based on the little I’ve read, is that FNMA will use an algorithm to identify adjustments that deviate from the norm.
Fair enough. I’ve longed for the day when the financial world would recognize the value of statistics as a tool to evaluate property. But I have also long argued that the process of appraising real estate cannot be reduced to a science. I demonstrated in another blog post the limitations of regression analysis, despite its usefulness in recognizing trends and tendencies.
I am now reading articles by ignorant authors that are critical of ANY adjustment not supported by quantifiable market evidence. Let me assure you that the appraisal process did, does, and will ALWAYS rest squarely on an appraiser’s sound judgment over quantifiable techniques. Try providing a quantifiable adjustment for the location of a television transmission tower and I think you’ll understand. For many, many years the federal government or quasi-governmental institutions such as FNMA have ignored useful tools such as regression and we all saw (and felt) the tragic effects in 2008.
Until very recently, I have considered the Collateral Underwriter to be a tempest in a teacup. But I’m afraid that the pendulum may now swing in the opposite direction. Now that the government has FINALLY discovered statistics, they may believe that only quantifiable adjustments are valid and that there’s no room for sound judgment by the appraiser. Let us hope and pray that I’m wrong.