Marketing Darwinism - by Paul Dunay
  • Home
  • Bio
  • Books
  • Press
  • Speaking
  • Webinars
  • Videos
  • Podcasts
  • Photos
  • Awards
  • Abstracts
  • Testimonials
Home
Bio
Books
Press
Speaking
Webinars
Videos
Podcasts
Photos
Awards
Abstracts
Testimonials
  • Home
  • Bio
  • Books
  • Press
  • Speaking
  • Webinars
  • Videos
  • Podcasts
  • Photos
  • Awards
  • Abstracts
  • Testimonials
Marketing Darwinism - by Paul Dunay
Artificial Intelligence

What does Computer Vision have to do with the Price of a House?

A guest post by: Romi Mahajan, CMO Quantarium

Residential real estate – peoples’ homes – is the world’s largest asset class, tipping the scales at almost $200 trillion worldwide.  This number is staggering to many, including those in the housing industry.  Larger even than the sums involved are the emotions – a family’s residence is likely its largest investment and one from which, so many other life-factors radiate:  Who are your neighbors, what schools do your kids attend, are you safe, how close are you to good medical care, and so on.  Insofar as this is true, the housing sector can never be given too much attention by Economists, Sociologists and even Technologists.  Still, in many ways the sector has been given short shrift.

Consider a matter at the heart of the industry – the value of a particular house.  What appears to be a simple question with a simple answer is not.  Sure, one can look at the basics – how big it is, the year built, comparable houses in the neighborhood and so on.  One can even attempt to factor in other variables – school district, crime statistics, proximity to the beach, and a host of other things.  All said and done, all of these factors are “external” and in many ways “non-specific.”

Let’s pause for a moment.  While these factors are indeed external, we have to ask ourselves a basic question – how do I get specific?  How do I assess the value of a particular house, looking beyond these basic factors and in the process taking into account the condition of the house and the nature of its interior landscape?

For most of us this is an obvious question.  After all, if you put in a lot of money to modernize or refurbish a house, you would expect that its value rises, even if your work and effort is not recognized in the external statistics being looked at for valuation.  If you, on the other hand, paid no attention to the house and allowed it to atrophy, you’d likely expect the value to diminish.

This issue is often “solved” by Appraisers, who theoretically take into consideration all of these interior and condition-based factors when assessing the value of a house.

Now, we enter a world fraught with problems.

For the purposes of this short piece, we won’t get into the debates about the objectivity of Appraisers or even about the shortage of talent that is delaying closings in many large markets (in the US for sure.)  These issues are fertile grounds for discussion, elsewhere.

No, the main issues we intend to dissect here are the issues of scale, speed, and customer experience.

In the US, there are over 100 million residential units.  Now imagine you work at a bank or other institution that originates and/or “owns” millions of mortgages and wants to determine the value of your portfolio in toto?  Imagine, further, that you need to do so every month.  After all, you need to keep track of your assets, make decisions about where to keep houses and where to sell houses, and assess your risk in holding these mortgages.  The issues of scale and cost are enormous.  You certainly can’t send an appraiser to each house.

Imagine a different scenario.  A consumer lives in a city with a very fast market and needs to make decisions on the spot whether to buy a house.  Waiting even a few hours, not to mention days, can mean losing a house.  In this cauldron, determining the true “value” of a home has to be done instantaneously.  Here, the issues of speed are paramount.

Finally, imagine you are a real estate agent with a demanding (and rightfully so) customer who wants to buy a house.  You have visited 10 houses to determine fit and have been disappointed by their dilapidated interiors.  You are not paid for your time, only for results.  If only, there were ways to determine condition and value based on condition in a way that was easy for the customer (in this case, you.)

Enter technology, specifically AI and its offshoot, Computer Vision.  Artificial Intelligence yields a potent set of tools for real estate, starting with valuations.  First of all, AI is “better at the basics” than non AI methodologies.  To get even a basic valuation of 100+ million properties every month is not trivial; with AI, the entire US footprint can be run in hours not weeks.  The idea is simple:  Computers can learn from data sets of a critical mass, then keep improving their outputs as more data comes in.  Machine-learning is just that- machines that actually “learn” and thus can offer results and outputs that are neither obvious nor simply the result of brute-force methods.  AI can thus help with the scale and speed components.

Computer Vision comes in here in a delightful way.  If you look at house-listings, they often come with a multitude of pictures.  Computer Vision can analyze and categorize these pictures- with speed and fidelity- thereby assigning “condition” scores to kitchens, bathrooms, and other hotspots in the house.  In this way, they can help offer a “condition-adjusted” value.

Put all of this together and you get a powerful mix.  Automated Valuation Models (AVMs), powered by AI can provide accurate valuations at scale and with enormous breadth.  Add condition-adjustment, powered by Computer Vision, and you start to see technology giving its due to the vexing problems and incredible opportunities in the real estate industry.

May 2, 2021by Paul Dunay
Applications, Big Data, Customer Experience, Data, Data Analytics, Data Mining, Innovation

Getting Towards a Mature Data Infrastructure

Data is the watchword in organizations large and small. In fact, how an organization frames data is the single most important determination of future success or failure. As some put it, Data is the new “oil,” the commodity of most value in the modern age.

Many business leaders understand this intuitively. As business-users in the organization are forced to make larger number of critical decisions with larger “payloads” on a more frequent basis, the idea that these decisions must be data-driven is at the fore. Gut instinct is fine but gut instinct inflected with timely, contextual, and comprehensive knowledge of relevant data is a winning strategy.

While the idea of being “data-driven” is fundamental and powerful, most organizations fall short. Intentions are necessary but not sufficient. For most organizations, the technology and operational infrastructure that defines their “data” is predicated on notions that made sense in an earlier era in which there were simply less sources of data and less change to existing sources. The “size” of the data question makes for a complexity that is not pre-defined and therefore the solution to the data problem has to be flexible and adaptive. Data infrastructure maturity is necessary in today’s business environment and has 4 basic qualities: Governance, Security, Agility, and Automation.

Without these 4 qualifiers, 2 core facets of the solution are absent- democratizing access to data and liberating IT from the backlog and fatigue associated with constantly-changing business needs. Business-users work in the “NOW” timeframe while IT has its own rhythms. In order to truly be data-driven in a way that scales, organizations must empower business-users while simultaneously freeing IT to innovate. While there are cultural hurdles to this state, the biggest blockers are infrastructural.

Until very recently, good enough was, alas, good enough. The internecine conflict between Business and IT was considered just a fact of life, a “cost of doing business.” With automation technology, business users’ data needs can be managed on the fly and without the need for reactive hand-coding, conferring agility to the business teams and handing time back to the IT teams to innovate and more resources from lower value tasks to higher value tasks. This structural win-win is available today and harmonizes the needs of Business and IT.

If data is the new oil then an infrastructure to capitalize on it is necessary- an infrastructure that is mature and “Hub”-like. While all organizations are different, they are similar in their data needs and the data platforms that win will accommodate diversity and change inherently.

Guest post by:
Romi Mahajan
Chief Commercial Officer, TimeXtender

March 6, 2017by Paul Dunay

Search

Welcome to my blog, my name is Paul Dunay and I lead Red Hat's Financial Services Marketing team Globally, I am also a Certified Professional Coach, Author and Award-Winning B2B Marketing Expert. Any views expressed are my own.

Archives

  • March 2023
  • February 2023
  • January 2023
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • April 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • March 2021
  • December 2020
  • October 2020
  • September 2020
  • August 2020
  • May 2020
  • April 2020
  • January 2020
  • March 2019
  • December 2018
  • October 2018
  • September 2018
  • August 2018
  • May 2018
  • April 2018
  • January 2018
  • November 2017
  • May 2017
  • March 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • March 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009
  • July 2009
  • June 2009
  • May 2009
  • April 2009
  • March 2009
  • February 2009
  • January 2009
  • December 2008
  • November 2008
  • October 2008
  • September 2008
  • August 2008
  • July 2008
  • June 2008
  • May 2008
  • April 2008
  • March 2008
  • February 2008
  • January 2008
  • December 2007
  • November 2007
  • October 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007
  • February 2007
  • January 2007
  • December 2006
  • November 2006
  • October 2006
  • September 2006
  • August 2006
  • July 2006
  • June 2006
  • May 2006
  • April 2006
  • March 2006

“I started with Brixton to provide you with daily fresh new ideas about trends. It is a very clean and elegant Wordpress Theme suitable for every blogger. Perfect for sharing your lifestyle.”

© 2018 copyright PREMIUMCODING // All rights reserved // Privacy Policy
Brixton was made with love by Premiumcoding.