Google takes real estate seriously

The people who poke and prod Google in the hopes of finding secrets hit a treasure trove of services in Google’s testing area today and it looks like Google isn’t just dinking around with a crummy Google-base – real estate listings mashup anymore. It shouldn’t be much of a surprise, what with Paul Rademacher (founder of housingmaps.com) on board, a Google base full of listings, and a great mapping service, Google is creating real estate search as a distinct service.

If Google real estate search uses the same technology as Froogle, you can expect to see a lot of Seattle-area homes listed for $150,000 with $300,000 in shipping costs shown to you after you try to buy it.

In other news, Trulia is now letting you post their listings on your site. They say it’s for agents and brokers, but do agents and brokers really want to steer people away from their web sites? If a visitor clicks on More details… they are whisked to the listing agent’s website. I predict that it will mostly be used by bloggers and non-real estate people.

Time Travel Transportation Maps

Not like Back To The Future, but pretty cool nonetheless. These transportation maps from the UK are super-sweet – they show the travel time by bus or rail from a few cities in the UK. I imagine we’ll be able to generate maps like this on the fly to show travel time from wherever you are someday. [photopress:transporttion_maps.jpg,thumb,alignright]

Until then, you can always pick two spots and get the travel time via Bus Monster (Seattle) or Google Transit (Portland). I suspect other cities will get Google Transit service soon- a little bird told me that Google transit is getting data from King County Metro.

(via boingboing.net )

The race for 2nd place has begun

OK, I’m biased and I still believe that “Zearch” is currently King of the Hill of King County home searches. However, I’m willing to give credit where credit is due and say the distance between us and the rest of the pack got smaller today.

[photopress:NewJohnLScott.jpg,full,alignright]

Today, John L Scott and their solution provider, Bellevue based Real Tech, have quietly introducted what they call “Real-Maps 2.0“. Essentially, they are now using Microsoft Virtual Earth instead of the old school ESRI based solution. Additionally, it appears they’ve AJAX-ifed their search pane on their map page, so when you change search criteria it automatically updates the map and the matching results count (which is pretty slick). It also appears that Real Tech has gone all out, and at first glance, it appears they are using the not quite released Microsoft Atlas framework (a new development tool that makes “Web 2.0” style applications easier to develop). It appears they are using JSON for the postbacks (most sites use XML, I currently send back Javascript source code). I haven’t spent much time reverse engineering it or learning Atlas yet, so it’s possible they are using a 3rd party AJAX framework. Regardless of the technical details, it does raise the bar for everybody else.

So what does this mean? Here’s my thoughts….

  • Me – Time to install and learn Atlas this weekend. If I’m going to remain competitive with the big boys, I gotta be using the same tools that the big boys are using. Besides doing complex AJAX with Asp.net 2.0 ICallbackEventHandler is bit tedious for my liking.
  • Galen – Wondering if he should rewrite ShackPrices so it uses Ruby on Rails instead of PHP?
  • ESRI – Between Google Maps & Microsoft Virtual Earth, this company won’t be serving the real estate mapping market much longer.
  • RedFin – That flash based satellite map, though very cool in it’s day, is increasingly looking like a liability. Better update it, do as Zillow did (partner with GlobeXplorer & Microsoft), or let one of the big boys handle your maps. Any map in which the Issaquah Highlands looks like polar bear eating vanilla ice cream during a snow storm, doesn’t cut it for me.
  • Zillow – Better do something cool with that MLS data you’ve been collecting. Otherwise, those eyeballs you were counting on, will be visiting the big brokers instead. Fortunately, for Zillow they could lose the local battle, but still win the national war. The NWMLS is releasing sold listing data in the near future and I’ll be shocked if the local big brokers don’t add “Zestimate” like features to their web sites in the next 6-12 months. Hell, Rain City Guide, already has one, but you already knew we’re ahead of the curve. πŸ˜‰
  • Realtor.com / HomeStore / Move – Obi Won “Dustin Luther” Kenobi – Are you their only hope? Do something! Add an Rain City RSS feed, if you have to! Anything! πŸ™‚
  • Coldwell Banker Bain – Since they are also Real Tech customers, I suspect they’ll be asking for Real-Maps real soon now.
  • Windermere – They can’t be far behind their arch-rivals, or can they?
  • Other local brokers/agents – Time to re-evaluate your MLS search/IDX vendor? Now that John L Scott’s web site has entered the 21st century, the pressure is building for you to join them.
  • John Q Home Buyer in Seattle/Eastside – The new John L Scott, is like RedFin but with better maps & aerial photos.
  • Everybody else, elsewhere – Consumer expectations are slowly being raised. I believe Seattle is ground zero of Real Estate 2.0. Those of you lucky enough to be living outside of the 206 & 425 area codes (aka the war zone), had better pay attention, because what’s happening here will happen in your neck of the woods, sooner than you think.

So what do our fair Rain City Guide readers think of this development?

SELECT * FROM MLS WHERE Remarks = ‘Whoa’

I thought I’d take a moment to reflect on how Rain City’s favorite MLS Search is implemented. I’m a little tired of thinking in computer languages (mostly T-SQL, C# and Javascript), so I figured I’d blog a bit in Geek English for a little while before I hit the compiler again.

[photopress:matrix1_alt.jpg,full,alignright]

I’m always interesed in how web sites & computer software works under the covers, so I thought I share some of the more interesting points about how I’ve implemented “Zearch” to date for the “geekier” folks in the blogosphere.

It all began way back in the fall of 2005 shortly after I got my first MLS feed. At the time, Microsoft’s asp.net 2.0 platform was still in beta. However, after learning what Microsoft’s next generation web development tools were going to do (and seeing what Google Maps and Microsoft’s Virtual Earth teams were doing), I saw a great unrealized potential in MLS search tools and decided to do something about it.

Anyway, it’s all built on top of asp.net 2.0 and MS SQL Server 2000 (yeah, I know I’m old school). One of the first things I did is combined all the property types into a VIEW and create a dynamic SQL query when you search for properties. Some search tools only let you search for residential properties or condominums at one time (which I thought was lame). I orginally tried to implement stuff doing a bunch of UNIONs, but keeping track of the schema variations for the different property types eventually drove me nuts, and I encapsulate all that crud into a VIEW.

I also find it a little ironic, that I’m not the only one who found the MLS schema differences a PITA to deal with. I’m glad the various MLS software vendors and the CRT are working toward a common industry schema (aka RETS), so us application developers can focus on the real problem (developing compelling & useful software), instead of remembering that the ld column in one table, is really the list_date column in another table.

Another interesting thing I do on the back end is that I geocode every listing after I do data download. The main reason is that I don’t trust the MLS data and their bogus geo-coding would make my app look bad. I also knew when I started, I’d eventually do maps, so as soon as a new listing hits my database, it’s gets more accurately/correctly geo-coded. In case your wondering if I’m screen scraping w/ Perl or something else, it’s all done with T-SQL stored procdures. (Well, technically it’s a proc that calls the MSXML2.ServerXMLHTTP COM object, to issue an HTTP request against a geocoding web service, and then uses OPENXML on the response’s XML to get the latitude & longitude).

As you might have guessed, there are also stored procedures and functions to get the distances between two points, doing a radius search, and other stuff of that ilk. Fortunately, all that stuff can easily be found using your favorite search engine, so you don’t need to know how all the math in the law of cosines works (you just need to know of it).

Well that’s it for the back end. Next time I’ll talk about the front end put on my Web Developer hat.


Did you know:

RCG’s Zearch is Released!

Robbie has just released what may be the most addictive home search tool I have every used!

Search Tool Codename: Zearch!

Some obvious highlights include:

  • Dynamic map of color-coed listings
  • Geocoded Rain City Posts

I know the search is addictive because earlier today I showed this to a friend who is in the market to buy a home in Seattle and we couldn’t pull ourselves away from bouncing around the map. To get an idea of what I mean, follow this link to the detail page of this listing in Ballard.

You should see a few things:

  • Photos of the listing
  • Lots of color dots
  • Raindrops

The color dots all represent different homes that are currently on the market in the nearby area. Light blue dots mean the house is far below the average listing price while dark red mean it is far above the average listing for that area. The addicting part is that you can click on any of these dots to bring up the home details (and photos) for that home. With my friend sitting beside me, we kept searching for light blue dots amid lots of red hoping to find a “deal”. Very interesting stuff.

You might also notice on the map that there are some raindrops. These represent Rain City Guide blog posts that have been geocoded. This is subtle, but very powerful, as it essentially represents a mapped-based archive page for Rain City Guide’s posts. The cool part about this is that as you’re searching for background on a home, you can see what RCG posts have said about the neighborhood! And as we continue to add more neighborhood content on Rain City Guide, I’ll continue to geocode the posts, which will automatically add more background data to the home search tool…

[photopress:zearch_screenshot.jpg,full,centered]

What else has Robbie done?

For starters, he didn’t mess with the stuff that works well. You can still use the site to:

Some other things to notice about the new detail page is that whenever you move around on the map, all the nearby active listings show up. More impressively, you can also toggle on the nearby schools, gas stations, grocery stores, and other points of interest associated with every day living. Again, the color coded pushpins show that homes in Medina are bright hues of red, many homes in Renton are purple, while most of the homes in this area of Tacoma are blue. So much cool stuff, so little time!

On a side note, today was my last day as a transportation engineer! For the next few weeks I’m unemployed! πŸ™‚

Who is Trulia Serving?

It was wonderful to have some quality time with the Trulia last night and I had a blast chatting about real estate search with them.

Sami and Kelly were great sports and the group ended up spending about an hour and a half discussing Trulia’s business plan and how others (like RCG!) might plug into their “platform”.

Seeing as how I posted the invite to the “chat” a few hours before the event, I didn’t expect too many people and was pleasantly surprised at the turn-out:

Interestingly, we dived into some questions I had about their business model pretty darn quickly and because I would completely fail if I tried to articulate the opinions of each of the participants. Instead I’ll only give my take on the Trulia’s place in future of real estate search (and welcome other participants to write up “meeting notes” if they are interested).

Most of my questions revolved around how Trulia planned to serve these four main groups (I’ve also included the ways they might serve these groups):

  • Brokerages: with (1) exclusive listings and/or (2) enhanced placement.
  • Agents: with (1) great branding opportunities and/or (2) tools to increase the agent’s internet “presence”.
  • Buyers: with (1) the most comprehensive listings, (2) the cleanest search interface, and/or (3) connections to the best real estate professionals.
  • Sellers: with (1) reasonably priced listings and/or the (2) widest possible exposure.

In talking with Trulia, it became clear that these guys have every intention of serving the brokerage community very well. They’ve opted to sign agreements with many of the largest real estate brokerage firms in order to get a live version of their feeds. In return, they’ve agreed to limit the type of listings they show (read: No FSBOs) and they’ve agreed to send people to the listing brokers website for detailed information on a home that is for sale. From what I can gather, they definitely have won over the largest brokerage firms and seem to be serving them well.

For agents, they offer some tools like one that allow agents to put a Trulia search on their site and another tool that allows real estate sites to list the most recently added homes. While the tools are interesting, I do not see anything that would make most agents jump at the opportunity to work with Trulia, especially since their search results will send potential buyers to the listing broker’s website if they find a home they are interested in. We discussed how a few forward thinking brokers realize that they best serve their clients when they give them the maximum amount of information, I don’t see that logic prevailing in the industry any time soon.

Trulia logo
Exmpl: “Palo Alto”, “94114”, (but not Seattle yet!) πŸ™‚

It is with buyers that I’m seeing the largest weaknesses in their business model. They have no plans to get a complete listing of “MLS” data (actually, it sounds like they may have completely ruled out the opportunity through contracts they’ve signed with large brokerages). In somewhere like NY where there is no “one” MLS, this strategy allows them to include listing data from a majority of the brokers and will quite possibly allow them to have the largest database of listed homes. However, in somewhere like Washington (and Oregon, California, and I think most of the country), this strategy means that they will never have the most complete database of homes (at least not under the current MLS system). Maybe I’m missing something, but I simply cannot see how this is going to win buyers over… Even an ugly search interface is better than a snazzy Trulia search if it includes a larger selection of homes that are on the market. In other words, if I’m interested in a home in Sunset Hill and there are three homes available in my price range, I want to see all three, not just the two that are represented by brokers who have signed agreements with Trulia.

INSERT_YAHOO_MAP

So, that addresses the issue of the most comprehensive listings, but how do they fair in “(2) the cleanest search interface”, and “(3) connections to the best real estate professionals”?

They definitely have the cleanest home search interface around and considering they put a lot of effort into making their searches lightning fast, they definitely serve the buyers best interests in this respect!

In terms of the issue of connecting the buyers with the best real estate professionals, I think they are missing the boat. Sure the listing broker’s website may have the most information available about a listing, but if their set-up encourages buyers to contact the listing agent (as oppose to a buyer’s agent), I do not think they are serving the buyer very well. To give an example, imagine if Google sent everyone who was looking for information on a iPod to Apple’s website (it is an Apple product after all). Sure Apple might be the best source of information on an iPod (assuming they share all the good and bad that they know), but is Apple’s website really the best place to research (let alone buy) an iPod? To bring this back to real estate buyers, the best way they could serve buyers would be if they could hook them up with an agents who had their best interests in mind (as oppose to an agent who has the best interest of the seller in mind).

One of the weaknesses in my Apple example is that Trulia includes a vast amount of awesome stats available directly on their website so that a buyer can find out information like comparable homes on the market without ever turning to the listing agent’s site for more detailed information on the listing itself. However, that doesn’t change the fact that they don’t plan to have the most comprehensive database of homes (at least not in areas with one functioning MLS) and they do not offer a way for a buyer to hook up with a good agent, but rather, they guide them to the listing agent.

For Sellers, they mentioned that they do not offer much. They are focused on working with brokerages, so the best they offers sellers is that if they use a real estate professional, they will indirectly benefit from the exposure on Trulia.

In reading over my text, I can’t help but notice just how negative I sound on Trulia. In reality, they are a great group of people and I’ve enjoyed EVERY interaction I’ve ever had with them. Along those lines, I decided to go back and remind myself of why I found Trulia so inspiring when they launched:

  • The search interface is as simple as entering a city name or a zip code! The UI is beautiful.
  • The filtering by other features like Price, Bedrooms, Bathrooms and Price is fast and very intuitive!
  • When clicking on more detail for a listing, you get the VERY useful information like the price per square foot, the days on the market, as well as details for other recently sold homes and similar homes in the area!
  • The color coded recently sold homes is awesome!
  • I really like that the the location of my search is stored in the url. This allows me to easily save and or send an area of interest. For example, here are the homes for sale in the part of Los Angeles where I grew up: http://www.trulia.com/CA/Eagle_Rock/90041/. (Also notice that it has neighborhood facts on this page.)
  • It has RSS feeds so that I can subscribe to my zip code and be updated each time a listing comes on the market.

All of these points are still valid and represent reasons that I still think Trulia is a fascinating and innovative company. My hope is that my comments in this post, especially when negative, will provide some food for thought to the people I’ve grown to really like over at Trulia. Their UI (user-interface) is a beautiful thing and blows away the UI of any other home search tool that I’ve seen. It is Trulia beautiful thing! πŸ˜‰

The Lame List – Real Estate Web Sites that Suck

evccliftIn a recent post, Galen said “And no, it’s not what Windermere or ZipRealty already do: their sites s-u-c-k compared to true consumer-oriented sites like Amazon.com and Google.”.

Now, comparing nearly any web site to Amazon or Google isn’t a fair comparission. Google & Amazon have 1) many of the best software engineers on the planet working for them and 2) they have thousands of them working on their web site. Microsoft (which is in the same league as Google & Amazon) is said to spend over $100 million/year on it’s corporate web site (I’m sure they spend even more on MSN)!

The only real estate company that I can think of that could afford that level of R&D is Cendant (they own Century 21, Coldwell Banker, and ERA). Ironically enough they also own Orbitz & CheapTickets.com (who are Expedia competitors). The vast majority of brokers are probably smallish companies that under-invest in technology (and Cendant is probably happy enough with the status quo that they aren’t going to rock the boat until the waves of change force it upon them).

Now, I do think ZipRealty’s site is medicore and Windermere’s site is average. But suck is way too strong a word. Could their sites be better? Yeah. But given they aren’t billion dollar internet/software companies with multi-million dollar R&D budgets, I think the sites are OK. I could do better, but don’t mistake a medicore site for one that sucks.

What I want you to do is tell me about the WORST agent & broker web sites out there. I only want to hear about the truly awful. Let me give you an example of how bad it can be.

Teri Herrera, is a very successful agent at John L. Scott with whom I purchased my first house with. However her web site makes me cringe in horror. Fortunately, she’s a much better agent than her web site would suggest, but her site is nothing but a flash link farm. Nothing of value other than links to other places and it’s wrapped up as an obnoxious flash app. At least ZipRealty & Windermere have branded MLS searches, instead of being just a link farm or framing somebody else content.

See, ZipRealty & Windermere look pretty good now, don’t they.

Robbie

Where's the Beef?

In my last post, I awoken Ardell from her winter hibernation. To which I feel I should both apologize and take the credit . Ardell raised many interesting points, that I feel that merit a response. First and foremost, I’ll admit I am somewhat biased, since I tend to view things through buyer-colored glasses and I overlooked the reasons why a seller might be less than completely forthcoming with their listing information.

Where's The Beef?Admittedly, my gripe about bad zip codes is pretty minor (less than 1% listings are affected). However, entering an incorrect zip code is like misspelling a street name, it just shows buyers a lack of attention to detail. If I’m a buyer looking for vacant land in an Issaquah zip code (98029), I don’t want to see listings in Bellingham. If I’m looking for rental property in a Redmond zip code (98052), I don’t want to see listings in Mercer Island.

Regarding my beef about school information; since only half the schools surrounding Lake Washington are above average, are only half are listed? I wonder if the MLS near Lake Wobegon has this problem? Besides, who makes the decision that the school that serves a property is bad? The buyer might think XYZ school district is great, but because the seller had a differing opinion (and didn’t disclose that information), they just lost a potential buyer who won’t bother looking at a property that they otherwise might have.

Regarding my beef about latitudes and longitudes; OK, you the agent have no control over this. It still doesn’t explain why the MLS does such a bad job of geocoding! Admittedly, most people probably don’t care (unless they use a computer). Unfortunately, since many people use computers to find property information (and that number is only increasing), it’s a problem that will only become more noticeable.

When you combine latitudes and longitudes with free digital maps and inexpensive computer databases, you can see the location of listings in the neighborhood and other points of interest with an ease that was impossible to do (or at least very expensive) only a few short years ago. As they say, the 3 most important words in real estate are “Location, Location, & Location”, which means the most important part of a real estate listing web site is going to be “Maps, Maps & Maps” (as you can see by the growth of map-based real estate listings web sites this past year). Not having accurate latitudes and longitudes, makes it harder for software engineers to develop features the real estate buying, real estate selling, & internet surfing public increasingly are going to demand.

Pop quiz, which house is the better value? This ~$800K house or this ~$800K house? Without knowing how much living space I’m getting for my $800K, it makes my job as a buyer more difficult.

Just because you can’t get an exact measurement, doesn’t mean you shouldn’t measure. Most real estate listings & transactions have more legal paperwork & disclaimers than a Microsoft EULA! Furthermore, the NWMLS has the source for the square footage information associated with a listing. Couldn’t an agent argue the source should be liable? I assume the agent pays only pays if the error was in the sellers favor? Otherwise, isn’t listing a property with 0 square footage is asking for a lawsuit? Granted, I’m not an attorney, but if the risk was meaningful, I’d suspect all properties would have a square foot value of 0! So, if you get 3 different answers, pick the lowest value! Throw out the measurements from the French & Russian judges! Inaccurate data is always better than no data (at least for buyers).

Lastly, as computer based listing search & analysis tools become easier to use and more sophisticated, bad data is only going to be easier & easier to spot and less & less tolerated. Missing data makes the buyers job harder. Perhaps ironically, it also makes the sellers job more difficult as well. How can you draft an accurate competitive market analysis report if you don’t know what the size of your competitors are / were? Sure an agent could do the extra leg work of looking at the county records, but it’ll cost you more time (and time is money).

I guess the moral of this post, is caveat emptor. Although buyers may want to trust MLS data, sellers have a motivation to give you a reason not to. Perhaps, there is a market for a CarFax like service, that provides better MLS data, than the MLS? Despite my complaining, none of these obstacles are going to stop software engineers from giving the internet home buying public what they want (complete & accurate listing data). The internet has given the buyer more knowledge & more power in the marketplace. Sellers (& their agents) would be wise to embrace this trend, instead of avoiding it.

Robbie
Caffeinated Software

PS – This blog posting information is not warranted. Reader should verify all information to their satisfaction. Information is based on data available to the poster, including county records. The information has not been verified by the poster and should be verified by the reader. To the maximum extent permitted by applicable law, in no event shall the poster (Robbie), Caffeinated Software, or its suppliers be liable for any special, incidental, punitive, indirect, or consequential damages whatsoever (including, but not limited to, damages for loss of profits or confidential or other information, for business interruption, for personal injury, for loss of privacy, for failure to meet any duty including of good faith or of reasonable care, for negligence, and for any other pecuniary or other loss whatsoever) arising out of or in any way related to the use of or inability to use this blog post.

Radcribs Stealing Trulia Code for their Real Estate Search?

I received this email from a Rain City Guide reader this morning:

Radcribs had to take down its NYC real estate mashup. Radcribs basically copied every bit of its code from Trulia (all the javascript, all the CSS for layouts, all the HTML templates, etc). Trulia’s lawyers made short work of that. You’ll see that Radcribs is back to just providing what is basically a ripoff of PropertyShark’s service. (PropertyShark actually had to file suit against Radcribs for copyright infringement to get Radcribs to back off.)

Like the top-notch investigative reporter that I am ;), I was able to confirm that Radcribs was stealing appeared to be copying code from Trulia from a very good source who I’ll leave anonymous.

Without a doubt, I’m going to take RadCribs (and CityCribs, which appears to be run by the same people) down from my list of Innovative real estate search tools since they don’t seem to be doing much innovation.

If I hear anything else interesting about this story, I’ll be sure to report it!

UPDATE: Based on some feedback from my source, I decided to tone down the assertion a bit since I may have been overzealous in my reporting…