Death by a thousand paper cuts

[photopress:papercut.jpg,thumb,alignright]Every once in a while a realtor or broker from out of state will ask me to develop an IDX web site for them. Unfortunately, supporting a new MLS is very similar to supporting a foreign language. It is a large software engineering task that takes a lot of time, and since I don’t already have the code written and don’t already have access to their MLS’s feed, I inform them that time is money and the conversation usually ends there. Someday, that may not be the case, but I’d rather be small & profitable than large & broke.

The problem is made worse by the fact that many Realtors don’t know what format or protocol their MLS uses for data downloads or even who to contact in their MLS to get a feed for an IDX vendor. If you ever want to change IDX vendors, hire a software engineer or are crazy enough to do it yourself, you should know this. Knowing how your MLS distributes your listing data is like knowing how to change the oil in your car or how to defragment your hard drive. You don’t have to know, but it’s good to know. It may seem like I’m ranting about some MLS techie mumbo jumbo thing again, but it is preventing the industry from taking advantage of the low cost IT innovations that could be. I don’t think folks fully appreciate the challenges that an IDX vendor faces and how those challenges are retarding the industry’s growth and health.

For example, the NWMLS (Northwest Multiple Listing Service – serves mainly Seattle, WA and western Washington) uses software from Rapattoni. It provides listing data via a proprietary SOAP interface and all the photos are accessible via an FTP server. Listing data is updated constantly (a new listing usually appears in our feeds about 15-20 minutes after it’s been entered into NWMLS by a member as I understand it).

By contrast, EBRD (East Bay Regional Data – serves mainly Oakland, CA and the east bay area) uses Paragon by Fidelity MLS Systems provides it’s listing data via nightly updated CSV text files, down-loadable by FTP. The new and updated listings images are accessible via ZIPed files via FTP. The photos for active listings which haven’t been recently added or changed are not available (unless you bug the IT dept).

The only way they could make their systems more different is if the EBRD encoded their listings in EBCDIC! In order to support both, I need to develop 2 very different programs for downloading the listing data onto my server, importing the listing data in my database, dealing with differences in the listing schema (for example, the EBRD doesn’t contain a “Number of Photos” field or a “Community Name” field), dealing with differences in the photo location downloading (the NWMLS stores all photos in an uncompressed format in one of a thousand sub directories while the EBRD just stores the fresh photos in one big zip file). So I can spend my limited time improving my software for new markets (that have no customers) or improving my software for my home market (which has paying customers). Unfortunately, given the current market realities I can only afford to support my home market at this time since MLS IDX programs can be very different and there is no place like home (so far as I know anyway).

I keep waiting for RETS to save me from this madness, but until it happens in Seattle or the East Bay, I’m not holding my breath. After all, if two of the larger MLSes in the country in the two most tech savy areas of the nation don’t support it yet, I interpret it to be a vote of no confidence. I suppose, RETS could be going great guns in the rest of the country, but if it was, I’d expect the NWMLS & EBRD to be all over it, like the establishment on Redfin.

The Center for REALTOR® Technology Web Log, paints a rosy a picture regarding RETS deployment in the industry. Unfortunately, according to Clareity Consulting, an IT consulting firm that serves MLSes and other parts of the real estate eco-system, RETS is the NAR’s unfunded mandate. Although, everybody wants the benefits of RETS, nobody is willing to pay for it. Furthermore, it appears back in days before I got sucked into real estate technology, there was an effort to promote the DxM standard and that went nowhere (which is a bad omen). What’s worse is that they keep moving the goal posts. We don’t even have widespead RETS 1.0 support, and they’ve already depreciated that standard going full bore on RETS Lite and RETS 2.0. It seems the biggest problem is one of vision and scope. They keeping adding more features to cover more scenarios, when we don’t even have wide deployment of the existing standard (assuming that we had standards to begin with at all). It reminds of the recent software industry debacle that is known as “Longhorn reset“. The problem is that RETS is just too complicated, in an environment with too many legacy systems in place, too few resources to support it, and excessive aspirations. The idea of RETS is great, it’s the implementation and deployment that’s disappointing and at least Microsoft pulled Vista out if it’s death spiral…

[photopress:pappercutter.jpg,thumb,alignleft]The sad thing is that computer industry already has great tools for moving data around over the Internet in efficient and well supported (if sometimes proprietary ways). They allow you to query, slice, and dice your data in a near infinite number of ways. They’re called database servers. They are made by multiple software vendors and there are even some excellent open source ones out there. They let you set permissions on what accounts can see what tables or views (gee, sounds like something an MLS would want). The better ones, even have this level of security to the field level. Even better, most of these so called database servers have the ability of exporting data into spreadsheets, reporting tools, and even GIS systems. All of them provide a well defined and often times well implemented API that software developers can use and exploit to implement what hasn’t been invented yet!

Why doesn’t the NAR & the MLSes save us all the trouble, standardize on a few good database platforms (I’m a fan of MS SQL Server and MySQL, but I’d settle for anything that has ODBC, .net & Java support at this point), and provide everybody RDBMS accounts? It’d lower the cost for us IDX vendors (less code to write, since everything is just SQL), it’d lower the costs for MLS vendors (since data access, security, programmability, and scalability is now the RDBMS vendor’s problem), provide more choices for agents and brokers (since getting Excel talking to MS SQL Server is a cakewalk compared to RETS) and it will lower IT costs for the MLS (because the MLS vendors don’t need to invent an industry specific solution to a problem that’s been largely solved already and I’m betting that the MLS vendors already use somebody else’s RDBMS to implement their solutions anyway). Granted, a SQL Server won’t enable all the scenarios that RETS wants to enable (if RETS was ever well implemented and widely deployed enough for that happen). However, I’m of the belief that it’s not going to happen until after Trulia or Google Base becomes the de facto nationwide MLS by providing a single schema with a simple REST like web services interface.

So, what does your MLS do to support IDX vendors? Do they provide all the data all the time, or just daily updates? Have they deployed RETS yet? Are they going to? Who is their MLS software vendor or do they have a home gown solution? What do you want to do, that you can’t do today because the data is in a format that you can’t use easily? Would you be willing to pay more in membership dues for better software or better service from your MLS? Are we at the dawning of the RETS revolution, or is it too little, too late?

PS – Anybody, know anybody from an MLS / IDX dept or MLS vendor that blogs? I’d love to know what things are really like on their side of the listing data fence.

Required Reading…

Another list of 10:

  1. Worth reiterating: Polly’s comments should be required reading for all agents (including the comments within the post about her comments! 🙂 ).
  2. Claudia Wicks lets us know about this “genealogy” site geared toward homes instead of people… The site includes maps, photos, etc.
  3. Also, are press releases still valuable? A quick search on Claudia shows that a recent press release she put out about being one of the Top Woman Real Estate bloggers dominates the coverage of her name on a google search. Fascinating.
  4. Artemi just emailed me to let me know that he just released a major upgrade to his real estate search site for England. The features that stick out for me are the simplicity, the tags for each property, and the natural language search (like the fact that the site also pre-fills in the search box with relevant tags). Great stuff…
  5. Interesting to read Jim’s perspective on the new website he is building with Ubertor. From what I’ve seen, the website definitely suffices as far as websites go, but if I was searching for an agent, I’d say his blog does a much better job selling himself.
  6. Searchlight had a follow up to their renting is for suckers article that describes some reasons a person should not buy a house. I can’t tell if they read my comment, but they clearly addressed some of the issues I brought up.
  7. Joel gives some insight into the art of being good enough
  8. And then follows it up with news that Prudential is jumping on the Zillow API bandwagon.
  9. My take? Here are the ingredients for housingmaps style publicity: map. geocode. data1. data2.
  10. Jim’s worth noting column reminded me that I really wanted to mention DataPlace at some point. I saw a presentation of this tool at Where2.0 and was very impressed with the massive amount of neighborhood, demographic, socio-economic, etc. data that the Fannie May Foundation has manage to squeeze into their interface (and it is all free!). To give an overview, check out the massive amount of mortgage information available for the Seattle-Bellevue area or better yet, check out the map that I was able to easy build on post on my site of home ownership rates in the area:

Our Home is Now Listed!

And despite the fact that we may not have Ardell’s magic open house touch, we are showing it on Sunday between 12 and 3PM as described in the open house listing on Trumba.

Update:

I also created an adword campaign around our home. If you see the following ad while surfing the web, don’t click on it because it costs me money and just takes you to this blog post! 🙂
[photopress:beautiful_ballard_home.jpg,thumb,centered]

Funny side note… I decided to try out Google’s option to target ads at specific websites and noticed that Zillow was on the list for real estate related sites. However, in order to see the ad for my home on Zillow, I had to disable the one-two punch of Adblock and Filter.G on my Firefox browser. By disabling these two extensions, so many websites that I visit on a regular basis looked so much uglier! It was like traveling the web naked! It you’re not using the firefox browser with these two extensions, then you are almost definitely surfing a web that looks much more annoying than mine!