[photopress:papercut.jpg,thumb,alignright]Every once in a while a realtor or broker from out of state will ask me to develop an IDX web site for them. Unfortunately, supporting a new MLS is very similar to supporting a foreign language. It is a large software engineering task that takes a lot of time, and since I don’t already have the code written and don’t already have access to their MLS’s feed, I inform them that time is money and the conversation usually ends there. Someday, that may not be the case, but I’d rather be small & profitable than large & broke.
The problem is made worse by the fact that many Realtors don’t know what format or protocol their MLS uses for data downloads or even who to contact in their MLS to get a feed for an IDX vendor. If you ever want to change IDX vendors, hire a software engineer or are crazy enough to do it yourself, you should know this. Knowing how your MLS distributes your listing data is like knowing how to change the oil in your car or how to defragment your hard drive. You don’t have to know, but it’s good to know. It may seem like I’m ranting about some MLS techie mumbo jumbo thing again, but it is preventing the industry from taking advantage of the low cost IT innovations that could be. I don’t think folks fully appreciate the challenges that an IDX vendor faces and how those challenges are retarding the industry’s growth and health.
For example, the NWMLS (Northwest Multiple Listing Service – serves mainly Seattle, WA and western Washington) uses software from Rapattoni. It provides listing data via a proprietary SOAP interface and all the photos are accessible via an FTP server. Listing data is updated constantly (a new listing usually appears in our feeds about 15-20 minutes after it’s been entered into NWMLS by a member as I understand it).
By contrast, EBRD (East Bay Regional Data – serves mainly Oakland, CA and the east bay area) uses Paragon by Fidelity MLS Systems provides it’s listing data via nightly updated CSV text files, down-loadable by FTP. The new and updated listings images are accessible via ZIPed files via FTP. The photos for active listings which haven’t been recently added or changed are not available (unless you bug the IT dept).
The only way they could make their systems more different is if the EBRD encoded their listings in EBCDIC! In order to support both, I need to develop 2 very different programs for downloading the listing data onto my server, importing the listing data in my database, dealing with differences in the listing schema (for example, the EBRD doesn’t contain a “Number of Photos” field or a “Community Name” field), dealing with differences in the photo location downloading (the NWMLS stores all photos in an uncompressed format in one of a thousand sub directories while the EBRD just stores the fresh photos in one big zip file). So I can spend my limited time improving my software for new markets (that have no customers) or improving my software for my home market (which has paying customers). Unfortunately, given the current market realities I can only afford to support my home market at this time since MLS IDX programs can be very different and there is no place like home (so far as I know anyway).
I keep waiting for RETS to save me from this madness, but until it happens in Seattle or the East Bay, I’m not holding my breath. After all, if two of the larger MLSes in the country in the two most tech savy areas of the nation don’t support it yet, I interpret it to be a vote of no confidence. I suppose, RETS could be going great guns in the rest of the country, but if it was, I’d expect the NWMLS & EBRD to be all over it, like the establishment on Redfin.
The Center for REALTOR® Technology Web Log, paints a rosy a picture regarding RETS deployment in the industry. Unfortunately, according to Clareity Consulting, an IT consulting firm that serves MLSes and other parts of the real estate eco-system, RETS is the NAR’s unfunded mandate. Although, everybody wants the benefits of RETS, nobody is willing to pay for it. Furthermore, it appears back in days before I got sucked into real estate technology, there was an effort to promote the DxM standard and that went nowhere (which is a bad omen). What’s worse is that they keep moving the goal posts. We don’t even have widespead RETS 1.0 support, and they’ve already depreciated that standard going full bore on RETS Lite and RETS 2.0. It seems the biggest problem is one of vision and scope. They keeping adding more features to cover more scenarios, when we don’t even have wide deployment of the existing standard (assuming that we had standards to begin with at all). It reminds of the recent software industry debacle that is known as “Longhorn reset“. The problem is that RETS is just too complicated, in an environment with too many legacy systems in place, too few resources to support it, and excessive aspirations. The idea of RETS is great, it’s the implementation and deployment that’s disappointing and at least Microsoft pulled Vista out if it’s death spiral…
[photopress:pappercutter.jpg,thumb,alignleft]The sad thing is that computer industry already has great tools for moving data around over the Internet in efficient and well supported (if sometimes proprietary ways). They allow you to query, slice, and dice your data in a near infinite number of ways. They’re called database servers. They are made by multiple software vendors and there are even some excellent open source ones out there. They let you set permissions on what accounts can see what tables or views (gee, sounds like something an MLS would want). The better ones, even have this level of security to the field level. Even better, most of these so called database servers have the ability of exporting data into spreadsheets, reporting tools, and even GIS systems. All of them provide a well defined and often times well implemented API that software developers can use and exploit to implement what hasn’t been invented yet!
Why doesn’t the NAR & the MLSes save us all the trouble, standardize on a few good database platforms (I’m a fan of MS SQL Server and MySQL, but I’d settle for anything that has ODBC, .net & Java support at this point), and provide everybody RDBMS accounts? It’d lower the cost for us IDX vendors (less code to write, since everything is just SQL), it’d lower the costs for MLS vendors (since data access, security, programmability, and scalability is now the RDBMS vendor’s problem), provide more choices for agents and brokers (since getting Excel talking to MS SQL Server is a cakewalk compared to RETS) and it will lower IT costs for the MLS (because the MLS vendors don’t need to invent an industry specific solution to a problem that’s been largely solved already and I’m betting that the MLS vendors already use somebody else’s RDBMS to implement their solutions anyway). Granted, a SQL Server won’t enable all the scenarios that RETS wants to enable (if RETS was ever well implemented and widely deployed enough for that happen). However, I’m of the belief that it’s not going to happen until after Trulia or Google Base becomes the de facto nationwide MLS by providing a single schema with a simple REST like web services interface.
So, what does your MLS do to support IDX vendors? Do they provide all the data all the time, or just daily updates? Have they deployed RETS yet? Are they going to? Who is their MLS software vendor or do they have a home gown solution? What do you want to do, that you can’t do today because the data is in a format that you can’t use easily? Would you be willing to pay more in membership dues for better software or better service from your MLS? Are we at the dawning of the RETS revolution, or is it too little, too late?
PS – Anybody, know anybody from an MLS / IDX dept or MLS vendor that blogs? I’d love to know what things are really like on their side of the listing data fence.