I thought I’d take a moment to reflect on how Rain City’s favorite MLS Search is implemented. I’m a little tired of thinking in computer languages (mostly T-SQL, C# and Javascript), so I figured I’d blog a bit in Geek English for a little while before I hit the compiler again.
[photopress:matrix1_alt.jpg,full,alignright]
I’m always interesed in how web sites & computer software works under the covers, so I thought I share some of the more interesting points about how I’ve implemented “Zearch” to date for the “geekier” folks in the blogosphere.
It all began way back in the fall of 2005 shortly after I got my first MLS feed. At the time, Microsoft’s asp.net 2.0 platform was still in beta. However, after learning what Microsoft’s next generation web development tools were going to do (and seeing what Google Maps and Microsoft’s Virtual Earth teams were doing), I saw a great unrealized potential in MLS search tools and decided to do something about it.
Anyway, it’s all built on top of asp.net 2.0 and MS SQL Server 2000 (yeah, I know I’m old school). One of the first things I did is combined all the property types into a VIEW and create a dynamic SQL query when you search for properties. Some search tools only let you search for residential properties or condominums at one time (which I thought was lame). I orginally tried to implement stuff doing a bunch of UNIONs, but keeping track of the schema variations for the different property types eventually drove me nuts, and I encapsulate all that crud into a VIEW.
I also find it a little ironic, that I’m not the only one who found the MLS schema differences a PITA to deal with. I’m glad the various MLS software vendors and the CRT are working toward a common industry schema (aka RETS), so us application developers can focus on the real problem (developing compelling & useful software), instead of remembering that the ld column in one table, is really the list_date column in another table.
Another interesting thing I do on the back end is that I geocode every listing after I do data download. The main reason is that I don’t trust the MLS data and their bogus geo-coding would make my app look bad. I also knew when I started, I’d eventually do maps, so as soon as a new listing hits my database, it’s gets more accurately/correctly geo-coded. In case your wondering if I’m screen scraping w/ Perl or something else, it’s all done with T-SQL stored procdures. (Well, technically it’s a proc that calls the MSXML2.ServerXMLHTTP COM object, to issue an HTTP request against a geocoding web service, and then uses OPENXML on the response’s XML to get the latitude & longitude).
As you might have guessed, there are also stored procedures and functions to get the distances between two points, doing a radius search, and other stuff of that ilk. Fortunately, all that stuff can easily be found using your favorite search engine, so you don’t need to know how all the math in the law of cosines works (you just need to know of it).
Well that’s it for the back end. Next time I’ll talk about the front end put on my Web Developer hat.
Did you know:
- Google’s stock is a bubble. Seattle area real estate may or may not be.
- Zillow uses an open source front end but is using MS SQL 2005 on the back end.
- HouseValues is a Microsoft shop and RedFin isn’t.
- The server that runs our home search is a home built machine running Windows Server 2003 co-located at IsoMedia in Redmond.
- The server that runs Rain City Guide blog is at Inmotion hosting in California (running WordPress on a LAMP stack).
- MySpace is now using asp.net and handles about 1.5 billion page views a day. (2nd only to Yahoo)
- Via Virtual Earth (a blog for Virtual Earth developers) has made Rain City Search a Site Gallery pick. You know your doing cool stuff when other geeks like it. 🙂