A look behind the Shack, Part 1: Speed Kills


Really, in the world of the web, slow speed kills. And most people only think about the length of time it takes a website to load when it is taking eons to show up. For static sites, meeting the magic four second page load time isn’t too hard, but for sites with lots of “dynamic content” (fancy menus and whatnot) and maps, it becomes sort of a trick.

Many (most?) fancy real estate search sites are plagued by slow load times – see the real estate 2.x person’s site reviews to see scathing analyses of how long it takes to for many sites to load. In light of this, we took great pains to make our site both feel and be speedy and, if I don’t say so, I think we’ve been pretty successful. On my old-ish computer, the Seattle real estate page typically loads in under 10 seconds (we could still do better on this!) and house detail and nearby pages typically load in under 3 seconds.

One of the tricks we employ is we don’t actually shuffle visitors from a complete page to a completely new page, which means we don’t have to reload the time-consuming Google Map or any of the stuff on the sidebar. Instead, we load little subpages within the site using AJAX (which is, I believe, a dumb acronym). When you click to see the details for a house, we only load up those details and leave the side and top of the site alone and intact. When you click back to the map tab, it’s already there waiting for you because it was just hiding behind the house information.

There are some other tricks that are much more technical: before we launched in December we did a bunch of optimization to cut down the time it takes our database (with over 30,000 western Washington properties currently for sale) to find and spit out the houses that match each search. Currently it it returns the ‘shacks’ that match your search within a second of you dragging the map around – the rest of your wait is the time it takes to actually send and display that information on your screen.

The dynamic updating introduces a can of worms of it’s own, including longer development time, but we think the tradeoffs were entirely worth it.

This is the first in a series of “Behind the Shack” themed posts. If you are especially interested in one aspect of ShackPrices.com, let me know and I’ll try to write about it!

Gridirons, Grid Controls & Touchdowns!

Every once in a while, Dustin tells me I should blog more. To which I usually reply, “I’m a coder not a blogger”. Who do I look like, ARDELL? (she is the ultimate blogging machine isn’t she? Notice, how I used the bold for the branding). However, despite my objections, he is correct. So I’m going to try blogging more often with a less time consuming off the cuff remarks style, instead of the thoughtful essays I usually favor.

The Matt in the Hat

Are you ready for some football?
OK, the 2006 NFL season is upon us and with the awesome success of the Seattle Seahawks (and the less than mediocre success of my 2005 fantasy football team), it’s time to regroup and prepare for my fantasy football draft. So does the blog-o-sphere have any draft advice for who’ll be this years new star or big bust that might be found? Any good fantasy site or blogs you guys like? Will St. Reginold (aka The Matrix) will be rookie of the year and rescue New Orleans? Will Alexander the Great break the Madden Curse? Will our beloved QB continue his evolution into the next Steve Young or merely join the hair club for men (or both)? Will Arizona be worthy division rival? (Perhaps “The Swann” would know the answer to that question?) Is there a RedFin / Zillow fantasy league I can sit in on?

ComponentArt’s Grid Control – Oh yeah baby!
I recently purchased a copy of ComponentArt’s Web.UI 2006.1 for ASP.net. So when I get some more free time, my favorite MLS search tool is going to get much better. Just for kicks, I integrated the Grid Control with the search results page on RCG Search and I’m very happy with the results. Any way, if your a professional web/software engineer that writes applications on ASP.net, I highly recommend it. I also considered Telerik’s r.a.d. controls, but I like ComponentArt a little better and they had a 10% off sale earlier this month, so they got my cash (Telerik may still get some though, they also do excellent work). I also looked at eBusiness Application’s AJAX grid on Dustin’s recommendation, but it didn’t have ASP.net 2.0 support that I desire and I felt it didn’t compare favorably with the best ASP.net only toolset vendors. Still, it looks like a great PHP grid control.

PS – I want to thank Gordon Stephenson & Jay Young of RPA for the ton of work they’ve been giving me, so I could afford this awesome addition to my software war-chest.

Live from Redmond
In other news, I decided to drink more Kool-Aid and I created this blog post using the new Windows Live Writer. Everybody knows I love MS tech, but who names these things? Names like that remind me of the Microsoft iPod video and the Office Dinosaur ads (Shudder). At any rate, it’s kind of like Word for WordPress. It’s a desktop based blog posting editor. It appears to support every blog platform that matters (Windows Live Spaces, BloggerTypePad, WordPress, and many others) and is better than most of the web based editors out there. Among it’s cooler features is auto-save (you don’t lose your post in case your web browser or blog posting app crashes), MS Virtual Earth integration (including Bird’s Eye images). Just find a map or image you want to insert, click OK, and a thumbnail is placed on your blog post (like that lovely photo of Qwest Field you see in this blog post).  So to paraphrase Dr. Suess…

Me: “You do not like MS Live Writer, so you say? Try it, try it, and you may. Try it and you may I say.”
Bloggers: “If you will let us be, We will try it, you will see”

There, that off the cuff post only took 3 hours of editing, revising, linking and tweaking…

Sigh, how does Ardell & Dustin do it? Maybe I need to link less and bold more? I’m going back to my compiler now, I’m a much better coder then a blogger.

Here’s to football! Go Seahawks!

The race for 2nd place has begun

OK, I’m biased and I still believe that “Zearch” is currently King of the Hill of King County home searches. However, I’m willing to give credit where credit is due and say the distance between us and the rest of the pack got smaller today.


Today, John L Scott and their solution provider, Bellevue based Real Tech, have quietly introducted what they call “Real-Maps 2.0“. Essentially, they are now using Microsoft Virtual Earth instead of the old school ESRI based solution. Additionally, it appears they’ve AJAX-ifed their search pane on their map page, so when you change search criteria it automatically updates the map and the matching results count (which is pretty slick). It also appears that Real Tech has gone all out, and at first glance, it appears they are using the not quite released Microsoft Atlas framework (a new development tool that makes “Web 2.0” style applications easier to develop). It appears they are using JSON for the postbacks (most sites use XML, I currently send back Javascript source code). I haven’t spent much time reverse engineering it or learning Atlas yet, so it’s possible they are using a 3rd party AJAX framework. Regardless of the technical details, it does raise the bar for everybody else.

So what does this mean? Here’s my thoughts….

  • Me – Time to install and learn Atlas this weekend. If I’m going to remain competitive with the big boys, I gotta be using the same tools that the big boys are using. Besides doing complex AJAX with Asp.net 2.0 ICallbackEventHandler is bit tedious for my liking.
  • Galen – Wondering if he should rewrite ShackPrices so it uses Ruby on Rails instead of PHP?
  • ESRI – Between Google Maps & Microsoft Virtual Earth, this company won’t be serving the real estate mapping market much longer.
  • RedFin – That flash based satellite map, though very cool in it’s day, is increasingly looking like a liability. Better update it, do as Zillow did (partner with GlobeXplorer & Microsoft), or let one of the big boys handle your maps. Any map in which the Issaquah Highlands looks like polar bear eating vanilla ice cream during a snow storm, doesn’t cut it for me.
  • Zillow – Better do something cool with that MLS data you’ve been collecting. Otherwise, those eyeballs you were counting on, will be visiting the big brokers instead. Fortunately, for Zillow they could lose the local battle, but still win the national war. The NWMLS is releasing sold listing data in the near future and I’ll be shocked if the local big brokers don’t add “Zestimate” like features to their web sites in the next 6-12 months. Hell, Rain City Guide, already has one, but you already knew we’re ahead of the curve. 😉
  • Realtor.com / HomeStore / Move – Obi Won “Dustin Luther” Kenobi – Are you their only hope? Do something! Add an Rain City RSS feed, if you have to! Anything! 🙂
  • Coldwell Banker Bain – Since they are also Real Tech customers, I suspect they’ll be asking for Real-Maps real soon now.
  • Windermere – They can’t be far behind their arch-rivals, or can they?
  • Other local brokers/agents – Time to re-evaluate your MLS search/IDX vendor? Now that John L Scott’s web site has entered the 21st century, the pressure is building for you to join them.
  • John Q Home Buyer in Seattle/Eastside – The new John L Scott, is like RedFin but with better maps & aerial photos.
  • Everybody else, elsewhere – Consumer expectations are slowly being raised. I believe Seattle is ground zero of Real Estate 2.0. Those of you lucky enough to be living outside of the 206 & 425 area codes (aka the war zone), had better pay attention, because what’s happening here will happen in your neck of the woods, sooner than you think.

So what do our fair Rain City Guide readers think of this development?

SELECT * FROM MLS WHERE Remarks = ‘Whoa’

I thought I’d take a moment to reflect on how Rain City’s favorite MLS Search is implemented. I’m a little tired of thinking in computer languages (mostly T-SQL, C# and Javascript), so I figured I’d blog a bit in Geek English for a little while before I hit the compiler again.


I’m always interesed in how web sites & computer software works under the covers, so I thought I share some of the more interesting points about how I’ve implemented “Zearch” to date for the “geekier” folks in the blogosphere.

It all began way back in the fall of 2005 shortly after I got my first MLS feed. At the time, Microsoft’s asp.net 2.0 platform was still in beta. However, after learning what Microsoft’s next generation web development tools were going to do (and seeing what Google Maps and Microsoft’s Virtual Earth teams were doing), I saw a great unrealized potential in MLS search tools and decided to do something about it.

Anyway, it’s all built on top of asp.net 2.0 and MS SQL Server 2000 (yeah, I know I’m old school). One of the first things I did is combined all the property types into a VIEW and create a dynamic SQL query when you search for properties. Some search tools only let you search for residential properties or condominums at one time (which I thought was lame). I orginally tried to implement stuff doing a bunch of UNIONs, but keeping track of the schema variations for the different property types eventually drove me nuts, and I encapsulate all that crud into a VIEW.

I also find it a little ironic, that I’m not the only one who found the MLS schema differences a PITA to deal with. I’m glad the various MLS software vendors and the CRT are working toward a common industry schema (aka RETS), so us application developers can focus on the real problem (developing compelling & useful software), instead of remembering that the ld column in one table, is really the list_date column in another table.

Another interesting thing I do on the back end is that I geocode every listing after I do data download. The main reason is that I don’t trust the MLS data and their bogus geo-coding would make my app look bad. I also knew when I started, I’d eventually do maps, so as soon as a new listing hits my database, it’s gets more accurately/correctly geo-coded. In case your wondering if I’m screen scraping w/ Perl or something else, it’s all done with T-SQL stored procdures. (Well, technically it’s a proc that calls the MSXML2.ServerXMLHTTP COM object, to issue an HTTP request against a geocoding web service, and then uses OPENXML on the response’s XML to get the latitude & longitude).

As you might have guessed, there are also stored procedures and functions to get the distances between two points, doing a radius search, and other stuff of that ilk. Fortunately, all that stuff can easily be found using your favorite search engine, so you don’t need to know how all the math in the law of cosines works (you just need to know of it).

Well that’s it for the back end. Next time I’ll talk about the front end put on my Web Developer hat.

Did you know:

ShackPrices is Filling in the Blank…

The ShackPrices’ blog (run by RCG contributor Galen) just announced some slick new features for their site:

  • Permalinks. Makes it possible to email/link to specific results
  • Address Search. Easier to zoom into a specific property
  • Condo Information. Not just homes anymore…

So what is ShackPrices? ShackPrices is a King County specific Home Price Evaluation site. Operating in true Web2.0 spirit, these guys have taken King County sold home data and mixed it with Google Maps to create a map-based home valuation tool. By focusing on the Seattle area, locals might find that ShackPrices is a more useful tool than the obvious huge white elephant in the room. It is also worth nothing that others have had online home valuation tools for a while (and we recently released our own!), so it is nice to see that Galen takes stuff in perspective:

So is this the future of real estate search? I sincerely doubt it. I believe that online real estate search is a sliver of what it could be today, let alone what it could be tomorrow. We’re in the “glorified book

How cool is our home search? Ice Cold!

In case you haven’t dropped by our home search tool recently, we’ve made some improvementsicecube. Changes include…

Market Analysis Tool Improvements
We thought it would be helpful, if you could get a second opinion when you get an estimate. So, we’ve made arrangements with Zillow to use their Zestimate web services on our Market Analysis page. That way, when you type in a property address, we’ll give you our estimate, get your property’s Zestimate (and the link to it’s page on Zillow), and save you some typing.

Radius Search
Want to find the all houses, within 2 miles of your house or office? Now you can here! And yes, the search results pages are Bookmark-able, RSS-able, and Google Earth-able. (I wouldn’t have it any other way).

Improved Location Search
The list boxes on the location search page are multi-selectable. Big whoop, I hear you say? Well, ours doesn’t refresh the entire page when you change the city or download a big city / community list when you first navigate to the page. Yes, you are seeing AJAX in action. It’s not something most people are going notice, until they wonder “Gee how come your page is so much faster than all the other ones”?

As always, the results from the improved location search are Bookmark-able, RSS-able, and Google Earth-able.

What’s next
Well, it’s a given that at some point I’m going have to have Virtual Earth or Google Maps integration, instead of static Yahoo Maps. If I’m going to compete with the big boys of real estate search, I gotta do maps. I’m probably going to have to create profiles, so you can save your searches, favorite properties, favorite places and other stuff that requires server side persistence.

What features would consumers and realtors like to see next? I’m more interested in hearing what realtors would like to see next because they are the ones who’ll be writing the check, when I eventually decide to release this. I have a billion ideas for what I’m going to do, but I’d get to some more feedback to find out what features I should implement next. Otherwise, I’ll continue to make it up as I go along…