"I am Tiger Woods"

tiger-woodsWhen I was at Inman, I believe it was Dottie Herman (although I realize that Altos Research attributes the quote to Burke Smith) who said “Technology won’t replace agents, agents with technology will replace agents“. Regardless of the source, it’s a great quote! That remark struck a chord with me. Except there’s one small problem… There’s not enough “real” technology vendors out there! Let me explain further…

OK, at one end of the Real Estate 2.0 spectrum, you have Zillow, Move, & Trulia. They use cool technology to sell advertising in the Real Estate market. Nothing wrong with that. Being a ‘softie alumni perhaps I’m a bit too set in my ways to fully appreciated the size of the opportunity these fine companies are going after. After all, MS only has a 10% share of the $500 billion/year enterprise IT market. But Google, probably only has a 1% share of the $3 TRILLION/year advertising market. Maybe those numbers are off, but it feels like a good Zestimate to me. Clearly there’s a lot money to made from the death of print media and these guys are at grave yard with their shovels ready. More power to them I say.

At the other end of the Real Estate 2.0 spectrum, you have HouseValues, HomeGain & others. They try to use technology to lure in and sell leads. It’s not my cup of tea, and some people don’t like them, but there’s nothing wrong with that business model either.

But where are the companies that use technology to just sell technology? When I look at the MLS search offerings of my future competitors like Birdview, Wolfnet, Superlative, Logical Dog, and literally a cast of thousands etc, I just cry and smile. The maps are non-existent or very Web 1.0-ish. RSS or KML? What’s that? Foreign langauge support? Is English considered a foreign language yet? Data Visualization? You gotta be joking. Page speed? Maybe if you measure performance with a calendar. And I haven’t even talked about half of the things I want to see or invent in a world class MLS search tool.

Granted, my game still needs a lot more work as well. Zearch is English only still, there’s more to data visualization than pretty Zillow charts, I really have no idea of how bad I scale yet (better than reply.com I hope, otherwise I know I’ll never hear the end of it), and I only support the NWMLS right now, but on the whole I’m feeling pretty optimistic about my chances on the pro tour.

Picture this scenario, here I am, John Q Homebuyer, getting my Zillow fix, Moving around the web, and being Trulia impressed with all this Real Estate 2.0 stuff, and then I click on your ad. I goto your web site, I wanna search for homes (because frankly that’s why people visit your site, unless you’re a famous blogger) and do you know what happens next? It’s reminds me of the guy going for a test drive in the new Volkswagen radio ads.

“This broker’s web site has 3 speeds, and this one is disappoinment. Web site, honk if you suck *honk* Take me to a RealTech or Caffeinated web site”

FYI – I’m leaving out RedFin because there are an exception to this generalization. They are a broker that has developed great technology in house and they are keeping it all to themselves (punks ;)). So most other brokers can’t really compete with them technologically speaking unless they partner with a technology vendor (like RealTech or myself).

I mean, we have all these “consumer portal” companies doing interesting work, empowering consumers, and then when I visit the broker’s or agent’s web site for the full story, it’s a total and complete let down. There’s over 1 million agents in the this country, and probably only a 1000 agents that have web sites worth visiting (I suspect half of which are regular Rain City Guide readers), and hardly anyone with compelling MLS search tools. It feels like all the good software engineers involved with this industry want to sell it an ad or a lead, instead of a killer web site. Maybe the industry needed a few well funded and very talented start-ups to smack it around to finally wake up and smell the software? (sniff, aaaahh, Firefox fresh scent, yummy)

Clearly there’s a big opportunity for developing a good MLS search tool for this industry. Maybe not Zillow, Microsoft or Google sized, but it’s big enough to make me interested in going for it. I’m pretty excited at the thought of all the possibilities, personally.

This is why I cry and smile. I cry because I feel my clients pain. They just want a cool web site to capture leads so they can get off the advertising & lead buying treadmill, and finding a good one is just about impossible. I cry because I feel the home buyers pain. This stuff should be so much better than it is. Many brokers have the money and are willing to do something about it, but it just looks like the current set of vendors serving them are developing products like it’s web circa 1999. I smile, because I’m in a position to do something about this. I feel like a Tiger. Here’s how I break things down on the links…

Jack Nicklaus is still winning most of the major tournaments these days, but he’s the one whose records I wish to break. RealTech has done some real nice work w/ John L Scott, CB Bain (did you guys do CatalistHomes? It looks like your work, but I don’t see your brand anywhere?). He has a few years of a head start over me, and is probably in process of making his other clients very happy. I hope that Zearch will eventually be as well regarded as the work you’ve done.

But after Jack, I can’t see anybody else out on the course improving their game. Maybe they are all down at the club house sipping some buds? Maybe they think the mine field of MLS downloading rules and methods will keep their market shares safe from technology disruptors (and to be honest, they are partly right – I wouldn’t be crazy enough to take this on if I wasn’t so convinced that I could build a much better set of web tools than most of the vendors I’ve discovered). Maybe they’ve never read Andy Grove’s “Only the Paranoid Survive“? But if this industry embraces RETS (or better yet, screw the SOAP and let me get dirty w/ the MLS’s SQL Servers), I suspect a few names on the MLS/IDX web site industry leaderboard will change.

But how can any vendor support all 900+ MLSes in this country! This is a monster challenge, even for a Tiger. We’re talking a 600 yard, Par 3 sized challenge here folks. Sorry, but even Tiger’s Nike golf equipment can’t par that hole. I’ll suspect I’ll just refine my game on the local links until I get really good. (If Dustin would only give me the connection string to Realtor.com’s SQL cluster it would all be so much easier. ;)) Oh well, if I gotta play the game one hole at a time, that’s the way I gotta play. Just keep making pars, make a birdie here or there, no bogeys, and watch the other players fall apart like a Sunday afternoon during a major. I dunno, but it’s starting to feel like the 2nd round of the 2000 US Open at Pebble Beach to me.

I’m working out, I’m going to a swing coach, I’m sinking my putts, I’m killing balls on the driving range, and more imortantly, I’m feeling a little faster, stronger, & smarter with each passing week. So all you other players, better step up your game. Tiger’s turning pro soon. Maybe not this year, maybe not next, but soon. And when he does, the game of real estate will not be the same.

Except for Jack, I wouldn’t worry about him too much. We’ve all seen the green jackets in his closet. 🙂

I know Ubertor’s got game, but I consider them more of a Michael Jordon type player. Great stuff, but he plays a different sport than we do. So Mr. Agent & Ms. Broker, are there any good MLS/IDX vendors out there whose game impresses you? (Other than Jack’s & Tiger’s of course?)

"Frank"-speak on "broads" and the mls' lame 24 hour rule

[photopress:bgrable.jpg,thumb,alignright] This broad is lonely. You know, the kind of lonely that just has to be scratched…like an itch. So she says to me, she says, Frank, I just gotta get out there and get a man, and she starts putting on her coat. I say Whoa there…hold up, Babe. You ain’t gonna get the right man that way. Go in the toilet there and fix yourself up a bit first. Pin up your hair like Betty does, and put on that fancy armored under-stuff that makes every woman look like a Betty Grable pinup at first look, till you get up real close and see the spines on ’em. That’s right. Take your time, honey. The better you look, the more successful your gonna be at this venture. Now put on that lipstick and pull down that thing over there and hike up that and stand up real tall, like you don’t need nobody! That’s better. If youse goes out lookin’ like a loser, that’s exactly what you will end up bein’ at the end of the night…a loser!

Now someone please tell me who came up with that stoopid mls rule that says a seller HAS-TA put his house up for sale within 24 hours after he signs a listing contract? Hell man, da ink isn’t even dry yet! Do yus really think an owner can get his sh-t together that fast and spif up da joint in 24 hours?! Heck…he barely has time to pull up his pants and douse his tonsils with his “Jack”. Are you guys kiddin’ me??

And what agent worth his weight in peanuts is gonna have a fancy, glossy, snappy flyer ready in 24 hours? Heck, can ya even get a sign up that fast? Who made up that rule anyways? You guys better stop drinkin’ that NAR Koolaid and get da F outta here pronto! Cause I ain’t puttin’ my house on the internet for everyone to peek at until it’s good and ready! Ya hear dat! GOOD and ready. And youse guys can go pound sand with your lame mls rule dat says udderwise. I don’t give a rat’s behind if you guys stab each other in da back or in da front or any other ways. Dat ain’t MY problem! You can’t tell me what I HAVE to do and you sure as hell can’t tell me that I ain’t got no say in it. So what that I can give you some letter saying “hold up, pal”, what about da poor slobs dat don’t know any better and have their underwear hanging over the shower curtain in the mls photo. Scratch that rule…scratch it now…cause it’s lame. Like my pal Robbie says…Lame, Lame, Lame MLS Rule! Trash it, cause it’s garbage and its stinkin’ up da place!”

I’m signing this here contract to get me the best damn agent there is and I’m sure as hell payin’ him plenty of dough! So’s he better damn well work his butt off for a whole lot more than TWENTY FOUR HOURS before HE pushes that button that sends me LIVE into the cyberworld.

I thought it was jes dose NAR Koolaid drinkin fools spoutin’ this garbage. But when I saw David Barry sayin’ his NEW Consumer Oriented Free Public mls was gonna go and copy that same stoopid lame mls rule, I said dat’s IT. I’ve HAD it. Jilly…go knock some sense into dos guys before the whole planet goes koo-koo on dat koolaid! Knock some heads together if ya has to…but don’t let the poor public be fooled into thinkin that they have to go out in public lookin’ like a LOSER! It ain’t dare right to say so…cause I say so and it just ain’t the right way to treat people and their most valuable asset. Heck, a girl takes longer than that to spruce herself up to find a guy. How da heck is a fella supposed to spruce up his whole house in 24 hours! It ain’t right…it just ain’t right.

Zillow vs. “average” agent

When I wrote my “Baby Takes a Bow” piece which took about 30 seconds, I knew I was opening Pandora’s Box, and would have to back up my one liners with some extensive writing on each topic outlined therein.

My definition of Pandora’s Box is the one one that attributes “the box” to a “woman’s womb” from which new life springs forth. While I do not necessarily agree with Inman’s new three part series on negating the mls offering started yesterday, or all of David Barry’s undertakings around the country, clearly I am not the only one trying to pry open Pandora’s Box. The box WILL be opened! Whether the DOJ or David Barry choose in the end to take the ultimate credit, truth is, it is just simply time for the box to be broken open by everyone at once.

If we all take out our respective crowbars, the box will open. Who takes credit for having opened it, and clearly David Eraker and those who came before him will deserve some of that credit as well, who takes the ultimate credit is irrelevant. In fact the DOJ is my best hope for getting the credit, so that the “new life that springs forth” will be on a national scale as only the DOJ can do best.

In this quote from my most recent beginnings of a very long explanation, you will quickly see just WHO Zillow can replace, which by current accounts and statistics may be up to 90% of the industry as it exists today.

“If you stand up from the computer with the value in your hand before you go to the house, and you stand by that value after you arrive at the house, because the computer “SET” the value…you are giving the seller the equivalent of a Zillow produced valuation…which is FREE. Any agent who thinks a computer spits out a home “value” via a CMA Program, is easily replaced by Zillow.”

To some extent, those who wrote those great CMA programs, like IRIS/Lightning and Top Producer and way back to Coldwell Banker’s very first CMA software which predated them all, are responsible for agents believing that a computer can value a home.

To a greater extent large brokers, and local mls classes, that mislead new agents into thinking they can “value a home” on day ONE after they receive their license, by using these programs, are even more responsible for this thinking.

When the Pareto Principal changes from an 80/20 rule to a 90/10 rule, as was told to me in Real Estate Broker Classes, with only 10% of agents being competent, then it is time. It is time for Pandora’s Box to be opened. It is time to stop that snowball from rolling down the hill, it is time to stop that train that doesn’t seem to have brakes. It is time to roll back the clock and begin again.

Contrary to Inman’s new series, we do not have to roll the clock back 35 years to the beginnings of the mls. We only have to back up to the day that buyers were supposed to become “1st class citizens”, and begin anew from that point. Because an agent who cannot value a home for a seller, cannot with any sense of credibility, value one for a buyer either.

The MLS and Creative Sales

It’s amazing the stories I hear about the self-policing that occurs on the MLS (some of it seems more like snitching, in my opinion). One of the big no-no’s is listing any reference to an open house in a listing’s marketing remarks. The reason for this rule seems clear enough; with so many of the big guys subscribing to feeds of all the listings, the MLS doesn’t want to bite the hand that feeds it. The Windermeres and John L Scott’s don’t want prospective clients to go to open houses on their own, they want their agents to represent these prospects.

On the more creative side, an agent I know had her hand slapped after she tried to emulate an auction situation, using her MLS listing to attract buyers. She is also an investor, having bought the home in question on a lease-option. The previous owner did a beautiful rehab job, but ran out of money and needed out fast. She stepped in, leasing the place (the lease payment covering holding costs for the owner), with a year option to buy for what was owed by the seller. She also took a chance and put another $15K into the home to finish it out (a risk since she didn’t own the home). With her large equity position given the market value of the home, she decided to test the waters, and create a situation that would attract more offers.

First, she offered a buyer’s agent commission of 4%. Then, in her creativity, she made her first mistake. In the agent-only remarks, she explained that the 4% would be added to the highest bid price. This was also clearly stated in the auction rules posted at the property during the open houses, available to all potential buyers. Her thinking was to put walk-in buyers on the same footing as buyers with agents (there were two open houses over the course of a weekend, then bids were accepted – no time for lockbox enabled agent visits), thereby creating an apples-to-apples comparison when reviewing competing offers. In other words, the final bid price would be the net (of commission) price. If a buyer with an agent offered the highest bid at (for example) $100,000, then price would be grossed up to $104,000, with the investor/agent/seller paying out the $4000 commission to the agent at closing. If a walk-in buyer offered the highest bid, then no commission is necessary (since she is the owner of the house, there’s no need to pay herself a commission…she’s getting all the profits anyhow). I can understand why she got in trouble for this. The MLS is for agents, not joe consumer. Agents would naturally prefer that the buyer’s agent commission is included in the final cost of the property. With this gross-up method, savvy buyers (even dumb buyers) would realize that they would have saved $4000 had they gone directly to the open house without an agent.

The other creative step she took was to include the word ‘auction’ in the agent remarks. She listed the home at her acquisition cost, knowing that it’s true market value would grab the attention of agents. Her objective was not to deceive, thus the reference to an auction (all bids would be reviewed at the end of the second open house).

However, on both counts, other agents snitched her out. I think each of her creative marketing ideas went against the traditional thinking of the industry.

In trying to offer a grossed up commission (and super-sized at that!), she was bringing the bright light of transparency to the transaction. Though her intent was not to ‘out’ agents who don’t discuss compensation with their clients, this is clearly the first thing many agents thought about. The agents would publicly claim that ‘it isn’t fair’ to raise the price 4% after a final bid has been made. On the contrary, as long as the this is known from the beginning, there should be no problem. If buyer’s agents openly discussed compensation with their clients, then clients would have a better chance of understanding that grossing up the offer to capture commission is not a big deal.

As for trying to market an auction, I think agents didn’t understand what the investor/agent/seller was trying to accomplish, or they were thinking of a Sotheby’s style of auction, which would be uncomfortable for agents and buyers alike. In either case, it was just strange enough to be deemed out of place on the MLS.

I’m guessing that some agents reading this are thinking, “yeah, this seller/agent should burn in hell for what she tried”. However, being an investor myself, I applaud her creative approach to trying to maximize the price she can fetch for her property, and for trying to structure the commission so that agent represented buyers and walk-ins are treated equally.

Future of the Real Estate Industry?

Hint: It is being discussed and decided this week, but not in San Francisco.

John Cook picks up this quote from Glenn Kelman of Redfin on what he will testify about when placed in front of U.S. House of Representatives’ Subcommittee on Housing and Community Opportunity:

“I am going to say how much friction there is in the business,” said Kelman, adding that as one of the first online brokers Redfin has been “kicked and spat on” by the Multiple Listing Services in California and Washington.

Ouch! I’m sure that once he testifies, relationships with the local MLS organizations will quickly be healed. 🙂

If this topic interests you, then definitely check out John’s column because he provides some great links.

“Disguised” FSBO Market Share

Some big news happened last week in Texas which I discuss on my blog [link removed]. In a nutshell, the FTC obtained a Consent Order from the Austin Board of Realtors to eliminate a rule that treated Exclusive Agency Listings different from Exclusive Right to Sell Listings, at least with respect to the publishing of those listings on public web sites. Rules like these have been adopted to deal with flat fee listing brokers who did nothing more than insert the listing into the MLS database. In other words, these are “disguised” FSBOs where the owner has agreed to pay some selling office commission but usually receives little or no additional help from the listing broker.

In its investigation, the FTC found that, prior to the adoption of the rule, 18% of the listings in the Austin MLS were Exclusive Agency Listings. Once the rule was adopted, the number of Exclusive Agency Listings dropped to 2.5% of the total.

I have always heard that the FSBO rate was somewhere around 10-15% nationally. Since the 18% figure does not include what I might call “pure” FSBOs where the seller basically hammers up a sign and calls it good, the actual FSBO rate in Austin (before the rule adoption) was probably greater than 20%. Is this surprising? Do you think it reflects historical numbers or is some kind of trend? Any thoughts on where the 15.5% went after the rule was adopted?

There you go again – the MLS doesn’t scale

[photopress:Reagan.jpg,thumb,alignright]Ever since Zearch, I’ve been bombarded with work to update or create MLS search web sites for various brokers & agents across the country. Because of this, I’ve had the opportunity to deal with another MLS in the Bay Area (EBRDI) and Central Virginia (CAARMLS). Before I begin another MLS rant (and cause the ghost of Gipper to quip one of his more famous lines), I want to say the IT staff at both EBRDI & the NWMLS have been helpful whenever I’ve had issues, and this primary purpose of the post is to shine a light on the IT challenges that an MLS has (and the hoops that application engineers have to jump through to address them).

After working with EBRDI, and the NWMLS, I can safely say the industry faces some interesting technical challenges ahead. Both MLSes have major bandwidth issues and the download times of data from their servers can be so slow, it makes me wonder if they using Atari 830 Acoustic modems instead of network cards.

The EBRDI provides data to members via ftp downloads. The provide a zip file of text files for the all listing data (which appears to be updated twice daily), and a separate file for all the images for that day’s listings (updated nightly). You can request a DVD-R of all the images to get started, but there is no online mechanism to get all older images. This system is frustrating because if you miss a day’s worth of image downloads, there’s no way to recover other than bothering the EBRDI’s IT staff. If the zip file gets corrupted or otherwise terminated during download, you get to download the multi-megabyte monstrosity again (killing any benefit that zipping the data might have had). Furthermore, zip file compression of images offers no major benefit. The 2-3% size savings is offset by the inconvenience of dealing with large files. The nightly data file averages about 5MB (big but manageable), but the nightly image file averages about 130 MB (a bit big for my liking considering the bandwidth constraints that the EBRDI is operating under).

As much as I complain about the NWMLS, I have to admit they probably have the toughest information distribution challenge. The NWMLS is probably the busiest MLS in the country (and probably one of the largest as well). According to Alexa.com, their servers get more traffic than redfin or John L Scott. If that wasn’t load enough, the NWMLS is the only MLS that I’m aware of that offers sold listing data [link removed]. If that wasn’t load enough, they offer access to live MLS data (via a SOAP based web service) instead of daily downloads that the EBRDI & CAARMLS offer their members. If that wasn’t enough load, I believe they allow up 16 or 20 photos per active listing (which seems to be more than the typical MLS supports). So, you have a database with over 30,000 active listings & 300,000 sold listings, all being consumed by over 1,000 offices and 15,000 agents (and their vendors or consultants). The NWMLS also uses F5 Network’s BigIP products, so they are obviously attempting to address the challenges of their overloaded information infrastructure. Unfortunately, by all appearances it doesn’t seem to be enough to handle the load that brokers & their application engineers are creating.

Interestingly, the other MLS I’ve had the opportunity to deal with (the CAARMLS in Central Virginia) doesn’t appear to have a bandwidth problem. It stores it’s data in a manner similar to EBRDI does. However, it’s a small MLS (only 2400-ish residental listings) and I suspect the reason it doesn’t have bandwidth problem is because of the fact it has fewer members to support and less data to distribute than the larger MLSes do. Either that, or the larger MLSes have seriously under invested in technology infrastructure.

So what can be done to help out the large MLSes with their bandwidth woes? Here’s some wild ideas…

Provide data via DB servers. The problem is that as an application developer, you only really want the differences between your copy of the data and the MLS data. Unforunately, providing a copy of the entire database every day is not the most efficient way of doing this. I think the NWMLS has the right idea with what is essentially SOAP front end for their listing database. Unfortunately, writing code to talk SOAP, do a data compare and download is a much bigger pain than writing a SQL stored proc to do the same thing or using a product like RedGate’s SQLCompare. Furthermore, SOAP is a lot more verbose than the proprietary protocols database servers use to talk to each other. Setting up security might be tricky, but modern DB servers allow you to have view, table, and column permissions so I suspect that’s not a major problem. Perhaps a bigger problem is that every app developer probably uses a different back-end, and getting heterogeneous SQL servers talking to each other is probably as big a headache as SOAP is. Maybe using REST instead of SOAP, would accomplish the same result?

Provide images as individually down-loadable files (preferably over HTTP). I think HTTP would scale better than FTP would for many reasons. HTTP is a less chatty protocol than FTP is, so there’s a lot less back & forth data exchange between the client & server. Also there’s a lot more tech industry investment in the ongoing Apache & IIS web server war than improving ftp servers (I don’t see that changing anytime soon).

Another advantage is that most modern web development frameworks have a means of easily making HTTP requests and generating dynamic images at run time. These features mean a web application could create a custom image page that downloads the image file on the fly at run-time from the MLS server and caches it on the file system when it’s first requested. Then all subsequent image requests would be fast since they are locally accessed and more importantly, the app would only download images for properties that were searched for. Since nearly all searches are restricted somehow (show all homes in Redmond under $800K, show all homes with at least 3 bedrooms, etc), and paged (show only 10, 20, etc. listings at a time), an app developer’s/broker’s servers wouldn’t download images from the MLS that nobody was looking at.

Data push instead of pull. Maybe instead of all the brokers constantly bombarding the MLS servers, maybe the MLS could upload data to broker servers at predefined intervals and in random order. This would prevent certain brokers from being bandwidth hogs, and perhaps it might encourage brokers to share MLS data with each other (easing the MLS bandwidth crunch) and leading to my next idea.

BitTorrents? To quote a popular BitTorrent FAQ – “BitTorrent is a protocol designed for transferring files. It is peer-to-peer in nature, as users connect to each
other directly to send and receive portions of the file. However, there is a central server (called a tracker) which coordinates the action of all such peers. The tracker only manages connections, it does not have any knowledge of the contents of the files being distributed, and therefore a large number of users can be supported with relatively limited tracker bandwidth. The key philosophy of BitTorrent is that users should upload (transmit outbound) at the same time they are downloading (receiving inbound.) In this manner, network bandwidth is utilized as efficiently as possible. BitTorrent is designed to work better as the number of people interested in a certain file increases, in contrast to other file transfer protocols.”

Obviously MLS download usage patterns match this pattern of downloading. The trick would be getting brokers to agree to it and doing it in a way that’s secure enough to prevent unauthorized people from getting at it. At any rate, the current way of distributing data doesn’t scale. As the public and industry’s appetite for web access to MLS data grows and as MLSs across the country merge and consolidate, this problem is only going to get worse. If you ran a large MLS, what would you try (other than writing big checks for more hardware)?

Our Home is Now Listed!

And despite the fact that we may not have Ardell’s magic open house touch, we are showing it on Sunday between 12 and 3PM as described in the open house listing on Trumba.

Update:

I also created an adword campaign around our home. If you see the following ad while surfing the web, don’t click on it because it costs me money and just takes you to this blog post! 🙂
[photopress:beautiful_ballard_home.jpg,thumb,centered]

Funny side note… I decided to try out Google’s option to target ads at specific websites and noticed that Zillow was on the list for real estate related sites. However, in order to see the ad for my home on Zillow, I had to disable the one-two punch of Adblock and Filter.G on my Firefox browser. By disabling these two extensions, so many websites that I visit on a regular basis looked so much uglier! It was like traveling the web naked! It you’re not using the firefox browser with these two extensions, then you are almost definitely surfing a web that looks much more annoying than mine!

Lame MLS Data Again!

[photopress:mea_culpa.jpg,thumb,alignright]Looks like I need to get down on bended knee and beg Robbie’s forgiveness, pounding my chest and saying “Mea Culpa!” Robbie, I try and try to give you accurate data, honestly I do. But it just is not always within my power to do so. I am totally stumped on this one.

If anyone out there can help me get the accurate data for Robbie, PLEASE point me in the right direction.

I listed a property in Mount Baker. Checked the tax record and it said “Year built 1900”, so I entered that “data”. This is a “man in the bushes” listing, so I already know who is likely going to buy it. But still, for Robbie’s sake, I would like the data to be accurate. Out of curiosity, I wondered if there were any houses older than this “Grand Olde Dame” of Mount Baker.[photopress:mt_baker.jpg,thumb,alignleft] I did a General Query in the tax database for homes built between 1800 and 1900, and guess what?! Anything OLDER than 1900, shows AS 1900 in the tax records!

So here I am realizing that I put “Year Built 1900” in the mls, and maybe it is really older than that. Of course my first thought is about Robbie and his “Cries Against Lame Data”. Tell me please, what’s a girl to do when the tax records won’t take me back further? So I contact the Title Company and they say they can only do what I did, giving them 1900 also. They then go a step further, and now do know that “the original plat declaration” for that section of Seattle was in 1888. So maybe that tells us that the house was built in that 12 year window, between 1888 and 1900.

Could “the original plat declaration of 1888” be filed AFTER the house was built there? Enquiring minds want to know! Sorry Robbie, I aim to please; and yet again disappoint. MLS has no way to put a “date range” for year built, or “older than” 1900. So 1900 it stays. Though I did try to account for that in the remarks section.

Am I forgiven, or do I end up on “Robbie’s Lame List” with a “lazy agent” dunce cap on my head? Oh well, “Wednesday’s Child is full of Woe”, as the Nursery Rhyme goes. Some days I wish I were born on a Sunday.

SELECT * FROM MLS WHERE Remarks = ‘Whoa’

I thought I’d take a moment to reflect on how Rain City’s favorite MLS Search is implemented. I’m a little tired of thinking in computer languages (mostly T-SQL, C# and Javascript), so I figured I’d blog a bit in Geek English for a little while before I hit the compiler again.

[photopress:matrix1_alt.jpg,full,alignright]

I’m always interesed in how web sites & computer software works under the covers, so I thought I share some of the more interesting points about how I’ve implemented “Zearch” to date for the “geekier” folks in the blogosphere.

It all began way back in the fall of 2005 shortly after I got my first MLS feed. At the time, Microsoft’s asp.net 2.0 platform was still in beta. However, after learning what Microsoft’s next generation web development tools were going to do (and seeing what Google Maps and Microsoft’s Virtual Earth teams were doing), I saw a great unrealized potential in MLS search tools and decided to do something about it.

Anyway, it’s all built on top of asp.net 2.0 and MS SQL Server 2000 (yeah, I know I’m old school). One of the first things I did is combined all the property types into a VIEW and create a dynamic SQL query when you search for properties. Some search tools only let you search for residential properties or condominums at one time (which I thought was lame). I orginally tried to implement stuff doing a bunch of UNIONs, but keeping track of the schema variations for the different property types eventually drove me nuts, and I encapsulate all that crud into a VIEW.

I also find it a little ironic, that I’m not the only one who found the MLS schema differences a PITA to deal with. I’m glad the various MLS software vendors and the CRT are working toward a common industry schema (aka RETS), so us application developers can focus on the real problem (developing compelling & useful software), instead of remembering that the ld column in one table, is really the list_date column in another table.

Another interesting thing I do on the back end is that I geocode every listing after I do data download. The main reason is that I don’t trust the MLS data and their bogus geo-coding would make my app look bad. I also knew when I started, I’d eventually do maps, so as soon as a new listing hits my database, it’s gets more accurately/correctly geo-coded. In case your wondering if I’m screen scraping w/ Perl or something else, it’s all done with T-SQL stored procdures. (Well, technically it’s a proc that calls the MSXML2.ServerXMLHTTP COM object, to issue an HTTP request against a geocoding web service, and then uses OPENXML on the response’s XML to get the latitude & longitude).

As you might have guessed, there are also stored procedures and functions to get the distances between two points, doing a radius search, and other stuff of that ilk. Fortunately, all that stuff can easily be found using your favorite search engine, so you don’t need to know how all the math in the law of cosines works (you just need to know of it).

Well that’s it for the back end. Next time I’ll talk about the front end put on my Web Developer hat.


Did you know: