Go Ahead, Make My Day

Dirty-Harry-Make-My-Day

You know, some days I can really relate to Inspector Harry Callahan. Some days, it feels like being an IDX vendor is a dirty job so unappreciated that only Dirty Harry could fully appreciate it.

Recently, I’ve heard that the NWMLS decided to enact a few more rule changes. Needless to say, I’m all broken up about the new NWMLS rules. The good news is that there will no longer be a 3 download agreement limit. This should allow members to more easily work with multiple vendors, and perhaps better allow members to easily find cost-effective solutions to their IT problems. I think it’s a good idea because it could create more demand for the services I can provide.

The bad news is that starting in October, the NWMLS will charge each entity downloading the IDX data (i.e. the consultant or the broker for an in-house data feed) $30 per month, per agreement. For example, if a vendor A has a download agreement with office A, B, and C, then the vendor will be charged $90 per month. Needles to say, this new rule will seriously hinder your vendor’s ability to inexpensively host web sites or otherwise develop applications w/ NWMLS listings on them.

Sometimes, I got to wonder what are the jive turkeys at the NWMLS are thinking? So now I either have to eat an unwanted (and probably unnecessary) cost or I have to pass on the increase in my costs to my customers? Neither scenario really appeals to me (and probably won’t appeal to my customers either). I’d rather increase my costs by buying more servers, going to Inman SF Connect, buying iPhone app development tools & books, or anything else that would ultimately improve end-user satisfaction with the applications I build. But now I have to pay a tax for merely trying to serve my clients? Gee, it isn’t like developing an Evernet XML download is already as much fun us as doing my taxes is.

My clients are hard working real estate professionals; they are not professional software engineers. They know about as much about creating Zillow XML feeds or developing Real Estate based Google Maps mash-up as I know about selling a home with a troublesome neighbor or if a property is next to a graveyard, does it lower the value because it’s creepy, or does it raise the value cause it’s quiet? Unfortunately, the nature of the world today requires real estate professionals partnering with vendors and/or consultants because real estate consumers increasingly demand high tech services from their agents & brokers and you can’t provide that service without high tech experts working on your behalf.

I’m not opposed to higher taxes if I know it’s going for a good cause. But what is this extra $30/month per agreement going to buy me or my clients? Is the NWMLS going to buy faster servers? Hire more IDX support staff? Throw a big party and spend the money on booze and strippers? Is the NWMLS running in the red and needs a bailout? Seriously, I’d like to know what I’m about to pay for.

Also, wouldn’t it make more sense to charge per vendor instead of per agreement? A vendor needs the same amount of NWMLS IT resources regardless if they serve only one member or ten members. Typically, a vendor only downloads the NWMLS data once, and uses the same copy of the NWMLS database for all their clients. It’s not like a vendor who has 10 clients incurs 10 times the CPU & bandwidth costs that a smaller vendor does.

Allowing multiple feed per broker could encourage more competition between vendors, but increasing vendor costs certainly won’t make things cheaper for members in the long run.

I can see a future phone call from the NWMLS enforcement division going something like this…

I know what you’re thinking, punk. You’re thinking “Did I sign six download agreements or only five?” Well to tell you the truth, in all this excitement I kind of lost track myself. But being as this is the NWMLS, the most powerful MLS in the greater Puget Sound area, and could blow your web site clean off, you’ve got to ask yourself one question: Do I feel lucky? Well, do ya, punk?

Has anybody else heard any details on these new policy changes? Do you know what the new IDX feed fees are for? Do you think the repeal of the download rule will help you? Do other MLS’s do this kind of thing? Why do I feel like I forgot my fortune cookie and it says I’m {bleep} out of luck?

Twitter: Is it for Twits or Twitteratti?

A week ago, I decided to finally sign up for a Twitter account and probably became the last person in Seattle to join. I’m still trying to figure out if it’s a revolutionary new communication medium or merely the CB radio of the early 21st century (to be honest, I haven’t yet decided what Twitter is yet). I think one MSNBC writer summed it up nicely quite nicely, when she stated Twitter is the Snuggie of social networking. I’m not sure if Twitter is truly useful, but there is no denying it’s the hot thing at the moment.

After all, if the Washington State Department of Transportation has multiple twitter account to broadcast updates on mountain passes and Seattle area traffic events. Many of the local TV news networks & anchors have twitter accounts (Jenni Hogan, Bill Wixey, Jesse Jones, & KOMO news are all there). If that wasn’t enough chatter, even the U.S. President is on Twitter (although he doesn’t tweet now that he’s a President instead of a candidate). Heck, even Senator John McCain is on Twitter. I think Twitter hype hit all time this week, when Google’s CEO declared Twitter A ‘Poor Man’s Email System‘ and the geeky gurus in Silicon Alley are ready to anoint Twitter the new Google killer. Ironically, Twitter has proven itself as a great tool for breaking news about gmail outages and down time.

It’s all very interesting, but is it useful? Well, since I’ve been on Twitter, I’ve discovered lots of interesting blog posts & new articles on topics of interest. I’ve learned that tinyurl.com is bloated compared to tr.im. I’ve even played with the Twitter APIs and tweeted myself. But, I’m still perplexed on what the best way of using this new social networking tool in my arsenal?

For example, would potential home buyers / sellers prefer to get updates on market changes via Tweets instead of e-mail messages, text messages, or RSS feeds? If so, should you be sending direct messages to your clients? (which seems like a poor man’s e-mail system to me). I suppose one could update their status, but if one updates their status every 10 minutes every time a new listing hits the market, one’s followers would probably get follower fatigue.

Suppose you have multiple clients, and that they all want the same information, should you create an account that they all follow instead of direct messages to each? What if you multiple clients that want different information, should you create multiple Twitter accounts, each of which publishes a certain information type (say homes in Redmond, Medina mansions, or condos in Renton)?

Of course, if that becomes popular then Twitter account names may become as valuable and as scarce as internet domain names are today? (BTW – SeattleHomes is already taken, although it doesn’t have any followers yet). Perhaps, everybody will use url shortening services like is.gd instead of domain names, SEO & names/brands of the actual twitter account won’t matter?

I’m not sure I’ll shake my fist at Twitter, like Jon Stewart did but I can’t help but wonder if micro-blogging, will beget a generation of people who can only communicate in phrases of 140 characters or less. I’m already growing nostalgic for thoughtful articles written by people in the news industry. Maybe I just need to read more NikNik & Tyr until I get it?

Perhaps, Twitter is best used to convey the daily minutiae our digital lives to interested parties and shouldn’t be taken seriously? In any event, I’m enjoying my time tweeting (or is it twittering?) like everybody else apparently is.

OpenSearch is beyond cool – it’s the new cold

I was reading Redfin’s Developer Blog and the IE blog a few months ago and I got this desire to write my own OpenSearch provider. OpenSearch was originally created by A9.com (an Amazon.com company) and was primarily designed as a way for web developers to publish search results in a standard and accessible format. This turns out to be a good idea because different types of content require different types of search engines. The best search engine for a particular type of content is frequently the search engine written by the people that know the content the best. Google is great at searching unstructured content on the internet, but when it comes to structured search on a single web site there are much better options (Endeca, FAST, Autonomy, Solr, my favorite SQL database, etc). The other benefit of OpenSearch providers is that it shifts the balance of power away from Google and back toward web browser vendors & web site developers.

Both of the major web browsers support the OpenSearch Referrer extension. IE 7+, Firefox 2+ & Chrome allows you to add search engines to your browser without leaving the web page. The best place to get started is from the browsers vendors themselves. You can add search providers from Microsoft’s site or you can add search providers from Firefox’s add-ons site. In the interest of full disclosure, Opera allows you to add search engines manually, and Safari currently does not support this feature in any form (unless you count using vi to edit the Safari executable or changing your OS’s hosts file as support, which I do not recommend).

Anyway, our developer friends at Redfin wrote a blog post about their OpenSearch provider on their dev blog some time ago. Of course, they took the easy way out by not developing an OpenSearch Suggestions extension (slackers). I decided that a search provider without suggestion support is lame, so I took a stab at creating one. I think what inspired me to write an OpenSearch suggestions provider is that the IE 8 team blogged about their new Visual Search feature (which embraces & extends the OpenSearch suggestions work that Firefox pioneered) and I could leverage the work to improve the search experience for both IE 8 & Firefox 2+ users. (And the satisfaction of having a cool feature that Redfin & Estately haven’t implemented yet was probably another factor).

This functionality is typically exposed to users, via the search engine bar, next to the address bar in your web browser. So in your page markup, you’ll add something like this that tells the browser that your web site has a search service.

<link title="RPA Real Estate Search" type="application/opensearchdescription+xml" rel="search" href="http://www.seattlehouses.com/Feeds/OpenSearch.ashx"/>

The above element points to your site OpenSearch Description XML file which describes your search service in a way the browser can understand. When you visit RPA’s site, the browser will read RPA’s OpenSearch Description file located here and unobtrusively let you add the site’s search providers.

Assuming everything is working correctly, the user should be able to visit RPA’s web site, click on the browser’s search bar to add our search provider like so… (IE’s screen captures are on the left, Firefox’s are on the right).

I’ve also added a button in RPA’s search bar (see above right) in case site visitors don’t discover our search provider via the browser (I suspect most users would miss it otherwise).

After you’ve registered RPA’s search provider with your web browser, you can select it and just start typing. Since I’ve implemented a suggestions service, it will auto complete cities, school districts & neighborhoods as you type them (Didn’t I say this was cool?). I should note that although IE 7 & Chrome support OpenSearch, only IE 8 and Firefox currently support the suggestions providers. Anyway, if you wanted to look for listings in Bellevue, here’s what it currently looks like.

As you’ll notice, IE 8 & Firefox 3 displays suggestions differently on RPA’s site. This is intentional because IE 8 supports a newer version of the OpenSearch standards (Microsoft calls it Visual Search) and I designed RPA’s search provider to exploit this fact. In Firefox, the browser can only handle plain text suggestions, which can lead to ambiguous searches. For example, let’s say you search for Riverview. Riverview is both a neighborhood in Kent and a school district in Carnation / Duvall, so in Firefox there is no means for the user to tell the web site in which context they meant to search for when they typed in Riverview. I suppose one could create a “Did you mean” results page for cases like this, but I think that somewhat defeats the purpose of having suggestions support.

However, in IE 8, if a term has multiple contexts, the search provider can display them all and the user can select the one they meant. Also in IE 8, the search provider can display thumbnails next to the suggestions, which further helps the user quickly find what they are looking for. Although, I haven’t implemented that feature yet (mostly because I wasn’t sure what picture I should put up there for search terms that return multiple results), other web sites have. For example, if you wanted to buy a movie from Amazon or learn more about our 16th president from Wikipedia, the IE 8 search provider experience looks like this…

As the Redfin developers stated, implementing OpenSearch Referrer extensions are surprisingly easy (so I think users will soon request them from all web sites once the word gets out). The OpenSearch Suggestions extensions are more difficult to implement because every single keystroke is essentially a REST web service call. If you aren’t careful, you could bring your web server to its knees real quick. However, given all the AJAX map based tricks today’s real estate web sites perform, this isn’t anything that a professional software engineer can’t handle.

Call me crazy, but I think OpenSearch providers are going to become bigger than RSS feeds over the next year. If IE 8’s forth coming release doesn’t launch them into the mainstream, I think future releases of Firefox & Chrome will improve upon IE 8’s good ideas. Maybe you should think of it as browser favorites on steroids? If search is sticky, then OpenSearch is superglue and duct tape. If Firefox’s suggestions support were the tip of the iceberg, then IE 8’s implementation is cooler than Barrow, Alaska. The future of OpenSearch looks bright, even if it’s cold outside.

Windermere’s Web Site Strikes Back

I’ve been way too busy at my current day job during the past year to play real estate mash up games at the level Galen has been playing at. However, it appears Windermere has decided to up their game and yesterday they released an improved & simplified property search feature on their web site.

On the plus side, I like the improved site’s ability to see multiple photos of listings alongside the Virtual Earth map-based interface. It addresses one my persistent complaints that most map-based real estate search sites tend to share. I also like how they embraced what appears to be a trend of starting a property search with a textbox of a city name (ala Redfin & Estately) instead of a byzantine array of list boxes & check boxes.

On the minus side, the site only showed me properties when my search returns between 1 and 100 matches. I hate limits, especially small ones. I have a big monitor and a pretty fast net connection. My hardware could handle a thousand pushpins on the map if you let it. To channel Jerry Maguire – Show me the listings! I have Windermere’s competitors on the other browser tabs – John L Scott’s limit is 300 (good), Redfin’s limit is 500 (better), and Estately shows me a 100 at time, but w/ no upper limit (I like the no upper limit part). I also missed the wide array of features & data that I’ve come to expect from Redfin or Estately. However, given Windermere’s design priorities for this release were simplicity, rather than power & flexibility; I can’t fault them too much for accomplishing their goals.

In any event, if you write real estate web apps for fun and/or profit, you owe it to yourself to read the Windermere Tech Blog. If you merely use real estate web apps, you should check out the new Windermere.com.

Goodbye Yellow Brick Road

The yellow brick road heading into the Emerald City, has finally lost its shine and turned to stone this past week. If the combined dreariness of fall in the Northwest, Wamu’s funeral, and the Seahawks SLOOOOW start wasn’t enough for us to ask for Lexapro during our next doctor’s visit, the latest batch of bad news certainly is.

No sooner than the venture capitalists sound alarms bells and implore startup CEOs to save cash, slash costs, & stay alive, two of Seattle’s Real Estate 2.0 giants, Redfin & Zillow announce workforce reductions.

Zillow announced at 25% cut on Friday, while Redfin announced that it had laid off 20% of its employees last Monday. Granted, it wasn’t like things were much better in the Real Estate 1.0 world. My NWMLS database’s member table has about 3,000 fewer records than a back up from last year did. But it’s another sign that everybody expects a long & cold winter ahead.

If you were one of the few that got axed, I wish you luck finding your future life, beyond the yellow brick road.

Another one bites the dust…

It’s pretty amazing how in the span of just under 3 weeks that…

  • Fannie Mae & Freddie Mac were seized by the government
  • Leman Brothers went into bankruptcy
  • Bank of America buys Merrill Lynch before they go into bankruptcy
  • The Federal Reserve gives AIG an $85 billion loan
  • President Bush seeks a $700 billion Bailout
  • Goldman Sachs & Morgan Stanley turn into regulated commercial banks
  • Warren Buffet buys $5 billion of Goldman Sachs stock
  • Washington Mutual is seized by the government & sold to JP Morgan Chase

I’m just in a state of shock and near disbelief witnessing the carnage unfold on Wall Street so fast.

    

Attack of the Killer Assessments

It was a warm & lovely summer evening… Our hapless hero goes through his nightly ritual of sorting the junk mail from the bills when stumbles upon his annual “Official property value notice” post card from the King County Assessor.

Before I actually looked at the card, I thought, this shouldn’t be too bad. The local real estate market has cooled down a lot in the past year. My appraised value should be flat (maybe even lower). Zillow thinks my house’s value has fallen by about 10% this past year. Cyberhomes thinks it’s fallen by about 9%. Eppraisal & Realtor.com doesn’t give me a historical chart, but their value ranges are realistic.

So I gaze upon my white post card of doom and see the following numbers…

APPRAISED VALUE

OLD VALUE

NEW VALUE

LAND

123,000

230,000

BLDGS, ETC

413,000

360,000

TOTAL

536,000

590,000

I then think, WTF? Why in the world has my land value gone up nearly 90%? Why is my total property value 10% higher than last year, despite the fact we are in a down market? Is the assessor catching up to the market? Did the assessor really blow it this badly in years past? Is this a work of comedy & horror to rival the cult classic of good garden vegetables gone bad?

So, I call the King County Assessor’s office, and they explain to me that the market sells it as one piece, but the assessor must value the land as if it were vacant. After the land value is determined, they determine the total value of the property. Then the land’s worth is subtracted from the total and the remainder becomes the value of the house. They tell me where to go to view the area report for the Issaquah Highlands if I want find out more about how they determined my property’s value.

I read the report and discover that the base land value of single family home in the Issaquah Highlands is $240,000 and that the appraised land value for Area 75 is about 56.7% higher than it was last year. OK, but it still doesn’t explain why my land value is nearly 90% higher than last year. Unless weeds are considered a land improvement or the definition of a square foot has changed in the past year, I still have no idea how they came up with that figure.

I usually read the Seattle Times, not the Seattle PI, so I didn’t see this coming! However, it’s nice to know, I’m that the only one confused about the crazy assessments this year. I haven’t decided if I’m going to get out my pitchfork and storm the assessor’s office yet, but I do feel the need to understand how they came up with their numbers. I’m sure it doesn’t help that Probably & Statistics for Engineers, wasn’t among the classes at school that improved my GPA when I was going to Cal Poly.

And if any program managers from Zillow are reading this blog post – there has to be a cool new feature idea in this experience somewhere. Your web site is very useful helping me buy or sell a home, but I really have no idea if land values really are what the county says they are. Besides, I pay property taxes on a twice a year basis, but I’ve only sold a home once in the past 10 years. Every time somebody’s assessment changes you could get more site traffic. Why can’t generating a Z-assessment petition be as easy as getting a Z-estimate? Just saying, there’s an opportunity here…

Geekier than Geek Estate & Sweeter than Sweet Digs

Sometimes, you find something in your own back yard that’s an unexpected & pleasant surprise. Like that hole in the wall teriyaki restaurant right by the office, I recently stumbled upon Redfin Developer’s Blog. And since that day of first discovery, I’ve come back often yearning for more.

I just wanted to thank the engineers at Redfin for blogging about their day to day life as web software engineers. As a fellow software engineer, I always like knowing “how they did it that”, “what are they up to now”, or even “WTF were they thinking” (just kidding on that last one guys).

Even though I tend to prefer SQL Server for my RE.net apps (I freely admit that I am biased), I really enjoyed their MySQL to Postgress & Elephant vs Dolphin posts (perhaps I have a database fetish?). I also learned something new & valuable from their CSS Sprites + Firefox Content Preferences = Site Go Boom post. Even the folks without software engineering degrees would probably enjoy their How to search Redfin directly from IE and Firefox & Syndicate Redfin Listings in WordPress posts.

Anyway, if you’re developing a RE.net web site, (or even if you aren’t), I think their developers blog feed belongs in your feed reader. Then again, I’m biased.

PS – I can’t believe you guys don’t have Coding Horror on your blogs you like list yet.

Flying under the radar with the stealthy SecondSpace

In Seattle, the real estate technology scene is pretty crowded. There’s the big 3: Zillow, Redfin, HouseValues. And then, there’s the cool 3: ActiveRain, Estately, and RealTech. Well, there’s another company in town, which will soon be joining the party.

Bellevue based, SecondSpace, was founded by Classmates.com executive Anil Pereira and former Microsoftie Alok Sinha, SecondSpace landed $6.6 million in venture funding from Ignition Partners over a year ago. Alok (their CTO) & Delane Hewett (their Software Architect) both had stints in the MSN HomeAdvisor teams (back in the Web 1.0 days), so they know the internet real estate space better than most new comers.

Pat Kitano of Transparent RE, talked about them 6 months ago when they came out of stealth mode, and starting flying under the radar. The most interesting thing about the company’s business plan is that they are attacking what appears to be small vertical niche. However, one does not talk venture capitalists into writing big checks for thinking small. Their sites, ResortScape.com and LandWatch.com are currently targeting consumers looking for vacation homes and vacant land. In future, they’ll probably target time shares, vacation homes in foreign countries, non regional visitors, and other second home ownership opportunities with additional sites targeted for those niches.

stealth fighter.jpgTechnologically speaking, they have some very compelling technology under development and a very talented technical staff. On their blog, They’ve talked about using SOLR & Lucene as the basis for their search engine, which should give them a near term advantage until somebody does the same thing or writes a check for Endeca. The neural network based learning they employ, should help visitors find interesting properties easier (think of a Google-like search experience tuned for real estate) and it allows them distribute more qualified traffic to their customers (brokers, developers, etc) than a traditional means would allow. They also have even more interesting features on the drawing board, that I’m probably not at liberty to discuss, but I can say their job posting on Craigslist drops some big hints.

However, given that they only have 70,000 listings at the moment; it’s difficult to fully appreciate the impressive technical infrastructure they are building until they have more data to test it with. It’s kind of like test driving a Corvette on a short pot-hole filled road. You just know it performs better than the conditions will allow for. The problem is until the test track improves, you don’t really know how much faster the car really is.

The real question is there a market for a second home or vacation home real estate web portal, when the first home real estate market is struggling? And is that opportunity worth the millions Ignition Partners is investing? For comparison sake, a typical NWMLS IDX web site has about 56,000 listings right now and popular real estate blogs currently have a larger reach than Landwatch.com (their largest site). Even with hockey stick growth of 10,000 new listings a month, it’ll be another 7 years before they hit the million mark (which I think is the magic number of listings you need to have to be taken seriously if you have big aspirations). I think the only people that read their blog are their employees, their VC’s and I. I think they need more a LOT more listings and a LOT more traffic before they are taken seriously by the general population. That hockey stick growth better turn exponential or they better have very patient investors.

Perhaps most disconcerting, they have no visible marketing push, and no real buzz in RE.net blog-o-sphere. Maybe, they are just flying under radar of the public eye until their technological terror is fully operational? Maybe it’s because their business model and the community they serve are so different than the ones the titans Web 2.0 real estate are currently serving, that they don’t need to play by the same rules? Maybe developers don’t feel the need to read or comment on blogs? Maybe their business development leaders needs to read Seth Godin or Dustin Luther?

All I know is that sooner or later, they’ll need to soar above the clouds with after burners at full throttle or crash back to earth. They can’t fly under the radar forever with the firepower they are packing… Anyway, I’m going to be watching this company very closely. The technology under development is too compelling and the business plan is too interesting to stay under the radar at cruising speed for much longer. Will 2008 be the year SecondSpace goes supersonic?