Predictions: helpful or counterproductive?

The dialogue between commenters has been interesting to read on Ardell’s most recent post about predictions.

Is this a helpful or counterproductive prediction?

“So, don’t be swayed by media reports of a ‘disastrous housing economy.’ Take the long-term view and be confident that your home will continue to appreciate in value. And know that if you buy a home today, in seven years it will be worth a lot more.”

– Geoff Wood, CEO Windermere Real Estate
(Quote taken from the Spring Quarter 2008 of ‘Inhabit,’ The best of the Pacific Northwest magazine) Published by the Seattle Times.

I don’t necessarily disagree with Geoff’s sentiment and I understand his overall point within the larger context of the quote**—I just took the last paragraph quote from his ad titled “Gaining Perspective on the Real Estate Market”. Seven years is a long time. But this is one heck of a prediction, no bones about it.

**his use of a “Casino” and “gambling” analogy was terribly ironic, intended or not.

I have no idea if this was solely a local ad or if they ran it or a similar one in other markets Windermere has offices as well.

Agents are resources in so many aspects for their clients. Consumers turn to them for valuable feedback, for information, data, suggestions AND ADVICE, which includes feedback on where the real estate market is trending. In so many ways, they are advocates whom consumers want to trust, but for a variety of reasons find it difficult. Building and gaining trust does not begin with “I don’t know where the market is heading, beats me.”

You have to give Geoff Wood credit. At least you know where he stands.

The MLS of the future

[photopress:futurama_bender.jpg,thumb,alignright]Recently, the Center for Realtor Technology and Jim Duncan’s Real Central VA had blog posts on the desire to have MLSs’ add another column to their schema that indicated the broadband access status of a property. I think this is an idea whose time has been a long time coming. When I moved from my old home in Carnation to my new home in Issaquah, the new owner of my old house wanted to know everything I could tell him about the home’s local ISP (I believe he was a network engineer). Similarly, one of the major reasons I moved into my current home, was that it had bandwidth to spare (my ISP’s top of the line plan is currently 8 M download / 2 M upload speeds). In the Emerald City or the Bay Area, this information is probably second in importance only to the list price of a home or its location. Simply put, a home’s high speed internet capabilities is an increasingly important factor in your purchasing decision.

However, as long as the MLS DBA is mucking around with database schema and typing in ALTER TABLE Residential ADD Internet varchar(50) and other SQL DDL commands, why should we stop there? Here’s what I’d like to see when the MLS gets around to enhancing it’s database schema.

Use Links. Why not enhance school, local government, builder & utility information in the MLS to have both names and urls? When I move to a new home, usually the first thing I need to do is contact all the local utilities and let them know I’m in a new place. Having links to Puget Sound Energy, Issaquah School District, Specialized Homes, and King County Government in the MLS would save me time. Finding contact information and phone numbers is a much bigger pain than it should be at times.

Cell phone reception information. If you don’t have good cable or DSL internet access, knowing how strong Sprint’s or Clearwire’s signal is would be nice to know. I suspect real estate agents and other professionals that increasingly depend on wireless internet access would find this information very helpful.

More accurate and fewer errors. OK, I’ve complained about this before. Still, is it really too much to ask? If a property doesn’t geocode, somebody may not find it when they use a popular map based real estate search engine.

Embrace RETS. Enough said.

Richer media. OK, so the MLS allows you to upload 10 or 20 small photos (or whatever the number is). Why not allow larger photos, MP3 files, video files or PDF flyers? As broadband takes over the world, the stuff is a lot more practical. Although, the idea sounds nice in theory, I’m not sure agents are ready to hire professional audio engineers or videographers when many haven’t learned the value of high quality photography yet. I also think the MLS IT infrastructure isn’t ready for this kind of load (frankly if you can’t handle the bandwidth demands of digital photography, you should probably outsource to Amazon S3 or Flickr Pro before it’s too late), and it’s going to make life more a lot more interesting for us IDX vendors.

So, if you could change the MLS database, what would you like to add or change? What information do you wish was there, but isn’t? Is built green home information and information on low flow toilets something today’s home buyer wants to be able to search for? Do you think more information would pose an undue burden on agents or brokers (those MLS listing forms are one step removed from a tax return), or do you want more, more, more? What would you like IDX vendors to do differently, regardless if the MLS changes or not?

Seattle Area Appreciation

Brian Brady asked: “Off topic but I wanted to ask you a question, Ardell. Has Seattle been a rising market from Feb, 2005 through today?”

It would have been a lot easier to answer if you hadn’t said February 2005 🙂 I could have just said yes. But I remember the day. It was June 15, 2005. I could feel it. I could taste it. I could smell it. The ground was swelling. You could put your ear on the ground and hear it coming! LOL I happened to be in a complex called Sixty-01, which has its own idiosyncrases that I won’t go into since you are out of State, Brian. But here’s some stats to prove my blood boiling was on target. Hindsight is easy. Feeling it coming is an artform. I’m using Sixty-01 because I was there that day and also because it has a lot of “same product”/apples to apples for straight appreciation comparisons. They are all practically identical 2 bedroom – 1.5 bath townhomes in the stats below.
07/09/03 – $100,000

11/24/03 – $ 95,000

08/12/04 – $128,950

08/24/04 – $129,500

02/18/05 – $128,950

05/03/05 – $123,000

06/20/05 – $131,450

07/07/05 – $127,000

All of those were in contract before June 15. On June 15th one came on market with an asking price of $137,950. I practically begged a poor woman to get an offer in within an hour of it hitting the market, to grab it at full price. I could feel it in my bones! The prices were going to move right now! She could get it at full price today! But she couldn’t get her brain around it. She wanted to make an offer based on the average of the comps at $127,000. I was beside myself. I knew getting that townhome at $137,950 on that first day was going to be the best move she ever made. But I couldn’t convince her. Five days later it bid out and sold at $148,000. And here’s what happened after that.

07/19/05 – $148,000

07/22/05 – $167,950

07/29/05 – $166,000

11/29/05 – $178,950

03/30/06 – $177,000

06/07/06 – $205,450 (list at $199,900)

07/11/06 – $205,000

08/25/06 – $227,500

09/13/06 – $235,000

11/01/06 – $245,000

01/17/07 – $252,500

New on Market $269,900

So Brian, rephrase the question and ask me if it has been going up since 6/15/05, and I can answer yes. February 05 through June 05, not as much. I’ll have to do a new townhome comparison in Ballard to confirm Eastside vs. Seattle proper. Hard to find “like kind” in Seattle as there are very few “like kind” comparisons except splits and townhomes. Many of the homes were built in the early 1900s through 1930, and are all unique structures with massive modifications since 1905. But I’m pretty sure the stats will be about the same. Kirkland Condos…same story but harder to find “like kind” these days as newer equals higher ceilings, so “like kind” harder to track.

Hope that answers your question.

Death by a thousand paper cuts

[photopress:papercut.jpg,thumb,alignright]Every once in a while a realtor or broker from out of state will ask me to develop an IDX web site for them. Unfortunately, supporting a new MLS is very similar to supporting a foreign language. It is a large software engineering task that takes a lot of time, and since I don’t already have the code written and don’t already have access to their MLS’s feed, I inform them that time is money and the conversation usually ends there. Someday, that may not be the case, but I’d rather be small & profitable than large & broke.

The problem is made worse by the fact that many Realtors don’t know what format or protocol their MLS uses for data downloads or even who to contact in their MLS to get a feed for an IDX vendor. If you ever want to change IDX vendors, hire a software engineer or are crazy enough to do it yourself, you should know this. Knowing how your MLS distributes your listing data is like knowing how to change the oil in your car or how to defragment your hard drive. You don’t have to know, but it’s good to know. It may seem like I’m ranting about some MLS techie mumbo jumbo thing again, but it is preventing the industry from taking advantage of the low cost IT innovations that could be. I don’t think folks fully appreciate the challenges that an IDX vendor faces and how those challenges are retarding the industry’s growth and health.

For example, the NWMLS (Northwest Multiple Listing Service – serves mainly Seattle, WA and western Washington) uses software from Rapattoni. It provides listing data via a proprietary SOAP interface and all the photos are accessible via an FTP server. Listing data is updated constantly (a new listing usually appears in our feeds about 15-20 minutes after it’s been entered into NWMLS by a member as I understand it).

By contrast, EBRD (East Bay Regional Data – serves mainly Oakland, CA and the east bay area) uses Paragon by Fidelity MLS Systems provides it’s listing data via nightly updated CSV text files, down-loadable by FTP. The new and updated listings images are accessible via ZIPed files via FTP. The photos for active listings which haven’t been recently added or changed are not available (unless you bug the IT dept).

The only way they could make their systems more different is if the EBRD encoded their listings in EBCDIC! In order to support both, I need to develop 2 very different programs for downloading the listing data onto my server, importing the listing data in my database, dealing with differences in the listing schema (for example, the EBRD doesn’t contain a “Number of Photos” field or a “Community Name” field), dealing with differences in the photo location downloading (the NWMLS stores all photos in an uncompressed format in one of a thousand sub directories while the EBRD just stores the fresh photos in one big zip file). So I can spend my limited time improving my software for new markets (that have no customers) or improving my software for my home market (which has paying customers). Unfortunately, given the current market realities I can only afford to support my home market at this time since MLS IDX programs can be very different and there is no place like home (so far as I know anyway).

I keep waiting for RETS to save me from this madness, but until it happens in Seattle or the East Bay, I’m not holding my breath. After all, if two of the larger MLSes in the country in the two most tech savy areas of the nation don’t support it yet, I interpret it to be a vote of no confidence. I suppose, RETS could be going great guns in the rest of the country, but if it was, I’d expect the NWMLS & EBRD to be all over it, like the establishment on Redfin.

The Center for REALTOR® Technology Web Log, paints a rosy a picture regarding RETS deployment in the industry. Unfortunately, according to Clareity Consulting, an IT consulting firm that serves MLSes and other parts of the real estate eco-system, RETS is the NAR’s unfunded mandate. Although, everybody wants the benefits of RETS, nobody is willing to pay for it. Furthermore, it appears back in days before I got sucked into real estate technology, there was an effort to promote the DxM standard and that went nowhere (which is a bad omen). What’s worse is that they keep moving the goal posts. We don’t even have widespead RETS 1.0 support, and they’ve already depreciated that standard going full bore on RETS Lite and RETS 2.0. It seems the biggest problem is one of vision and scope. They keeping adding more features to cover more scenarios, when we don’t even have wide deployment of the existing standard (assuming that we had standards to begin with at all). It reminds of the recent software industry debacle that is known as “Longhorn reset“. The problem is that RETS is just too complicated, in an environment with too many legacy systems in place, too few resources to support it, and excessive aspirations. The idea of RETS is great, it’s the implementation and deployment that’s disappointing and at least Microsoft pulled Vista out if it’s death spiral…

[photopress:pappercutter.jpg,thumb,alignleft]The sad thing is that computer industry already has great tools for moving data around over the Internet in efficient and well supported (if sometimes proprietary ways). They allow you to query, slice, and dice your data in a near infinite number of ways. They’re called database servers. They are made by multiple software vendors and there are even some excellent open source ones out there. They let you set permissions on what accounts can see what tables or views (gee, sounds like something an MLS would want). The better ones, even have this level of security to the field level. Even better, most of these so called database servers have the ability of exporting data into spreadsheets, reporting tools, and even GIS systems. All of them provide a well defined and often times well implemented API that software developers can use and exploit to implement what hasn’t been invented yet!

Why doesn’t the NAR & the MLSes save us all the trouble, standardize on a few good database platforms (I’m a fan of MS SQL Server and MySQL, but I’d settle for anything that has ODBC, .net & Java support at this point), and provide everybody RDBMS accounts? It’d lower the cost for us IDX vendors (less code to write, since everything is just SQL), it’d lower the costs for MLS vendors (since data access, security, programmability, and scalability is now the RDBMS vendor’s problem), provide more choices for agents and brokers (since getting Excel talking to MS SQL Server is a cakewalk compared to RETS) and it will lower IT costs for the MLS (because the MLS vendors don’t need to invent an industry specific solution to a problem that’s been largely solved already and I’m betting that the MLS vendors already use somebody else’s RDBMS to implement their solutions anyway). Granted, a SQL Server won’t enable all the scenarios that RETS wants to enable (if RETS was ever well implemented and widely deployed enough for that happen). However, I’m of the belief that it’s not going to happen until after Trulia or Google Base becomes the de facto nationwide MLS by providing a single schema with a simple REST like web services interface.

So, what does your MLS do to support IDX vendors? Do they provide all the data all the time, or just daily updates? Have they deployed RETS yet? Are they going to? Who is their MLS software vendor or do they have a home gown solution? What do you want to do, that you can’t do today because the data is in a format that you can’t use easily? Would you be willing to pay more in membership dues for better software or better service from your MLS? Are we at the dawning of the RETS revolution, or is it too little, too late?

PS – Anybody, know anybody from an MLS / IDX dept or MLS vendor that blogs? I’d love to know what things are really like on their side of the listing data fence.

2006 Statistical Review and Highlights

Straight out of the horses mouth. I noticed these stats posted by the NWMLS today. I found a smilar post on their public site, nwrealestate.com. You can see the detailed story here

During 2006, members of NWMLS. . .

  • Reported more than 96,000 closed sales with a combined value of more than $35 billion
  • Experienced a 6.7% drop in number of units sold compared to 2005, but an increase of about 5% in the dollar volume of the closed transaction
  • Reported 1,951 sales of single family homes priced at $1 million or more (up from 1521 during 2005) and 859 sales of condominiums priced at $500,000 or more (up from 623 during 2005).
  • The MLS area covering Bellevue/West of 405 had the highest number of million dollar-plus sales with 219, followed by Central Seattle/Madison Park with 165. For high-end condos ($500,000-plus), west Bellevue had the largest number (183), followed by Belltown/downtown Seattle (130) and Kirkland (117); 145 condos sold for more than $1 million
  • Among the 19 counties in the MLS service area, San Juan claimed the highest median price ($539,500) for single family homes that sold last year; King County followed at $425,000
  • Maintained a high ratio of cross-sales: more than three of every four transactions were listed by one office and sold by a different office
  • Added 139,814 new listings of SFH and condos to inventory, with the highest volume (14,541 added during June
  • Represented more than 30,000 home sellers, on average, each month
  • Reported double-digit price gains for SFH compared to 2005 in all but one county
  • In the four-county Puget Sound region (King, Snohomish, Pierce and Kitsap), only about 6% of single family homes sold for under $200,000
  • Sold more than 15,000 condominiums, about the same as during 2005; approximately 63% of all condos that sold system-wide were in King County.
  • Found wide variation in prices of 3-bedroom homes. For pre-owned homes (built 2004 or earlier), the median sales price ranged from $124,900 in Grant Co. to $508,000 in San Juan Co.
  • In King County, the average price of a single family home that sold in 2006 was about 2.9 times higher than the price in 1990 (up from $178,187 to $518,108).

NWMLS at a Glance

December 2006
Member Brokerages
2,075
Sales Associates
26,183
Counties included in Summary Report
17

Top 10 List of Real Estate Lists

That’s right, I’m going meta-meta. Or better yet, I’m going mega meta (unlike Greg who went mini meta! 🙂 ).

  1. Hanan’s irregular list of new real estate blogs. Beautiful idea, perfectly executed. It is interesting to note that almost none of blogs from his first installment are still around writing interesting content…
  2. 10 Best Women Bloggers. Because it matters.
  3. 3 Easy Steps to Stop Zillow from Publishing the Zestimate of your Home… Because it doesn’t matter (and it still generates a ton of hits).
  4. Curbed’s Broker Boys and Babes Contest. No one else could have done this right.
  5. Curb Appeal Enthusiasm. Simple. Relevant. Useful. Interesting.
  6. 21 reasons to bank on the Phoenix real estate market… Should serve as a great warning to agents writing about the bubble… Be prepared to take the issue on like Greg or don’t even go there… (and I simply can’t ignore his list of blogs that feed a hungry mind.)
  7. The consistently growing list of neighborhood videos from TurnHere… I’m addicted.
  8. The PMI Group’s list of cities with the riskiest housing market. (This is a personal favorite since they traditionally rank Seattle as one of the least risky places to invest in real estate.
  9. Another ego item for the list… I check out the technorati site multiple times a day to find out if anyone is linking to RCG. But technorati provides so much more like keyword searches of blog posts and keyword searches of blogs. I similar argument could be made for del.icio.us since it is so darn useful for finding good content!
  10. Ardell’s list of posts for buyers and for sellers make up an incredible, wild, colorful, useful list of content.

And the worst real estate lists?

  1. I have only one: PubSub. This great concept is in desperate need of some algorithm love. For starters, if they are not going to count blogrolls each day, then they have to be consistent. For example, the Seattle PI Real Estate blog shows up #1 day in and day out because PubSub thinks that all the PI blogs are giving a fresh link to PI Real Estate blog every time they post a new blog entry. In reality, it is simply a function of their blog being on the blogroll of all the PI blogs. In addition, many features (like their URL detail page) have been broken for most of their existence. Lazy-coding issues like this make their tool nearly useless.

20 million reasons to cancel AOL

Update: You can now search the AOL data from your web browser.
As promised earlier, I did some scans through the massive privacy invasion from AOL for some real state search insight. I’ll leave it to other sites to search for the tell you about the disgusting things people search for.
Not many AOL searchers are looking for “seattle real estate” in those words – in fact only 21 of the 20 million queries contained that text and those users largely went to the big Google-optimized sites like Seattle Power Search (the number one result) or the Seattle Times (the number 3 result).

AOL users found Rain City Guide through many long-tailish routes, with relevant keywords like “zilllow” (sic), “small condos,” “seattle real estate,” “earnest money recipt” (sic), and “five factors that determine if an idea is a good investment opportunity.” Guilty-conscience user 917673 came to us while searching for “sellers disclosure for condominium complex.” And User 1636230, who came to Rain City Guide after searching for “seattle real estate,” also visited Winderemere and Seattle Power Search. One searcher also found us when they searched for “dustin dustin.”
Here’s where this data goes beyond our own site (and where it gets creepier): what preceded and followed those inquisitive searches? Can we tell something about these people? Well, User 1636230’s interest in real estate was passing. They searched around for five minutes in March:

  • www.happydogtoys.com 2006-03-21 14:12:20 2 http://www.happydogtoys.com
  • www.happydogtoys.com 2006-03-21 14:12:20 2 http://www.happydogtoys.com
  • hometown realty executives 2006-03-21 15:30:32
  • hometown realty executives seattle 2006-03-21 15:30:41
  • hometown r.e. executives seattle 2006-03-21 15:31:05
  • hometown real estate executives seattle 2006-03-21 15:31:24
  • seattle real estate 2006-03-21 15:33:18 1 http://www.seattlepowersearch.com
  • seattle real estate 2006-03-21 15:33:18 8 http://raincityguide.com
  • seattle real estate 2006-03-21 15:33:18 9 http://www.windermere.com
  • locks for love 2006-03-22 10:52:35 1 http://www.locksoflove.org

Then decided to look into building their own home a week later:

  • lux homes 2006-03-28 15:33:40 1 http://www.luxhomesllc.com
  • woodenville builders 2006-03-28 15:34:56 (4 more identical searches)

User-1636230 then went on to search for approximately 10,000 pet related items and for much sadder subjects, including for cancer drugs and incontinence.

What of guilty-conscience-User 917673? They were clearly concerned about their condo and they didn’t want to tell the buyer. Here are three of their searches (of over forty):

  • condominium disclosure by seller in los angeles
  • arbitration for selling or buying a condo
  • consequences of no disclosure from seller

Sounds like someone got a bum deal on that condo!

In looking at the other Seattle Real Estate searches, it seems that the adage that buyers and sellers go with the first agent they talk to does not apply to searches (no big surprise here). Searchers go all over the internet and leave and come back to the same search repeatedly. If they’re as committed as User 917673, they use lots of slightly different word combinations. What I found interesting was watching users hit a site that they are interested in, then go on to search for that company’s or person’s name to see if they can find some background (so it is good to be on a first name basis with the search engines).

I only found 668 occurrences of “cancel AOL.” I suspect and hope there will be a lot more this week.

Real estate search patterns and AOL users

Yesterday AOL proudly announced the release of 20 million web queries from 650,000 users (screenshot), with each user “anonymized,” but identified by a unique ID. This is appalling – it means that potentially thousands of social security numbers and email addresses are now free for spammers and thieves to harvest, along with a lot of other personally identifying information. Think about what you search for – email addresses, people’s addresses, business secrets and even social security numbers come to mind. AOL quickly realized their mistake and pulled the plug, but not before the dataset had taken on a life of its own.

So, spammers and thieves are having a field day, but now that it’s out, we might as well use it for educational purposes. It’s a big, unwieldy file, but I’ll try to post some real estate search patterns by tomorrow. If you’re hoping to do your own analysis on this dataset, I wager that there will be a nice web interface for you to use within a week (Consumerist thinks so too). I’ll let you know when it pops up.
More on the ramifications of the release at TechCrunch. If you’re going to cancel your AOL account, good luck.

There you go again – the MLS doesn’t scale

[photopress:Reagan.jpg,thumb,alignright]Ever since Zearch, I’ve been bombarded with work to update or create MLS search web sites for various brokers & agents across the country. Because of this, I’ve had the opportunity to deal with another MLS in the Bay Area (EBRDI) and Central Virginia (CAARMLS). Before I begin another MLS rant (and cause the ghost of Gipper to quip one of his more famous lines), I want to say the IT staff at both EBRDI & the NWMLS have been helpful whenever I’ve had issues, and this primary purpose of the post is to shine a light on the IT challenges that an MLS has (and the hoops that application engineers have to jump through to address them).

After working with EBRDI, and the NWMLS, I can safely say the industry faces some interesting technical challenges ahead. Both MLSes have major bandwidth issues and the download times of data from their servers can be so slow, it makes me wonder if they using Atari 830 Acoustic modems instead of network cards.

The EBRDI provides data to members via ftp downloads. The provide a zip file of text files for the all listing data (which appears to be updated twice daily), and a separate file for all the images for that day’s listings (updated nightly). You can request a DVD-R of all the images to get started, but there is no online mechanism to get all older images. This system is frustrating because if you miss a day’s worth of image downloads, there’s no way to recover other than bothering the EBRDI’s IT staff. If the zip file gets corrupted or otherwise terminated during download, you get to download the multi-megabyte monstrosity again (killing any benefit that zipping the data might have had). Furthermore, zip file compression of images offers no major benefit. The 2-3% size savings is offset by the inconvenience of dealing with large files. The nightly data file averages about 5MB (big but manageable), but the nightly image file averages about 130 MB (a bit big for my liking considering the bandwidth constraints that the EBRDI is operating under).

As much as I complain about the NWMLS, I have to admit they probably have the toughest information distribution challenge. The NWMLS is probably the busiest MLS in the country (and probably one of the largest as well). According to Alexa.com, their servers get more traffic than redfin or John L Scott. If that wasn’t load enough, the NWMLS is the only MLS that I’m aware of that offers sold listing data [link removed]. If that wasn’t load enough, they offer access to live MLS data (via a SOAP based web service) instead of daily downloads that the EBRDI & CAARMLS offer their members. If that wasn’t enough load, I believe they allow up 16 or 20 photos per active listing (which seems to be more than the typical MLS supports). So, you have a database with over 30,000 active listings & 300,000 sold listings, all being consumed by over 1,000 offices and 15,000 agents (and their vendors or consultants). The NWMLS also uses F5 Network’s BigIP products, so they are obviously attempting to address the challenges of their overloaded information infrastructure. Unfortunately, by all appearances it doesn’t seem to be enough to handle the load that brokers & their application engineers are creating.

Interestingly, the other MLS I’ve had the opportunity to deal with (the CAARMLS in Central Virginia) doesn’t appear to have a bandwidth problem. It stores it’s data in a manner similar to EBRDI does. However, it’s a small MLS (only 2400-ish residental listings) and I suspect the reason it doesn’t have bandwidth problem is because of the fact it has fewer members to support and less data to distribute than the larger MLSes do. Either that, or the larger MLSes have seriously under invested in technology infrastructure.

So what can be done to help out the large MLSes with their bandwidth woes? Here’s some wild ideas…

Provide data via DB servers. The problem is that as an application developer, you only really want the differences between your copy of the data and the MLS data. Unforunately, providing a copy of the entire database every day is not the most efficient way of doing this. I think the NWMLS has the right idea with what is essentially SOAP front end for their listing database. Unfortunately, writing code to talk SOAP, do a data compare and download is a much bigger pain than writing a SQL stored proc to do the same thing or using a product like RedGate’s SQLCompare. Furthermore, SOAP is a lot more verbose than the proprietary protocols database servers use to talk to each other. Setting up security might be tricky, but modern DB servers allow you to have view, table, and column permissions so I suspect that’s not a major problem. Perhaps a bigger problem is that every app developer probably uses a different back-end, and getting heterogeneous SQL servers talking to each other is probably as big a headache as SOAP is. Maybe using REST instead of SOAP, would accomplish the same result?

Provide images as individually down-loadable files (preferably over HTTP). I think HTTP would scale better than FTP would for many reasons. HTTP is a less chatty protocol than FTP is, so there’s a lot less back & forth data exchange between the client & server. Also there’s a lot more tech industry investment in the ongoing Apache & IIS web server war than improving ftp servers (I don’t see that changing anytime soon).

Another advantage is that most modern web development frameworks have a means of easily making HTTP requests and generating dynamic images at run time. These features mean a web application could create a custom image page that downloads the image file on the fly at run-time from the MLS server and caches it on the file system when it’s first requested. Then all subsequent image requests would be fast since they are locally accessed and more importantly, the app would only download images for properties that were searched for. Since nearly all searches are restricted somehow (show all homes in Redmond under $800K, show all homes with at least 3 bedrooms, etc), and paged (show only 10, 20, etc. listings at a time), an app developer’s/broker’s servers wouldn’t download images from the MLS that nobody was looking at.

Data push instead of pull. Maybe instead of all the brokers constantly bombarding the MLS servers, maybe the MLS could upload data to broker servers at predefined intervals and in random order. This would prevent certain brokers from being bandwidth hogs, and perhaps it might encourage brokers to share MLS data with each other (easing the MLS bandwidth crunch) and leading to my next idea.

BitTorrents? To quote a popular BitTorrent FAQ – “BitTorrent is a protocol designed for transferring files. It is peer-to-peer in nature, as users connect to each
other directly to send and receive portions of the file. However, there is a central server (called a tracker) which coordinates the action of all such peers. The tracker only manages connections, it does not have any knowledge of the contents of the files being distributed, and therefore a large number of users can be supported with relatively limited tracker bandwidth. The key philosophy of BitTorrent is that users should upload (transmit outbound) at the same time they are downloading (receiving inbound.) In this manner, network bandwidth is utilized as efficiently as possible. BitTorrent is designed to work better as the number of people interested in a certain file increases, in contrast to other file transfer protocols.”

Obviously MLS download usage patterns match this pattern of downloading. The trick would be getting brokers to agree to it and doing it in a way that’s secure enough to prevent unauthorized people from getting at it. At any rate, the current way of distributing data doesn’t scale. As the public and industry’s appetite for web access to MLS data grows and as MLSs across the country merge and consolidate, this problem is only going to get worse. If you ran a large MLS, what would you try (other than writing big checks for more hardware)?

Lame MLS Data Again!

[photopress:mea_culpa.jpg,thumb,alignright]Looks like I need to get down on bended knee and beg Robbie’s forgiveness, pounding my chest and saying “Mea Culpa!” Robbie, I try and try to give you accurate data, honestly I do. But it just is not always within my power to do so. I am totally stumped on this one.

If anyone out there can help me get the accurate data for Robbie, PLEASE point me in the right direction.

I listed a property in Mount Baker. Checked the tax record and it said “Year built 1900”, so I entered that “data”. This is a “man in the bushes” listing, so I already know who is likely going to buy it. But still, for Robbie’s sake, I would like the data to be accurate. Out of curiosity, I wondered if there were any houses older than this “Grand Olde Dame” of Mount Baker.[photopress:mt_baker.jpg,thumb,alignleft] I did a General Query in the tax database for homes built between 1800 and 1900, and guess what?! Anything OLDER than 1900, shows AS 1900 in the tax records!

So here I am realizing that I put “Year Built 1900” in the mls, and maybe it is really older than that. Of course my first thought is about Robbie and his “Cries Against Lame Data”. Tell me please, what’s a girl to do when the tax records won’t take me back further? So I contact the Title Company and they say they can only do what I did, giving them 1900 also. They then go a step further, and now do know that “the original plat declaration” for that section of Seattle was in 1888. So maybe that tells us that the house was built in that 12 year window, between 1888 and 1900.

Could “the original plat declaration of 1888” be filed AFTER the house was built there? Enquiring minds want to know! Sorry Robbie, I aim to please; and yet again disappoint. MLS has no way to put a “date range” for year built, or “older than” 1900. So 1900 it stays. Though I did try to account for that in the remarks section.

Am I forgiven, or do I end up on “Robbie’s Lame List” with a “lazy agent” dunce cap on my head? Oh well, “Wednesday’s Child is full of Woe”, as the Nursery Rhyme goes. Some days I wish I were born on a Sunday.