Am I watching the end of cable TV as we know it?

No surprise that I don’t watch much TV (and take the time away from reading blogs???), which makes it all the more surprising that I’ve become addicted to Joost.

Joost is a free TV service/technology from the wickedly-smart guys who brought you both distributed music sharing (Kazaa) and distributed telephone service (Skype). It is “cable-style” in the sense there are a bunch of channels dedicated to specific themes from groups like MTV, National Geographic and Comedy Central.

Just based on their reputations, if I was a Comcast executive, I’d be thinking how can we buy these guys out before I have to pay a Skype-like price for a service that Niklas and Janus have made (essentially) free… again. Any resemblance to dis-intermediation efforts in real estate is completely circumstantial! 🙂

My guess (although I’ve not read this anywhere) is that the backbone of this technology is similar their earlier products in that users are sharing the data with each other as oppose to having any one set of servers stream the information. The beauty of this set-up is that after seeding the correct data, it scales extremely well and service actually improves with more users. The idea being that with more users, there is more likely to be someone local to you who is able to stream you the necessarily data bits.

[photopress:greenday.jpg,full,centered]

Niklas and Janus fascinate me in that they continue to pump out products that can only exist in the world of internet technology and allow them to avoid all the pesky hardware issues that follow their competition. Consequently, they build stuff that consumers can (and do) love.

Nonetheless, I agree with the folks from Lifehacker that the quality is not up to cable standards (yet), but I happily overlook this issue thanks to the extreme usability (i.e. fast-forward, multiple programs on any given channel, etc.) as well as the social-networking aspects (i.e. movie chat!). If you’re interested in peeking into the future of TV, I still have two invites to give out! 🙂

Amazing Access

My husband was helping our son remodel in Portland this weekend. I had alot of work to do, keeping up with new classwork and working on business. I was going to stay home but decided it was lovely weather and I’d travel along with him so I could see my kids, too. 

Having been in college when learning programming meant using keypunch cards, (A key punch is a device for entering data into punch cards by precisely punching holes at a designated locations :)).I have to constantly remind myself to remember how mobile my work now is.  There was nothing that I had to do at the office that I couldn’t do sitting in the car with my computer on my lap, including this post!

I remember the first time I left the office with my cell phone and talked to a customer in Nordstrom.  I pretended I was in the office and that I had time to talk. I remember the feeling that I could really do almost anything as long as I had my cell phone with me and no one would even know I wasn’t at work.

The craziest was when I got an offer on a listing about 10 minutes before pulling out of Miami on a cruise. The deal came together, the seller never did know I was not in town and I handled the whole thing from the ship’s computers.  I don’t turn my business over to an associate unless I have to, and I don’t tell my clients where I am unless it’s necessary. So for a work-a-holic like me, It’s amazing how much free time I have now with mobile phones and now, the mobile office.

[photopress:Copy_of_P1010174.JPG,thumb,alignleft]So, Randy is waiting for me to get back on the road where I can get back to listening to my lecture series online and typing a blog on my laptop plugged in to the battery with my really slick sprint card.  I’ve already answered several emails, searched for a handyman in Vaughn for a client, done a CMA for another client and emailed all the results, while driving up I-5!

It may be that I never really get away from the office, but at least by bringing the office along with me, I can now even enjoy the hours I put in on the job.

BTW, my son is moving from his 1910 bungalow 2100 sq ft home in NE Portland worth $600,000 to a 3600 sf ft new home in Vancouver on 1/2 acre with more bells and whistles than I knew exsisted for $650,000. And it’s only 20 minutes away. Location, location, location.

History of Realtor.com?

Since I know there is more than a little bit of interest among RCG readers with regard to Realtor.com, I thought I’d point people to this video interview I just posted with Phil Dawley (Move’s Chief Technical Officer) and also one of the first employees. (direct link to video)

BTW, I’m up for more of these interviews, so please feel free to suggest people/topics/questions…

Welcome to Seattle, we'll get to that in 8 years

New Orleans is halfway done with a wireless network in less than a year (and while cleaning up after a hurricane no less!), but Seattle is thinking long haul. We’re discussing a city-wide high speed broadband network by 2015. Doesn’t it seem like such a techie city would have started on this a few years ago?

We’ve only been discussing the plan for the viaduct, our 1-in-20-odds-of-collapsing-in-the-next-ten-years waterfront highway, for 5 years now. I expect a draft viaduct replacement plan to be ready for high speed download in 2015.

There you go again – the MLS doesn’t scale

[photopress:Reagan.jpg,thumb,alignright]Ever since Zearch, I’ve been bombarded with work to update or create MLS search web sites for various brokers & agents across the country. Because of this, I’ve had the opportunity to deal with another MLS in the Bay Area (EBRDI) and Central Virginia (CAARMLS). Before I begin another MLS rant (and cause the ghost of Gipper to quip one of his more famous lines), I want to say the IT staff at both EBRDI & the NWMLS have been helpful whenever I’ve had issues, and this primary purpose of the post is to shine a light on the IT challenges that an MLS has (and the hoops that application engineers have to jump through to address them).

After working with EBRDI, and the NWMLS, I can safely say the industry faces some interesting technical challenges ahead. Both MLSes have major bandwidth issues and the download times of data from their servers can be so slow, it makes me wonder if they using Atari 830 Acoustic modems instead of network cards.

The EBRDI provides data to members via ftp downloads. The provide a zip file of text files for the all listing data (which appears to be updated twice daily), and a separate file for all the images for that day’s listings (updated nightly). You can request a DVD-R of all the images to get started, but there is no online mechanism to get all older images. This system is frustrating because if you miss a day’s worth of image downloads, there’s no way to recover other than bothering the EBRDI’s IT staff. If the zip file gets corrupted or otherwise terminated during download, you get to download the multi-megabyte monstrosity again (killing any benefit that zipping the data might have had). Furthermore, zip file compression of images offers no major benefit. The 2-3% size savings is offset by the inconvenience of dealing with large files. The nightly data file averages about 5MB (big but manageable), but the nightly image file averages about 130 MB (a bit big for my liking considering the bandwidth constraints that the EBRDI is operating under).

As much as I complain about the NWMLS, I have to admit they probably have the toughest information distribution challenge. The NWMLS is probably the busiest MLS in the country (and probably one of the largest as well). According to Alexa.com, their servers get more traffic than redfin or John L Scott. If that wasn’t load enough, the NWMLS is the only MLS that I’m aware of that offers sold listing data [link removed]. If that wasn’t load enough, they offer access to live MLS data (via a SOAP based web service) instead of daily downloads that the EBRDI & CAARMLS offer their members. If that wasn’t enough load, I believe they allow up 16 or 20 photos per active listing (which seems to be more than the typical MLS supports). So, you have a database with over 30,000 active listings & 300,000 sold listings, all being consumed by over 1,000 offices and 15,000 agents (and their vendors or consultants). The NWMLS also uses F5 Network’s BigIP products, so they are obviously attempting to address the challenges of their overloaded information infrastructure. Unfortunately, by all appearances it doesn’t seem to be enough to handle the load that brokers & their application engineers are creating.

Interestingly, the other MLS I’ve had the opportunity to deal with (the CAARMLS in Central Virginia) doesn’t appear to have a bandwidth problem. It stores it’s data in a manner similar to EBRDI does. However, it’s a small MLS (only 2400-ish residental listings) and I suspect the reason it doesn’t have bandwidth problem is because of the fact it has fewer members to support and less data to distribute than the larger MLSes do. Either that, or the larger MLSes have seriously under invested in technology infrastructure.

So what can be done to help out the large MLSes with their bandwidth woes? Here’s some wild ideas…

Provide data via DB servers. The problem is that as an application developer, you only really want the differences between your copy of the data and the MLS data. Unforunately, providing a copy of the entire database every day is not the most efficient way of doing this. I think the NWMLS has the right idea with what is essentially SOAP front end for their listing database. Unfortunately, writing code to talk SOAP, do a data compare and download is a much bigger pain than writing a SQL stored proc to do the same thing or using a product like RedGate’s SQLCompare. Furthermore, SOAP is a lot more verbose than the proprietary protocols database servers use to talk to each other. Setting up security might be tricky, but modern DB servers allow you to have view, table, and column permissions so I suspect that’s not a major problem. Perhaps a bigger problem is that every app developer probably uses a different back-end, and getting heterogeneous SQL servers talking to each other is probably as big a headache as SOAP is. Maybe using REST instead of SOAP, would accomplish the same result?

Provide images as individually down-loadable files (preferably over HTTP). I think HTTP would scale better than FTP would for many reasons. HTTP is a less chatty protocol than FTP is, so there’s a lot less back & forth data exchange between the client & server. Also there’s a lot more tech industry investment in the ongoing Apache & IIS web server war than improving ftp servers (I don’t see that changing anytime soon).

Another advantage is that most modern web development frameworks have a means of easily making HTTP requests and generating dynamic images at run time. These features mean a web application could create a custom image page that downloads the image file on the fly at run-time from the MLS server and caches it on the file system when it’s first requested. Then all subsequent image requests would be fast since they are locally accessed and more importantly, the app would only download images for properties that were searched for. Since nearly all searches are restricted somehow (show all homes in Redmond under $800K, show all homes with at least 3 bedrooms, etc), and paged (show only 10, 20, etc. listings at a time), an app developer’s/broker’s servers wouldn’t download images from the MLS that nobody was looking at.

Data push instead of pull. Maybe instead of all the brokers constantly bombarding the MLS servers, maybe the MLS could upload data to broker servers at predefined intervals and in random order. This would prevent certain brokers from being bandwidth hogs, and perhaps it might encourage brokers to share MLS data with each other (easing the MLS bandwidth crunch) and leading to my next idea.

BitTorrents? To quote a popular BitTorrent FAQ – “BitTorrent is a protocol designed for transferring files. It is peer-to-peer in nature, as users connect to each
other directly to send and receive portions of the file. However, there is a central server (called a tracker) which coordinates the action of all such peers. The tracker only manages connections, it does not have any knowledge of the contents of the files being distributed, and therefore a large number of users can be supported with relatively limited tracker bandwidth. The key philosophy of BitTorrent is that users should upload (transmit outbound) at the same time they are downloading (receiving inbound.) In this manner, network bandwidth is utilized as efficiently as possible. BitTorrent is designed to work better as the number of people interested in a certain file increases, in contrast to other file transfer protocols.”

Obviously MLS download usage patterns match this pattern of downloading. The trick would be getting brokers to agree to it and doing it in a way that’s secure enough to prevent unauthorized people from getting at it. At any rate, the current way of distributing data doesn’t scale. As the public and industry’s appetite for web access to MLS data grows and as MLSs across the country merge and consolidate, this problem is only going to get worse. If you ran a large MLS, what would you try (other than writing big checks for more hardware)?

SELECT * FROM MLS WHERE Remarks = ‘Whoa’

I thought I’d take a moment to reflect on how Rain City’s favorite MLS Search is implemented. I’m a little tired of thinking in computer languages (mostly T-SQL, C# and Javascript), so I figured I’d blog a bit in Geek English for a little while before I hit the compiler again.

[photopress:matrix1_alt.jpg,full,alignright]

I’m always interesed in how web sites & computer software works under the covers, so I thought I share some of the more interesting points about how I’ve implemented “Zearch” to date for the “geekier” folks in the blogosphere.

It all began way back in the fall of 2005 shortly after I got my first MLS feed. At the time, Microsoft’s asp.net 2.0 platform was still in beta. However, after learning what Microsoft’s next generation web development tools were going to do (and seeing what Google Maps and Microsoft’s Virtual Earth teams were doing), I saw a great unrealized potential in MLS search tools and decided to do something about it.

Anyway, it’s all built on top of asp.net 2.0 and MS SQL Server 2000 (yeah, I know I’m old school). One of the first things I did is combined all the property types into a VIEW and create a dynamic SQL query when you search for properties. Some search tools only let you search for residential properties or condominums at one time (which I thought was lame). I orginally tried to implement stuff doing a bunch of UNIONs, but keeping track of the schema variations for the different property types eventually drove me nuts, and I encapsulate all that crud into a VIEW.

I also find it a little ironic, that I’m not the only one who found the MLS schema differences a PITA to deal with. I’m glad the various MLS software vendors and the CRT are working toward a common industry schema (aka RETS), so us application developers can focus on the real problem (developing compelling & useful software), instead of remembering that the ld column in one table, is really the list_date column in another table.

Another interesting thing I do on the back end is that I geocode every listing after I do data download. The main reason is that I don’t trust the MLS data and their bogus geo-coding would make my app look bad. I also knew when I started, I’d eventually do maps, so as soon as a new listing hits my database, it’s gets more accurately/correctly geo-coded. In case your wondering if I’m screen scraping w/ Perl or something else, it’s all done with T-SQL stored procdures. (Well, technically it’s a proc that calls the MSXML2.ServerXMLHTTP COM object, to issue an HTTP request against a geocoding web service, and then uses OPENXML on the response’s XML to get the latitude & longitude).

As you might have guessed, there are also stored procedures and functions to get the distances between two points, doing a radius search, and other stuff of that ilk. Fortunately, all that stuff can easily be found using your favorite search engine, so you don’t need to know how all the math in the law of cosines works (you just need to know of it).

Well that’s it for the back end. Next time I’ll talk about the front end put on my Web Developer hat.


Did you know:

Psst – Want a free copy of Windows?

Granted, this post is not real estate related. However, since Rain City Guide is at the intersection of Real Estate Ave & Technology Blvd, I figured the goings on at Microsoft Way might interest some Rain City Readers.

Windows VistaFor good or ill, Microsoft is still the 800 lb gorilla of the technology industry. Because of this, the success of the company has sizable impact on the price & availability of housing in the region. Therefore, it’s fair to say, that I’m hoping that Windows Vista won’t suck, and its success will lead to another growth spurt at the company which will increase the value of every-body’s home in the greater Redmond/Seattle area. (That way both the geeks & realtors will be happy)

Anyway, the purpose of this post if to inform the geeky among us that Microsoft is having a Windows Vista Install Fair this weekend. Here’s what you need to do in order to get a free copy Windows Vista.

  • Have a computer that your willing to sacrifice that meets the following criteria..
    • Your important data has been backed up! (This is pre-release software after all)
    • Operating System is Windows XP SP2 (Home or Professional) English x86.
    • Intel/AMD Processor running at 1 GHz or higher.
    • Minimum of 512MB of RAM.
    • Video adapter is AGP4x/8x or PCIe with a minimum of 64MB of RAM.
    • Hard Disk has a minimum of 10GB of free space to allow for upgrade.
    • Has a DVD-ROM Drive.
  • Send mail to Windows Vista Install Fair Registration (wvifr@microsoft.com) and inform them that you wish to participate in either the Saturday (March 18th 2006) 1:00 PM to 4:00 PM session or the Sunday (March 19th 2006) 1:00 PM to 4:00 PM session.
  • Bring yourself and your computer over to Building 27 on the Microsoft Campus at the scheduled time.
  • Enjoy your free copy of Windows Vista Ultimate Edition (February 2006 CTP Version) 

Well, I’m looking forward to playing around with all the cool new features and using an OS built for the 21st century. Here’s to hoping my laptop enjoys the experience!

Using the Internet to Buy Your New Home

I have recently been enlightened on how grossly inadequate many of the home viewing sites are and how misleading they can be.

Maybe I should have known this before, but frankly, those sites rarely come into play in my everyday life. I use the mls and clients use me. I truly haven’t considered until recently how people use the internet in the home buying process and why they do that.

Now that I am viewing the world through your eyes a bit, with the help of my most recent clients, I would like to “give back