- Google is doing a major update on their backlink calculator. One of the updated datacenters is showing over 1600 backlinks to RCG while the regular search is still only showing 733. This is great news! The more often Google re-indexes backlinks the better because we get so many more (recognized) backlinks than the typical agent website and I’ve noticed that each time Google updates these backlinks (they only do it every 3 to 6 months), we placed much better in organic search results shortly thereafter. Yum!
- Talking about organic search results, I let Greg know that I thought he was potentially hurting himself in Google by posting identical articles on both his regular blog and his ActiveRain blog (no longer available). Put very simply (and definitely an oversimplification), when Google sees two identical articles, they are forced to make a choice in determining which article is “good” and which one is “spam”. Assuming you don’t want either of your sites to be labeled “spam”, then don’t have identical content floating around in full. (When a spam site copies your articles in full, you’re just have to trust that Google will figure it all out!) If you’re going to put articles on more than one site, make sure that you change things up a bit, or better yet, summarize the article and link to your main site where the full article can be found. I would point out some of the other people besides Greg who are doing this same thing on ActiveRain, but it appears that word travels fast via email and most of the guilty have taken their ActiveRain blogs down (Joel being the only exception I’ve found at this point… and he really should not be doubling up his content at this point considering he’s still in the process of “teaching” google about his new domain.)
- However, all this makes me feel bad… Matt, I promise my intentions were good and I wasn’t looking to get people to drop their activerain blogs. I think you’ve got a great platform and others should definitely consider blogging on your site. I just wanted to warn people that they might be committing googlecide (a great phrase coined by Greg!) if they post identical content in both places! For everyone’s benefit, Matt Cutts gives a comprehensive explanation on how to get re-included in Google searches should your site ever be listed as spam, but I don’t think that should be necessary as the re-inclusion request is typically for sites that have actively tried to trick Google in ways much more devious than duplicate content.
- Steve Hurley let me know about his new blog for the Tacoma area (South Sound) and he asked for some advice on how to get more readers. My advice: start linking to other real estate blogs! There are a lot (a ton!) of real estate blogs with good content that will never get “discovered” because they live in their own bubble (yes, real estate has lots of bubbles!). I think a lot of real estate agents have a view that they are smart enough to be the one and only resource of real estate information. Even if that held water, very few agents are good enough to break out of the mold without some major help from other real estate bloggers. So, regardless of how good your stuff is, find someone else to link to in every post! Really, every post!
- Another way to drive traffic is to leave comments on other people’s blogs. The nice part about leaving a comment is that you’ll get a link back to your blog with each and every comment. However, that won’t generate traffic nearly as effectively as if other bloggers are linking to you within their posts. What is the most effective way to get the attention of other bloggers so that they will link to you? Link to them! Want more? Here are the three most important elements of real estate… blogging: Linkation, Linkation, Linkation.
- Greg: Ardell’s going to kill me for that title. I promise I wrote it before I became a believer in the church of Ardell! 🙂 I really wish I could give you a “on a related note” to this story, but I simply can’t blog about a meeting I had last week with the master of real estate marketing…
- I agree with Chris Pirillo that social bookmarking buttons have gotten out of hand. I’ve not added any to RCG because it seemed like it took up valuable real estate and I’m not sure it provided a valuable service to our readers. The only one I’ve considered adding is del.icio.us, but considering most del.icio.us users have a button installed on their browser (they tend to be a tech-savvy bunch), I’ve never bothered. Adding a button for a site like digg (let alone sites like reddit) seems pointless for a real estate blog since I’ve never seen one real estate article promoted by those communities. (In other words, why would I give them an ad (i.e. their logo) on every one of my posts if they are never going to send me traffic?)
- I want one… Sony is preparing to introduce a light-weight geocoder with software to make geocoding photos easy. Although I wish geocoding photos was easier than dragging along another device…
- Taken one step further (and two steps too far): Wouldn’t it be great if you could search for an item based on where you were when you were working on the file? As in, “I remember taking those notes while in San Francisco…” and then have a document filter based on where you were when you made those edits (obviously, this only makes sense if you’re working on a laptop or mobile device). The secret weapon in this idea would be taking advantage of the wifi positioning from Loki so that you don’t have to lug around another device…
- Everyone knows that Loki was the god of mischief, right? (Due to a simple twist of fate, I know a lot more about Nordic gods than I do bible stories, but I can’t go there because I’ll get to sidetracked…). Well, the mischievous people over at Trulia have blocked Move’s IP address so that I didn’t read what Greg liked so much about their post until I got home. (I know I could have proxied in, but I didn’t bother). Anyway, the article is hilarious and definitely shows the benefit of not taking yourself too seriously. Tell your kids: real estate is fun!!!
In my last post, I was asked what the accuracy of the locations in our generated Google Earth files are. Before I divulge that information, I’d like to explain some of the challenges of getting accurately geocoded data. (I’ll get on my soapbox and complain about the state of NWMLS data in my next post).
Now, in partial defense of realtors and the MLS, it is unrealistic to expect perfect data. For example, consumer-level GPS receivers aren’t always as accurate as one might think. This weekend I loaded up Microsoft Streets & Tips 2006 on my desktop computer, hooked up my GPS receiver, turned on GPS tracking , created a GPS trail, and walked away for an hour. An hour later, my map had a line drawing that resembled the type my 3 year old son likes to create. So even if a realtor was to use a GPS receiver, to get a latitude & longitude reading, it’s entirely possible that the measurement would be off by a house or two (or four).
Another problem, is that most digital maps are created with data sold by companies like TeleAtlas or NavTeq. The companies compile their data by driving around previously unknown streets & neighborhoods, with computers & GPS receivers (kinda like how that annoying guy in the Verizon ads, test their network). I should note that in-vehicle navigation systems are more accurate than GPS receivers alone, because the vehicle’s navigation system can also use the vehicle’s steeling wheel position and the speedometer to determine what your location is.
Unfortunately, by the time the Microsoft’s, Yahoo’s and Google’s of the world get their hands on the data, it is at least 3-6 months out of date (and probably closer to 12-18 months out of date by the time it gets on the web or published on a CD). This is a problem because about 25% of the properties in the NWMLS are new construction (where new construction is defined as a property that was built in 2005 or later). Since new construction is often located near new roads, the giants of digital mapping may be unable to help and are always in a position of playing catch up.
Then when the companies convert the raw data into digital maps, they end up using multiple sources of data, and interpolating it into one set of data they are going to use for a map. However, the data sources don’t always agree on where a point of interest is.
For example, Google Earth thinks the top of the Seattle Space Needle is at 47.620367° north latitude & 122.349005° west longitude. Meanwhile, Microsoft’s Virtual Earth, seems to think it’s located at 47.620336° north latitude & 122.348515° west longitude. Now, a few ten thousand-enths of a degree means the difference between the tip of the needle & one of the air conditioning units on the roof (a few yards). But if they can’t agree on where the top of the Space Needle is, it’s likely they aren’t going to agree on where 742 Evergreen Terrace is either. However, a few yards of error is better than a few miles of error (which is what can happen when I use raw NWMLS data)
Because of this, I have to geocode every single property in the database because I don’t trust the NWMLS data. So I to call Yahoo! Maps Web Services – Geocoding API to get a latitude & longitude for everything. Although Yahoo is far from perfect, at least it’s free and try’s harder than the MLS. So without further delay, here is the current geocoding precision of the points on our generated maps.
|Geocoding Precision||No. of properties||Percentage|
In closing, I’d like to ask real estate professionals to be as complete and as accurate as possible when submitting listing data to their local MLS. I’d also like to state even if the MLS was accurate, it’s unrealistic to expect prefect geo-coding from imperfect data. If digital mapping companies and GPS technology can’t get it exactly right, a house or two off, is probably as accurate as you can realistically hope for given the current state of the art.