Having just read a couple of very insightful and thoughtful piece on Google Base by Bill Burnham, I’ve totally altered my position on structured blogging and I’m now convinced that we’re set to see a two-tier publishing model arise very quickly that separates the “data” from the presentation mechanism.
In brief, Burnham argued in his original piece that Google Base really represents the big G’s desire to have people feed content directly into its database via RSS, and therefore what Google Base really represents is the world’s biggest XML database that will enable the company to launch in the not too distant future a whole series of vertical search services.
This follow up piece posits that this is going to represent a massive threat to “walled garden” recruitment, real estate, auction and dating sites. I think Burnham’s very much on the money with his thoughts here.
However, I think it’s much more than a Google issue. A lot has been made of Google not allowing other robots to spider its site and so that in effect it’s a walled garden itself, as one of the comments in Burnham’s blog points out. If you follow Burnham’s theory then the act of aggregating the millions of user’s various RSS feeds is the real achievement and goal behind Google Base. The fact that Google is hosting the data as well as aggregating it, on one level just means there are less RSS feeds it needs to ping.
However, on another level, the data hosting represents Google’s attempt to control the content and give itself a leg-up on other companies who will no doubt in the future want to build the same kind of vertical search services that Google no doubt is looking to roll out.
Attempting to wall off that data is surely doomed to failure. If ever there was a need for open standards, it’s here and the structured blogging initiative is probably what we have to look to. As an aside. I think the people behind Structure Blogging have made a dreadful mistake giving it that name. Tying the idea so tightly to what people understand to be blogging, to me, really limits how people will approach the whole concept (it certainly limited my original understanding of what was possible). Surely, structured syndication or structured feed would have encouraged people to think about what is trying to be achieved in far more general, wider-reaching terms.
But back to the main gist. Where I think we’re heading here, is that every user will in the not-too-distant-future own an XML online data store, which will adhere to standard XML schemas for the many different information types (classified, blog post, contact information, calendar entry etc) you may want to publish.
These will almost certainly be free data stores based on throwing up ads to you, while you submit your information. Certainly, power users and businesses will be willing to pay a subscription to a provider that can ensure their data and subsequent feeds are always available. Uptime will be everything on this front.
From there, we come to the presentation layer. As a blogger, I’ll be able to shift the presentation of my blog to which every provider I choose, probably the one that offers me the most revenue-generating widgets that I can plug into. Then of course, you’ll have the aggregators who take my content, add it to other like-content and build vertical search, content or e-commerce businesses on that front. (I’d suggest all bloggers will be both publishers and aggregators, because those widgets I talked about will do things like aggregating related-items for sale).
In this world, pretty much anyone could build their very own Monster.com. The much-vaunted and prized “network-effect” is nullified. I’m still not exactly sure who wins in this scenario. Certainly, the bikkies are broken up into much tinier pieces, probably into crumbs, which can only be a good thing for bloggers and independent content producers.