Towards the Skeletal Web
The blogosphere went a little crazy last week with the announcement of the Google Sitemaps initiative, but in the true spirit of scholarly research I thought I'd jot down my own thoughts before seeing what everyone else has written...
My immediate impression, and one that I fired off to a couple of OU conferences the day the initiative was released, was that this seemed to me like Google trying to get webmasters to do a lot of the grunt work wrt site mapping. (For readers who haven't come across the thread before now Google are proposing that webmasters publish an XML site map somewhere obvious, and upload a copy to Google).
There's nothing really new about this, I guess - after all, Amazon has long been using the co-developer/user community to build Amazon storefronts with the Amazon e-commerce API...
The second thing that came to mind was - will all my separate Blogger pages now be described in a blog sitemap? (That would be handy...)
The third thought that struck me was what this might mean in terms of republishing the web...
Let me explain: one way of categorising websites is to look at them as: 1) content pages, 2) presented with some styling/branding (and extraneous advertising!), 3) organised according to the site architecture.
It's easy enough to lose the style gubbins by simply republishing each page as unstyle XML, or even RSS, but there is then a problem for the user wishing to navigate your site (other than by maintaining a sitewide RSS link'n'highlights feed). However, if the sitemap becomes widely used, as it may with the weight of Google behind it, and probably will if the other search engines get into it too, then it will provide a way of navigating the feed web easily.
So imagine - I stumble across a web page (a real web page one day, that I like the content of, and that also suggests a good site behind it. Considerate as the designer is, the page is <link /> tagged with references to the RSS feed for the page (in a standardised form easily derived from the URI of the (X)HTML page) and also to the sitemap (e.g. <link rel="sitemap" href="http://wherever.com/sitemap.xml" />)...
...and now the site is mine...I can provide my own navigation, style the content pages as I wish, and dump the ads and extra pages...
The scene is now set for the fourth thought - suppose that del.icio.us+ gets on the case, and people start tagging up site maps. Rather than be forced to navigate the site via the sitemap hierarchy, perhaps new structures will emerge based on individual interpretations of tag sets and clusters. Could it be that site remixes will become more popular than the sites themselves (as as historically been the case with accessible versions of complex, media rich sites?
It will be fun to see...
My immediate impression, and one that I fired off to a couple of OU conferences the day the initiative was released, was that this seemed to me like Google trying to get webmasters to do a lot of the grunt work wrt site mapping. (For readers who haven't come across the thread before now Google are proposing that webmasters publish an XML site map somewhere obvious, and upload a copy to Google).
There's nothing really new about this, I guess - after all, Amazon has long been using the co-developer/user community to build Amazon storefronts with the Amazon e-commerce API...
The second thing that came to mind was - will all my separate Blogger pages now be described in a blog sitemap? (That would be handy...)
The third thought that struck me was what this might mean in terms of republishing the web...
Let me explain: one way of categorising websites is to look at them as: 1) content pages, 2) presented with some styling/branding (and extraneous advertising!), 3) organised according to the site architecture.
It's easy enough to lose the style gubbins by simply republishing each page as unstyle XML, or even RSS, but there is then a problem for the user wishing to navigate your site (other than by maintaining a sitewide RSS link'n'highlights feed). However, if the sitemap becomes widely used, as it may with the weight of Google behind it, and probably will if the other search engines get into it too, then it will provide a way of navigating the feed web easily.
So imagine - I stumble across a web page (a real web page one day, that I like the content of, and that also suggests a good site behind it. Considerate as the designer is, the page is <link /> tagged with references to the RSS feed for the page (in a standardised form easily derived from the URI of the (X)HTML page) and also to the sitemap (e.g. <link rel="sitemap" href="http://wherever.com/sitemap.xml" />)...
...and now the site is mine...I can provide my own navigation, style the content pages as I wish, and dump the ads and extra pages...
The scene is now set for the fourth thought - suppose that del.icio.us+ gets on the case, and people start tagging up site maps. Rather than be forced to navigate the site via the sitemap hierarchy, perhaps new structures will emerge based on individual interpretations of tag sets and clusters. Could it be that site remixes will become more popular than the sites themselves (as as historically been the case with accessible versions of complex, media rich sites?
It will be fun to see...
0 Comments:
Post a Comment
<< Home