The Rules, 2007

[innerindex]Web 2.0 has matured to the point where even those who endorse the moniker are beginning to cringe at its use. Still, it gave me pause the other day when Cliff (a sysop) began a sentence with “Web 2.0 standards require….”

Web 2.0 is now coherent enough to have standards? We used to joke about rounded corners and gradient blends being the rule, but something more has indeed emerged. O’Reilly defined Web 2.0 by example, Time Magazine echoed Kevin Kelly’s assertion in naming You as person of the year: Web 2.0 is about people. And “the rules” are emerging as a matter of market forces and natural selection.

Open Source

No matter your position on the Free Software Foundation’s philosophy, open source development reduces costs while improving quality and helps projects get to market faster with new ideas.

Flickr is among those that’s been rather public about their use of the LAMP stack, though Google and others have quietly built their business on it too. WordPress, a rare example of a downloadable Web 2.0 application, has enjoyed active development (and even a resurrection) due to its GNU license.

Still other Web 2.0 applications extend the open source model further. Open source content, or the user’s ability to declare a Creative Commons license on their content in these Web 2.0 applications is becoming common (and demanded by some). We may argue about the efficacy of Wikipedia, but the fact is that it’s among the most likely sites to appear for a web search and it’s consistently ranked among the top sites for traffic.

Wikipedia’s early contributors, looking at a young site with an unclear value proposition, could trust that their work would be protected by license (specifically, the GNU Free Documentation License).

Built for Remixing

Amazon reports that almost a third of their sales are attributable to remixers and boasts 180,00 registered developers of their API.

Google Maps didn’t include a public API when first released, but the community responded with enthusiasm and quickly reverse engineered the JavaScript to build new applications. Google responded by releasing a public API, making internet mapping and Google almost synonymous.

Dan Cat mashed up flickr and Google Maps on his own before Yahoo!/flickr snatched him up to build those features into flickr’s own site. But the company still enjoys the efforts of developers building applications to the flickr API, independently developing new features and adding value to the service.

Like open source, remixability and APIs engage a larger pool of talent than is available inside any company and serve two very important audiences: those who want features and those who care about their exit strategy. Neither group is remarkably large, but both are influential, passionate users. (More: Usability, Findability, and Remixability, Especially Remixability.)

Well Behaved and Social

Predictable and reliable URLs are essential to allowing users to bookmark and link to your site; well-formed semantic markup makes it easier for screen readers and search engines to make sense of the content. Semantic markup and microformats aid in remixability, contribute greatly to the Semantic Web, make site redesigns easier, and generally display better in a broader variety of formats and clients (think HTML vs. RSS).

People are anxious to leave comments telling us how right or wrong we are, so a site without comments/trackbacks/pingbacks is turning its back on its users. Good sites recognize the value of their users and cultivate the community. Caterina Fake did a lot of that for flickr (see her comments on my first photos there), while MetaFilter exists entirely as a community.

That doesn’t mean users are itching to build somebody else’s site, the lesson is that personal value precedes network value. Good sites make it easier for people to do what they want to do, not what their boss or the site’s creator wants.

If it isn’t obvious already: empower the user to achieve their own goals and control their experience.