This month, our good friend Wil Arndt, from mod7 is providing a guest editorial/rant on all of the hype surrounding a white elephant that goes by the moniker “Web 2.0”. (By the way, if you are in Vancouver on June 16th, you can catch him, and Eric, at VIDFEST 2006, where they will be presenting at the Creative Exchange sessions.) So, without further ado, I leave you with Wil, and his article: Web 2.0 is Bull, or: How I Learned to Stop Worrying and Love the Web 2.0
I tend to involuntarily wince a little every time I hear that shallow buzzword these days. At the risk of sounding like a dick (or, worse, like a Gore), let me get one thing straight right off the bat: I was into the idea of Web 2.0 waaaay before you were.
Ok — more like I thought about something that was kind-of Web 2.0-like, but let’s not get caught up in semantics.
The Web 2.0-like Dream
Way back in 1998, as I was steeped in transarchitecture, pop culture, Las Vegas, emergent networks, and my favorite non-academic academian, Jaron Lanier (and, oh, yes, I was so into that dude before you were), I was developing this concept for a physical interface that would dynamically structure and personalize the media landscape. I called this interface, interestingly enough, the “media room” (yes, my flair for dramatic wordsmithing was quite under-developed back then). Anyway, this project was part of this bigger concept I was working on called a(n) advertisement for living (exhibited at an architecture conference in Italy in 2000), which examined, among other things, how we as consumers can co-opt advertisement and even use it to redefine our sphere of influence and sense of identity (but that’s not necessarily where the “advertisement” part came from). But more on all that in another article.
The point here is not so much the media room itself, but rather what the media room connected to — the internet, email, telephone, television, and the rest of the mediascape. I posited that one’s personal information (and, far more deeply, one’s own electronic identity) would consist of more than just “local” data (physically stored within the user’s local machine), but it would also be represented by “dispersed” data (that data which is stored non-centrally throughout the web). And it was the job of the media room to help capture, evaluate, and catalog this data and convert that into something understandable and personalized. Think of it as sort of a MySpace that you can live in (along with all the good and bad that implies).
I was stoked about the idea. Think of it: your identity is wrapped up in the sum of everything you’ve done, are doing, contribute to within your “community”, create, hope, dream and bitch about. You create and access your “stuff” non-centrally from any location (assuming you have access to similar tech). And, coolest of all, you create your “space” and inhabit that data like a house. A house that comes with you everywhere you go. And a house that everyone in the network (or the neighborhood) can visit and contribute to in visible, or invisible, ways.
Plus, if your computer crashes, then your stuff is still “out there” in the web ether. It’s like a kind of immortality for your digital self. There were some other facets I was dealing with at the time, too, like visualizing data that modulates with time, wearable computer interfaces, seeing and predicting patterns in meta data (like the stock market, another thing I was a trend-setter on), and architecture that acts like software. You get the picture. I was a frickin’ visionary genius.
The Promise of a Next-Gen Web
So, when I first heard about this thing called Web 2.0, you can imagine my reaction. I was like, “No way! It’s starting to happen! Wicked!” I mean, sure, the form was a little different (mine, theirs), but it was the same concept in principle.
But now that Web 2.0 is being hawked as a critical business advantage for everything from the publishing business to the travel industry, I think it’s time to evaluate both what this was all about at the beginning, and what it’s starting to become. Let’s review the alleged tenets of Web 2.0, according to O’Reilly Media (presented along with a checklist that evaluates my 1998 premonitions):
- the Web as platform – mine? check;
- data as the driving force – check;
- network effects created by an “architecture of participation” – check;
- innovation in assembly of systems and sites composed by pulling together features from distributed, independent developers (a kind of “open source” development) – check;
- lightweight business models enabled by content and service syndication – indirect check;
- the end of the software adoption cycle (“the perpetual beta”) – uh, no, didn’t think of that in 1998, but still cool, I guess ;
- software above the level of a single device – and… check.
We all know that Google Maps totally, totally, deeply, and truly, rocks. If you’re a detractor, then may I humbly offer my estimation of your character: you’re a grump-ass. I mean, if there’s anything that gets us closer to living in Star Trek la-la world (and I mean something that’s this cheap, accessible and easy to use), I’d like to see it.
Then again, evaluated against our benchmark 2.0 criteria, it’s a bit lacking in the “participation” department, but one might argue that that’s remedied by the novel uses (ok, hacks) that users have found for Google Maps. In a sense, Maps is pure 2.0 in that it takes a sort of open source approach to content and builds itself as an application that acts like a building block for other applications.
Next up is everyone’s favorite place to look at banal photos of people you don’t know or care about: Flickr. It’s also a great online hard drive. And a place for inane chatter about the stupid things that too many people seem to get really passionate about.
Regardless of how one may feel about Flickr (and blogs, in general, of which Flickr is kindred), it’s an indisputable killer app for over 1.2 million or so content creators. It’s simple, smart, kind of chic to be a part of, and I can store my photos there. I can also have some chit-chat with like-minded people. Not bad.
It’s an online, seemingly indestructible place in the cyber-techno-virtua-2.0-sphere where I can store and share my personal stuff. My digital immortality begins.
If there’s anywhere that a great big middle finger is given to visual design, it’s certainly MySpace. As my buddy, Jeff Weir, so eloquently opines on his blog, “myspace really shows that kids don’t give a shit about a well designed site. Give them personalization (and a place where their parents can’t find them), and you’ve got a killer app.”
Beyond the sheer ugliness of it, there are some other rather interesting aspects to MySpace. Even though I have a MySpace account and sometimes use it for connecting with people with similar interests I would otherwise never have communicated with, I still don’t understand the phenomenon fully. I know of people so into it that they communicate through MySpace mail exclusively (like, they use it instead of email or the telephone). I know people that spend days each week addictively searching for more heads to add to their MySpace friends list. And they’re addicted to voyeuristically browsing people’s profiles.
MySpace is, in many ways, a microcosm of what Web 2.0 could be. But, as a contributor to the Web 2.0 philosophy, it falls short in its isolation from other technologies and communities (more on that later).
It Smells Nice, But The Thorns Sure Hurt
I must admit, you’re witnessing (right now, in realtime) a shift in my attitude towards the idea of Web 2.0 in the very writing of this article. I started out a cynic of the current hype and batch of development. As I work through the current state of the online medium, I’m getting excited again, and I’m almost a believer. Almost. But now that we’ve taken the briefest of tours through the 2.0 landscape, allow me to level my own (though, I’m sure, not completely original) criticisms of “Web 2.0,” the term and movement.
Not Fundamentally Different
First, unlike the potential applications afforded by the Semantic Web, Web 2.0 is not an “improved form” of the World Wide Web. Let’s not kid ourselves—no one is changing the essential manner in which the internet works when they’re blogging about their trip to Rio. At best (and, granted, this is still a big deal), Web 2.0 is a world-view, or attitude, that is influenced by the nature of the internet as medium.
It’s significant that we are finally allowing our attitudes and perspectives to be changed by the inherent quality of decentralized communication and data. Just like early photographs were “paintings in light,” early films were essentially “theater on screen,” and early television was “radio with pictures,” so our first staggering forays into the online world were crude attempts to cram in our current understanding of advertising, television, CD-ROMs, movies, newspapers, and print. The internet was, and still is, a hideous car wreck. But, while Web 2.0 is changing the way in which we think about interaction and the online medium, I take great issue with the choice of terminology here. Maybe the Semantic Web should be called “Web 2.0,” and we can think of a term that’s less pretentious for our dear Web 2.0.
It Doesn’t “Mean” Anything
Speaking of the Semantic Web naturally leads me to my second criticism. Namely, that Web 2.0 is missing that meta view; a sense of meaning, and standard unifying framework that the Semantic Web aspires to bring about. Sure, at a low level, the technical details of 2.0 seem to be incorporating a rudimentary sort of semantics (through RSS feeds, Google Maps, XML tags and attributes, blogging cross-media, blog aggregators). But the current state of affairs creates a condition whereby millions of loosely-connected, or even outright-isolated, pools of data are floating around out there.
I have a Flickr account, a Blogger account, my own personal blog, I leave posts on various industry forums, I have a Hotmail account, a Gmail account, various accounts on friends’ blogs and forums, an account for my Emily Carr forum, a page on MySpace, and certainly far more community-centered depots than I can remember right now. All these bits of me are floating around without a unifying framework. That’s unless you consider a Google search on “Wil Arndt” to be a unifying framework, which, although very useful, is still essentially like ripping the pages out of a book and then reading them in a random order. Some would say that Technorati does this to an extent, and I would agree. To an extent. A very limited extent that fails to be truly “meta.”
My third criticism may not be of importance to the vast majority of web users. But, as a visual designer, and someone that believes in the power of visual design as a means to effectively communicate, I am annoyed (or, grieved?) by the Web 2.0 design aesthetic that’s emerged. Just because we’re learning to think like a decentralized database, doesn’t mean we have create things that look like one. It surprises me that the drive for experimentation in user interface on the web has nearly disappeared in the past few years. It’s like people just gave up and decided that it’s ok that everything looks like a blog. Yes, I’m all for common standards in UI (maybe not to the extreme of some), but I don’t think we’re there yet. Soon, but not yet.
Let me take a clear stand here for the record, in case you missed the subtext: it’s not ok that everything looks like a blog. That’s a form that’s appropriate to a certain message. We can do better. Perhaps we just lack the imagination.
And despite the claim that Web 2.0 is about community, about the people, for the people and by the people, let me assure you that it’s not. Not fully. Yes, content publishing is easier than ever before, but when’s the last time you heard of Jane Average building her own blog, from the ground up, the way she made her own homepage in 1996? The “normal people” just aren’t in control of the form anymore. It’s imposed upon us.
It’s About the Money
The final criticism I have of Web 2.0 today (I do have more criticisms, but this is all I’ll get into here) is the insistence of people trying to find viable business and/or marketing models using this “technology” but based on existing notions of ownership and intellectual property. “How can we make money exploiting this” is a question (paraphrased) I often hear from clients and colleagues. The subtext is usually “(without losing control of our intellectual property)” or “(without taking a risk).”
What’s the Solution?
Well, this might sound like a cop out, but I can’t go into a fully-formed set of solutions in one article. Nor should I. That wouldn’t be very “Web 2.0” or me to propose a hierarchically-imposed, top-down solution. No, the solution should present itself through that survival of the fittest process that the web is just so good at fostering. The crowd will decide (and it may be a painful process). But, of course, I do have some suggestions for your consideration, dear crowd. And consider these more as guidelines for moving forward, rather than as concrete solutions.
No Magic Bullet
First, let’s acknowledge that, while the philosophy of Web 2.0 is certainly driven by a newer and more mature understanding of the online medium, it’s not the magic bullet for each and every content and communication situation and challenge. And those of you that say it is the magic bullet need to tap back into that sense of openness and eureka-ness you had experienced when you first embraced 2.0 as a leap forward.
Viva La Revolution
It’s exciting to me that the internet is challenging, and will continue to challenge, accepted notions of ownership and “usership”. So let’s stop trying to see how we can fit intellectual property law and traditional content distribution into the Web 2.0 model and instead focus on using our new 2.0-informed perspective to radically change and revolutionize how we approach these issues.
Dear Browser: It’s Not Me, It’s You
Third—and this may seem like it’s coming out of nowhere (but I’m the one writing the article and am thus entitled to inject my wish list)—let’s put a greater emphasis on web-enabled desktop applications. Even though today’s web-enabled app’s may seem trivial, they are, indeed, the future. This will especially be the case when we inevitably move beyond the desktop, and, thus beyond the browser.
You’re Looking at the Trees Again, Poindexter
Next, let’s stop geeking out on the minute details of a set of technologies. Stop using technology implementation details as the very definition for Web 2.0. Look at the forest, right?
Web 2.0 should never be defined by XML, RSS, API’s, CMS’s, SOAP, XHTML, CSS, PHP, blah, blah, blah. All that geek talk leads to deadlocks where there should be none (“My CMS is better than yours,” “Oh, yeah? Well at least my code is standards-compliant!”). Screw you. Get with the program, people. You want to evangelize a new way of thinking to the masses? Step one: stop eating your young.
Who Cares, Anyway—Web 2.0 Will Die by 2008
Maybe it’s just me, but I would love it we just stopped using the term “Web 2.0” altogether. We designers, web elites and technology pundits are all getting on the same page (slowly), anyway, so why confuse issues with a meaningless term? Plus, so many companies are riding the Web 2.0 wave as a marketing gimmick that it denigrates and weakens the core of what it’s all really about. But, of course, that won’t happen, so I guess I should just shut-up and be happy that my 1998 prophecies are finally manifesting themselves in unexpected ways.
In the end, it doesn’t matter. The term, “Web 2.0” will eventually go away and die—drunk and in a self-pitying corner of obscurity (Las Vegas)—while the rest of us will enjoy the wonderful legacies it leaves behind: a better way of thinking about the web and, at the very least, better ways to find out what kinds of bags people around the globe prefer. The world will be better for it.
Wil Arndt is the Creative Director and Founding Principal of mod7. When not designing, teaching design at the Emily Carr Institute, thinking about digital living, kickin’ it on the drums, creating industrial metal music, or blowing bubbles with his kids, Wil obsessively toils away at his latest graphic novel (which is about stuff in space 10,000 years into the future), hopefully to be released sometime before the events in that story are due to unfold. He does not own a mobile phone or blackberry. Yet.