<videovortex> web 2.0 criticism (from mute)
Geert Lovink
geert at xs4all.nl
Wed Feb 14 14:11:42 CET 2007
InfoEnclosure 2.0 Editorial content | Magazine
Submitted by mute on Monday, 29 January, 2007 - 16:35
http://www.metamute.org/en/InfoEnclosure-2.0
By Dmytri Kleiner & Brian Wyrick
The hype surrounding Web 2.0’s ability to democratise content
production obscures its centralisation of ownership and the means of
sharing. Dmytri Kleiner & Brian Wyrick expose Web 2.0 as a venture
capitalist’s paradise where investors pocket the value produced by
unpaid users, ride on the technical innovations of the free software
movement and kill off the decentralising potential of peer-to-peer
production
Wikipedia says that ‘Web 2.0, a phrase coined by O’Reilly Media in
2004, refers to a supposed second generation of internet-based services
– such as social networking sites, wikis, communication tools, and
folksonomies – that emphasise online collaboration and sharing among
users.’
The use of the word ‘supposed’ is noteworthy. As probably the largest
collaboratively authored work in history, and one of the current
darlings of the internet community, Wikipedia ought to know. Unlike
most of the members of the Web 2.0 generation, Wikipedia is controlled
by a non-profit foundation, earns income only by donation and releases
its content under the copyleft GNU Free Documentation License. It is
telling that Wikipedia goes on to say ‘[Web 2.0] has become a popular
(though ill-defined and often criticised) buzzword among certain
technical and marketing communities.’
The free software community has tended to be suspicious, if not
outright dismissive, of the Web 2.0 moniker. Tim Berners-Lee dismissed
the term saying ‘Web 2.0 is of course a piece of jargon, nobody even
knows what it means.’ He goes on to note that ‘it means using the
standards which have been produced by all these people working on Web
1.0.’
In reality there is neither a Web 1.0 nor a Web 2.0, there is an
ongoing development of online applications that cannot be cleanly
divided.
In trying to define what Web 2.0 is, it is safe to say that most of the
important developments have been aimed at enabling the community to
create, modify, and share content in a way that was previously only
available to centralised organisations which bought expensive software
packages, paid staff to handle the technical aspects of the site, and
paid staff to create content which generally was published only on that
organisation’s site.
A Web 2.0 company fundamentally changes the mode of production of
internet content. Web applications and services have become cheaper and
easier to implement, and by allowing the end users access to these
applications, a company can effectively outsource the creation and the
organisation of their content to the end users themselves. Instead of
the traditional model of a content provider publishing their own
content and the end user consuming it, the new model allows the
company’s site to act as the centralised portal between the users who
are both creators and consumers.
For the user, access to these applications empowers them to create and
publish content that previously would have required them to purchase
desktop software and possess a greater technological skill set. For
example, two of the primary means of text-based content production in
Web 2.0 are blogs and wikis which allow the user to create and publish
content directly from their browser without any real need for knowledge
of markup language, file transfer or syndication protocols, and all
without the need to purchase any software.
The use of the web application to replace desktop software is even more
significant for the user when it comes to content that is not merely
textual. Not only can web pages be created and edited in the browser
without puchasing html editing software, photographs can be uploaded
and manipulated online through the browser without the need for
expensive desktop image manipulation applications. A video shot on a
consumer camcorder can be submitted to a video hosting site, uploaded,
encoded, embedded into an HTML page, published, tagged, and syndicated
across the web all through the user’s browser.
In Paul Graham’s article on Web 2.0 he breaks down the different roles
of the community/user into more specific roles, those being the
Professional, the Amateur, and the User (more specifically, the end
user). The roles of the Professional and the User were, according to
Graham, well understood in Web 1.0, but the Amateur didn’t have a very
well defined place. As Graham describes it in ‘What Business Can Learn
From Open Source’, the Amateur just loves to work, with no concern for
compensation or ownership of that work; in development, the Amateur
contributes to open source software whereas the Professional gets paid
for their proprietary work.
Graham’s characterisation of the ‘Amateur’ reminds one of If I Ran The
Circus by Dr. Suess, where young Morris McGurk says of the staff of his
imaginary Circus McGurkus:
My workers love work. They say, ‘Work us! Please work us!
We’ll work and we’ll work up so many surprises
You’d never see half if you had forty eyses!’
And while ‘Web 2.0’ may mean nothing to Tim Berners-Lee, who sees
recent innovations as no more than the continued development of the
web, to venture capitalists, who like Morris McGurk daydream of
tireless workers producing endless content and not demanding a pay
cheque for it, it sounds stupendous. And indeed, from YouTube to Flickr
to Wikipedia, you’d truly never see half if you had forty eyses.
Tim Berners-Lee is correct. There is nothing from a technical or user
point of view in Web 2.0 which does not have its roots in, and is not a
natural development from, Web 1.0. The technology associated with the
Web 2.0 banner was possible and in some cases readily available before,
but the hype surrounding this usage has certainly affected the growth
of Web 2.0 internet sites.
The internet (which is more than the web, actually) has always been
about sharing between users. In fact, Usenet, a distributed messaging
system, has been operating since 1979! Since long before even Web 1.0,
Usenet has been hosting discussions, ‘amateur’ journalism, and enabling
photo and file sharing. Like the internet, it is a distributed system
not owned or controlled by anyone. It is this quality, a lack of
central ownership and control, that differentiate services such as
Usenet from Web 2.0.
If Web 2.0 means anything at all, its meaning lies in the rationale of
venture capital. Web 2.0 represents the return of investment in
internet startups. After the dotcom bust (the real end of Web 1.0)
those wooing investment dollars needed a new rationale for investing in
online ventures. ‘Build it and they will come’, the dominant attitude
of the ’90s dotcom boom, along with the delusional ‘new economy’, was
no longer attractive after so many online ventures failed. Building
infrastructure and financing real capitalisation was no longer what
investors were looking for. Capturing value created by others, however,
proved to be a more attractive proposition.
Web 2.0 is Internet Investment Boom 2.0. Web 2.0 is a business model,
it means private capture of community-created value. No one denies that
the techology of sites like YouTube, for instance, is trivial. This is
more than evidenced by the large number of identical services such as
DailyMotion. The real value of YouTube is not created by the developers
of the site, but rather it is created by the people who upload videos
to the site. Yet, when YouTube was bought for over a billion dollars
worth of Google stock, how much of this stock was acquired by those
that made all these videos? Zero. Zilch. Nada. Great deal if you are an
owner of a Web 2.0 company.
The value produced by users of Web 2.0 services such as YouTube is
captured by capitalist investors. In some cases, the actual content
they contribute winds up the property of site owners. Private
appropriation of community created value is a betrayal of the promise
of sharing technology and free cooperation.
Unlike Web 1.0, where investors often financed expensive capital
acquisition, software development and content creation, a Web 2.0
investor mainly needs to finance hype-generation, marketing and buzz.
The infrastructure is widely available for cheap, the content is free
and cost of the software, at least that much of it that is not also
free, is negligible. Basically, by providing some bandwidth and disk
space, you are able to become a successful internet site if you can
market yourself effectively.
The principal success of a Web 2.0 company comes from its relationship
to the community, more specifically, the ability of the company to
‘harness collective intelligence’, as O’Reilly puts it. Web 1.0
companies were too monolithic and unilateral in their approach to
content. Success stories of the transition from Web 1.0 to Web 2.0 were
based on the ability for a company to remain monolithic in its brand of
content, or better yet, its outright ownership of that content, while
opening up the method of that content’s creation to the community.
Yahoo! Created a portal to community content while it remained the
centralised location to find that content. EBay allows the community to
sell its goods while owning the marketplace for those goods. Amazon,
selling the same products as many other sites, succeeded by allowing
the community to participate in the ‘flow’ around their products.
Because the capitalists who invest in Web 2.0 startups do not often
fund early capitalisation, their behaviour is markedly more parasitic
as well. They often arrive late in the game when value creation already
has good momentum, swoop in to take ownership and use their financial
power to promote the service, often within the context of a hegemonic
network of major, well financed partners. This means that companies
that are not acquired by venture capital end up cash starved and
squeezed out of the club.
In all these cases, the value of the internet site is created not by
the paid staff of the company that runs it, but by the users who use
it. With all of the emphasis on community created content and sharing,
it’s easy to overlook the other side of the Web 2.0 experience:
ownership of all this content and ability to monetise its value. To the
user, this doesn’t come up that often, it’s only part of the fine print
in their MySpace Terms of Service agreement, or it’s the Flickr.com in
the url of their photos. It doesn’t usually seem like an issue to the
community, it’s a small price to pay for the use of these wonderful
applications and for the impressive effect on search engine results
when one queries one’s own name. Since most users do not have access to
alternative means to produce and publish their own content, they are
attracted to sites like MySpace and Flickr.
Meanwhile, the corporate world was pushing a whole different idea of
the Information Superhighway, producing monolithic, centralised ‘online
services’ like CompuServe, Prodigy and AOL. What separated these from
the internet is that these were centralised systems that all users
connect directly to, while the internet is a peer-to-peer network,
every device with a public internet address can communicate directly to
any other device. This is what makes peer-to-peer technology possible,
this is also what makes independent internet service providers
possible.
It should be added that many open source projects can be cited as the
key innovations in the development of Web 2.0: free software like
Linux, Apache, PHP, MySQL, Python, etc. are the backbone of Web 2.0,
and the web itself. But there is a fundamental flaw with all of these
projects in terms of what O’Reilly refers to as the Core Competencies
of Web 2.0 Companies, namely control over unique, hard-to-recreate data
sources that get richer as more people use them – the harnessing of the
collective intelligence they attract. Allowing the community to
contribute openly and to utilise that contribution within the context
of a proprietary system where the proprietor owns the content is a
characteristic of a successful Web 2.0 company. Allowing the community
to own what it creates, though, is not. Thus, to be successful and
create profits for investors, a Web 2.0 company needs to create
mechanisms for sharing and collaboration that are centrally controlled.
The lack of central control possessed by Usenet and other peer
controlled technologies is the fundamental flaw. They only benefit
their users, they do not benefit absentee investors, as they are not
‘owned’.
Thus, because Web 2.0 is funded by Capitalism 2006, Usenet is mostly
forgotten. While everybody uses Digg and Flickr, and YouTube is worth a
billion dollars, PeerCast, an innovative peer-to-peer live video
streaming network that has been in existence for several years longer
than YouTube, is virtually unknown.
From a technological stand point, distributed and peer-to-peer (P2P)
technologies are far more efficient than Web 2.0 systems. Making better
use of network resources by using the computers and network connections
of users, P2P avoids creating bottlenecks created by centralised
systems and allows content to be published with less infrastructure,
often no more than a computer and a consumer internet connection. P2P
systems do not require the massive data centres of sites such as
YouTube. The lack of central infrastructure also comes with a lack of
central control, meaning that censorship, often a problem with
privately-owned ‘communities’ that frequently bend to private and
public pressure groups and enforce limitations on the the kinds of
content they allow. Also, the lack of large central cross-referencing
databases of user information has a strong advantage in terms of
privacy.
From this perspective, it can be said that Web 2.0 is capitalism’s
preemptive attack against P2P systems. Despite their many disadvantages
in comparison to these, Web 2.0 is more attractive to investors, and
thus has more money to fund and promote centralised solutions. The end
result of this is that capitalist investment flowed into centralised
solutions making them easy and cheap or free for non-technical
information producers to adopt. Thus, this ease of access compared to
the more technically challenging and expensive undertaking of owning
your own means of information production created a ‘landless’
information proletariat ready to provide alienated content-creating
labour for the the new info-landlords of Web 2.0.
It is often said that the internet took the corporate world by
surprise, coming as it did out of publicly funded university and
military research. It was promoted by way of a cottage industry of
small independent internet service providers who were able to squeeze a
buck out of providing access to the state-built and financed network.
The internet seemed anathema to the capitalist imagination. Web 1.0,
the original dotcom boom, was characterised by a rush to own the
infrastructure, to consolidate the independent internet service
providers. While money was thrown around quite randomly as investors
struggled to understand what this medium would actually be used for,
the overall mission was largely successful. If you had an internet
account in 1996 it was likely provided by some small local company. Ten
years later, while some of the smaller companies have survived most
people get their internet access from gigantic telecommunications
corporations. The mission of Internet Investment Boom 1.0 was to
destroy the independent service provider and put large, well financed,
corporations back in the driving seat.
The mission of Web 2.0 is to destroy the P2P aspect of the internet. To
make you, your computer, and your internet connection dependent on
connecting to a centralised service that controls your ability to
communicate. Web 2.0 is the ruin of free, peer-to-peer systems and the
return of monolithic ‘online services’. A telling detail here is that
most home or office internet connections in the ’90s, modem and ISDN
connections, were synchronous – equal in their ability to send and
receive data. By design, your connection enabled you to be equally a
producer and a consumer of information. On the other hand, modern DSL
and cable-modem connections are asynchronous, allowing you to download
information quickly, but upload slowly. Not to mention the fact that
many user agreements for internet service forbid you to run servers on
your consumer circuit, and may cut off your service if you do.
Capitalism, rooted in the idea of earning income by way of idle share
ownership, requires centralised control, without which peer producers
have no reason to share their income with outside shareholders.
Capitalism, therefore, is incompatible with free P2P networks, and
thus, so long as the financing of internet development comes from
private shareholders looking to capture value by owning internet
resources, the network will only become more restricted and
centralised.
It should be noted that even in the case of commons-based peer
production, so long as the commons and membership in the peer group is
limited, and inputs such as food for the producers and the computers
that they use are acquired from outside the commons-based peer group,
then the peer producers themselves may be complicit in the exploitative
capturing of this labour value. Thus in order to really address the
unjust capture of alienated labour value, access to the commons and
membership in the peer group must be extended as far as possible toward
the inclusion of a total system of goods and services. Only when all
productive goods are available from commons-based producers can all
producers retain the value of the product of their labour.
And while the information commons may have the possibility of playing a
role in moving society toward more inclusive modes of production, any
real hope for a genuine, community enriching, next generation of
internet-based services is not rooted in creating privately owned,
centralised resources, but rather in creating cooperative, P2P and
commons-based systems, owned by everybody and nobody. Although small
and obscure by today’s standards, with it’s focus on peer-to-peer
applications such as Usenet and email, the early internet was very much
a common, shared resource. Along with the commercialisation of the
internet and the emergence of capitalist financing comes the enclosure
of this information commons, translating public wealth into private
profit. Thus Web 2.0 is not to be thought of as a second-generation of
either the technical or social development of the internet, but rather
as the second wave of capitalist enclosure of the Information Commons.
Virtually all of the most used internet resources could be replaced by
P2P alternatives. Google could be replaced by a P2P search system,
where every browser and every webserver were active nodes in the search
process; Flickr and YouTube could also be replaced by PeerCast and
eDonkey type applications, which allow users to use their own computers
and internet connections to collaboratively share their pictures and
videos. However, developing internet resources requires the application
of wealth, and so long as the source of this wealth is finance capital,
the great peer-to-peer potential of the internet will remain
unrealised.
--
Dmytri Kleiner <dk AT haagenti.com> is an anarchist hacker and a
co-founder of Telekommunisten, a worker-owned technology company
specialising in telephone systems. Dmytri is a USSR-born Canadian,
currently living in Berlin with his wife Franziska and his daughter
Henriette
Brian Wyrick <brian AT pseudoscope.com> is an artist, film maker and
web developer working in Berlin and Chicago. He also co-founded Group
312 Films, a Chicago-based film group, and posts updates regarding his
projects and adventures at http://www.pseudoscope.com
More information about the videovortex
mailing list