Ana Viseu on Mon, 20 Aug 2001 04:30:35 +0200 (CEST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> geography, control and hackers-Taming the web


[Apparently we are in an age of reflection. In the last week or so the 
trend has been to reflect on the internet--both on the technology and its 
accompanying culture and beliefs. So, we now know that the internet is not 
a-spatial or a-material, that information does not want to be free, that 
the new economy is as much about property as the old one, that the new 
(lack) of nature of the internet is its main feature, etc.

In this article Charles C. Mann (Technology Review) reviews this period of 
reflection and summarizes and critizes with empirical examples three of Net 
myths that have been long lasting in this 'information age'. These are: 
Myth #1: The Internet Is Too International to Be Controlled; Myth #2: The 
Net Is Too Interconnected to Control; Myth #3: The Net Is Too Filled with 
Hackers to Control.

Mann's article finishes critizing hacker culture because by not recognizing 
that the Net is in fact controllable, it is excluding itself from the 
process of making the rules, thus allowing others to do it for them. This 
echoes both Lessig's plead that the 'internet does not take care of itself' 
it is everyone's 'job' to get involved in the process, and Borsook's 
critique of the libertarian culture. In 'Cyberselfish' Borsook argues, 
amongst others, that one of the main faults of libertarian culture is that 
by denying that politics are useful for anything its proponents do not know 
how to use it for their own sake and end up being left out of the loop.

Best. Ana]




http://www.techreview.com/magazine/sep01/mann.asp
Technology Review, September 2001
By Charles C. Mann

Taming the Web


"Information wants to be free." "The Internet can't be controlled." We've 
heard it so often that we sometimes take for granted that it's true. But 
THE INTERNET CAN BE CONTROLLED, and those who argue otherwise are hastening 
the day when it will be controlled too much, by the wrong people, and for 
the wrong reasons.

Last December, Vincent Falco, a 28-year-old game programmer in West Palm 
Beach, FL, released version 1.0 of a pet project he called BearShare. 
BearShare is decentralized file-sharing softwarethat is, it allows 
thousands of Internet users to search each other's hard drives for files 
and exchange them without any supervision or monitoring. Released free of 
charge, downloaded millions of times, BearShare is a raspberry in the face 
of the music, film and publishing industries: six months after the release 
of version 1.0, tens of thousands of songs, movies, videos and texts were 
coursing through the network every day. Because the software links together 
a constantly changing, ad hoc collection of users, Falco says, "there's no 
central point for the industries to attack." BearShare, in other words, is 
unstoppable.

Which, to Falco's way of thinking, is entirely unsurprisingalmost a matter 
of course. BearShare is just one more example, in his view, of the way that 
digital technology inevitably sweeps aside any attempt to regulate 
information. "You can't stop people from putting stuff on the Net," Falco 
says. "And once something is on the Net you can't stop it from spreading 
everywhere."

The Internet is unstoppable! The flow of data can never be blocked! These 
libertarian claims, exemplified by software like BearShare, have become 
dogma to a surprisingly large number of Internet users. Governments and 
corporations may try to rein in digital technology, these people say, but 
it simply will never happen because...information wants to be free. 
Because, in a phrase attributed to Internet activist John Gilmore, the Net 
treats censorship as damage and routes around it. Laws, police, governments 
and corporationsall are helpless before the continually changing, endlessly 
branching, infinitely long river of data that is the Net.

To the generations nurtured on 1984, Cointelpro and The Matrix, the image 
of a global free-thought zone where people will always be able to say and 
do what they like has obvious emotional appeal. Little wonder that the 
notion of the Net's inherent uncontrollability has migrated to the 
mainstream media from the cyberpunk novels and technoanarchist screeds 
where it was first articulated in the late 1980s. A leitmotif in the 
discussion of the Napster case, for example, was the claim that it was 
futile for the recording industry to sue the file-swapping company because 
an even more troublesome file-swapping system would inevitably emerge. And 
the rapid appearance of BearSharealong with LimeWire, Audiogalaxy, Aimster 
and a plethora of other file-swapping programsseemed to bear this out.

Nonetheless, the claim that the Internet is ungovernable by its nature is 
more of a hope than a fact. It rests on three widely accepted beliefs, each 
of which has become dogma to webheads. First, the Net is said to be too 
international to oversee: there will always be some place where people can 
set up a server and distribute whatever they want. Second, the Net is too 
interconnected to fence in: if a single person has something, he or she can 
instantly make it available to millions of others. Third, the Net is too 
full of hackers: any effort at control will invariably be circumvented by 
the world's army of amateur tinkerers, who will then spread the workaround 
everywhere.

Unfortunately, current evidence suggests that two of the three arguments 
for the Net's uncontrollability are simply wrong; the third, though likely 
to be correct, is likely to be irrelevant. In consequence, the world may 
well be on the path to a more orderly electronic futureone in which the 
Internet can and will be controlled. If so, the important question is not 
whether the Net can be regulated and monitored, but how and by whom.

The potential consequences are enormous. Soon, it is widely believed, the 
Internet will become a universal library/movie theater/voting 
booth/shopping mall/newspaper/museum/concert halla 21st-century version of 
the ancient Greek agora, the commons where all the commercial, political 
and cultural functions of a democratic society took place. By insisting 
that digital technology is ineluctably beyond the reach of authority, Falco 
and others like him are inadvertently making it far more likely that the 
rules of operation of the worldwide intellectual commons that is the 
Internet will be established not through the messy but open processes of 
democracy but by private negotiations among large corporations. To think 
this prospect dismaying, one doesn't need to be a fan of BearShare.

Myth #1: The Internet Is Too International to Be Controlled

At first glance, Swaptor seems like something out of a cyberpunk novel. A 
secretive music-swapping service much like Napster, it seems specifically 
designed to avoid attacks from the record labels. The company is 
headquartered in the Caribbean island nation of St. Kitts and Nevis. Its 
founders are deliberately anonymous to the public; its sole address is a 
post-office box in the small town of Charlestown, Nevis. Swaptor's creators 
seem confident that the company can survive beyond national lawsafter all, 
the Internet is too spread across the world to control, right?

Indeed, Swaptor does seem protected. Nevis, according to company 
representative John Simpson, "has excellent corporate laws for conducting 
international business." He is apparently referring to the happy fact that 
Nevis has not ratified either the World Intellectual Property Organization 
Copyright Treaty or the WIPO Performances and Phonograms Treaty, both of 
which extend international copyright rules to the Internet. As a result, 
Swaptor appears not to be breaking local or international law.

The founders of Swaptor "wish to remain anonymous at this time," according 
to Simpson. They won't need to reveal themselves to raise money: the 
company is headquartered in an offshore bank called the Nevis International 
Trust. Affiliated with the bank is a successful online gambling concern, 
Online Wagering Systems. Supported by advertising, Simpson claims, Swaptor 
has been profitable since its launch in February.

In the imagination of Net enthusiasts, offshore havens like Nevis are 
fervid greenhouses in which this kind of suspect operation can flower. But 
can it?
Napster at its peak had a million and a half simultaneous users, generating 
a huge amount of data traffic; the company established itself in Silicon 
Valley, where it could gain access to the infrastructure it needed to 
handle this barrage of connections. Swaptor, in contrast, is headquartered 
in Nevis. The sole high-capacity Net pipeline to Nevis is provided by the 
Eastern Caribbean Fibre-Optic System, which snakes through 14 island 
nations between Trinidad, off the Venezuelan coast, and Tortola, near 
Puerto Rico. Yet this recently installed system, though it is being 
upgraded, has a limited capacitynot enough to push through the wash of 
zeroes and ones generated by a large file-swapping service. Which, one 
assumes, is why the "offshore" service of Swaptor is actually situated 
in...Virginia.

Should the recording industry decide to sue Swaptor, it wouldn't need to 
rely on the company or on Technology Review to get this information; widely 
available software can trace Swaptor traffic and discover that Swaptor's 
central index of available files is located on five servers that sit just a 
few miles from the Washington, DC, headquarters of the Recording Industry 
Association of America. (Two common monitoring programs, Traceroute and 
Sniffer, can be downloaded gratis from thousands of Web sites.) Not only 
that, Swaptor's Web sitethe site from which users download the programis 
hosted by a Malaysian company with an explicit policy against encouraging 
copyright infringement.

As Swaptor shows, the Net can be accessed from anywhere in theory, but as a 
practical matter, most out-of-the-way places don't have the requisite 
equipment. And even if people do actually locate their services in a remote 
land, they can be easily discovered. "I don't think most people realize how 
findable things are on the Net," says David Weekly, the software engineer 
and Net-music veteran who tracked down Swaptor's servers for this magazine 
in a few minutes. "With simple software...you can find out huge amounts of 
information about what people are doing in almost no time."

Once international miscreants are discovered, companies and governments 
already have a variety of weapons against themand soon will have more.
According to Ian Ballon of the Silicon Valley law firm Manatt, Phelps and 
Phillips, who serves on the American Bar Association committee on 
cyberspace law, even if offshore firms are legal in their home bases, their 
owners "have to be willing to not come back to the United States." Not only 
do they risk losing any assets in this country, but U.S. businesses that 
deal with them will also be at risk. "Any revenue the offshore business 
sends to them could be subject to attachment," says Ballon.

In the future, moreover, the reach of national law will increase. The Hague 
Conference on Private International Law is developing an international 
treaty explicitly intended to make outfits like Swaptor more vulnerable to 
legal pressure"a bold set of rules that will profoundly change the 
Internet," in the phrase of James Love, director of the activist Consumer 
Project on Technology. (The draft treaty will be discussed at a diplomatic 
meeting next year.) By making it possible to apply the laws of any one 
country to any Internet site available in that country, the draft treaty 
will, Love warns, "lead to a great reduction in freedom, shrink the public 
domain, and diminish national sovereignty."

Rather than being a guarantee of liberty, in other words, the global nature 
of the Net is just as likely to lead to more governmental and corporate 
control.

Myth #2: The Net Is Too Interconnected to Control

Before BearShare came Gnutella, a program written by Justin Frankel and Tom 
Pepper. Frankel and Pepper were the two lead figures in Nullsoft, a tiny 
software firm that America Online purchased in June 1999 for stock then 
worth about $80 million. Rather than resting on their laurels after the 
buyout, Frankel and Pepper became intrigued by the possibilities of file 
swapping that arose in the wake of Napster. When college network 
administrators tried to block Napster use on their campuses, Frankel and 
Pepper spent two weeks throwing together Gnutella, file-swapping software 
that they thought would be impossible to block. They released an 
experimental, unfinished version on March 14, 2000. To their surprise, 
demand was so immediate and explosive that it forced the unprepared Pepper 
to shut down the Web site almost as soon as it was launched. Within hours 
of Gnutella's release, an embarrassed AOL pulled the plug on what it 
characterized as an "unauthorized freelance project."

It was too late. In an example of the seeming impossibility of stuffing the 
Internet cat back into the bag, thousands of people had already downloaded 
Gnutella. Amateur programmers promptly reverse-engineered the code and 
posted non-AOL versions of Gnutella on dozens of new Gnutella Web sites.
Unlike Napster or Swaptor, Gnutella lets every user directly search every 
other user's hard drive in real time. With member computers connecting 
directly to each other, rather than linking through powerful central 
servers, these "peer-to-peer" networks have no main hub, at least in 
theory. As a result, there is no focal point, no single point of failure, 
no Gnutella world headquarters to sue or unplug. "Gnutella can withstand a 
band of hungry lawyers," crows the Gnutella News Web site. "It is 
absolutely unstoppable."

Peer-to-peer networks have a number of important advantages, such as the 
ability to search for documents in real time, as opposed to looking for 
them in the slowly compiled indexes of search engines such as Google and 
HotBot. Excited by these possibilities, such mainstream firms as Intel and 
Sun Microsystems have embraced peer-to-peer network technology. But the 
focus of interest, among both the proponents and critics of peer-to-peer 
networks, has been the purported impossibility of blocking them. "The only 
way to stop [Gnutella]," declared Thomas Hale, former CEO of the Web-music 
firm WiredPlanet, "is to turn off the Internet."

Such arguments have been repeated thousands of times in Internet mailing 
lists, Web logs and the press. But the claims for peer-to-peer's 
uncontrollability don't take into consideration how computers interact in 
the real world; a network that is absolutely decentralized is also 
absolutely dysfunctional. In consequence, the way today's Gnutella networks 
actually work is quite different from the way they have been presented in 
theory.

To begin, each Gnutella user isn't literally connected to every other 
userthat would place impossibly high demands on home computers. Instead, 
Gnutellites are directly connected to a few other machines on the network, 
each of which in turn is connected to several more machines, and so on. In 
this way, the whole network consists of hundreds or thousands of 
overlapping local clusters. When users look for a file, whether it is a 
copy of the Bible, a bootleg of A.I. or smuggled documents on the Tiananmen 
massacre, they pass the request to their neighbors, who search through the 
portion of their hard drives that they have made available for sharing. If 
the neighbors find what is being looked for, they send the good news back 
to the first machine. At the same time, they pass on the search request to 
the next computer clusters in the Gnutella network, which repeat the process.

Hopping through the network, the search is repeated on thousands of 
machineswhich leads to big problems. According to a report in December by 
Kelly Truelove of Clip2, a Palo Alto, CA-based consulting group that 
specializes in network-performance analysis, a typical Gnutella search 
query is 70 bytes long, equivalent to a very small computer file. But there 
are a great many of themas many as 10 per second from each machine to which 
the user is connected.
In addition, there is a constant flow of "ping" messages: the digital 
equivalent of "are you there?" Inundated by these short messages, the 56 
kilobit-per-second modems through which most people connect to the Net are 
quickly overwhelmed by Gnutella. Broadband connections help surprisingly 
little; the speed with which the network processes requests is governed by 
the rate at which its slowest members can pass data through.

With BearShare, Vinnie Falco developed one potential fix. BearShare, like 
other new Gnutella software, automatically groups users by their ability to 
respond to queries, ensuring that most network traffic is routed through 
faster, more responsive machines. These big servers are linked into 
"backbone" chains that speed along most Gnutella search requests. Further 
unclogging the network, Clip2 has developed "reflectors"large servers, 
constantly plugged into the Gnutella network, that maintain indexes of the 
files stored on adjacent machines. When reflectors receive search queries, 
they don't pass them on to their neighbors. Instead they simply answer from 
their own memories"yes, computer X has this file." Finally, to speed the 
process of connecting to Gnutella, several groups have created "host 
caches," servers that maintain lists of the computers that are on the 
Gnutella network at a given time. When users want to log on, they simply 
connect with these host caches and select from the list of connected 
machines, thus avoiding the slow, frustrating process of trying to 
determine who else is online.

As their capacity improved, Gnutella-like networks soared in popularity. 
Napster, buffeted by legal problems, saw traffic decline 87 percent between 
January and May, according to the consulting firm Webnoize. Meanwhile, 
LimeWire, another Gnutella company, reported that the number of Gnutella 
users increased by a factor of 10 in the same period. "The networks are 
unclogging, and as a result they're growing," Truelove says. "And the 
content industries should be concerned about that."

But the problem with these fixes is that they reintroduce hierarchy. 
Gnutella, once decentralized, now has an essential backbone of important 
computers, Napster-like central indexes and permanent entryway servers. 
"We've put back almost everything that people think we've left out," says 
Gene Kan, a programmer who is leading a peer-to-peer project at Sun. "Ease 
of use always comes at some expense, and in this case the expense is that 
you do have a few points of failure that critically affect the ability to 
use the network."

Rather than being composed of an uncontrollable, shapeless mass of 
individual rebels, Gnutella-type networks have identifiable, centralized 
targets that can easily be challenged, shut down or sued. Obvious targets 
are the large backbone machines, which, according to peer-to-peer 
developers, can be identified by sending out multiple searches and 
requests. By tracking the answers and the number of hops they take between 
computers, it is possible not only to identify the Internet addresses of 
important sites but also to pinpoint their locations within the network.

Once central machines have been identified, companies and governments have 
a potent legal weapon against them: their Internet service providers.
"Internet service providers enjoy limitations on liability for their users' 
actions if they do certain things specified by law," says Jule Sigall, an 
Arnold and Porter lawyer who represents copyright owners. "If you tell them 
that their users are doing something illegal, they can limit their exposure 
to money damages if they do something about it when they are notified." 
Internet service providers, he says, do not want to threaten their 
customers, "but they like not being sued even more, so they've been 
cooperating pretty wholeheartedly" with content owners.

As Ballon of Manatt, Phelps and Phillips notes, Gnutella traffic has a 
distinctive digital "signature." (More technically, the packets of Gnutella 
data are identified in their headers.) Content companies are also learning 
how to "tag" digital files. The result, in Ballon's view, is easy to 
foresee: "At a certain point, the studios and labels and publishers will 
send over lists of things to block to America Online, and 40 percent of the 
country's Net users will no longer be able to participate in Gnutella. Do 
the same thing for EarthLink and MSN, and you're drastically shrinking the 
pool of available users." Indeed, the governments of China and Saudi Arabia 
have successfully pursued a similar strategy for political ends.

Perhaps sensing that Gnutella cannot escape the eye of authority, 
bleeding-edge hackers have searched for still better solutions. Determined 
to create a free-speech haven, a Scottish activist/programmer named Ian 
Clarke in 1999 began work on a Gnutella-like network called Freenet that 
would be even more difficult to control, because it would encrypt all files 
and distribute them in chunks that constantly shifted location. 
Unsurprisingly, it has attracted enormous media attention. But the system 
is so incompletesearchability is an issuethat one cannot judge whether it 
will ever be widely used. (A small number of people are already using 
Freenet. Most of them are pornography fans, but a few, according to Clarke, 
are Chinese dissidents who employ Freenet to escape official scrutiny.) 
Even if Freenet does not end up in the crowded graveyard of vaporware, 
Internet service providers can always pull the plugtreating Freenet, in 
essence, as an unsupported feature, in the way that many providers today do 
not support telnet, Usenet and other less popular services.

Myth #3: The Net Is Too Filled with Hackers to Control

It was a classic act of hubris. The Secure Digital Music Initiative, a 
consortium of nearly 200 technology firms and record labels, thought the 
software it had developed to block illegal copying of music was so good 
that last September it issued an "open letter to the digital community" 
daring hackers to try their best to break it. The result was a fiasco. 
Within three weeks, at least four teams broke the code, and hacks were soon 
distributed widely across the Internet. In the folklore of the Net, the 
initiative's challenge became one more example of a general truth: any 
method of controlling digital information will fail, because someone will 
always find a way around itand spread the hack around the Internet.

"There are no technical fixes," says Bruce Schneier, cofounder of 
Counterpane Internet Security. "People have tried to lock up digital 
information for as long as computer networks have existed, and they have 
never succeeded. Sooner or later, somebody has always figured out how to 
pick the locks."

But software is not the only means of controlling digital information: it's 
also possible to build such controls into hardware itself, and there are 
technical means available today to make hardware controls so difficult to 
crack that it will not be practical to even try. "I can write a program 
that lets you break the copy protection on a music file," says Dan Farmer, 
an independent computer security consultant in San Francisco. "But I can't 
write a program that solders new connections onto a chip for you."

In other words, those who claim that the Net cannot be controlled because 
the world's hackers will inevitably break any protection scheme are not 
taking into account that the Internet runs on hardwareand that this 
hardware is, in large part, the product of marketing decisions, not 
technological givens. Take, for example, Content Protection for Recordable 
Media, a proposal issued late last year by IBM, Intel, Toshiba and 
Matsushita Electric (see "The End of Free Music?" TR April 2001). The four 
companies developed a way to extend an identification system already used 
in DVDs and DVD players to memory chips, portable storage devices and, 
conceivably, computer hard drives. Under this identification scheme, people 
who downloaded music, videos, or other copyrighted material would be able 
to play it only on devices with the proper identification codes.

In addition to restricting unauthorized copies, it was widely reported that 
the technology also had the potential to interfere with other, less 
controversial practices, such as backing up files from one hard drive onto 
another. In part because of controversy surrounding the technology, the 
companies withdrew the plan from consideration as an industrywide standard 
in February. But the point is clear: the technology has been tabled because 
its promoters believed it wasn't profitable, not because it would not work. 
This and other hardware schemes have the potential to radically limit what 
people can do with networking technology.

Some hardware protection methods already exist. Stephen King released his 
e-book Riding the Bullet in March 2000, in what were effectively two 
different versions: a file that could be read only on specialized 
electronic deviceselectronic booksand a file that could be read on computer 
monitors. Even though the text was available for free at Amazon.com, some 
people went to the trouble of breaking the encryption on the computer file 
anyway; distributed from Switzerland, it was available on the Internet 
within three days. But the electronic-book version was never cracked, 
because e-books, unlike computers, cannot do two things at once. "On a 
computer, you can always run one program to circumvent another," says 
Martin Eberhard, former head of NuvoMedia, the developer of the Rocket 
eBook. "If a book is on a computer screen, it exists in video memory 
somewhere, and someone will always be able to figure out how to get at it."

Eberhard's e-books, by contrast, were deliberately designed to make 
multitasking impossible. True, future e-books could, like computers, 
perform two tasks simultaneously, but publishers could refuse to license 
electronic books to their manufacturers, in much the same way that film 
studios refuse to allow their content to be used on DVD machines that don't 
follow certain rules. And even computers themselves, in Eberhard's view, 
could be "rearchitected," with added hardware that performs specific, 
controlling tasks. "If people have to rip up their motherboards to send 
around free music," he says, "there will be a lot less free music on the 
Net....It would be an ugly solution, but it would work."

Of course, consumers will avoid products that are inconvenient. A leading 
example is digital audio tape recorders, which by law are burdened with so 
many copy protection features that consumers generally have rejected them. 
But to assume that companies involved with digital media cannot come up 
with an acceptable and effective means of control is to commit, in reverse, 
the same act of hubris that the Secure Music Digital Initiative did, when 
it assumed that clever people couldn't break its software. And if the 
hardware industry resists making copy-protected devices, says Justin 
Hughes, an Internet-law specialist at the University of California, Los 
Angeles, an appeal to Congress may be "just a matter of time." If the 
Internet proves difficult to control, he says, "you will see legislation 
mandating that hardware adhere to certain standard rules, just like we 
insist that cars have certain antipollution methods."

"To say that a particular technology guarantees a kind of anarchic utopia 
is just technological determinism," he says. "This argument should be 
ignored, because the real question is not whether the Net will be tamed, 
but why and how we tame it."

We are in the beginning stages of the transfer of most of society's 
functionsworking, socializing, shopping, acting politicallyfrom what 
Internet denizens jokingly call "meatspace" into the virtual domain. In the 
real world, these functions are wrapped in a thicket of regulations and 
cultural norms that are, for the most part, accepted. Some free-speech 
absolutists dislike libel laws, but it is generally believed that the 
chilling effect on discourse they exert is balanced by their ability to 
punish gratuitous false attacks on private individuals. Regulations on the 
Net need not be any more obnoxious. "If the whole neighborhood's online, 
it's okay to have a cop on the beat," says Schneier.

The risk, of course, is overreachingof using law and technology to make the 
Internet a locus of near absolute control, rather than near absolute freedom.
Paradoxically, the myth of unfettered online liberty may help bring this 
undesirable prospect closer to reality. "Governments are going to set down 
rules,"
says Hughes, "and if you spend all your time fighting the existence of 
rules you won't have much chance to make sure the rules are good ones."

In other words, hackers may be their own worst enemies. By claiming that 
the Net is inherently uncontrollable, they are absenting themselves from 
the inevitable process of creating the system that will control it. Having 
given up any attempt to set the rules, they are unavoidably allowing the 
rules to be set for them, largely by business. Corporations are by no means 
intrinsically malign, but it is folly to think that their interests will 
always dovetail with those of the public. The best way to counterbalance 
Big Money's inevitable, even understandable, efforts to shape the Net into 
an environment of its liking is through the untidy, squabbling process of 
democratic governancethe exact process rejected by those who place their 
faith in the endless ability of anonymous hackers to circumvent any 
controls. An important step toward creating the kind of online future we 
want is to abandon the persistent myth that information wants to be free.


Charles Mann has written for Technology Review about the free software 
movement (January/February 1999) and the use of genetic engineering in 
agriculture (July/August 1999).


----++++----++++----
Tudo vale a pena se a alma não é pequena.
http://fcis.oise.utoronto.ca/~aviseu

#  distributed via <nettime>: no commercial use without permission
#  <nettime> is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: majordomo@bbs.thing.net and "info nettime-l" in the msg body
#  archive: http://www.nettime.org contact: nettime@bbs.thing.net