articles in english


From quill to cursor
OSCE, April 2003


[This article is the introduction to the April 2002 book published by the OSCE (the Organization for Security and Co-Operation in Europe) in preparation of its June 2003 conference on freedom of the media and the internet. The full book, containing the papers presented at the preparatory workshop held in Vienna in November 2002, can be downloaded at http://www.osce.org/documents/rfm/2003/04/41_en.pdf.] The resulting recommendations as adopted by the Representative for Freedom of the Media, are now available as the Amsterdam Recommendations.


New technologies raise new perspectives - and new questions

"Technology is not just a value-neutral set of tools," as the Electronic Frontier Foundation (EFF) correctly observed. (1) New technologies (especially those concerning communication and the distribution of information) invariably bring about new political and social relations. Precisely by doing that, they actually change the world, and our minds.

Gutenberg

Once nowledge, ideas, thoughts, insights and theories started being written down and were put into books, they suddenly acquired a less transient and volatile form. For the first time in the history of humankind, one could literally hand down a body of knowledge and preserve it in its original form, not only for oneself and for others, but also for future generations. While we now routinely award prizes to books, in the pre-Gutenberg era books themselves were the prize. Yet, in part because of their sheer cost - books had to be manually copied, one by one, at a painstakingly slow rate - in practice, books were monopolised and were only accessible to a very small elite: some scientists, some of the higher clergy, and, of course, the rich.

It was the invention of the printing press that made books more affordable and somewhat more generally accessible. When the excruciating process of manual copying of manuscripts was supplanted by the mechanical printing press, books could finally start spreading amongst bigger groups within society. A remarkable interaction was initiated: with the appearance of books, the wish to become literate became general and the ability to read spread outside the elites that had previously had had access to handwritten books. Increasing literacy in turn promoted the publication of more books, thus creating a spiral of increasing abilities and knowledge for the masses.

The Gutenberg revolution allowed individuals to educate themselves and made them irrevocably less dependent upon what others - their superiors, the clergy, or even the traveling troubadours - would relay to them or were willing to summarise for them. After the onset of printing, ideas could spread faster and further, whilst retaining their original form, without intervention or mediation (or "filtering" as one would now say), and one no longer had to rely on experts or the elite to disseminate or interpret these ideas. In short, the printing press had a tremendous liberating effect on people in general. It enabled them to become generally more knowledgeable and informed, generally more independent, and generally more equipped. One could even argue that the invention of the printing press was one of the founding elements of the democratic society.

Publishing explosion

Currently, we are in the middle - or perhaps it is only the onset? - of the digital revolution. Both the variety and the total volume of available texts has increased manifold since articles and books began to be published on the internet. Some statistics may serve to illustrate the gargantuan volume of this digital explosion.

Usenet is the name for the collected newsgroups, which are organised by subject. Usenet was developed in the nineteen-eighties for scientists to exchange information and discuss ideas. By now, there are newsgroups about almost everything: from hobbies, lifestyle and sports to culture, science and politics. In the early stages of usenet, it was possible to follow more or less all existing newsgroups and read all postings in them. Nowadays, that is utterly impossible. By 18 December 2002, Newszilla - one of the biggest news servers in the world, run by the Dutch ISP XS4all - carried no less than 46.181 different newsgroups.

The total volume of usenet postings is called the 'newsfeed'. In March 2001, the global newsfeed was measured to be 2 Gb of text (2) and 220 Gb of binaries (pictures, sound files, executables, etc.) per day. Less than 1,5 year later, in October 2002, the global newsfeed had doubled to a daily 400 to 450 Gb. (3)

The number of web sites has increased even more spectacularly. The World Wide Web (WWW) was created in May 1993, when the first application was invented that could present structured, hyperlinked text. That application was Mosaic, the first browser ever. Already that same year, images could be integrated into the text. Much later, around 1996, sound and moving images (film) could be added to web pages.

The following table presents the increase of web sites in the first three years of the WWW, in half-year steps. (4)

 Year - Month No. of web sites 
 1993 - June 130 
 1993 - December 623 
 1994 - June 2.738 
 1994 - December 10.022 
 1995 - June 23.500 
 1995 - December 90.000 

By the end of October 2002, the number of individual pages indexed by Google, currently the prime search engine, had reached almost 2,5 billion pages. Three months later, in January 2003 - that is, shortly under ten years after the birth of the web - that number had reached almost 3,1 billion pages. (5) And although Google indexes a lot, it doesn"t even index everything.

Distributed and shared information makes robust

The internet is not only a publishing tool, it is also a distributing technology. Servers all over the world keep polling one another. News servers for instance exchange articles with one another: "I have this bunch of new newsgroup articles, here ya go." "Thanks. Oh, I already have some of them. I'll reject those and take the rest."

In a similar way, routers (which forward traffic between networks and keep track of which web site is located where) assist one another in finding out the shortest route to a website and in figuring a way to get there anyway when a part of the network is down. "I can't reach www.osce.org. Can't reach their hosting provider, a shackle in the chain from me to them is missing. Do you know how to get there?" To which another router might reply: "Sure, I know another way. I'll point you. Go to that machine, then hop unto that one, and then..." Or: "I can't get there either. But I can get a few steps closer than you can. What if I ask somebody in that neighbourhood?" Or they'd say: "It takes me 42 hops to reach X. How many hops does it take you? 180? Well, here's how I do it."

All this filling in of the gaps, finding shortcuts, balancing traffic, sharing information and circumventing fall-out is built into the system. It is an integral part of the underlying structure of the internet - which is designed to work in such a way that if a connection between two distribution points breaks down, both servers will find new routes to reach one another. (6)

This basic structure is the backdrop of the famous adagium that The internet perceives censorship as damage [to the system] and routes around it. It means that a disruption in the system - no matter whether it is caused by accident, error or malice - will never bring down the system as a whole.

Adopting routing relations

A fascinating phenomenon, especially from the point of view of censorship and prevention or circumvention of it, is that internet users have adopted this principle of routing around damage as a model to base their own behaviour upon. When a web page - or worse, a whole site - is under threat of censorship, (7) website owners often utilise their human networks to route around the impending damage. They ask their contacts to take copies of the besieged pages and to publish them elsewhere, in less dangerous places. This phenomenon is known as mirroring.

And just as routers assist one another in finding the easiest way to reach a certain machine, internet users help one another - dutifully assisted by the courts, one might add, tongue in cheek - to find the locus of least resistance, places where certain material will be the least challenged and the most secure. Because neonazi sites will generally be prosecuted in most West-European countries, such groups have wisened up and have taken their information out of that continent. Since the USA provides a much broader level of protection for speech, that country has become the international host for sites like this; there, they have found political asylum so to speak. Conversely, Singaporean pages about sexuality and religion and US sites about drugs are often hosted in Europe, where such pages meet with less resistance than in their country of origin.

Internationalised information

While governments and authorities may on the one hand dislike this internalisation of information and routing around censorship, on the other hand they sometimes hope to reap a benefit by adopting the same practice themselves. The Netherlands for instance are not very pleased by 'their' neonazi's seeking digital refuge in the United States, yet it gladly hosts sites aiming at a US audience craving more objective information about recreational drugs. Simultaneously, CNN hopes to reach and inform the citizens of Middle-Eastern countries and to circumvent the media restrictions imposed upon citizens of this region by their own governments. Indeed, the CIA has funded software that circumvents certain types of national censorship and that re-enables foreign citizens to tune in to the web version of The Voice of America. (8)

Various authorities try to limit the impact of this phenomenon by attempting to curtail the net in a variety of manners. Until now, most of these censorship efforts have generally failed. (9) In part, because computer experts are technically more proficient than their policy-making opponents and can, with some effort, figure a way around restrictions and prohibitions and develop new protocols to share and access information, and in part because censorship is by definition reactive, a response to a newly created technical reality, and will thus always be lagging behind. (10) Yet, through the course of the years attempts at curtailing the net have been getting increasingly knowledgeable and more difficult to circumvent. And while computer experts often do find loopholes in censoring software or policies, and can write software or protocols that circumvent such measures, that in turn usually means that programs are getting increasingly complicated and thus difficult to use for the uninitiated. (11)

This shifting around of information - either politically and geographically, by finding the locus of least resistance, or technically, by distributing information via rapidly evolving new protocols - is part and parcel of the net, just as integral and fundamental to it as it is on the structural level of the internet. Indeed: censorship is perceived as damage, and not only does the net itself try to route around it, but net users and developers attempt to route around it as well.

This phenomenon drastically changes the effect of national law and politics. Information that isnot legally available in a country can readily be served to a citizen of that country from a website located at the other side of the globe. And interestingly, the person requesting that information won't even notice that it is coming from elsewhere. This means that the internet has made national borders and political boundaries more diffuse than they were and has lessened their importance, or at least has undermined their strictness. National laws curtailing information are simply not as effective as they used to be. The internationalisation of information allows citizens to partake of (or distribute) facts, knowledge, relays and experiences that they would otherwise not have been able to access or share. The analogue with the effect of the Gutenberg press is evident.

New technologies create novel needs and responses

With the assistance of a cheap computer and a modem, or with an internet café in the vicinity, anybody anywhere (12) can access foreign newspapers, start publishing their own magazine, make their ideas and knowledge available to the world, read or publish stories which otherwise would never cross the border, exchange and discuss ideas with people at the other end of the globe.

A newly arisen wish is to block some of these pages. Not only some governments do so - and both China and Saudi Arabia are rather effective at it - but companies, libraries and schools do it as well, the latter two usually at the government's behest. People at home do it too.

There are two basic strategies for blocking pages: imposed blocking and requested blocking. With imposed blocking, a government orders that certain pages should be blocked nation-wide. This can be done via a national proxy (a prozy is a machine that handles all requests for web pages; it acts as an intermediary between a personal web browser and web servers on the net). The proxy is in such cases configured to refuse requests for any page that matches certain (blacklisted) criteria. Saudi Arabia for instance routinely blocks all foreign pages relating to sex and politics. (13) China, on the other hand, blocks pages based on their IP number (crudely: their internet address). (14) Australia has implemented a system blocking specific national pages, but does not block international ones. (15) In in Germany, the state of North Rhine-Westphalia has ordered ISPs and universities to block access to certain sites (mostly neonazi sites, but not only those). (16) In the US, publicly funded schools and libraries are obliged to block pages deemed to be unfit for the children and teens. (17)

A nasty characteristic of this type of blocking is that almost invariably, it censors more than it purports to do. Especially blocking based on IP has dire consequences: it affects entire sites or sometimes even a series of web servers sharing the same IP, while actually only a few pages on those sites or servers fall within the scope of the prohibition. Blockages of this type are very difficult to overcome. Additionally, a shared characteristic of imposed blocking is that the citizen has no choice whatsoever in the matter.

Requested blocking on the other hand is voluntary. It takes place at the user's instigation and on the user's computer only. It is usually done to protect children and to present them with a customised, sanitised version of the net. This type of blocking is done via commercial so-called censorware. (It is this kind of software that US public libraries and schools use, but in their case, it is mandatory.)

An interesting hybrid variant was used by Scientology. The cult developed its own version of a censorware package which, at the user's end, blocks all pages critical of Scientology. Critics of the cult dubbed this censorware Scienositter, after the package of which it was a derivate: Cybersitter. Scienositter had one unexpected characteristic: it was imposed upon unwitting cult members. Scientology sold them a package to create instant "individualised" proselytising web pages with the program installed the Scienositter on the sly, thus preventing these members - without their consent - from seeing pages that Scientology deems inappropriate because they opposes the organisation's own view. (18)

While voluntary blocking pages at one's own computer is an unalienable right, blocking pages at school or libraries is an altogether different matter. Civil liberties organisations have made a pretty good case that censorware programs actually filter out more than they profess to do and have - in the USA - started lawsuits asserting that such blocking of information is in fact unconstitutional. Indeed, on May 31 2002, a federal court agreed with that criticism. The US state has appealed the ruling, and the case is currently being reviewed by the US Supreme Court. (19)

Fashionable self-regulation

Not only is information shifting from place to place to find the locus of least resistance; measures to contain information are doing exactly the same. Slowly, the governments of industrialised countries have arrived at the general agreement that discussions about, and measures against, disputable web sites should not be carried out by themselves, but by others. In a few years time, so-called self-regulation of the net has become one of the policy makers' new buzzwords.

Self-regulation means that the industry is assigned a major role in policing content that is not clearly illegal, and therefore not directly punishable, and/or should reach agreement upon how to act when complaints are received about web pages. The industry is asked to develop, possibly in co-operation with users" interests groups (such as civil liberties and privacy organisations), rules for acceptable use, codes of conduct and procedures for the removal or suspension of disputed pages. The standard categories that are mentioned as areas where self-regulation should be promoted, are child pornography, pornography, violence, racism and 'hate speech'. (20)

Almost all European national and supra-national governmental and official advisory bodies nowadays promote such self-regulation. The Council of Europe for instance states: "International co-ordination should also involve the industry, which should be encouraged to develop codes of conduct and self-regulatory schemes. This co-ordination is also essential to guarantee the protection of minors against content which is not strictly illegal, but may be harmful and detrimental to their personal development, in an environment where traditional ways of controlling access (for example watershed rules) do not work." (21) This proposal is a rephrasal of the Recommendation Rec(2001)8 of the Committee of Ministers to member states on self-regulation concerning cyber content. (22)

Self-regulation will of course never be able to solve the fundamental problem of information travelling to the locus of least resistance. After all, when such information has found a safe haven, that new abode is by definition a country where that information is fully legal. Policing legal material is of course redundant, politically adverse and blatant nonsense. Taking this into account, the only area where general agreement is at all possible is child pornography: it is forbidden in practically all countries. Then again, precisely because the trafficking in or displaying of child pornography is illegal everywhere, self-regulation is not necessary. There are solid laws against it. And policing illegal acts should not be relegated to private parties. All other contested material - from racism to depiction of sex - is legal in one country or another, and can thus never be banned from the net.

Moving censorship out of the public realm

Apart from that, there are quite a number of drawbacks to and loopholes in self-regulatory systems. (23) First of all, and it is really begging the question, why should "content which is not strictly illegal" (to quote the Council of Europe) be subjected to any kind of regulatory measure or process? Parents are after all free - and encouraged - to install censorware if they want to protect their children, while any kind of industry self-regulatory practice affects the rights of mature internet users to access material that is "not strictly illegal", which is a rather obscure way of saying that it might be in bad taste but it is actually legal, in which case the industry has no right to prevent access to it.

Conversely, where it concerns material which is illegal in country A, but fully legal in country B where it is hosted, the industry of country A has no right to prevent anyone from accessing it. The government of country A can indeed decide to outlaw such material, but then they should take that responsibility upon themselves and not dump it on the industry.

Secondly, some material might indeed be on the edge, but countries have a fully qualified system to make decisions about the acceptability and legality of specific material: the courts. By moving the assessment of the legality of such material out of the court room, not only do the processes and the criteria by which material are judged become opaque, on top of that, they run a high risk of becoming arbitrary. Indeed, existing self-regulatory bodies - such as the censoring authority in Australia and the child pornography hotline in the UK - have already gained themselves a reputation for being remarkably furtive about their own proceedings. Needless to add that, unlike in court, the accused lack lawyers and the possibility to appeal decisions.

Thirdly, the term "self-regulation" is highly deceptive. As the above shows, it is not about the industry regulating itself, but about the industry regulating its customers. In other words, self-regulation primarily concerns itself with regulating others.

Fourthly, the industry is supposed to regulate something in which it itself has a stake, that stake being - amongst others - to have a blooming business without too many hassles. ISPs weren"t started to defend users' rights, nor are they generally willing to stand up for free speech when the effects of that speech might damage their reputation or their revenues. In that sense, ISPs and their clients become opponents whenever an ISP receives a complaint about one of its customers. Legally, the customer might be in his full right to publish the disputed material, but the ISP - facing a complaint - is not likely to assist him in to find arguments pro publication. In other words, one of the parties involved in the conflict has been assigned complete responsibility to decide upon the fine line between legality and illegality. The protection of the constitutional right to express one's opinion is being put in the hands of the industry. (24)

Finally, what is fundamentally wrong with self-regulation is that it allows governments to refuse to set rules and limits and delegate the matter to a private body, hoping to thus solve the issue. In doing so, governments are privatising censorship, without assuming responsibility and accountability for it themselves, and without offering legal redress for either those censored or for those robbed of access to the censored content.

Susceptible search engine

The gargantuan number of web pages resulting from the new digital publishing-and-distributing technique has created a demand for new meta-technologies: to wit, that of indexing and retrieving web pages and newsgroup postings. Indexing web pages and usenet postings has become a matter of prime importance. After all, what would be the use of publishing something in a newsgroup or on a web page, if nobody would be able to find it in this vast sea of exponentially increasing information? Without retrieval technologies, the only way to be informed about the existence of specific pages would be by word of mouth, which of course defeats the purpose. It would make the sharing and distribution of information a local matter once again.

Search engines provide precisely this service. When you key in a few search terms, search engines will point you to pages or newsgroup postings that contain these terms. Some search engines additionally rate pages by assumed relevance, some search engines organise them in coherent groups, some do nothing but present long lists of so-called 'hits'. What all search engines have in common is that they guide you through the web and usenet, and assist you in finding what you were looking for. Search engines have become pivotal to the net, to the degree that without them, there is no way to find your way around.

Their particular strength makes them susceptible to censorship attacks, all the more so because the possibility to access pages or to censor them, are at heart two sides of the same coin. Indeed, without the facility to sift through indices using search terms, censors wouldn"t even be able to decide what pages to block in the first place... While finding a page is the result of filtering massive amounts of data in order to select a set based on specific criteria, blocking is filtering that information in order to suppress that same set.

In the past few years, search engine censorship has been on the rise. Through a number of French court cases, the US based Yahoo auction site was forced to prevent French users from perusing nazi memorabilia; while the relevant court decisions have finally been overturned in the US, (25) by now a number of search engines have voluntary adapted their local, nationalised versions so that people consulting the German, French and Swiss version of Google will not be able to find neonazi or white supremacy content, even though these pages are quite legal in the countries where they are hosted and indexed. (26) (However, the mother of all Googles, the international Google, http://www.google.com, still lists them and is readily accessible to German, French and Swiss users.) Also, the precise criteria for dropping pages from the n ationalised search indices are completely unclear. Who decides what page is labelled as nazi-like or neonazi? On what grounds? (27) And what are the chances that these pages, after due process, would be deemed on the verge but acceptable anyway in either Germany, Switzerland or France?

The reason why search engines engage in such censorship is obvious: to prevent lawsuits. The big question for the future is: how far will this local censoring go? Will the universal index - in this case, the one at google.com, remain generally accessible? And if not, will other - by then more daring - search engines supplant them?

New monopolies: connectivity/bandwidth

So far, we have looked at new ways of distributing and new methods of censorship. Another important issue is ownership.

Many media watchdogs are worried about concentration of ownership leading to media monopolies; the Italian case (Berlusconi) springs to mind. However, there is no similar worry over monopolies on the internet.

In some countries, access to the net is completely controlled by the government. In China for instance, one needs to register before one can buy a modem; in some countries, the only body providing internet access is the government-owned telco. In countries where such a monopoly exists, it is comparatively easy to censor users: since the government controls access, it can restrict users or impose a national proxy blocking specific pages (Saudi Arabia does so, just like Dubai and Singapore).

In industrialised Western countries, monopolies are on the rise. While a country may have many ISPs providing internet access, the ISPs themselves need to buy bandwidth and connectivity from so-called upstream providers. These upstream providers in turn have their own upstream providers where they buy bandwidth and connectivity. Worldwide, that specific market is owned by four or five companies.

If you follow the line upwards in Sweden, you will discover that all ISPs depend for their bandwidth and connectivity upon one single US-based multinational: MCI/Worldcom. In an infamous case where a user's page repeatedly garnered complaints and the user's provider, Flashback, refused to block the page - the prosecutor had investigated the page earlier and found it legal - a complainant went to Flashback's upstream provider, who then decided to block the whole of Flashback. When Flashback tried to buy connectivity and bandwidth elsewhere they discovered that MCI/Worldcom was the final upstream provider in the whole country - and MCI/Worldcom had told all their clients to refuse Flashback. All this happened over one single user's page that the prosecutor had deemed legal. (28)

Currently, wireless internet is on the rise. (29) Wireless internet operates in an unregulated and unlicensed band of spectrum that is shared and available for use by anyone. Until now it was most commonly used for personal appliances, such as microwave ovens or cordless home telephones, and even for the radar 'gun' used by law enforcement to read the speed of a moving vehicle.

Unlike today's wired network, a wireless network requires little more than an access point (abbreviated as AP). Access to a wireless-based service doesn't require an expensive connection to each user - there is no need for running wires to each building, or for the installation of a satellite dish. Wireless technology is also far less expensive to deploy than the limited wireless technologies of existing cellular service providers. And, because in most countries it operates in an unregulated spectrum, anyone can deploy a wireless Access Point. Basically, a wireless access point is nothing less than a broadband network

Wireless connectivity might in the near future become one of the prime means of offering connectivity to countries lacking a telephone or cable infrastructure. There are great concerns that the "big players" in the connectivity market will attempt to block this development, fearing that wireless connectivity will lose them part of the market (a part which they themselves have not yet deemed interesting enough to explore.)

What is journalism, and who qualifies as a journalist?

One last question to ponder is what, in this post-Gutenberg era, qualifies as journalism and who as a journalist. This question is of special importance for the OSCE/FOM, since amongst its tasks is to provide early warning on censorship and other violations of freedom of expression, to respond to obstruction of media activities and to act against unfavourable working conditions for journalists.

It used to be rather straightforward: journalism was what the media engaged in, and a journalist was anybody who would work for such a medium. With the rise of the net, that definition has become too strict. After all, one need no longer be employed by a radio or television station, magazine or newspaper. Any individual with internet access can start a news magazine of his own - he is suddenly equipped with a Gutenberg press and a distribution center.

While people making a web site about their private hobby - collecting stamps, or gossiping about pop idols - are not likely to be put through any political hassle, there is nothing to say that others, who engage in more political content, will not suffer the same repression that 'classic' journalists experience. And if they do, there is no reason whatsoever to withhold from them the kind of support that their colleagues working for traditional media hope to get from the OSCE/FOM. In fact, they might need it all the more, because there are not too not many organisations standing up for them.

Apart from that, internet journalists increasingly face a new problem. Several governments - to wit: Italy, Spain, Turkey; and Finland is proposing to do the same - have imposed existing media laws upon internet publications, demanding that each and every web site owner registers himself and informs a designated authority of any updates or changes, a demand which is clearly impossible to live up to on the net. (30) Until now, laws like this have only been called upon to penalise web sites and internet journalists that do not concur with official policies.

Conclusions and recommendations

The rise of the internet creates all new kinds of questions pertaining to freedom of the media and the freedom to access media. We have seen a number of these questions in the above, and there are questions this article hasn"t even touched upon. How are texts distributed and accessed nowadays? Will ownership of texts (copyright) remain the same? (31) What is the place and meaning of internet journalism, as compared to broadcasted or written journalism? How are media laws applied to the net? What does censorship look like in a post-Gutenberg epoch, and what do media monopolies look like?

In answering these questions we should at least take the following points into account:


  1. Censorship on the net does not merely copy censorship of the classic or traditional media: it is more diffuse, less centralised, more widespread, and far less tangible than older forms of censorship, and it is becoming more and more common in industrialised countries.

  2. The OSCE/FOM will have to adjust its working definition of censorship to encompass imposed content filtering and limiting or denying bandwidth.

  3. The implementation of censorship is slowly being delegated from governments to the internet industry, whereby the latter is given quite a power to wield over citizens' (constitutional) rights.

  4. Journalism is changing face. People who are persecuted over their web pages need our support just as much as people who are prosecuted over their work in the traditional media. After all, one type of journalism merits as much protection aas the other.

  5. The political risks of internet monopolies need to be taken into account, and varied means of access need to be promoted and supported.

  6. Finally, we need to take into account the vulnerability of certain pivotal services on the net, par excellence that of search engines, and support them when they are under siege.


Karin Spaink,
for the OSCE/FOM
January 2003


Footnotes:


  1. Electronic Frontier Foundation: Building People In. Architecture Is Policy, http://www.eff.org/buildin.html.
  2. Gb means 'gigabyte'. One gigabyte equals 1024 megabyte (Mb); a megabyte contains 1024 kilobytes (Kb); a kilobyte, finally, consists of 1024 bytes. Thus, a gigabyte is 1.073.741.824 bytes. For comparison's sake: this document is 152.064 bytes long, so it would take 7061 times this text to get one gigabyte of data.
  3. Erik Hensema, FAQ/VVV: De XS4ALL newsservers, December 14, 2002, http://groups.google.com/groups?selm=newsservers-1039935300-3268@hensema.net
  4. Information taken from Net Genesis (now defunct), quoted in A Profile of the Internet, http://www.cwrl.utexas.edu/~tonya/309m/class/internet.html, section 2. I proudly testify that my web site was one of those the 23.500 counted in June 1995.
  5. On its front page, Google states the exact number of indexed pages; the count is updated automatically.
  6. This is true only for the structure of the net, not for its content. When the server hosting my web site is down, nobody can reach my website. However, in as far as routers are concerned - the machines that point the way on the internet, by making data packets hop from one place to another - its tasks will readily be taken up by other routers. Thus, the damage is minimised.
  7. "Censorship" in this respect should be taken broadly. It does not only refer to pages being yanked by governrnents or by other authorities, but also to pages being closed over libel or copyright law.
  8. In December 2001, Safeweb received 7 million dollars from IN-Q-Tel, the CIA's venture capital fund and a from second investor for the development of this software, which was dubbed Triangle Boy. The product seems to have been discontinued. Safeweb is however still in the business of thwarting those trying to police the net. Recently, they released SEA Tsunami for secure remote access: once you log in to Safeweb's SEA, your activities on the net can no longer be logged (see http://www.safeweb.com/sea_tsunami_features.html). Western states are more and more interested in such logs and the EU is currently discussing laws that make retention of user logs by their ISPs mandatory. The UK already has such a law, called the RIP Act. For more information about Safeweb and SEA, see Thomas C. Greene, "US company defeats Brit RIP Act", The Register, 17 January 2003, http://www.theregister.co.uk/content/8/18017.html.
  9. For an extensive overview of these censorship efforts, their downsides and their workarounds, see Felipe Rodriquez, Burning the village to roast the pig, November 30, 2002 OSCE Vienna workshop. Bennett Haselton however points out that circumvention software is prone to fail in the long run; see Bennett Haselton, "List of possible weaknesses in systems to circumvent Internet censorship", http://www.peacefire.org/circumventor/list-of-possible-weaknesses.html.
  10. Historically, censorship is always either circumvented through hi-tech (which is too difficult for censors), or by reverting to low-tech (which is too common for it to be censored). When the USSR made printing difficult, dissidents fell back upon the manual carbon-copying of books and articles (Samizdat); B92, the censored Serbian news broadcast, used both hi- and low-tech and eventually depended upon a combination of internet cable connections for uploading broadcasts and foreign radio stations broadcasting them via medium wave.
  11. PGP (encryption) is a good example. While PGP provides an almost foolproof method or rendering e-mail communications illegible for all outsiders (and thus for snooping governments), many people find it very difficult to use, although PGP nowadays comes with a broad variety of good and easy-to-use interfaces. Somehow, PGP still has a too technical 'feel' for most people, which in itself works as a rather effective deterrent against using it, no matter how useful or necessary PGP might be.
  12. In principle, at least. Some countries are badly connected, and the increasing habit of Western websites to use lots of Flash and other heavy applications makes it very difficult to view these web pages: it takes ages for them to load - and thus viewing them becomes very expensive. However, web space - thus, publishing - is getting cheaper by the day, and there are quite a number of places where one can get free web space (Lycos, Geocities, Tripod etc.). Postings on usenet (newsgroups) are archived automatically; one doesn't even need to have a website to store them.
  13. See Harvard researchers Jonathan Zittrain and Benjamin Edelman, Documentation of Internet Filtering in Saudi Arabia, http://cyber.law.harvard.edu/filtering/saudiarabia/
  14. Felipe Rodriquez describes the principle in his essay Burning the village to roast the pig, November 30, 2002 OSCE Vienna workshop.
  15. See the Electronic Frontier Australia's paper Internet Censorship in Australia, December 20, 2002, http://www.efa.org.au/Issues/Censor/cens1.html, and Felipe Rodriquez, Burning the village to roast the pig, November 30, 2002 OSCE Vienna workshop, chapter 2.
  16. Alexander J. Kleinjung, "Vom Daten-Highway auf die Straße" in the German edition of C'T, September 2002. The law is heavily criticised by, amongst others, Initiative für ein freies Internet; Plattform zur Veranstaltung von Online-Demonstrationen (ODEM), http://odem.org/, and the Chaos Computer Club (CCC), at http://www.ccc.de/censorship/.
  17. The US Congress passed the Children's Internet Protection Act (CIPA) on 15 December 2000. The full text of the act is at http://www.ifea.net/cipa.html.
  18. See Scientology censors WWW for members, http://scn.martinobrien.com/ABUSE/KRASEL/COS/FILTER/FILTER1.HTM.
  19. The US Children's Internet Protection Act is being fought by the ACLU, the American Civil Liberties Union. The case is being reported on as it develops at http://archive.aclu.org/features/f032001a.html.
  20. "Hate speech" is a loosely defined lump term used to denote the propagation of hatred against ethnic minorities. Sometimes they border on or are neonazi kind of pages. The amount of attention such pages get from both other media and policy makers, is however somewhat disproportional. To wit:
    In 2000, Hatewatch.org counted between 450 and 500 "hard core" hate sites and circa 1750 sites that it deemed "problematic". (Source: QuickFacts: Hate and Hate Crimes, http://www.media-awareness.ca/eng/issues/stats/isshate.htm.) Let's be very pessimistic and set their number at 50,000 pages all in all. Let's then set the amount of all existing pages in 2000 at 1 billion, a rather high number. Basic math tells us that even with these exaggerated figures, "hate pages" make up a mere 0.05% of the total amount of pages. One would wish that there was as little racism and hatred in the analogue world.
  21. Páll Thorhallsson, Freedom of the media and the Internet, November 30, 2002 OSCE Vienna workshop.
  22. Recommendation Rec(2001)8 of the Committee of Ministers to member states on self-regulation concerning cyber content, adopted by the Committee of Ministers on 5 September 2001. For the full text, see http://cm.coe.int/ta/rec/2001/2001r8.htm.
  23. Sandy Starr elaborates on the theme in The diminishing importance of constitutional rights in the internet age, his contribution to the November 30, 2002 OSCE Vienna workshop.
  24. A Swedish case (Flashback facing MCI/Worldcom over one of its users) and a Dutch/US case (Xtended Internet facing Scientology over one of its users) are described at large in Christiane Hardy and Karin Spaink, "Freedom of the internet. Our new challenge", OSCE Yearbook 2002, Vienna 2002, also at http://www.spaink.net/english/osce_internetfreedom.html. In both cases, the providers stood up for their clients; in both cases, their upstream providers simply cut the ISP's connection without redress being possible.
  25. A summary of the ruling is given in U.S. Court Releases Yahoo! Inc. From Compliance With French Court Order, http://www.ffhsj.com/bancmail/pdf/011120.pdf. For a more elaborate description, see Christiane Hardy and Karin Spaink, op. cit.
  26. Researchers Jonathan Zittrain and Benjamin Edelman from Harvard Law School have investigated Google at large and discovered that Google has dropped 113 sites, in whole or in part, from its localised version for French, German and Swiss users (http://www.google.fr, http://www.google.de and http://www.google.ch respectively). See Jonathan Zittran and Benjamin Edelman, Localized Google search result exclusions. Statement of issues and call for data, http://cyber.law.harvard.edu/filtering/google/. The paper gives a brief overview of other attempts to censor search engines.
  27. Zittran and Edelman (op. cit.) stress that while "many such sites seem to offer Neo-Nazi, white supremacy, or other content objectionable or illegal in France and Germany, [..] other affected sites are more difficult to cleanly categorize."
  28. Flashback (http://www.flashback.se) is currently up again, but now only as a news agency; it has abolished its user pages. A list of news articles about the shutdown is available through Flashback's mirror at http://fb.provocation.net/www.flashback.se/. The case is described in detail in Christiane Hardy and Karin Spaink, op. cit.
  29. The following description is taken from Alan Levy's Matching new wifi technology with virtual private networks to create affordable universal internet access, at http://www.centerdigitalgov.com/international/story.php?docid=36757.
  30. For Italy, see Manlio Cammarata, "Qui succede un 'quarantotto'", in Interlex, April 4 2001, http://www.interlex.it/stampa/48.htm and Snafu, Re: <nettime> Indymedia Italy under attack, at http://amsterdam.nettime.org/Lists-Archives/nettime-l-0202/msg00109.html. For Spain, see Julia Scheeres, "Fears of a Website Inquisition", in Wired, May 29 2001, http://www.wired.com/news/business/0,1367,44110,00.html and Steve Kettmann, "Spanish Web Law Sparks Debate", also in Wired, May 1 2002, http://www.wired.com/news/print/0,1294,52201,00.html. For Turkey, see Kemal Altintas, Tolga Aydin, and Varol Akman: "Censoring the Internet: The Situation in Turkey", published in First Monday, http://www.firstmonday.org/issues/issue7_6/altinta/index.html. For Finland, see Electronic Frontier Finland, Freedom of expression: The law on liabilities in public communications,at http://www.effi.org/sananvapaus/index.en.html.
  31. For that question, see Jennifer Jenkins' contribution to the November 30, 2002 OSCE Vienna Workshop, entitled The importance of the public domain for creativity, innovation and culture.

Copyright Karin Spaink.
This text is offered for private use only. Any
other use requires the author's written permission.