"We explore... and you call us criminals. We seek after knowledge... and you call us criminals. We exist without skin color, without nationality, without religious bias... and you call us criminals... Yes, I am a criminal. My crime is that of curiosity. My crime is that of judging people by what they say and think, not what they look like. My crime is that of outsmarting you, something that you will never forgive me for. I am a hacker, and this is my manifesto". - The Mentor
Any newbie hacker will be able to tell you that hacking relies on "exploits," preexisting bugs that are leveraged by the hacker to gain access to a computer. Burglars know that houses have their own exploits. Locks may be picked, windows broken, doors jimmied. The same can be done to a computer: buffers may be overflowed, trapdoors sprung, Trojan Horses deployed. Yet, while a burglar's exploits often rely on physical force to gain entry, a hacker's exploits generally rely on logical force. That is, hackers generally hack the possible bugs and loopholes of a machine's logical code base (rather than taking a crowbar to its hardware).
Hackers don't care about rules, or feelings, or opinions. They care about what is true and what is possible. And in the logical world of computers, if it is possible then is the real. Can you break into a computer?--not "should you" or "is it right to." When poured in a vessel, water will fill the vessel completely; when poured into a computer network, the hacker will enter any space available to him/her. In fact, possibility often erases the unethical in the mind of the hacker. An anecdote from the legendary hacker Acid Phreak illustrates this well. After being told certain personal details of his rhetorical opponent John Perry Barlow, information which he would later use to obtain Barlow's credit history, Acid Phreak screamed, "Mr. Barlow: Thank you for posting all I need to know to get your credit information and a whole lot more! Now, who is to blame? ME for getting it or YOU for being such an idiot?!" Most hackers would answer: You, for being such an idiot.
Fredric Jameson said somewhere that one of the most difficult things to do under contemporary capitalism is to envision utopia. This is precisely why possibility is important. Deciding (and often struggling) for what is possible is the first step for a utopian vision based in our desires, based in what we *want*. Hackers are machines for the identification of possibility.
The relationship between utopia and possibility is a close one. It is necessary to know what one wants, to know what is possible to want, before a true utopia may be envisioned. "When computers become available to everybody the hackers take over," wrote Steward Brand in 1972. "We are all Computer Bums, all more empowered as individuals and as cooperators." The hacker's unique connection to the realm of the possible, via computer technologies which structure themselves on precisely that threshold of possibility, gives the hacker special insight into the nature of utopia, or what we want out of computers.
One of the most important signs of this utopian instinct is the hacking community's anti-commercial bent. Software products have long been developed and released into the public domain, with seemingly no profit motive on the side of the authors, simply for the higher glory of the code itself. "Spacewar was not sold," Steven Levy writes, referring to the video game developed by several early computer enthusiasts at MIT. "Like any other program, it was placed in the drawer for anyone to access, look at, and rewrite as they saw fit." The limits of personal behavior become the limits of possibility to the hacker. Thus, it is obvious to the hacker that one's personal investment in a specific piece of code can do nothing but hinder that code's overall development. Code does not reach its apotheosis for people, but exists within its own dimension of perfection. The hacker feels obligated to remove all impediments, all inefficiencies that might stunt this quasi- aesthetic growth. "In its basic assembly structure," writes Andrew Ross, "information technology involves processing, copying, replication, and simulation, and therefore does not recognize the concept of private information property." Commercial ownership of software is the primary impediment hated by all hackers because it means that code is limited--limited by intellectual property laws, limited by the profit motive, limited by corporate "lamers." Even Kevin Mitnick, a hacker maligned by some for his often unsavory motivations, admits that the code itself has of higher priority than any commercial motivation "such as trying to get any type of monetary gain."
British hacker Dr-K claims that "corporations and government cannot be trusted to use computer technology for the benefit of ordinary people." It is for this reason that the Free Software Foundation was established in 1985. It is for this reason that so much of the non-PC computer community is dominated by free, or otherwise de-commercialized software. The hacker ethic thus begets a utopianism simply through its (anachronistic) rejection of all commercial mandates.
However, greater than this anti-commercialism is a pro-protocolism. A protocol is a set of guidelines that govern how computers communicate over networks. Protocol, by definition, is open source. That is to say, protocol is nothing but an elaborate instruction list of how a given technology should work, from the inside out, from the top to the bottom, as exemplified in the RFCs, or "Request For Comments," documents freely available from the Net. While many closed source technologies may appear to be protocological due to their often monopolistic position in the market place, a true protocol cannot be closed or proprietary. It must be paraded into full view before all, and agreed to by all. It benefits over time through its own technological development in the public sphere. It must exist as pure, transparent code (or a pure description of how to fashion code). As concerned protocological actors, hackers have often called attention to commercial or governmental actions that impede protocol through making certain technologies proprietary or opaque. One such impediment is the Digital Millennium Copyright Act (DMCA) of 1998. The hacker magazine 2600 has pointed out that the DMCA "basically makes it illegal to reverse engineer technology," reverse engineering being the term to describe the interpellation of source code through an examination of the results of that code. "This means that you're not allowed to take things apart and figure out how they work if the corporate entities involved don't want you to." This certainly is a pity for those of us wishing free use of commercial technology products we have purchased, however it is a greater pity for possibility. For if technology is proprietary it does nothing but limit what is possible.
The synonym for "possibility" most commonly used in today's technospeak is "access." On the Net, something is possible only if it is accessible. Hackers reject situations where access to technology is limited. Purveyors of proprietary technology "want to be able to dictate how, when, and where you can access content," complain the editors of 2600 over a lawsuit levied by the Motion Picture Association of America against hackers who had cracked the proprietary limitations of the DVD media format. 2600 writes, correctly, that the real issue here is one of control over a specific technical knowledge, not potential piracy of DVD media. "The Motion Picture Association of America wanted to make sure they had control and that nobody, not hackers, not civil libertarians, not ordinary people in the street--dared to figure out how to challenge that control. Selling a pirated movie is nothing to them. But telling people how the technology works is the real threat."
The question of access is top priority today. Access to physical space, and access to psychic space. We can learn from the hackers, those machines of possibility, to help carve out our possible futures.