|
This article appears in the August 10, 2007 issue of Executive Intelligence Review.
ARTIFICIAL INTELLIGENCE
`Spacewar': Welcome to the
`Post-Human' Era
by Gabriela Arroyo Reyes, LaRouche Youth Movement
Ready or not, computers are coming to the people.
That's good news, maybe the best since psychedelics. It's way off the track of the "ComputersThreat or menace?" school of liberal criticism, but surprisingly in line with the romantic fantasies of the forefathers of the science such as Norbert Wiener, Warren McCulloch, J.C.R. Licklider, John von Neumann, and Vannevar Bush.
The trend owes its health to an odd array of influences: the youthful fervor and firm dis-Establishmentarianism of the freaks who design computer science; an astonishingly enlightened research program from the very top of the Defense Department; an unexpected market-banking movement by the manufacturers of small calculating machines, and an irrepressible midnight phenomenon known as Spacewar.
Reliably, at any night-time moment (i.e., non-business hours) in North America, hundreds of computer technicians are effectively out of their bodies, locked in life-or-death space combat, computer-projected onto cathode ray tube display screens, for hours at a time, ruining their eyes, numbing their fingers in frenzied mashing of control buttons, joyously slaying their friends and wasting their employers' valuable computer time. Something basic is going on.
"Spacewar," by Stewart Brand
(Rolling Stone, ca. 1972)
Within the remote confines of Stanford's Artificial Intelligence Laboratory in Palo Alto, California, something big was brewing, the implications of which would make the likes of Bertrand Russell, Norbert Wiener, and Mephistopheles himself cackle.
In all their righteous, scraggly glory, the self-proclaimed "enlightened" hippies, from New York City to Haight-Ashbury, who had "tuned in, turned on, and dropped out," to the point of dullness, were immersing themselves in the writings of Norbert Wiener, Buckminster Fuller, and Marshall McLuhan. It was through these New Age visionaries, that they could vicariously envision themselves in a cyberuniverse, one in which they could leave behind any semblance of responsibility for the past, present, or future, in which material reality could be wholly imagined as an information system.
The mysterious, but long awaited "Internet" was about to be unveiled, like a Pandora's Box upon an unsuspecting world, and there were high hopes everywhere, as MIT's Nicholas Negroponte put it, that it would "flatten organizations, globalize society, decentralize control, and help harmonize people." Long gone would be the days of dirigistic economies and industry; the faint sounds of spinning lathes, milling cutters, dirigibles and gliders, cranes and tractors, would inevitably fall into an eerie silence. In their place, the Internet would usher in an unprecedented era, as it paved the way for a "digital generation." But not merely digital in the conventional sense, as Dr. Timothy Leary (not one to jump on this cataclysmic bandwagon too late) attested when he reached the profound realization that psychedelia as a radical new religion attracted too few followers, and instead opted to coronate himself as the new high priest of cyberculture, prophesying that virtual reality was the new and improved "Electronic LSD."
In a cultural landscape such as this one, where it can be said with certainty that the fate of entire language cultures teeters on the edge of a slippery precipice, it becomes difficult to ignore the debris of a civilization that had once produced minds of an impressive caliber and moral fiber that laid the very foundations in culture, science, epistemology, and the maxim of man's divine spark of reason. Whoever would be so naive and gullible as to be seduced by this "technetronic" symbiotic union of gadgetry with the perverse, would be, wittingly or unwittingly, giving in to the tried and true methods of the Luciferian Venetian Empire; as the ancient hands of Time bear witness to the fact that Venice would rather kiss the hand it could not sever.
From the Counterculture to Cyberculture
The two pillars of assault on the American Intellectual Tradition, although cloaked in what appeared to be antithetical garments, were cybernetics and the drug counterculture. In the same way that the youth were corrupted in the aftermath of World War II, into their adolescent years during the Vietnam War, today there is a culture of rabid existentialism, ahistoric by its very nature, that because of the multiplicity of options available to it, doesn't know which reality to choose to make its own; and the preconditions are being set by the modern-day descendants of the aforementioned Wieners and Russellites to ensure a new artificial paradigm-shift into a culture that would bring about its own destruction, and with it, the most advanced ideas that civilization has produced to datea culture that only a cyberculture could offer.
The Advanced Research Projects Agency (ARPA) was established on Feb. 17, 1958 (under the Defense Reorganization Act), in response to the Soviet Union's launching of Sputnik, to ensure that missile response capability in the United States would be adequate. Under the Defense Department, ARPA was bequeathed $520 million by Congress, and with it, sole responsibility "for the direction or performance of such advanced projects in the field of research and development as the Secretary of Defense shall, from time to time, designate by individual project or by category" (DOD directive 5105.15).
In 1963, the portion of research dealing with missile technology was moved from the jurisdiction of the military, to become a separate entity known as the National Aeronautics and Space Administration (NASA), at which point, ARPA was left with nothing but a large budget. The morbidly astute behavioral psychologist J.C.R. Licklider (who would later run the Command and Control Research division of ARPA research) was quick to suggest that ARPA, which would, in 1972, change its name to Defense Advanced Research Projects Agency (DARPA), should invest heavily in computer and artificial intelligence research. As the Cold War intensified, ARPA became a willing vessel for the ideas of cybernetic unmanned warfare of Norbert Wiener, which relied on the computers being built based on the logic designs of John von Neumann.
An overwhelming number of research and development initiatives, and disciplines under the rubric of "interactive computing systems," associated with Human System Integration (HMI), and dealing on one level or another with human brain-machine interfaces, the Internet, or the gaming industry, have a genesis which can be traced back to the earliest days of ARPA. It was through ARPA that the cybernetic blueprint regarding human-machine interface would be unveiled. Today, DARPA proudly carries on the cybernetic torch with the AugCog (Augmented Cognition) program whichthrough its ongoing research and development for the Army, Navy, Marines, and Air Forceseeks to develop a computational system that, with the aid of prosthetic technologies, such as cued memory retrieval, would enhance the overall effectiveness and performance of the warrior soldier.
"The newly emerging field of AugCog is aimed at revolutionizing the way humans interact with computer-based systems by coupling traditional electromechanical interaction devices (such as a mouse or a joystick) with psychophysiological methods (respiration, heart rate, EG, functional optical imaging), where human physiological indicators can be used in real time to drive system adaptation or a priori assess potential design issues which may induce information overload or inefficient decision making" (DARPA). This is the beginning of what some hope will be the next big paradigm-shift, not only in interactive computing, but that it will come to define new parameters for what it means to be human.
The barely recognizable remnants of the military-industrial complex had been transformed into the military-entertainment-complex; this is the training ground for what is now being infamously called "post-human" warfare: a realm in which the unyieldingly faithful and self-avowed worshippers of the fathers of Cybernetics and Information Theory, Wiener and von Neumann, have incessantly and tirelessly dedicated themselves to the propagation of a "Renaissance," in which there exists a seamless fusion between the digital, cybernetic machine and the human being. It is a grave error, according to them, to assume that cognition is an occurrence that takes place in the human mind. Instead, the high priests of post-humanism audaciously preach that cognition is a logical systemic activity which is distributed throughout the environments in which human entities just happen to move and work.
Can Machines Supercede Man?
We need first to understand that the human formincluding human desire and all its external representationsmay be changing radically, and thus must be re-visioned. We need to understand that five hundred years of humanism may be coming to an end as humanism transforms itself into something that we must helplessly call post-humanism.
Ihab Hassan, "Prometheus as Performer:
Towards a Posthuman Culture?"[1]
The litmus test for the age-old question of whether or not machines could supercede man's intellect was typified by what was widely known as the Universal Turing Machine, or Turing test, as described by Prof. Alan Turing in his 1950 paper, "Computing machinery and intelligence." It consists of the following procedure: A human judge engages in a conversation with two other parties, one a human, and the other a machine; based on the responses from each of them, the judge, who does not know which is which, must figure out which is the human, which is the machine. It is presumed that both the human and the machine will try to mislead the judge as to its real identity, and pose as the "most human." If an intelligent being cannot tell the intelligent machine from the intelligent human, this failure, according to Turing, would be the final and necessary proof that machines can think, and would draw out an obvious distinction between intellectual and physical capacities of the thinking human being.[2]
It was not until 1974, at a meeting of the American Society for Cybernetics, in Philadelphia, that the phrase "second order" or "second wave of cybernetics" was officially coined by Heinz von Foerster. There are three main "waves" of cybernetics that are distinguishable today: "first order" cybernetics, which Wiener helped engineer, and which lasted from the mid-1940s to the mid-1970s; "second order" cybernetics, which lasted from the mid-1970s until the mid-1990s. Finally, the "third order" cybernetics, also known as the period of social cybernetics (with which the futurists and humanist educators of today seem to preoccupy themselves the most), which began in the mid-1990s.[3]
Carrying on where Turing left off, Hayles, Hassan, and Hans Moravec propose in their rehashed theories that human identity is essentially an informational pattern, and that it has become increasingly "disembodied." Moravec even goes as far as to make the modest proposal, that, in the not too distant future, human consciousness will itself be downloadable into a computer.
"We are cyborgs not in the merely superficial sense of combining flesh and wires, but in the more profound sense of being human-technology symbiots: thinking and reasoning systems whose minds and selves are spread across biological brain and non-biological circuitry."[4]
The fusion between the biological and technological domains has created what academicians and scholars are likening to a "cognitive machinery" which they predict will inevitably evolve into a self-perpetuating process. To begin to unravel the convoluted phenomenon they describe, one need only to assess with a clinical eye the woes, curses, and bizarre sentimentality that pour forth from the mouths of mesmerized computer video-game players, thus affirming that they are merely projecting their proprioceptive senses into the simulation that is the gaming world. As though in a trance, produced by the flashing graphics of the technicolor LCD screen, many devout gamers find themselves locked in the same positions for countless hours, their left hand tapping away mindlessly on the keyboard, and their itchy "trigger-happy" finger nervously and repetitively guiding the mouse up and down. Entrenched, to the point of exaggeration, in the simulated space of the virtual world; indulged, to the point of complete oblivion to the real world around themthere is a fluid intermingling between flesh and metal, where there seemingly exist no physical boundaries between their fleshy bodies and the joystick which has now become an unconscious extension of their hands.
Welcome to the era of disembodied information, where flesh and metal become one. But before the preconditions of a post-human future are fully comprehended, the question must be posed: Who are the agents of this degrading misnomer that passes for "human" science?
Nancy Katherine Hayles, Professor of English at UCLA, and author of the cult-classic of cyberneticists and futurists alike, How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics, speaks for an emerging breed of academician, determined to keep this odiously entropic and venomous dogma alive. Hayles describes the kooky "research" of Kevin Warwick, Professor of Cybernetics at the University of Reading in England, who inserted an implant into his arm: his first implant being a passive device, communicating only with embedded sensors in the environment. He went on from this first attempt to a second implant that also sends signals to his nervous system, creating an integrated circuit, linking his evolving neural patterns directly with sensors and computer chips embedded in the external environment. Such are the depths to which these engineers of the Apocalypse will plunge in their promotion of a science (by nomenclature only) devoid of profound and impassioned ideas.
Hayles describes what she sees as the promising future of the post-human vision, which, despite the fact that it still has problems, and dangers, makes an otherwise meaningless and miserable existence quite bearable.
The Merger of Defense and Entertainment
The other leading propagandists of this perverse social fusion between man and machine include the Institute for Creative Technologies in Marina del Rey, California. In December of 1996, the National Academy of Sciences hosted a workshop on the common and organized aims that existed in the defense and entertainment industries dealing with modeling and simulation. The report that would emerge in the aftermath of this workshop, at the request of Prof. Michael Zyda (Computer Science specialist in artificial intelligence at the Naval Postdoctoral Academy in Monterey, California, and director of the MOVES Institute, which spawned the game "American's Army"), prompted, three years later, the U.S. Army to fund the University of Southern California with a $45 million budget to create a research center that would develop and advance military simulations, and reflect the overlap between the Pentagon and Hollywood. Also in this growing list of propagandists, is the Institute for the Future (IFTF), which was founded in 1968 by former RAND Corporation researchers, and today claims to forecast the future.
Another incubator for the continued creation of explicitly anti-human ideas goes by the name of HASTAC (Humanities, Arts, Science, and Technology Advanced Collaboratory). The conception for the HASTAC consortium came in 2003 at a meeting of humanities leaders sponsored by the Mellon Foundation. Founder Cathy N. Davidson (vice provost for Interdisciplinary Studies, and co-founder of John Hope Franklin Humanities Institute at Duke University) and co-founder David Theo Goldberg (University of California Humanities Research Institute, Irvine), had already envisioned a plethora of projects that would expand innovative uses of technology to create an unparalleled cyberinfrastructure process. Included in the core leadership of HASTAC are Jeffery Schnapp (director of the Stanford University Humanities Lab); Ruzena Bajcsy (director of the Center for Information Technology Research in the Interest of Society at the University of California, Berkeley); Hadass Scheffer (director of fellowship programs at the Woodrow Wilson National Fellowship Foundation); and Henry Lowood (Curator for Germanic Collections and History of Science and Technology Collections at Stanford University Libraries, Stanford University).
HASTAC describes itself as "a voluntary consortium of leading researchers from dozens of institutions, who have been co-developing software, hardware, and cyberinfrastructure. Legal, ethical, social, historical, and aesthetic issues must also be carefully considered as we expand our capacities for accumulating and analyzing data and as we push the boundaries of science and what it means to be human." From among its ranks, HASTAC seeks to create a new generation of scholars in the humanities who have an infallible expertise in the most advanced work in creating leading edge Information Technologies, and transform institutions in the process of spreading their cyber-humanities vision.
Only in its fourth year of existence, HASTAC already commands "academic attention" and has more than 70 institutions under its umbrella, including Wayne State, Duke, Boston, Cornell, George Mason, Rice, and Stanford Universities; University of California at Irvine; the Universites of Michigan, Southern California, and Washington State; and last, but not at all least, the Woodrow Wilson National Fellowship Foundation, Digital Promise, and the John D. and Catherine T. MacArthur Foundation (the country's largest private grant-making institution, with assets of $6 billion). Two of the most ambitious projects under the HASTAC umbrella are "The global body and the virtual Cyborg," which is already underway through programs at Duke University, and "How they got game: Cultural implications of interactive simulations and video games," from Stanford Humanities Lab, one of HASTAC's founding members.
It is from these pitiable echelons that the morally repugnant Timothy Lenoir crawls out. With funding from the Alfred P. Sloan Foundation, Lenoir's past endeavors have included investigative projects launched from Stanford Humanities Department at the time he was teaching the History of Science. Today, Lenoir is the Kimberly Jenkins Chair for New Technologies and Society at Duke University, where he continues to engage in research for the introduction of virtual reality into biomedics and other humanities fields. The Jenkins "Collaboratory" exists for the sole purpose of investigating and pushing the limits of "transformative processes" in fields such as "cultural production" and human-machine engineering, as well as biotechnology.
Today, the "Game" project is housed at Duke University, and focuses on the development of "industrial-strength" simulations that are the product of the military's relationship with Hollywood and the gaming industry. It is here that the "collaboratory" of Simulation, First-Person Shooters, Strategy, and storytelling project leaders from PEO STRICOM (Program Executive Office for Simulation Training and Instrumentation Command); the Institute for the Future, Science Applications International Corporation (SAIC), MIT Media Lab; and SIMNET (Simulated Network) slime molds interface.
These are not merely colorful concoctions springing forth from fertile imaginations of mad scientists and pedagogues of calamity. This is a heinous attempt to create, from among the ranks of this emerging generation, a class of desensitized drones who will conform to the absurdity of a society in which nothing is held to be true, and everything is permitted. Reminiscent of the dark ages in science, where knowledge was suppressed, today it is not a question of annihilating science, but of controlling it. These are, and always have been the preconditions to control a society. From the pits of the aforementioned nexus, have sprung the seeds that were necessary predecessors to the modern-day Darwinian globalized market and cyberculture that have spawned a population on the verge of willingly surrendering that which renders them superior to apes, bacteria, and computerstheir humanity.
[1] "Prometheus as Performer: Toward a Postmodern Culture?" Georgia Review 31, 4 (Winter 1977-78). In Performance in Postmodern Culture, Michel Benamou and Charles Caramello, eds. (Madison, Wisconsin: Coda Press, 1977).
[2] Alan Turing, "Computing Machinery and Intelligence," Mind, vol. LIX, no. 236, October 1950.
[3] Stuart Umpleby, "The Science of Cybernetics and the Cybernetics of Science," Cybernetics and Systems, vol. 21, no. 1, 1990.
[4] Andy Clark, Being There: Putting Brain, Body, and World Together Again (Cambridge: MIT Press, 1998).
|
|
|