from: Friedrich=Kittler AT rz.hu-berlin.de
to: nettime-l AT desk.nl
subject: On the Implementation of Knowledge - Toward a Theory of Hardware

For David Hauptmann, SysOp of my professorship,
laid off by the Berlin Senate

The world in which we have lived for the last forty years is no longer broken up into stones, plants and animals but into the unholy trinity of hardware, software and wetware. Since computer technology (according to the heretical words of its inventor) is at the point of “taking control," the term hardware no longer refers to building and gardening tools but to the repetition, a million times over, of tiny silicon transistors. Wetware, on the other hand, is the remainder that is left of the human race when hardware relentlessly uncovers all our faults, errors and inaccuracies. The billion-dollar business called software is nothing more than that which the wetware makes out of hardware: a logical abstraction which, in theory - but only in theory - fundamentally disregards the time and space frameworks of machines in order to rule them.

In other words, the relationship between hardware, wetware and software remains a paradox. Either machines or humans are in control. However, since the latter possibility is just as obvious as it is trivial, everything depends on how the former is played out. We must be able to pass on to the coming generations - if not as the legacy of these times then as a kind of message in a bottle - what computer technology meant to the first generation it effected. In opposition to this, though, is the fact that theories from the outset turn everything they are at all able to describe into software, that they are already beyond hardware. There exists no word in any ordinary language which does what it says. No description of a machine sets the machine into motion. It is true that implementation, in the old Scottish double-meaning of the word - at once the becoming an implement and the completion or deployment - is indeed the thing which gives plans or theories their efficiency, but at the price of forcing them into silence. In this crisis, the only remaining remedy is also just as obvious as it is trivial. This essay, instead of attempting a general theory of hardware which cannot be accomplished, turns first of all to history, in order to take the measure of what computer technology calls innovation, with the aid of a familiar hardware: writing. For reasons which are connected to the city of Berlin, in this year, I further focus on one single hardware: the implementation of the knowledge produced by universities. With the double prerequisites of high technology and the scarcity of finances, a kind of knowledge which needs knowledge hardware can probably do no damage.

1

Ernst Robert Curtius, who knew what he was talking about, called universities "an original creation of the [European] Middle Ages." Even this great medievalist, however, did not bother to clarify the kind of material basis this creation was founded upon. The academies of antiquity, the only comparable institutions, got by with hardware that was more modest and more plentifully available. In Nietzsche's wicked phrasing, Plato himself, in all his Greek "innocence," made it clear "that there wouldn't even be a Platonic philosophy if there hadn't been so many lovely young boys in Athens, the sight of whom was what first set the soul of the philosopher into an erotic ecstasy, leaving his soul no peace until he had planted the seed of all high things in that beautiful soil." The cultural legacy of a time in which the free citizens and the working slaves remained strictly separated coincided, then, with biological heredity. The youths who attended the Early Medieval universities, on the other hand, were monks. Their task involved neither procreation nor beauty, but work. Since the time of Cassiodor and Benedict, when it was allowed to fall to the level of a lowly craft or trade, this has consisted of writing. Every stroke of the quill on parchment, even if its meaning was lost to the writer, still as such delivered a flesh wound to Satan. Thus it came to be that monasteries, cathedral schools and universities began to produce books incessantly. Unlike the academies or schools of philosophy in antiquity, they were founded on a material basis which cast the transfer of knowledge between the generations in a form of hardware. In place of an amorous rapture between philosophers and young boys, an Arabic import came up between professors and students: the simple page. In the writing rooms maintained by every university, under the direction of lecturers, the old books multiplied to a mass of copies. Hardly had the new university been founded when these copies, for their part, forced the founding of a university library. The newly acquired knowledge was multiplied in letters which were sent from scholar to scholar, soon demanding the founding of a university postal system. Long before modern territorial states or nation states nationalized the universities, the dark Middle Ages had already truly implemented this knowledge. It is well known that, as a legacy of this time when every university had at its disposal its own medium of storage (a library) and its own medium of transmission (a postal system), only the libraries remain. It is possible that the universitas litterarum, the community of those versed in writing, was a bit too proud of its literacy to keep it secret as did the cleverer professions. The fastest and largest pre-modern postal system, reaching all across Europe, is namely thought to have been maintained by the butchers. Whenever the butchers had to appear before the court, however, they would strategically deny their writing and reading abilities. It then came to be that, without much ado, the university postal service was merged with the state post upon which Kaiser Maximilian and his royal rivals founded their states. The abolition of the butcher post, however, was only achieved much later by the same kaisers and kings. Bans and prohibitions which were just as draconian as they were repetitive helped spark the Thirty Years War.

In much the same way as the university postal services, which perished due to the vanity of those trained in writing, the university writing rooms have also disappeared. For Gutenberg's invention of moving type was not aimed at the multiplication of books but at their beautification. Everything which previously flowed with the sweat of calligraphers, unable to entirely avoid making copying mistakes, into handwritten texts and miniatures was to become standardized, free of errors, and reproducible. Precisely this new beauty, however, made it possible to break knowledge down into software and hardware. Universities appeared, on the one hand, whose equally slow and unstoppable nationalization replaced the production of books with that of writers, readers and bureaucrats. On the other hand, that Tower of Babel of books also emerged, whose thousands of identical pages had all the same page numbers, and whose equally un-falsifiable illustrations put before the eyes that which the pages described. Once Leibniz submitted the organizing of authors and titles to the simple ABCs, entire state and national libraries (such as those here in Berlin) were founded upon this addressability. At the same time, this alliance between text and image, book printing and perspective, gave rise to technical knowledge per se.

It is no accident that Gutenberg's moving letters have been called history's first assembly line. For it was the compiling of drawings and lettering, and of construction plans and instruction manuals, which first made it possible for engineers to build further and further on the shoulders - or rather on the books - of their predecessors, without being in any way dependent on oral tradition. Beyond the universities and their lecturing operations, going all the way back to the succession model of masters and journeymen, technical drawings and mathematical equations promoted a kind of knowledge which could even take book printing as its own basis. Even the aesthetic-mathematical revolutions, bearing fruit in Brunelleschi's linear perspective and Bach's well-tempered clavier, were based upon measuring devices like the darkroom or the clock whose complex construction plans could first be handed down through printed matter. The fact that Vasari placed the invention of the camera obscura, that technically implemented perspective, in the same year as Gutenberg's book printing was, of course, a mistake - but it was significant. In technical media, such as photography or the phonograph, precisely the same discoveries are at work, but with the difference that no longer is any hand, and thus no artistry, necessary to mediate between the algorithm and the machine. Perspective has its origin in the beam path of the lens; frequency analysis in the needle's cutting process. Instead of monks, scholars or artists (in the lovely words of photography pioneer Henry Fox Talbot) with analog media "nature" itself guides "the pencil." However, the analog media of the greater 19th century pay a price for this self-sufficiency. The more algorithmic the transmission of their input data, the more chaotic is the storage of their output data. The immense storage facilities, holding in images and sounds that which was once known as history, replace history with real-time, but they also replace addressability with sheer quantity. In spite of film philology (to use Munich University's bold neologism), no one can skim through celluloid or vinyl like they can in the philologist's books. For this reason, it is precisely the act of implementing optical and acoustic knowledge in Europe which has resulted in boundless ignorance. At the same historical moment that nation states were giving their populations democratic law in the form of general obligatory schooling, the people themselves saw writing fade away into high-tech arcana. Their unreadable power, systematically drifting away from the populations, has passed from the First World War's military telegraph system to the expanded directional radio of World War II and, finally, to the computer networks of today. The father of all transmission-tehnological innovations, however, has been war itself. In a strategic chain of escalation, the telegraph appeared in order to surpass the speed of messenger postal services; radio was developed to solve the problem of vulnerable undersea cables; and the computer emerged to make possible the codification of secret - and interceptable - radio communications. Since then, all knowledge which gives power is technology.

2

Weighed on a moral scale, the legacy of this time may therefore as a complete catastrophe. From a more knowledge-technological estimation, it is, rather, a quantum leap. This strategic escalation has led to the fact that today a historically incredible line of succession holds sway. Living beings transmitted their hereditary information further and further, until millions of years later a mutation interrupted them. Cultures transmitted acquired, and thus not quite hereditary, information ever further with the help of their storage media, until centuries later a technical innovation revolutionized the storage media themselves. Computers, on the other hand, make it truly possible to optimize storage and transmission in all their parameters for the first time. As a legacy of the Cold War, which coupled the mathematical problems of data processing with the telecommunication problems of data transmission, they have produced rates of innovation which irrevocably surpass those of nature and cultures. Computing capacities of computer generations double, not over the course of millions of years, and not over hundreds of years, but every eighteen months (according to Moore's so-called empirical - but as yet only affirmed - law). It is an implementation of knowledge which has already surpassed every attempt at its retelling. Nevertheless, three points can perhaps be emphasized. First, all the man-years of engineering work possible will no longer suffice for the designing of new computer architectures. Only the machines of the most up-to-date generation are at all capable of sketching out the hardware of the coming generations as a circuit diagram or transistor design. Second, all of the hardware to which such designs refer are is further stored in software libraries which themselves indicate or display not merely their electronic data and boundaries but even the production process. Technical drawing is no longer a drifting abstraction, as once in printed books, dreferring to devices whose possibility or impossibility (in the case of the perpetual motion machine) must first be proven in the process of building. It now indistinguishably coincides with a machine which itself is a technical drawing, in microscopic layers of silicon and silicon dioxide. However, third and lastly, the hardware of today thereby brings together two previously separated knowledge systems: media technology and the library. On the one hand, computer hardware functions like a library, making possible the storage and retrieval of data under definable addresses. On the other hand, it makes possible the same mathematical operations with these data that have been part of technical analog media since the 19th century, operations which, however, have fundamentally vanished from traditional libraries. From this combination, the management of knowledge results in a double gain of efficiency. To the same extent that the analog media appear one after another in the Universal Discrete Machine, their former chaos also falls under an ordering of universal addressing which first truly enable the knowing of images or sounds. Or the other way round, to the degree that it appears in binary code, writing gains the enormous power to do what it says. It is no accident that what we call in ordinary speech a statement is called, in programming language, a command. Whatever technical drawing simply puts before the eyes, effectively takes place.

3

It is possible that from this short sketch, which does not even come close to doing justice to the complexity of today's hardware, the vast migration which knowledge has experienced and will yet continue to experience does indeed emerge. Michael Giesecke, in his study on book printing of the Early Modern era, was able to use the triumphal procession of electronic information technology as a methodological model in order to be able to estimate Gutenberg's leap of innovation quantitatively. On the other hand, such a process does not work in reverse. No past leap of innovation can provide the measure for that which is currently occurring. If so-called intellectual work on the one side and its objects of study on the other are as a whole transferred to machines, the self-definition of European modernity, understanding thought as an attribute of subjectivity, is at vulnerable. This is not the time or place to discuss in detail the results of this occurence for a society which blithely banishes machines and programs out of its consciousness and must be immediately retrained. Because it is about implemented knowledge, and not implemented strategy, the results of that migration for universities as institutionalized places of knowledge remain urgent. At first viewing, there are reasons that the university can be satisfied. First of all, the principle circuit diagram of the Universal Discrete Maschine appeared in an unprepossessing dissertation which counted human beings and machines, regardless of any differences, as paper machines. Secondly, the implementation of this simple and useless paper machine, first put into operation using tubes, later with transistors, also took place at that elite American university which decided the Second World War as a sorcerer's war. Thirdly, the circumstances of this birth have already made it sure that the Pentagon, in order to be equipped for the case of an atomic attack, did not only diversify its command centers over numerous states, but also had to link with them the elite colleges from which the hard- and software employed first originated. Long before the Internet was promoted as the utopia of radical democrats and the delight of features editors, it was already a university postal system in precisely the historical sense of the Early Modern coupling of state and university postal systems, such as in the France of Henry III. The difference being that in the Internet, in defiance of all those utopias, scholars do not exchange their findings or documents, but computers transmit their bits and bytes. (Which is not even to speak of the radical democratic forums of discussion.) Every knowledge system has its corresponding medium of transmission, which is why the electronic networks are best understood as first the emanation of the silicon hardware itself, as the planetary expansion and spread of - of all things - the epitome of miniaturized technology. In this respect, universities had better chances under high-tech conditions precisely because their origins are older, more mobile, and more integrated than those of teritorial or national states. It is precisely their proximity to computer technology, however, which makes it difficult for universities to be equipped. Wholly apart from the economic shifts which, in the meantime, have made the design of new hardware generations into a billion dollar business for a few companies, established academic knowledge, along with its implementation, also has theoretical deficiencies. In the pattern of the four faculties which still survives its many reformers, there was from the very beginning no place for media technicians as they explicitely arose out of the modern alphabet and number systems. For this reason, technical knowledge, after a long path through royal societies, royal academies and military engineering schools, all of which circumvented the universities, finally reached the technical colleges, the prototypes of which at the time of the French Revolution were not accidentally called schools for powder and saltpeter. This odor of sulphur frightened the old universities so much that they wanted to refuse the technical colleges the right of promotion to doctoral degrees. And it was first the life's work of the great mathematician Felix Klein, who compensated for his extinguished genius with organizational talent, which in the German Reich prevented science and technology, universities and schools of engineering, from taking fully separate ways. In the garden of the Mathematical Institute at Goettingen, as the first physics laboratory in the history of German universities, a couple of cheap sheds appeared, out of which emerged all of quantum mechanics and the atomic bombs. David Hilbert, Klein's successor to the professorship, was thus doubly refuted. His theory that no hostility exists between mathemeticians and engineers simply because there is no relationship between them at all was overshadowed by world developments, and his hypothesis that all mathematical problems can be decided was pushed aside by Alan Turing's computer prototype. Since then, all knowledge, even the mathemetician's most abstract, is technically implemented. If “the 19th century," to use Nietzsche's wicked phrasing, was a “victory of the scientific method over science," then our century will be the one that saw the victory of scientific technology over science. In exactly this way, over a century ago, the physicist Peter Mittelstaedt described it as state of the art, though not without experiencing the passionate animosity of his colleagues. Even in the 19th century, according to Mittelstaedt, every experimental scientist worked like a transcendental apperception, in the Kantian sense, incarnate. The data of the sensory impression (to stay with Kant's phraseology), flowed to the senses, whereupon the understanding and the faculty of judgement could synthesize this flow of data into a generally valid natural law. In contrast, today's experimental physics claims that stochastic processes which occur far beneath any threshhold of perception are received, first of all, by sensors which digitalize them and transmit them to high-performance computers. What the physicist achieves, finally, with his this human-machine interface, is scarcely “nature" anymore, but a “system of information," the “ordering" and mathematical modelling of which has itself been taken over by computer technology. The result of this is Mittelstaedt's compelling conclusion that transcendental apperception, also referred to as knowledge, has simply abdicated. With this abdication, in part because with solid-state physics it made possible the hardware of today, physics really takes on merely the role of a forerunner. If the spirit of the philosophers itself, in Hegel's great words, is "only as deep as it dares to spread and to lose itself in its interpretation," though this explicit interpretation would be unthinkable without a storage medium, the formerly so-called humanities (Geisteswissenschaften) are no less affected. The fact that they show a readiness to drop their old name and in its place to take on the name of cultural sciences (Kulturwissenschaften) appears to encompass a renunciation of transcendental apperception, namely the equally hermeneutic and recursive "knowing of that which is known." Cultural science, in case this term doesn't remain a fashionable word, can surely only mean that the facts which make up integral cultures, the investigation of which is therefore fixed, are in and of themselves technologies; they are, furthermore - in the harsh words of Marcel Mauss - cultural technologies. When texts, images, and sounds are no longer considered the impulses of brilliant individuals but are seen as the output of historically specified writing, reading, and computing technologies, much will already have been gained. Only when the cultural sciences, over and above this, begin to use contemporary logorithms to coordinate all the writing, reading and computing that history has seen will it have proved the truth of its renaming. The legacy of these times is certainly not only to be found in archives and data records, which are inherited by every age, but also in those which it passes down to coming generations. If the knowledge that is handed down, then, does not become recoded and made compatible with the universal medium of the computer, it will be threatened by a forseeable oblivion. It is quite possible that Goethe, that totem animal of all the German literary sciences, has long since ceased to be at home in Weimar archives, but has taken residence at the American university which has most exhaustibly scanned-in his writings - an institute that, not in vain, was founded by Mormons, and so for the eternity of the resurrected. The apocatastasis panton need not hurry, as silicon-based calculation and transmission still lack the sufficient storage. Even now, physical parameters are not capable of authenticating the event of the recording per se. That which is valid for archives and storage facilities is, for that reason, all the more valid for the knowledge technologies and categories. In Gutenberg's time there were French monasteries in which handwriting was so deeply rooted that they searched through all three hundred copies of their first printed missal book for copying mistakes. In Fichte's time, and much to his derision, there were professors whose lectures would "re-compose the world's store of book knowledge" although it was clearly to be found "already printed before the eyes of everyone." Knowledge practices which even today adhere to book knowledge in computer illiteracy and misuse a technology which sits on every writing desk as merely a better kind of typewriter are no less anachronistic. Indeed, even the lectures in video conferences and Internet seminars, currently being attempted in many places, presumably bring necessary but still insufficient changes. Only when the categories which are implemented in computers, meaning the algorithms and data structures, are elevated to utilization as guides for - precisely - culture-scientific research will their relationship to the hard sciences be anything more than the shock-absorber or compensation for the evil results of technology that has been favored since the time of Odo Marquardt. The unique opportunity to bridge the chasm between both cultures stems from technology itself. For the first time since the differentiation of libraries and laboratories, the natural sciences again work, insofar as they have become technical sciences, in one and the same medium as the cultural sciences. Soon, the network of machines will have filed texts and formulas, past and future projects, catalogues and hardware libraries in a uniform format under uniform addresses. If it succeeds from that point in articulating the cultural and natural sciences to one another, the university will have a future.

4

This articulation, perhaps, can be expressed with the formula that the cultural sciences will no longer be able to exclude calculation in the name of their timeless truth, and the natural sciences will no longer be able to exclude memory in the name of their timeless logic or efficiency. They must learn from one another in ways that are precisely reversed: the one to make use of calculation, the other of the memory. Only if that which is to be passed down historically is so formalized that it even remains capable of being handed down under high-tech conditions does it produce an archive of possibilities which may be able to claim, in its great variety, no lesser a protection of species than that of plants or animals. The other way round, the technological implementations in which formerly so-called nature crystallizes begin to be more than ever in danger of forgetting, along with their origins, their reason for being. Even now there are vast quantities of data which are simply unreadable because the computers which once wrote them can no longer be made to run. Without memory - and this means without a history which also explicitely places machines under the protection of species - the legacy of this time in history, then, cannot be passed on to the coming generations. Only when the natural sciences stop dismissing their history in terms of being a forerunner will that same history begin to appear as a scattering of alternatives. The fact that even Stanford University is preparing to collect the half-forgotten private archives of all the Silicon Valley companies could very soon have a rescuing effect - if not for human lives, then certainly for programs upon which human lives (not only in the airbus) increasingly depend. The historicity of technologies does not encompass, but rather excludes, sticking to the saddest legacy of all so-called intellectual history. Knowledge can exist without the copyright. When Goethe, in January of 1825, strongly suggested the "favorable conclusion" to a "high" German "national assembly" that he be able "to draw mercantile advantage" "from his intellectual production" "for himself and those of his dependents," the development of a privatization that in the meantime has spread to even formulas and equations was initiated. Gene technological and related computer supported procedures are patented, while the currently fastest primary number algorithm - in contrast with four centuries of free mathematics - remains an operational secret of the Pentagon. Turing's proof that everything which humans can compute can also be taken over by machines has up to now had so little effect in an economy of knowledge which, not only at the disadvantage of its transmission capacity, systematically disables more than only the universities. Clearly, our inherited ideas are a long ways from reaching the level of today's hardware, the manufacturing equipment of which costs billions, and the manufacturing price of which, in contrast, crashes downward. It can be expected of hardware, and only of hardware, that it will one day drive out the apparition of the copyright. That, however, is bitterly necessary. All of the myths that are constantly conjured up, which like the copyright or creativity define knowledge as the immaterial act of a subject, as the software of a wetware, do nothing more than hinder only its implementation. It may be the case that, in past times when the infrastructure of knowledge lay in books, they even had a function. Jean Paul's brilliant but dirt-poor Wuz, in any case, who could not afford to pay for any books, could himself write his library. Today such lists would be condemned to failure. Computer technology offers not merely an infrastructure for knowledge, which could be replaced by other, more costly or time-consuming procedures. Rather, computer technology provides a hardware whose efficiency itself earns the name software compatibility. It is, then, in contrast to all the current theories which have only pictured technology as a prothesis or tool, an inevitability. This may not please nation states and scientists. The doctrine, particularly favored in Germany doctrine, that the communicative reason, formerly also called the peace of God, is higher than the instrumental, in the end costs much less. It is probably for this reason that the siren songs of a discourse theory which has no terms at all of time and archive meet such open ears in high offices. As places of communicative reason, universities did not have the slightest need for hardware. They got along with just that garden on the north edge of Athens, where Plato once dropped the seed of all higher things in the soil of his young boys. The short history of European universities should have shown, on the other hand, that knowledge is not to be had without technology, and that technology is not to be reduced to instruments. Moreover, the anonymity of knowledge, for which Alan Turing gave his life, makes it ever more impossible to decide whether major states will continue as before to be responsible for knowledge institutions such as universities. One thing is certain, however: it will be decided, regarding the legacy of this time, who set up which hardware when.

--- # distributed via nettime-l : no commercial use without permission