A Short History of Computing

1. Slouching to Bethlehem

In 1816, on the shores of Lake Geneva, Mary Shelley had a nightmare about a soulless artificial monster given shape and identity by the ingenuity of science. It was a creation that embodied everything antithetical to the spirit of romanticism that she and her circle so fervently believed in: passion, sublimity, intuitiveness and the elevation of the human spirit; and it has been an enduring vision, achieving more longevity in the public’s imagination than any work by her contemporaries.

But there are other dreams, other visions. In 1802 a young boy was taken by his mother to see the work of an engineer who called himself Merlin, a maker of the delicate and intricate automata so popular at the turn of the century. One of these figures, about twelve inches high and made of silver, was of a dancer with a bird on her finger who “attitudinised in a most fascinating manner. Her eyes full of imagination, and irresistible.”

So struck was Charles Babbage by the figure that, when he grew up, he bought it and installed it in his salon as a conversation piece where it still produced murmurs of admiration from his guests.

If these examples typify the two most human reactions at the sublimation of nature to science, repulsion and fascination, then the history of computers is a replay of such encounters on a gathering breadth and scale. While the issues and paradoxes associated with each encounter have increased exponentially so those primal reactions intensify. Every fresh invasion of our lives by advancing technologies elicits the same mixed responses and the same drawing of lines by the current crop of philosophers, moralists and savants. Just as inevitably the repercussions are unpredictable; evading, and yet exceeding, the most apocalyptic scenarios.

As for the monster and the lady, in 1927 the German filmmaker, Fritz Lang, fused these visions in “Metropolis.” But that was after a century of industrialisation and the entry of a new word into our language “robot”, derived from the Czech term for forced labour.

2. Steam intellect.

If the invention of early mechanical computational devices was not a localised phenomenon (Pascal in France and Liebniz in Alsace both produced such devices as early as 1642), it was driven by a common factor: the desire to automate grinding and “mechanical” mathematical chores. In essence this is the objective that has propelled the development of computers. The precept was defined by Arthur C. Clarke: “if any intellectual activity can be precisely described, then a machine can, in principle, be designed to carry it out.”

It implies two tasks: firstly, an accurate description of a problem, secondly, a design solution to match it. The first is a necessary precursor of the second and that is why most “quantum leaps” in computer design has been caused by the inspired input of brilliant mathematicians (i.e. Turing and Von Neuman) to the work of gifted engineers (i.e. Eckart and Mauchly).

So it was a commonplace frustration that in 1812 found expression during a laborious task being performed by two students at Cambridge. One of these, Charles Babbage, a brilliant young mathematician, was checking a lengthy table of calculations for errors when he exclaimed in exasperation: “I wish to God these calculations had been executed by steam!” At that moment, so legend has it, a life-long obsession was born.

Nineteenth century England was a significant place for such obsessions in more senses than one. The rise of national industrial capitalism had transfigured both the physical and psychological landscape in an unprecedented fashion. Science in particular was achieving a new status that eschewed the rarefied climate of 18th century enlightenment for a Samuel Styles pragmatism that bent to serve the increasing demands and pressures of privatised industrial expansion.

As capitalism took on more active forms and an era of cheap mass production was ushered in, the politics of functionality worked changes upon the social fabric of society that would have a gathering impetus in the century that followed. The most obvious outward sign of this was a new spirit of egalitarianism founded upon the rapid commercial and imperial expansion taking place.

(Less visible was a psychic dislocation produced by this process: one in which the public life of individuals acquired an anonymous “respectability” while the private world of the family became a refuge from the factory. For some theorists, notably Richard Sennet, this is where the narcissism and alienation of contemporary culture are rooted.)

Babbage, in many ways, exemplified his age. Born into an affluent merchant family, he was to become a member of a new meritocracy that would eventually eclipse the power of the land owning classes. Essentially a moneyed “dabbler” in the fields of science and technology, he nevertheless possessed extraordinary gifts and a restless entrepreneurial spirit.

As a child he almost drowned when testing an invention for walking on water. As an adult, he once walked into an oven to test the effects of a 265 degree temperature on the human body. However he also made highly significant contributions towards the development of the railway and postal service and his original inventions included the opthalmoscope and the occulating lighthouse.

But the invention that was to occupy most of his working life was first presented to the Royal Astronomical Society in 1822. Babbage’s idea was to build a machine that performed more complex calculations than any predecessor and printed out the results at the end. He called it the Difference Engine. The Royal Society was impressed and the Exchequer agreed to give him £1,500 towards its development.

Encouraged, Babbage set to work but he had vastly underestimated the resources he would need, the time it would take, and the shortfall in the still developing technology of engineering. In the event, he used up vast amounts of his own capital, became estranged from the science establishment; and was refused sufficient funds by the government to complete his work.

The factor that finally scuppered his project however was a break down in labour relations. His workforce consisted of mechanical engineers, veterans of the cotton industry, and they saw themselves, not as mere employees but as skilled artisans with an “equal” stake in what was essentially artisanal labour. Thus, when funds ran short, his chief engineer walked out, taking the tools that Babbage had designed with him.

In a post-Fordist era this may seem a peculiar mind-set but it illustrates a grey area of demarcation that then (temporarily) existed in the relationship between capital and labour. Babbage’s response was to publish a book, “On the Economy of Machinery and Manufactures”, which sought, by means of an early “time and motion study”, to categorise the function and roles involved in the manufacturing process.

This was, probably unintentionally, a dangerously radical piece of writing that provided an influential model for both proto-Marxist and Fordist thinking. The most radical aspect of it was the distinction Babbage made between “intellectual” and “physical” labour coupled with the idea of a management hierarchy. It might even be said that Babbage had predicated the advent of a “middle management class” whose ancestors would use the computer as their primary tool.

The irony is that this same “modular approach” was the chief virtue of Babbage’s successive machine, the Analytical Engine; something he had started to work on while the other was still in its death throes. This was to be far more powerful than its predecessor and to employ an architecture involving a “memory” and “processor” that uncannily predicted the internal hardware structure of a modern computer. Amongst its many innovations was the concept of “operation cards”, literally punched cards, that would transmit functions from processor to memory.

In 1827, Babbage had lost his wife and two of his children to illness. He was an isolated figure; few people were able to understand, let alone appreciate his ideas. In 1833, however, he was to find an unexpected ally in the figure of a young woman as prodigious in mathematics as himself.

Ada Byron was the daughter of Lord Byron; proposer of the storywriting contest that gave birth to “Frankenstein”; so there was an appropriate symmetry in their meeting. Ada’s gift for numbers was equal to her father’s gift for words, and she devoted her short life to helping Babbage in the development of his second machine. She was largely responsible for the perfection and application of his punched card system; in effect making her the first computer programmer in history.

Despite her help, Babbage died without completing either of his machines (although working versions of them were produced later by other hands). Famous in his day, his work was mostly forgotten for over half a century until Ada’s writings brought him back into the public eye. Many of his discoveries, as we will see, had to be rediscovered again by others; his originality as a precursor becoming the real source of his stature in the history of computing.

This article continues on subsequent pages

© 1998 David Clough

Continued (Part 2)

Leave a Reply