About a success that didn’t want to be one

By Dr. Lorenz Steinke
A wooden mouse, steam power, music boxes, and a vacuum cleaner motor – the history of the computer on its way to becoming an integral component of modern life is brimming with unusual inventions and turning points. Plus, with predictions that never came true.
© ClassicStock/Alamy

How do you tell a story that has no clear beginning and is far from having ended?

May tinkerer Herman Hollerith have marked the beginning of the story? The mining engineer and son of German immigrants was employed by the U.S. Census Bureau in the 1880s. Because the work there involved many repeatedly identical calculations Hollerith designed a machine that was supposed to automate the recording and storing of demographic data and, especially, a quick accounting process. He used punched cards as a storage medium for his invention.

That idea had been inspired by the mechanical Jacquard looms that would receive their weaving patterns from punched continuous tapes. They, in turn, were modeled on music boxes “storing” tunes on small cylindrical rollers. In France, Jacquard looms were initially controversial because weavers were afraid of losing their work due to those looms. Hollerith, however, recognized and used the great potential of punched cards.

Hollerith machines – their inventor had since gone into business for himself – for the first time enabled the U.S. Census Bureau in 1890 to count the whole country in just three years. Up until the turn of the millennium, some U.S. voting machines were still operating with the standardized 80-byte punched cards. Did that mark the birth of digital data processing?

Or did the history of the computer start 50 years earlier with Charles Babbage? He devised a steam-powered computer for which the congenial mathematician Ada Lovelace developed computing programs – either of them doing so just on paper because nobody was able to build a machine like that at the time. Even so, that analytical engine – as we know today – would in fact have worked. Many computer pioneers would subsequently adopt ideas from the Babbage machine, while a programming language has been named after Ada Lovelace.

5,000 computing operations per second: That’s how much the ENIAC computer weighing 27 metric tons (30 short tons) achieved in the 1940s. Today’s mainframes manage 2.8 quadrillion.

© Science History Images/Alamy

Yet what would the history of computerization be without Konrad Zuse? In 1937, he developed Z1, the first functional binary computer that, like all modern computers, operated strictly with zeros and ones. He received the money for his development from a benevolent factory owner even though the latter told Zuse that everything in that sector had in fact already been invented.

Just like today’s microprocessors the Z1 had an input and output device, a computing device, and a random-access memory – just a million times bigger in terms of the space it required and equipped with mechanical metal slides that would often get jammed – the whole machine was powered by a vacuum cleaner motor.

The battle against bugs

The actual miniaturization of computers took place on the other side of the Atlantic. Early calculating machines such as the ENIAC that was being built for the U.S. Army starting in 1942 were still using electron tubes the size of a fist in their arithmetic units. Some 20,000 of them were embedded in the bowels of the mammoth machine – and every day some of them had to be replaced. Just for maintaining its temperature, the behemoth weighing 27 metric tons (30 short tons) used 150 kilowatts, nearly 1,000 times the amount of energy a current office computer consumes – while performing just 5,000 computing operations per second.

Calamitously, the heat attracted insects that would die in the sequential circuits of those early computers, converting a zero into a one and vice versa. The first logbook entry about such a computer “bug” was made in 1947. The insect was affixed right next to it. Even so, “bug” isn’t an expression coined in the computer age. Even before it, Thomas Edison had complained about bugs in his Morse telegraph.

All-purpose element silicon

In 1948, John Bardeen, Walter Brattain, and William Shockley at telephone manufacturer AT&T invented the transistor to replace vacuum tubes, thus catapulting humanity – after the Stone, Bronze, and Iron Age – into the Silicon Age. That incidentally solved the bug problem as well.

Silicon is a semi-metal and could be called the Swiss Army Knife among elements. It will conduct current into this or that direction, intensify the flow of electrons, or, as in the case of the transistor, work as an electronic on/off switch, depending on how it’s injected with tiny particles of other elements. Today, a single microprocessor contains more than 100 billion of those tiny switching transistors.

How Schaeffler became increasingly digital

As a company with an affinity for technology, Schaeffler recognized the potential of modern IT solutions early on and started becoming “digital” in the 1960s. As early as in 1961, Schaeffler received its first electronic computer called 14-01. It had an 8-kilobyte main memory, which made it a very powerful system back in those days. Including card readers and high-speed printers, the leasing fee for the IT system amounted to DM 29,000 – per month. The next milestone, in 1969, was the opening of the newly erected INA data center that with its uninterrupted power supply, an ionization early warning system, and special fire protection rooms for magnetic tapes was the most modern facility in the marketplace. Today, Schaeffler is at home in all digital worlds – up to and including virtual applications. The most recent milestone is the integration of the innovative AI assistant “Siemens Industrial Copilot” on the factory floor. The AI solution, for instance, creates complex programming codes for manufacturing processes, so reducing the effort to be expended by machine operators. In addition, the “co-pilot” has access to relevant sets of documentation, guidelines, and handbooks to support employees in identifying potential sources of errors. The AI-supported assistant offers further potential in the areas of machine correspondence or validations.

  • The digital applications Schaeffler uses include AI-supported machine assistants
    The digital applications Schaeffler uses include AI-supported machine assistants © Schaeffler
  • The most advanced technology available: Schaeffler’s data centers in the 1960s . ...
    The most advanced technology available: Schaeffler’s data centers in the 1960s ... © Schaeffler
  • ... and 1970s
    ... and 1970s © Schaeffler

Herman Hollerith’s company that following several mergers was called International Business Machines Corporation, or IBM for short, began selling initial transistor computers for civilian use in 1960. Just a few years earlier, IBM’s CEO Thomas Watson purportedly stated that there was a world market for merely five computers, but the transistor changed all that.

About a success that didn’t want to be one
Swedish top seller: The ABC 80 was one of the first home computers in the 1970s, but was also being used in schools – which led to Sweden having had a strong affinity for all things digital to this day© Classic Picture Library/Alamy

Starting in the 1960s, insurance companies, utility providers, or business banks of high repute would establish at their headquarters one of those air conditioned, neon light illuminated data centers in which IBM System 360 and 370 computers incessantly wrote data to huge magnetic tapes while specialists were standing at dot matrix printers tearing off sheets from continuous paper. Usable monitors had not yet been invented and the paperless office was a distant dream.

An astronaut’s anxiety

Women can rarely be seen in pictures from those days even though it was specialists like Katherine G. Johnson and Judith Love Cohen, the mother of actor Jack Black, that on NASA computers calculated the trajectories for the Mercury space program and the Apollo lunar mission. Astronaut John Glenn purportedly demanded that Johnson personally recalculate his data for orbiting the Earth. Only after she did would he climb into the space capsule.

  • Paul Allen (left) and Bill Gates were exposed to the programming language BASIC ...
    Paul Allen (left) and Bill Gates were exposed to the programming language BASIC in computer courses at their school – and subsequently revolutionized the PC world with Microsoft © Lakeside School Archives
  • The “mouse” unveiled in 1963 would show its potential only 20 years later with t ...
    The “mouse” unveiled in 1963 would show its potential only 20 years later with the advent of graphic user interfaces © SRI International
  • 1972: Atari’s co-founder Ted Dabney made computer games popular with the tennis- ...
    1972: Atari’s co-founder Ted Dabney made computer games popular with the tennis-like “Pong” © YouTube/The Game Scholar
  • As early as in the 1960s, computers were mass-produced at IBM
    As early as in the 1960s, computers were mass-produced at IBM © RBM Vintage Images/Alamy

Not an equally great success as the Lunar landing was the first portable IBM computer that was unveiled in 1975. One of the reasons why it wasn’t may have been its weight of 25 kilos (55 lbs.) Yet the development was going in the right direction: Computers became smaller and more powerful. As early as in the 1960s, Intel co-founder Gordon Moore had predicted that the number of installed transistors would double every 12 to 18 months. That’s referred to as Moore’s Law.

Starting in the 1960s, mainframes were joined by so-called minicomputers such as the PDP models from DEC. The prefix “mini” should be seen as relative in that case. Those computers were still the size of closets and cost five- to six-digit sums. Only microcomputers would change that.

“I think there’s a world market for maybe five computers”

IBM’s CEO Thomas Watson, 1943

In 1981, following just one year of development time, IBM presented its “Personal Computer,” marking the invention of the VW Golf among computers. The PC consisted of standard parts available on the market. Originally, it was intended to merely round out IBM’s portfolio toward the lower end and inspire customers’ enthusiasm for more expensive product lines. Inside, a low-priced 8088 processor from Intel was at work. That’s why IBM engineers would dub their company’s PC division “toy department.” Even so, the machine went on to be sold by the millions. IBM had merely forecast the sale of 250,000 – across five years!

About a success that didn’t want to be one
Before computers ended up in office settings, their deployment on factory floors as control units for machines and robots resulted in major efficiency increases© Dino Fracchia/Alamy

While the transistors had rung in the computer age, PCs ushered in the computer’s democratization. Soon it was sitting on nearly any office desk. Many employees were worried about losing their jobs due to PCs – data typists and stenographers were no longer needed. But computerization also created many new jobs and enabled new business fields and services. In the end, it was the companies that failed to keep pace with technological change, clinging to slide rules, manila envelopes, and dictaphones that disappeared.

A lot changed for IBM as well. As late as into the 1970s, the dictum of “IBM and the Seven Dwarfs” applied to the computer market. Now, competitors such as DELL, Compaq, Olivetti, or Wang secured market share with partly better and partly less expensive IBM “clones.”

No demand for private computers?

In addition to the successful IBM PC and its many replicas, there were other personal or home computers including the Apple I that was produced merely 200 times. The predecessor of the Apple II that became a global success sold for a ridiculous price of 666 U.S. dollars – as a bare PCB without a chassis. Today, any owner of one of the eight known functional units enjoys owning a classic that’s worth millions.

Computers in advertising – a historic journey
  • 10 MB for 3,398 U.S. dollars: In the 1980s, storage space was as a pure luxury
    10 MB for 3,398 U.S. dollars: In the 1980s, storage space was as a pure luxury © Anbieter
  • Recommended by Captain Kirk: Actor William Shatner praises the popular Commodore ...
    Recommended by Captain Kirk: Actor William Shatner praises the popular Commodore home computer as a “wonder” © Anbieter
  • Laptop dino: In 1971, the Osborne 1 weighing eleven kilograms (24 lbs.) marked t ...
    Laptop dino: In 1971, the Osborne 1 weighing eleven kilograms (24 lbs.) marked the market launch of the first commercially available portable computer © Anbieter
  • Everything’s relative: IBM touts this IT ensemble as a “small computer.” From to ...
    Everything’s relative: IBM touts this IT ensemble as a “small computer.” From today’s perspective, that fits neither the size nor the price of the device © Anbieter
  • It’s magic: Electronics corporation Honeywell promotes the then new “electronic ...
    It’s magic: Electronics corporation Honeywell promotes the then new “electronic mail” and by claiming that email will become an integral component of the automated offices of tomorrow demonstrates considerable foresight © Anbieter

As late as in 1977, Ken Olsen, CEO of computer manufacturer DEC, had stated that he couldn’t imagine any private citizen wanting to own a computer. He was another person to be proven wrong and DEC was to disappear from the market. Soon more and more home users could and would afford their own hardware – initially for the price of a second-hand car and – as an assembly kit – clearly cheaper. The DIY computer “Altair 8800” sold for just 400 dollars and became a top seller among early nerds.

In the early 1980s, home computers such as the Commodore 64 or the Sinclair ZX Spectrum hit the market. They were slower than a PC but had color graphics and sound on board that PC owners had to retrofit for a lot of money. Cassette recorders served as data storage devices. A whole generation gathered their first computer experiences in that way.

Without realizing it, with the PC, IBM also relinquished the reins of its greatest asset, namely control over its software: a then still unknown, up-and-coming company called Microsoft was awarded the contract for an operating system. Without further ado, its founders, Bill Gates and Paul Allen, acquired the largely finished operating system of another developer and turned it into PC-DOS that would subsequently become the basis for Windows as well.

The XY position indicator revolutionizes the computer world

Arguably, Windows would never have become a reality without another pioneer: Douglas Engelbart. As a soldier in the Second World War, the computer engineer from Portland had read about a fictitious computer for everyone named Memex (memory extender). Ever since then, he was looking for an intuitive human-machine interface with which a computer like that could be operated. Engelbart considered punched cards a bad idea: complex to punch and requiring costly read-in hardware. While conducting research work at Stanford, he experimented with light pens, but holding them for long periods of time would tire the arms. Control using a toggle underneath the keyboard? Far too inaccurate.

About a success that didn’t want to be one
Analogue high tech in 1937: The data volume of the world’s then largest vertical file archive at the Prague Social Authority would easily have fit on a smartphone memory© rarehistoricalphotos

Finally, Engelbart used a planimeter – a wheel that traces and measures distances on a drawing board. He combined it with a second wheel that he placed at a right angle to it. He had both wheels installed in a wooden housing that he equipped with a button at the top: the computer mouse was born – in 1963 still clumsily called an “XY position indicator for a monitor system.” The mouse achieved its breakthrough only when graphic user interfaces increasingly emerged starting in the 1980s: with Apple Lisa, Windows, and Geos. Engelbart was involved in that research as well.

And then there was: the network of networks

Around the same time that Neil Armstrong set foot on the Moon in 1969, the internet’s predecessor Arpanet was launched largely unnoticed by the global public – initially with only a handful of computers. Future generations may be arguing which of these two events was the more significant one. Twenty years later, Tim Berners-Lee gave the internet a dedicated language called HTML, so creating the World Wide Web. Today, the network of networks links more than 20 billion machines. Two thirds of humanity are now online.

About a success that didn’t want to be one
Less chaos, more output: This ad from the 1980s illustrates the change that resulted from the integration of PCs in daily work© Anbieter

The more powerful computers and the internet have become the more of the world has migrated into cyberspace – initially often in playful ways: The first webcam (1991) showed merely the fill level of a coffee machine. The first email (1971) was sent merely across a distance of three meters (10 feet).

Today, humans are generating an incredible 150 zettabytes of data per year, storing a major part of that online. Punched into Hollerith cards, the stack would extend more than one million times from the Earth to the Sun. That’s rather remarkable for a technical revolution that once upon a time began with music boxes and steam power.

Dr. Lorenz Steinke
Author Dr. Lorenz Steinke
As an IT journalist, Dr. Lorenz Steinke has been accompanying the evolution of the computer industry for many years. The computer he used at university in the early 1990s was a second-hand PC, albeit a replica made by another manufacturer – with then impressive 40 megabytes of hard drive space.