Computing Versus Computers – a book review

By Jim Scheef

The book being reviewed is The Universal History of Computing, From the Abacus to the Quantum Computer, by Georges Ifrah, 2001, published by John Wiley & Sons, Inc. That title is more than just a mouthful; it is a real clue that this is not your ordinary computer history book. This is an academic work written in the “English style.” The organization is in disjointed blocks that might make sense to the author but are hard for the reader to follow. Before we go much further, let me tell you that this is not a book you are likely to read. First, it’s impossible to find in Borders or any top-tier bookstore (but is available online). Second, it is awkward to read, and last, it’s not really about computers!

So what is it about and why am I writing this review? Because I found the concepts presented to be fascinating and I plan to tell you everything you need to know right here. What I present here are facts as contained in the book. Certainly, I have not independently verified any of this, so if you think something is wrong, I suggest that you get the book and check the source identified by the author. The bibliography takes 16 pages.

The book doesn’t start with the integrated circuit, or the transistor, or even as the title suggests, with the abacus. No, this book starts with the invention of writing and numbers! Skipping the first 20-30 thousand years of this process we arrive at the first number system developed in 4 th millennium BCE by “the people of Sumer” [Samaria?]. This was an oral system, using a base 60 passed on to us by the Babylonians, the Greeks and the Arabs. We still use it today for minutes and seconds of time and angles. Near the end of the 3 rd millennium BCE, the Semites of Mesopotamia adopt a cuneiform decimal notation. Thank God that all humans have ten fingers because these are very handy for counting, so a decimal base 10 number system does seem a natural byproduct. The people of Sumer developed the abacus during this era, abandoning an older system of rods.

By the end of the next millennium, a positional number notation developed. Positional notation is one of the fundamental concepts that enable everything else. The other really fundamental concept is zero. The first known use of a zero was in Babylonia (where else?) in the 3 rd century BCE. However they did not recognize it as a number. That takes many hundreds of years before mathematicians in India pull together a numerical notation using nine digits plus the zero to hold places. This occurred from 400 to 900 CE! In other words, this happened during the time that the Romans were hobbled by their arcane system of numerals (which for some inexplicable reason we retain for large clocks, corner stones in buildings and movie copyrights). The Arabs, a nomadic people engaged in trade, spread the system we call “Arabic numerals” from India to the western world. They recognized a superior system when they saw it. During this period, the Indian number system also spreads to Southeast Asia. In the twelfth century, Arabic numerals with the zero are introduced in Europe, where the new system is vigorously resisted. In 1654, Blaise Pascal defines a general number system to base m where m is greater than or equal to 2.

From here the book moves into the history of algebra, calculus and the developments that eventually made computers possible. Like in 1679 when Leibniz proposed a binary calculating machine using moving balls. He never built it. The book now shifts gears and discusses the development of all sorts of automata – machines from clockworks to mechanical animals that start in ancient Greece and evolve in Renaissance Europe. People all over the world realized that calculation was an arduous task fraught with error and just about any aid to this process was welcomed. So we get to examine the progress in everything from clocks to electricity. Remember, many analog computers (calculators, really) were powered by electric motors. Babbage’s goal was to power calculation with steam. The earliest “super computers” were people. Calculating prodigies were known throughout the years. In the modern era, calculating tools begin with Napier’s Bones, a set of ten rods calibrated to form multiplication tables. Of course, these and similar tools were aids, not mechanical calculators.

Why did it take so long to develop mechanical calculators? Well, ask Babbage who tried in 1834 to design and build his “difference engine.” This device was intended to automate the production of mathematical tables such as logarithms. Babbage needed hundreds of identical gears and in the 1830’s and even into the 1850’s there was no way to produce them accurately in sufficient quantities. The accurate machine tools needed did not exist until later in the industrial revolution. Of course, there were other impediments going back into antiquity. Religion has been a force on both sides of the coin. The Arabs needed to determine which way to face Mecca so calculation was important, and on the other hand, we all know how well the Church took Galileo’s news!

So let’s get on with it. Mechanical calculators developed first to count and compute money. The first widely used calculating devices were tables used by money changers from before Roman times into the Renaissance. The tables were marked with aids to help the person doing the computing. Since few citizens could even count, accuracy was incidental. The first mechanical calculating machine was constructed in 1623 by Wilhelm Schickard, a German astronomer. Destroyed by fire only a year later, the machine he called a “calculating clock” had no impact. The book then covers Pascal, Leibniz, and others who did matter. As business developed, the need for computation led to the development of the computer —the job, of course. From the 18 th century right through WWII, these were the people who performed the calculations needed by commercial businesses. As the cost of this labor became significant, businesses sought ways to make computation more efficient. The modern era saw the development of cash registers as a key segment of an industry that began in the late 1800’s along side adding machines and, eventually, the mechanical desk calculators from Monroe and Burrows that we all remember in offices from the 1920’s until the popularization of 4-function electronic calculators in the 1970’s.

Now the book really falls apart and becomes totally unreadable, as the author tries to cover the development of computing devices into modern electronic computers. If you are interested in computer history there are many better books, like A History of Modern Computing, 2nd Edition, by Paul E. Ceruzzi, or Computer, A History of the Information Machine, by Martin Campbell-Kelly and William Aspray. For Internet history, read Where Wizards Stay Up Late, by Katie Hafner and Matthew Lyon. There are also many good books on the personal computer industry, like Accidental Empires, How the Boys of Silicon Valley Make Their Billions, Battle Foreign Competition, and Still Can't Get a Date, by Robert X. Cringely. Happy reading!


 
 
© Danbury Area Computer Society, Inc. All Rights Reserved.
Web Site Terms & Conditions of Use