A marvellous device of many applications: A short history of the computer (Book Review)

Image
IANS
Last Updated : Dec 09 2017 | 11:40 AM IST

Title: Turing and the Universal Machine; Author: Jon Agar; Publisher: Icon Science/Icon Books; Pages: 141; Price: Rs 399

In our workplaces, our homes, in the smartphones we flash around and everywhere else, computers, in all shapes and sizes, have become such an integral and ubiquitous part of our life that some cannot visualise a world without them. But do we ever wonder what compulsions and leaps in human intellect and ingenuity led to their genesis?

Or for that matter, why businesses and bureaucracies find them so useful, or how their concept and working relate to the organisation of human society?

And providing answers to these and more pertinent but perplexing questions in the fields of human endeavour and thought in this installment of Icon Science's special series chronicling a dozen ground-breaking moments in scientific history, author Jon Agar shows how and why the development of computers also qualifies among them.

Agar, Professor of Science and Technological Studies at the University College, London, however begins his exposition into the computer's complexity -- and versatility -- by a comparison with the handy Swiss knife, and the lawn-mower or even a sharpened stick to show how it is the "universal machine" with diverse applications.

But that is not the issue, for "computers present a strange case in the history of technology".

"They are machines of apparently limitless activity, yet they are also the drudges of the modern world. Numbering millions, have a typical working day made up of repetition, repetition, repetition. How can the invention of this remarkable device be explained?"

Or as he puts it: "The question is the same as asking: What sort of society would ever need such a thing?"

Towards answering this, he takes us on a whirlwind voyage from explorations into mathematics from the Ancient Greeks to the cloisters of 20th century Cambridge, from the requirements and challenges of government and business in 19th century Britain and America to the universities and scientists drafted by both sides in the Second World War, and from analog to digital thinking and processing.

The journey also seeks to show how the invention of computers was not some spontaneous stroke, but a sustained and collective effort involving a host of gifted minds over at least two centuries, catalysed by requirements of the world.

And these minds span Charles Babbage, perceived as the "father of the computer", to the brilliant but persecuted Alan Turing, whose contributions in mathematics facilitated computing, but also include many other significant but not-so-well-known figures -- American mechanical engineer Herman Hollerith, applied mathematician Charles Aiken and German aeronautical engineer Konrad Zuse, whose contribution has remained mostly obscure.

Then the forerunners of the modern computer are dealt with, in their historical context and requirements -- Babbage's Difference Engine (which faced enormous cost-overruns and was never built) and the more ambitious Analytical Engine (again only completed in a fashion by his son Henry Babbage four decades after his death); Hollerith's punched cards and pin box type tabulator; Zuse's home-based Z3 and Z4; Aiken's Automatic Sequence Controlled Calculator, the Electronic Numerical Integrator and Computer (ENIAC) and so on.

One of the best sections is on the code-breaking efforts, in and before the Second World War, especially at Bletchley Park with Agar drawing a fine analogy of this vital centre being a computer itself with the various huts being distributed processors.

Interspersed with the fascinating bits of technological history is a discussion of the underlying theory of computing, and how it reflects our increasingly complicated world. The treatment is mostly lucid, except the part on the mathematics which requires some specialised background to comprehend.

Though the story finishes soon after the World War with the first computer in Britain, rather strangely called "Blue Pig", and Turing's tragic fate, it also includes quite a bit of debates on artificial intelligence, specifically if computers could be made to think like humans.

At its length and style, this is by no means a comprehensive history, but rather an overview that informs, but also inspires us to think about making that leap that makes marvellous inventions possible. In that only lies our salvation.

(Vikas Datta can be contacted at vikas.d@ians.in)

--IANS

vd/sac

Disclaimer: No Business Standard Journalist was involved in creation of this content

*Subscribe to Business Standard digital and get complimentary access to The New York Times

Smart Quarterly

₹900

3 Months

₹300/Month

SAVE 25%

Smart Essential

₹2,700

1 Year

₹225/Month

SAVE 46%
*Complimentary New York Times access for the 2nd year will be given after 12 months

Super Saver

₹3,900

2 Years

₹162/Month

Subscribe

Renews automatically, cancel anytime

Here’s what’s included in our digital subscription plans

Exclusive premium stories online

  • Over 30 premium stories daily, handpicked by our editors

Complimentary Access to The New York Times

  • News, Games, Cooking, Audio, Wirecutter & The Athletic

Business Standard Epaper

  • Digital replica of our daily newspaper — with options to read, save, and share

Curated Newsletters

  • Insights on markets, finance, politics, tech, and more delivered to your inbox

Market Analysis & Investment Insights

  • In-depth market analysis & insights with access to The Smart Investor

Archives

  • Repository of articles and publications dating back to 1997

Ad-free Reading

  • Uninterrupted reading experience with no advertisements

Seamless Access Across All Devices

  • Access Business Standard across devices — mobile, tablet, or PC, via web or app

More From This Section

First Published: Dec 09 2017 | 11:34 AM IST

Next Story