«ß çíàþ, ÷òî òû ïîçâîíèøü, Òû ìó÷àåøü ñåáÿ íàïðàñíî. È óäèâèòåëüíî ïðåêðàñíà Áûëà òà íî÷ü è ýòîò äåíü…» Íà ëèöà íàïîëçàåò òåíü, Êàê õîëîä èç ãëóáîêîé íèøè. À ìûñëè çàëèòû ñâèíöîì, È ðóêè, ÷òî ñæèìàþò äóëî: «Òû âñå âî ìíå ïåðåâåðíóëà.  ðóêàõ – ãîðÿùåå îêíî. Ê ñåáå çîâåò, âëå÷åò îíî, Íî, çäåñü ìîé ìèð è çäåñü ìîé äîì». Ñòó÷èò â âèñêàõ: «Íó, ïîçâîí

Exactly: How Precision Engineers Created the Modern World

Exactly: How Precision Engineers Created the Modern World Simon Winchester SHORTLISTED FOR THE ROYAL SOCIETY SCIENCE BOOK PRIZE 2018 Bestselling author Simon Winchester writes a magnificent history of the pioneering engineers who developed precision machinery to allow us to see as far as the moon and as close as the Higgs boson. Precision is the key to everything. It is an integral, unchallenged and essential component of our modern social, mercantile, scientific, mechanical and intellectual landscapes. The items we value in our daily lives – a camera, phone, computer, bicycle, car, a dishwasher perhaps – all sport components that fit together with precision and operate with near perfection. We also assume that the more precise a device the better it is. And yet whilst we live lives peppered and larded with precision, we are not, when we come to think about it, entirely sure what precision is, or what it means. How and when did it begin to build the modern world? Simon Winchester seeks to answer these questions through stories of precision’s pioneers. Exactly takes us back to the origins of the Industrial Age, to Britain where he introduces the scientific minds that helped usher in modern production: John ‘Iron-Mad’ Wilkinson, Henry Maudslay, Joseph Bramah, Jesse Ramsden, and Joseph Whitworth. Thomas Jefferson exported their discoveries to the United States as manufacturing developed in the early twentieth century, with Britain’s Henry Royce developing the Rolls Royce and Henry Ford mass producing cars, Hattori’s Seiko and Leica lenses, to today’s cutting-edge developments from Europe, Asia and North America. As he introduces the minds and methods that have changed the modern world, Winchester explores fundamental questions. Why is precision important? What are the different tools we use to measure it? Who has invented and perfected it? Has the pursuit of the ultra-precise in so many facets of human life blinded us to other things of equal value, such as an appreciation for the age-old traditions of craftsmanship, art, and high culture? Are we missing something that reflects the world as it is, rather than the world as we think we would wish it to be? And can the precise and the natural co-exist in society? (#ulink_cdc68b68-24ab-599f-9afd-1ba25ab289e7) Copyright (#ulink_5e725965-7535-5e94-a7c5-e30f57c044c7) William Collins An imprint of HarperCollinsPublishers 1 London Bridge Street London SE1 9GF www.WilliamCollinsBooks.com (http://www.WilliamCollinsBooks.com) This eBook first published in Great Britain by William Collins in 2018 Copyright © Simon Winchester 2018 Cover images © Getty Images Simon Winchester asserts the moral right to be identified as the author of this work Much of the material here (#litres_trial_promo) relating to the Tohoku Tsunami of March 2011 is taken with permission from an essay by Simon Winchester in the New York Review of Books, November 9, 2017. Image of space on title page by Yuriy Mazur/Shutterstock, Inc. A catalogue record for this book is available from the British Library All rights reserved under International and Pan-American Copyright Conventions. By payment of the required fees, you have been granted the non-exclusive, non-transferable right to access and read the text of this e-book on-screen. No part of this text may be reproduced, transmitted, down-loaded, decompiled, reverse engineered, or stored in or introduced into any information storage and retrieval system, in any form or by any means, whether electronic or mechanical, now known or hereinafter invented, without the express written permission of HarperCollins Source ISBN: 9780008241766 Ebook Edition © May 2018 ISBN: 9780008241797 Version: 2018-05-01 Dedication (#ulink_701f4596-7358-51b7-bba4-74d66fe958e9) For Setsuko And in loving memory of my father, Bernard Austin William Winchester, 1921–2011, a most meticulous man Epigraph (#ulink_fedbe521-fe4f-526f-95fa-6ab8670bd6b3) These brief passages from works by the writer Lewis Mumford (1895–1990) might usefully be borne in mind while reading the pages that follow. The cycle of the machine is now coming to an end. Man has learned much in the hard discipline and the shrewd, unflinching grasp of practical possibilities that the machine has provided in the last three centuries: but we can no more continue to live in the world of the machine than we could live successfully on the barren surface of the moon. —THE CULTURE OF CITIES (1938) We must give as much weight to the arousal of the emotions and to the expression of moral and esthetic values as we now give to science, to invention, to practical organization. One without the other is impotent. —VALUES FOR SURVIVAL (1946) Forget the damned motor car and build the cities for lovers and friends. —MY WORKS AND DAYS (1979) Contents Cover (#u0632ea98-e12e-5daa-afbc-aa42d075cef5) Title Page (#ulink_45b44346-f25e-53c8-a46f-a18c450964f4) Copyright (#uf4799009-f9e5-5ccd-8b96-84a13571f285) Dedication Epigraph List of Illustrations Prologue Chapter 1: Stars, Seconds, Cylinders, and Steam (#ulink_12029412-9e1c-5357-b369-d73d27b337d0) Chapter 2: Extremely Flat and Incredibly Close (#ulink_2e856b8b-2ec4-5aa4-b15a-cad1c2b9c291) Chapter 3: A Gun in Every Home, a Clock in Every Cabin (#ulink_7e8649ed-b4da-523b-b0c2-7ce5717a1570) Chapter 4: On the Verge of a More Perfect World (#litres_trial_promo) Chapter 5: The Irresistible Lure of the Highway (#litres_trial_promo) Chapter 6: Precision and Peril, Six Miles High (#litres_trial_promo) Chapter 7: Through a Glass, Distinctly (#litres_trial_promo) Chapter 8: Where Am I, and What Is the Time? (#litres_trial_promo) Chapter 9: Squeezing Beyond Boundaries (#litres_trial_promo) Chapter 10: On the Necessity for Equipoise (#litres_trial_promo) Afterword: The Measure of All Things Acknowledgments A Glossary of Possibly Unfamiliar Terms Bibliography Index About the Author Also by Simon Winchester About the Publisher (#litres_trial_promo) List of Illustrations (#ulink_99f31de1-49a8-5f4c-aebd-a98b51ba341d) Unless otherwise noted, all images are in the public domain. Difference between Accuracy and Precision John Wilkinson Boulton and Watt steam engine Joseph Bramah Henry Maudslay Maudslay’s “Lord Chancellor” bench micrometer (courtesy of the Science Museum Group Collection) Flintlock on a rifle Thomas Jefferson Springfield Armory “organ of muskets” Joseph Whitworth Crystal Palace Whitworth screws (courtesy of Christoph Roser at AllAboutLean.com) “Unpickable” Bramah lock Henry Royce Rolls-Royce Silver Ghost (courtesy of Malcolm Asquith) Ford Model T (#litres_trial_promo) Ford Model T (exploded) (#litres_trial_promo) Henry Ford Ford assembly line Box of gauge blocks Qantas Flight 32 (2010 incident) (courtesy of Australian Transport Safety Bureau) Frank Whittle (courtesy of University of Cambridge) Turbine blades (courtesy of Michael P?tzold/Creative Commons BY-SA-3.0 de) Rolls-Royce Trent engine Qantas Flight 32 failed stub pipe diagram (courtesy of Australian Transport Safety Bureau) Early Leica camera Leica IIIcs Hubble Space Telescope Hubble mirror being polished Null corrector Jim Crocker (courtesy of NG Images) Roger Easton (courtesy of the U.S. Naval Research Laboratory) Transit-system satellite (courtesy of the National Air and Space Museum, Smithsonian Institution) Bradford Parkinson Schriever Air Force Base, Colorado (courtesy of Schriever Air Force Base, U.S. Air Force) Ops room of Second Space Operations Squadron ASML EUV photolithography machine (courtesy of ASML) Gordon Moore (courtesy of Intel Free Press) John Bardeen, William Shockley, and Walter Brattain First Bell Labs transistor (courtesy of Windell H. Oskay, www.evilmadscientist.com) Chart showing progress from Intel 4004 to Skylake (courtesy of Max Roser/Creative Commons BY-SA-2.0) Main mirror for James Webb Space Telescope Aerial view of LIGO Hanford Observatory LIGO test mass (courtesy of Caltech/MIT/LIGO Lab) Seiko Building with clock in Ginza (courtesy of Oleksiy Maksymenko Photography) Quartz watch (courtesy of Museumsfoto/Creative Commons BY-SA-3.0 de) Makers of Grand Seiko mechanical watch Bamboo creation from Met exhibit (courtesy of Metropolitan Museum of Art) Example of fine urushi work (courtesy of the Japan Folk-Craft Museum) Prologue (#ulink_e63fa50e-116b-57d9-83ae-ba921d55431b) The aim of science is not to open the door to infinite wisdom, but to set a limit to infinite error. —BERTOLT BRECHT, LIFE OF GALILEO (1939) We were just about to sit down to dinner when my father, a conspiratorial twinkle in his eye, said that he had something to show me. He opened his briefcase and from it drew a large and evidently very heavy wooden box. It was a London winter evening in the mid-1950s, almost certainly wretched, with cold and yellowish smog. I was about ten years old, home from boarding school for the Christmas holidays. My father had come in from his factory in North London, brushing flecks of gray industrial sleet from the shoulders of his army officer’s greatcoat. He was standing in front of the coal fire to warm himself, his pipe between his teeth. My mother was bustling about in the kitchen, and in time she carried the dishes into the dining room. But first there was the matter of the box. I remember the box very well, even at this remove of more than sixty years. It was about ten inches square and three deep, about the size of a biscuit tin. It was evidently an object of some quality, well worn and cared for, and made of varnished oak. My father’s name and initials and style of address, B. A. W. WINCHESTER ESQ., were engraved on a brass plate on the top. Just like the much humbler pinewood case in which I kept my pencils and crayons, his box had a sliding top secured with a small brass hasp, and there was a recess to allow you to open it with a single finger. This my father did, to reveal inside a thick lining of deep red velvet with a series of wide valleys, or grooves. Firmly secured within the grooves were a large number of highly polished pieces of metal, some of them cubes, most of them rectangles, like tiny tablets, dominoes, or billets. I could see that each had a number etched in its surface, almost all the numbers preceded by or including a decimal point—numbers such as .175 or .735 or 1.300. My father set the box down carefully and lit his pipe: the mysterious pieces, more than a hundred of them, glinted from the coal fire’s flames. He took out two of the largest pieces and laid them on the linen tablecloth. My mother, rightly suspecting that, like so many of the items my father brought home from the shop floor to show me, they would be covered with a thin film of machine oil, gave a little cry of exasperation and ran back into the kitchen. She was a fastidious Belgian lady from Ghent, a woman very much of her time, and spotless linen and lace therefore meant much to her. My father held the metal tiles out for me to inspect. He remarked that they were made of high-carbon stainless steel, or at least another alloy, with some chromium and maybe a little tungsten to render them especially hard. They were not at all magnetic, he added, and to make his point, he pushed them toward one another on the tablecloth—leaving a telltale oil trail to further upset my mother. He was right: the metal tiles showed no inclination to bond with each other, or to be repelled. Pick them up, my father said, take one in each hand. I took one in each palm and made as if to measure them. They were cold, heavy. They had heft, and were rather beautiful in the exactness of their making. He then took the pieces from me and promptly placed them back on the table, one of them on top of the other. Now, he said, pick up the top one. Just the top one. And so, with one hand, I did as I was told—except that upon my picking up the topmost piece, the other one came along with it. My father grinned. Pull them apart, he said. I grasped the lower piece and pulled. It would not budge. Harder, he said. I tried again. Nothing. No movement at all. The two rectangular steel tiles appeared to be stuck fast, as if they were glued or welded or had become one—for I could no longer see a line where one tile ended and the other began. It seemed as though one piece of steel had quite simply melted itself into the structure of the other. I tried again, and again. By now I was perspiring from the effort, and my mother, back from the kitchen, was getting impatient, and so my father set his pipe aside and took off his jacket and began to dish out the food. The tiles were beside his water glass, symbols of my muscular impoverishment, my defeat. Could I have another try? I asked at dinner. No need, he said, and he picked them up and with a flick of his wrist simply slid one off the other, sideways. They came apart instantly, with ease and grace. I was openmouthed at something that, viewed from a schoolboy’s perspective, seemed much like magic. No magic, my father said. All six of the sides, he explained, are just perfectly, impeccably, exactly flat. They had been machined with such precision that there were no asperities whatsoever on their surfaces that might allow air to get between and form a point of weakness. They were so perfectly flat that the molecules of their faces bonded with one another when they were joined together, and it became well-nigh impossible to break them apart from one another, though no one knows exactly why. They could only be slid apart; that was the only way. There was a word for this: wringing. My father started to talk animatedly, excitedly, with a passionate intensity that I always liked. Metal tiles like these, he said, and with a very evident pride, are probably the most precise things that are ever made. They are called gauge blocks, or Jo blocks, after the man who invented them, Carl Edvard Johansson, and they are used for measuring things to the most extreme of tolerances—and the people who produce them work at the very summit of mechanical engineering. These are precious things, and I wanted you to see them, since they are so important to my life. And with that said, he fell quiet, carefully put the gauge blocks back in their velvet-lined wooden box, finished his dinner, lit his pipe once more, and fell asleep by the fire. MY FATHER WAS for all his working life a precision engineer. In the closing years of his career, he designed and made minute electric motors for the guidance systems of torpedoes. Most of this work was secret, but once in a while he would smuggle me into one of his factories and I would gaze in either admiration or puzzlement at machines that cut and notched the teeth for tiny brass gearwheels, or that polished steel spindles that seemed no thicker than a human hair, or that wound copper coils around magnets that seemed no bigger than the head of a pipe smoker’s vesta. I remember with great fondness spending time with one of my father’s favored workers, an elderly man in a brown lab coat who, like my father, clasped a pipe between his teeth, leaving it unlit all the time he worked. He wore a permanently incised frown as he sat before the business end of a special lathe—German, my father said; very expensive—watching the cutting edge of a notching tool as it whirled at invisible speed, cooled by a constant stream of a cream-like oil-and-water mixture. The machine hunted and pecked at a small brass dowel, skimming as it did so microscopic coils of yellow metal from its edges as the rod was slowly rotated. I watched intently as, by some curiously magical process, an array of newly cut tiny teeth steadily appeared incised into the metal’s outer margins. The machine stopped for a moment; there was a sudden silence—and then, as I squinted into the moving mass of confusion around the workpiece, a gathering of separate and more delicate tungsten carbide tools moved into view and were promptly engaged, and the spindles began to turn and cut, such that the teeth that had so far been created were now being shaped and curved and notched and chamfered, the machine’s magnifying glass showing just how the patterns of their edges evolved as they passed beneath the blades, until, with a whisper of disengagement, the spinning stopped, the dowel was sliced as a side of ham might be, the clamp was released, and out of a filter lifted from the cream-oil bath rose a dripping confection of impossibly shiny finished gearwheels, maybe twenty of them, each no more than a millimeter thick and perhaps a centimeter in diameter. They were all flipped by an unseen lever out of the lathe and onto a tray, where they would lie ready to be slipped onto spindles and then attached in mysterious fashion to the motors that turned a fin here or varied the pitch of a screw there, with the gyroscopically ordered intention of keeping a high-explosive submarine weapon running straight and true toward its enemy target through the unpredictable movements of a cold and heaving sea. Except that, in this case, the elderly craftsman decided that the Royal Navy could easily spare one from this fresh batch of wheels. He took a pair of steel needle-nose tweezers and picked a sample out of the creamy bath, washed it under a gush of clear water, and handed it to me with an expression of pride and triumph. He sat back, smiled broadly at a job well done, and lit a satisfying pipe. The tiny gearwheel was a gift, my father would say, a reminder of your visit. As precise a gearwheel as you’ll ever see. JUST LIKE HIS star employee, my father took singular pride in his profession. He regarded as profound and significant and worthy the business of turning shapeless slugs of hard metal into objects of beauty and utility, each of them finely turned and neatly finished and fitted for purposes of all imaginable kinds, prosaic and exotic—for as well as weaponry, my father’s plants built devices that went into motorcars and heating fans and down mineshafts; motors that cut diamonds and crushed coffee beans and sat deep inside microscopes, barographs, cameras, and clocks. Not watches, he said ruefully, but table clocks and ships’ chronometers and long-case grandfather clocks, where his gearwheels kept patient time to the phases of the moon and displayed it on the clock dials high up in a thousand hallways. He would sometimes bring home pieces even more elaborate than but perhaps not quite as magical as the gauge blocks, with their ultra-flat, machined faces. He brought them primarily to amuse me, unveiling them at the dinner table, always to my mother’s chagrin, as they were invariably wrapped in oily brown wax paper that marked the tablecloth. Will you put that on a piece of newspaper? she’d cry, usually in vain, as by then the piece was out, shining in the dining room lights, its wheels ready to spin, its arms ready to be cranked, its glassware (for often there was a lens or two or a small mirror attached to the device) ready to be demonstrated. My father had a great fascination with and reverence for well-made cars, most especially those made by Rolls-Royce. This at a time, long past, when these haughty machines represented not so much the caste of their owners as the craft of their makers. My father had once been granted a tour of the assembly line in Crewe and had spent a while with the team who made the engine crankshafts. What impressed him most was that these shafts, which weighed many scores of pounds, had been finished by hand and were so finely balanced that, once set spinning on a test bench, they had no inclination to stop spinning, since no one side was even fractionally heavier than another. Had there been no such phenomenon as friction, my father said, a Phantom V’s crankshaft, once set spinning, could run in perpetuity. As a result of that conversation, he had me try to design a perpetual motion machine of my own, a dream on which I wasted (given my then only very vague understanding of the first two laws of thermodynamics, and thus the impossibility of ever meeting the challenge) many hours of spare time and many hundreds of sheets of writing paper. Though more than a half century has elapsed since those machine-happy days of my childhood, the memory still exerts a pull—and never more so than one afternoon in the spring of 2011, when I received, quite unexpectedly, an e-mail from a complete stranger in the town of Clearwater, Florida. It was headed simply “A Suggestion,” and its first paragraph (of three) started without frill or demur: “Why not write a book on the History of Precision?” My correspondent was a man named Colin Povey, whose principal career had been as a scientific glassblower. The argument he put forward was persuasive in its simplicity: precision, he said, is an essential component of the modern world, yet is invisible, hidden in plain sight. We all know that machines have to be precise; we all recognize that items that are of importance to us (our camera, our cellphone, our computer, our bicycle, our car, our dishwasher, our ballpoint pen) have to sport components that fit together with precision and operate with near perfection; and we all probably suppose that the more precise things are, the better they are. At the same time, this phenomenon of precision, like oxygen or the English language, is something we take for granted, is largely unseen, can seldom be fully imagined, and is rarely properly discussed, at least by those of us in the laity. Yet it is always there, an essential aspect of modernity that makes the modern possible. Yet it hasn’t always been so. Precision has a beginning. Precision has a definite and probably unassailable date of birth. Precision is something that developed over time, it has grown and changed and evolved, and it has a future that is to some quite obvious and to others, puzzlingly, somewhat uncertain. Precision’s existence, in other words, enjoys the trajectory of a narrative, though it might well be that the shape of that trajectory will turn out to be more a parabola than a linear excursion into the infinite. In whichever manner precision developed, though, there was a story; there was, as they say in the moviemaking world, a through line. That, said Mr. Povey, was his understanding of the theory of the thing. Yet he also had a personal reason for suggesting the idea, and to illustrate it, he told me the following tale, which I offer here in summary, a mix of precision and concision: Mr. Povey Sr., my correspondent’s father, was a British soldier, a somewhat eccentric figure by all accounts who, among other things, classified himself as a Hindu so that he would not be obliged to attend the normally compulsory Sunday Anglican service. Not wishing to fight in the trenches, he joined the Royal Army Ordnance Corps, the body that has the responsibility of supplying weapons, ammunition, and armored vehicles to those soldiers who used such things in battle. (The RAOC’s functions have since expanded, and now, less glamorously, it also runs the army’s laundry and mobile baths and does the official photography.) During training, he learned the rudiments of bomb disposal and other technical matters, excelling at the engineering aspects of the craft, and thus qualified, he was sent in 1940 to the British embassy in Washington, DC (in secret, and wearing civilian clothes, as the United States had so far not joined the war). His duties were mainly to liaise with American ammunition makers to create ordnance that would fit into British-issued weapons. In 1942, he was given a special mission: to work out just why some American antitank ammunition was jamming, randomly, when fired from British guns. He promptly took a train to the manufacturers in Detroit and spent weeks at the factory painstakingly measuring batches of ammunition, finding, to his chagrin, that every single round fitted perfectly in the weapon for which it was destined, meeting the specifications with absolute precision. The problem, he told his superiors back in London, did not lie with the plant. So London told him to follow the ammunition all the way to where the commanders were experiencing the vexing misfires, and that was in the battlefields of the North African desert. Mr. Povey, lugging along his giant leather case of measuring equipment, promptly lit out for the East Coast. He first traveled on a variety of ammunition trains, passing slowly across the mountains and rivers of eastern America, all the way to Philadelphia, whence the ordnance was to be shipped. Each day, he measured the shells, and found that they and their casings retained their design integrity perfectly, fitting the gun barrels just as well at each of the railway depots as they had when they left the production lines. Then he boarded the cargo ship. It turned into something of a testing journey: the vessel broke down, was abandoned by its convoy and its destroyer escort, became frighteningly vulnerable to attack by U-boats, and was trapped in a mid-ocean storm that left all of the crew wretchedly seasick. But, as it happened, it was this deeply testing environment that allowed Mr. Povey finally to solve the puzzle. For it turned out that the severe rocking of the ship damaged some of the shells. They were stacked in crates deep in the ship’s hold. As the vessel rocked and heeled in the storm, those crates on the outer edges of the stacks, and only those, would crash into the sides of the ship. If they hit repeatedly, and if when they hit they were configured in such a way that the tip of the ammunition struck the wall of the hold, the whole of the metal projectile at the front end of each shell—the bullet, to put it simply—would be shoved backward, by perhaps no more than the tiniest fraction of an inch, into its brass cartridge case. This collision, if repeated many times, caused the cartridge case to distort, its lip to swell up, very slightly, by a near-invisible amount that was measurable only by the more sensitive of Povey’s collection of micrometers and gauges. The shells that endured this beating—and they would be randomly distributed, for once the ship had docked and the stevedores had unloaded the crates and the ammunition had been broken down and sent out to the various regiments, no one knew what order the shells would be in—would, as a result, not fit into the gun barrels out on the battlefield. There would, in consequence, be (and entirely randomly) a spate of misfires of the guns. It was an elegant diagnosis, with a simple recommended cure: it was necessary only for the factory back in Detroit to reinforce the cardboard and wood of the ammunition crates and—presto!—the shell casings would all emerge from the ship unbruised and undistorted, and the jamming problem with the antitank rifles would be solved. Povey telegraphed his news and his suggestion back to London, was immediately declared a hero, and then, in classic army style, was equally immediately forgotten about, in the desert, without orders, but with, as he had been away from his office in Washington for so long, a considerable amount of back pay. Hot work in the Sahara it must have been, for at this point the story wavers a little: Mr. Povey Sr. seems to have gone on some kind of long-drawn-out desert bender. But after enjoying the sunshine for an indecent number of weeks, he decided that he did in fact need to return to America, so he bribed his way back there with bottles of Scotch whisky. It took him eleven bottles of Johnnie Walker to get from Cairo (via a temporary aerodrome in no less exotic a wartime stopover than Timbuktu) to Miami, after which it was but a short and easy hop up to Washington. There he found dismaying news. It turned out that he had been away in Africa for so long without any communication that he had been declared missing and presumed dead. His mess privileges had been revoked, his cupboard closed, and all his clothes altered to fit a much smaller man. It took a while for this discomfiting mess to be sorted out, and when eventually everything was more or less back to normal, he discovered that his entire ordnance unit had been transferred to Philadelphia—to which he promptly went as well. There he met and fell in love with the unit’s American secretary. The pair got married, and Mr. Povey, never apparently practicing the Hinduism that had been engraved on his army dog tag, remained blamelessly in the United States for the rest of his days. And, as my correspondent then wrote, with a flourish, “the lady in question was my mother, and so I exist—and I exist entirely because of precision.” This is why, he then added, “you must write this book.” BEFORE WE DELVE too deeply into its history, two particular aspects of precision need to be addressed. First, its ubiquity in the contemporary conversation—the fact that precision is an integral, unchallenged, and seemingly essential component of our modern social, mercantile, scientific, mechanical, and intellectual landscapes. It pervades our lives entirely, comprehensively, wholly. Yet, the second thing to note—and it is a simple irony—is that most of us whose lives are peppered and larded and salted and perfumed with precision are not, when we come to think about it, entirely sure what precision is, what it means, or how it differs from similar-sounding concepts—accuracy most obviously, or its lexical kissing cousins of perfection and exactitude and of being just right, exactly! Precision’s omnipresence is the simplest to illustrate. A cursory look around makes the point. Consider, for example, the magazines on your coffee table, in particular the advertising pages. In a scant few minutes you could, for instance, construct from them a rough timetable for enjoying a precision-filled day. You would begin your morning by first using a Colgate Precision Toothbrush; if you were clever enough to keep up with Gillette’s many product lines, you could enjoy less “tug and pull” on your cheek and chin by shaving with the “five precision blades” in its new Fusion5 ProShield Chill cartridge, and then tidying up your goatee and mustache with a Braun Precision Trimmer. Before the first meeting with a new acquaintance, be sure to have any former-girlfriend-related body art painlessly removed from your biceps with an advertised machine that offers patented “precision laser tattoo removal.” Once thus purified and presentable, serenade your new girlfriend by playing her a tune on a Fender Precision bass guitar; maybe take her for a safe wintertime spin after fitting your car with a new set of guaranteed-in-writing Firestone Precision radial snow tires; impress her with your driving skills first out on the highway and then at the curb with adroit use of the patented Volkswagen Precision parking-assist technology; take her upstairs and listen to soft music played on a Scott Precision radio (a device that will add “laurels of magnificent dignity to those of the world-record achievements” of the Chicago-based Scott Transformer Company—not all the magazines on an average coffee table are necessarily current). Then, if the snow has eased, prepare dinner in the back garden with a Big Green Egg outdoor stove equipped with “precision temperature control”; gaze dreamily over nearby fields newly sown with Johnson Precision corn; and finally, take comfort from the knowledge that if, after the stresses of the evening, you awake hungover or unwell, you can take advantage of the precision medicine that is newly available at NewYork-Presbyterian Hospital. It took no time at all to tease out these particular examples from one randomly selected coffee-table pile. There are all too many others. I see, for instance, that the English novelist Hilary Mantel recently described the future British queen, n?e Kate Middleton, as being so outwardly perfect as to appear “precision-made, machine-made.” This went down well with neither royalists nor engineers, as what is perfect about the Duchess of Cambridge, and indeed with any human being, is the very imprecision that is necessarily endowed by genes and upbringing. Precision appears in pejorative form, as here. It is also enshrined elsewhere and everywhere in the names of products, is listed among the main qualities of the function or the form of these products, is all too often one of the names of companies that produce such products. It is also used to describe how one uses the language; how one marshals one’s thoughts; how one dresses, writes by hand, ties ties, makes clothing, creates cocktails; how one carves, slices, and dices food—a sushi master is revered for the precise manner in which he shaves his toro—how cleverly one throws a football, applies makeup, drops bombs, solves puzzles, fires guns, paints portraits, types, wins arguments, and advances propositions. QED, one might say. Precisely. Precision is a much better word, a more apposite choice in all the examples just given, than is its closest rival, accuracy. “Accurate Laser Tattoo Removal” sounds not nearly as convincing or effective; a car with merely “Accurate Parking Technology” might well be assumed to bump occasional fenders with another; “Accurate Corn” sounds, at best, a little dull. And it surely would be both damning and condescending to say that you tie your tie accurately—to knot it precisely is much more suggestive of ?lan and style. THE WORD precision, an attractive and mildly seductive noun (made so largely by the sibilance at the beginning of its third syllable), is Latin in origin, was French in early wide usage, and was first introduced into the English lexicon early in the sixteenth century. Its initial sense, that of “an act of separation or cutting off”—think of another word for the act of trimming, pr?cis—is seldom used today: the sense employed so often these days that it has become a near clich? has to do, as the Oxford English Dictionary has it, “with exactness and accuracy.” In the following account, the words precision and accuracy will be employed almost but not quite interchangeably, as by common consent they mean just about the same thing, but not exactly the same thing—not precisely. Given the particular subject of this book, it is important that the distinction be explained, because to the true practitioners of precision in engineering, the difference between the two words is an important one, a reminder of how it is that the English language has virtually no synonyms, that all English words are specific, fit for purpose by their often very narrow sense and meaning. Precision and accuracy have, to some users, a significant variation in sense. The Latin derivation of the two words is suggestive of this fundamental variance. Accuracy’s etymology has much to do with Latin words that mean “care and attention”; precision, for its part, originates from a cascade of ancient meanings involving separation. “Care and attention” can seem at first to have something, but only something rather little, to do with the act of slicing off. Precision, though, enjoys a rather closer association with later meanings of minuteness and detail. If you describe something with great accuracy, you describe it as closely as you possibly can to what it is, to its true value. If you describe something with great precision, you do so in the greatest possible detail, even though that detail may not necessarily be the true value of the thing being described. You can describe the constant ratio between the diameter and the circumference of a circle, pi, with a very great degree of precision, as, say, 3.14159265 358979323846. Or pi can happily be expressed with accuracy to just seven decimal places as 3.1415927—this being strictly accurate because the last number, 7, is the mathematically acceptable way to round up a number whose true value ends (as I have just written, and noted before the gap I have placed in it) in 65. A somewhat simpler means of explaining much the same thing is with a three-ring target for pistol shooting. Let us say you shoot six shots at the target, and all six shots hit wide of the mark, don’t even graze the target—you are shooting here with neither accuracy nor precision. Maybe your shots are all within the inner ring but are widely dispersed around the target. Here you have great accuracy, being close to the bull’s-eye, but little precision, in that your shots all fall in different places on the target. Perhaps your shots all fall between the inner and outer rings and are all very close to one another. Here you have great precision but not sufficient accuracy. (#ulink_e57b507f-7696-52fe-ba11-c433eda765f2) The image of a target offers an easy means of differentiating precision and accuracy. In A, the shots are close and clustered around the bull: there is both precision and accuracy. In B, there is precision, yes, but insofar as the shots miss the bull, they are inaccurate. C, with the shots widely dispersed, shows neither precision nor accuracy. And in D, with some clustering and some proximity to the bull, there is moderate accuracy and moderate precision—but very moderate. Finally, the most desired case, the drumroll result: your shots are all clustered together and have all hit the bull’s-eye. Here you have performed ideally in that you have achieved both great accuracy and great precision. In each of these cases, whether writing the value of pi or shooting at a target, you achieve accuracy when the accumulation of results is close to the desired value, which in these examples is either the true value of the constant or the center of the target. Precision, by contrast, is attained when the accumulated results are similar to one another, when the shooting attempt is achieved many times with exactly the same outcome—even though that outcome may not necessarily reflect the true value of the desired end. In summary, accuracy is true to the intention; precision is true to itself. One last definition needs to be added to this mass of confusion: the concept of tolerance. Tolerance is an especially important concept here for reasons both philosophical and organizational, the latter because it forms the simple organizing principle of this book. Because an ever-increasing desire for ever-higher precision seems to be a leitmotif of modern society, I have arranged the chapters that follow in ascending order of tolerance, with low tolerances of 0.1 and 0.01 starting the story and the absurdly, near-impossibly high tolerances to which some scientists work today—claims of measurements of differences of as little as 0.000 000 000 000 000 000 000 000 000 01 grams, 10 to the -28th grams, have recently been made, for example—toward the end. Yet this principle also prompts a more general philosophical question: why? Why the need for such tolerances? Does a race for the ever-increasing precision suggested by these measurements actually offer any real benefit to human society? Is there perhaps a risk that we are somehow fetishizing precision, making things to ever-more-extraordinary tolerances simply because we can, or because we believe we should be able to? These are questions for later, but they nonetheless prompt a need here to define tolerance, so that we know as much about this singular aspect of precision as about precision itself. Although I have mentioned that one may be precise in the way one uses language, or accurate in the painting of a picture, most of this book will examine these properties as far as they apply to manufactured objects, and in most cases to objects that are manufactured by the machining of hard substances: metal, glass, ceramics, and so forth. Not wood, though. For while it can be tempting to look at an exquisite piece of wooden furniture or temple architecture and to admire the accuracy of the planing and the precision of the joints, the concepts of precision and accuracy can never be strictly applied to objects made of wood—because wood is flexible; it swells and contracts in unpredictable ways; it can never be truly of a fixed dimension because by its very nature it is a substance still fixed in the natural world. Whether planed or jointed, lapped or milled, or varnished to a brilliant luster, it is fundamentally inherently imprecise. A piece of highly machined metal, however, or a lens of polished glass, an edge of fired ceramic—these can be made with true and lasting precision, and if the manufacturing process is impeccable, they can be made time and time again, each one the same, each one potentially interchangeable for any other. Any piece of manufactured metal (or glass or ceramic) must have chemical and physical properties: it must have mass, density, a coefficient of expansion, a degree of hardness, specific heat, and so on. It must also have dimensions: length, height, and width. It must possess geometric characteristics: it must have measurable degrees of straightness, of flatness, of circularity, cylindricity, perpendicularity, symmetry, parallelism, and position—among a mesmerizing host of other qualities even more arcane and obscure. And for all these dimensions and geometries, the piece of machined metal must have a degree of what has come to be known as tolerance. It has to have a tolerance of some degree if it is to fit in some way in a machine, whether that machine is a clock, a ballpoint pen, a jet engine, a telescope, or a guidance system for a torpedo. There is precious little point in tolerance if the machined object is simply to stand upright and alone in the middle of a desert. But to fit with another equally finely machined piece of metal, the piece in question must have an agreed or stated amount of permissible variation in its dimensions or geometry that will allow it to fit. That allowable variation is the tolerance, and the more precise the manufactured piece, the greater the tolerance that will be needed and specified. A shoe, for instance, is invariably a thing of very low tolerance: on the one hand, a poorly made slipper may have “an agreed or stated amount of allowable variation in its dimensions” (which is the engineer’s formal definition of tolerance) of half an inch, with so generous an amount of wiggle room between foot and lining as to make the notion of precision almost irrelevant. A handmade brogue shoe by Lobb of London, on the other hand (or foot), may seem to fit snugly, perfectly, precisely even, but it will still have a tolerance of maybe an eighth of an inch—and in a shoe, such a tolerance would be acceptable, and the shoe indeed worn with pride. Yet, in terms of precision engineering, it is anything but precisely made; nor is it even accurately so. ONE OF THE two most precise measuring instruments ever built by human agency stands in America’s Pacific Northwest, far away from everything, in the arid middle of Washington State. It was built just outside the top-secret nuclear installation where the United States created the first supplies of plutonium for the bomb that destroyed Nagasaki, for decades the material at the heart of much of the nation’s arsenal of atomic weapons. The years of nuclear activity there have left an unimaginably large legacy of dangerously irradiated substances, from old fuel rods to contaminated items of clothing, which are only now, and after a loud public outcry, being remedied—or remediated, the term environmentalists prefer. Today, the Hanford site, as it is known, is officially the largest environmental cleanup site in the world, with decontamination bills reaching the tens of billions of dollars and the necessary remedial work likely to last until the middle of the twenty-first century. I first passed by the site very late one night, after a long drive from Seattle. From my southbound speeding car, I could see the glimmer of lights in the far distance. Behind razor-wire security fences and warning signs and under the protection of armed guards, some eleven thousand workers are now laboring night and day to cleanse the earth and waters of the poisonous radioactivity that so dangerously suffuses it. Some suppose it is a task so vast that it may never be properly completed. To the south of the main cleanup site, just outside the razor-wire fence but within sight of the still-standing towers of the remaining atomic piles, one of present-day science’s most remarkable experiments is being conducted. It is not secret at all, is unlikely to leave a legacy of any danger whatsoever, and requires the making and employment of an array of the most precise machines and instruments that humankind has ever attempted to construct. It is an unassuming place, easily missed. I arrived for my appointment in morning daylight, weary after the long nighttime drive. It was cold; the road was quite empty, the main turnoff unmarked. A small notice on the left pointed to a cluster of low white buildings a hundred yards off the road. “LIGO,” the sign read. “WELCOME.” And that was about all. Welcome to the current cathedral, it might also have said, to the worship of ultraprecision. It has taken decades to design the scientific instruments that are secreted out in the middle of this dust-dry nowhere. “We maintain our security by our obscurity” is the motto for those who fret about the costly experiments sited there, all without a fragment of barbed wire or chain link to protect them. The tolerances of the machines at the LIGO site are almost unimaginably huge, and the consequent precision of its components is of a level and nature neither known nor achieved anywhere else on Earth. LIGO is an observatory, the Laser Interferometer Gravitational-Wave Observatory. The purpose of this extraordinarily sensitive, complex, and costly piece of equipment is to try to detect the passage through the fabric of space-time of those brief disruptions and distortions and ripples known as gravitational waves, phenomena that in 1916 Albert Einstein predicted, as part of his general theory of relativity, should occur. If Einstein was right, then once every so often, when huge events occur far out in deep space (the collision of a pair of black holes, for instance), the spreading fan of interstellar ripples, all moving at the speed of light, should eventually hit and pass through the Earth and, in doing so, cause the entire planet to change shape, by an infinitesimal amount and for just the briefest moment of time. No sentient being would ever feel such a thing; and the slight squeezing would be so minute and momentary and harmless that not a trace could ever be recorded by any machine or device known—except, in theory, by LIGO. And after decades of experiments with instruments that were being ever more refined to greater and greater degrees of sensitivity, the devices now running in the high northwest desert of Washington State and down in the bayous of Louisiana, where the second such observatory has been built, have indeed brought home the bacon. For, in September 2015, almost a century after Einstein’s theory was first published, and then again on Christmas Eve that same year and then again in 2016, LIGO’s instruments showed without doubt that a series of gravitational waves, arriving after billions of years of travel from the universe’s outer edges, had passed by and through Earth and, for the fleeting moment of their passage, changed our planet’s shape. TO DETECT THIS, the LIGO machines had to be constructed to standards of mechanical perfection that only a few years before were well-nigh inconceivable and that, before then, were neither imaginable nor even achievable. For it was not always so, this delicacy, this sensitivity, this ultraprecise manner of doing things. Precision was not always there, waiting in the shadows, needing to be found and then exploited for what its early admirers believed would be the common good. Far from it. Precision was a concept that was invented, quite deliberately, out of a single and well-recognized historic need. It was brought into being for severely practical reasons—reasons that had much to do not with any dreamy twenty-first-century wish to confirm (or otherwise) the existence of vibrations from the collisions of distant stars. Rather, it had to do with a down-to-earth eighteenth-century realization of what was then a pressing matter of physics, and which was related to the potentially awesome power of that high-temperature form of water that since the century before had been known as and defined by the word steam. Precision’s birth derives from the then-imagined possibility of maybe holding and managing and directing this steam, this invisible gaseous form of boiling water, so as to create power from it, and to demand that by the employment of this power, it perform useful work for the good (perhaps, and with luck) of all humankind. And all that, what turned out to be one of the most singular of engineering epiphanies, took place in North Wales on a cool May day in 1776—by coincidence, within weeks of the founding of the United States of America, which would eventually make such use of the precision techniques that duly evolved. That spring day is now generally (though not unanimously) agreed to mark the birth date for the making of the first construction possessed of a degree of real and reproducible mechanical precision—precision that was measurable, recordable, repeatable, and, in this case, created to the tolerance of one-tenth of an inch, or, as it was put at the time, of an English silver coin with a value or worth of just one shilling. Chapter 1 (#ulink_53900484-0f62-560c-b1a4-80ecc9a3a031) (TOLERANCE: 0.1) Stars, Seconds, Cylinders, and Steam (#ulink_53900484-0f62-560c-b1a4-80ecc9a3a031) It is the mark of an instructed mind to rest assured with that degree of precision that the nature of the subject admits, and not to seek exactness when only an approximation of the truth is possible. —ARISTOTLE (384–322 BC), NICOMACHEAN ETHICS The man who by the common consent of the engineering fraternity is regarded as the father of true precision was an eighteenth-century Englishman named John Wilkinson, who was denounced sardonically as lovably mad, and especially so because of his passion for and obsession with metallic iron. He made an iron boat, worked at an iron desk, built an iron pulpit, ordered that he be buried in an iron coffin, which he kept in his workshop (and out of which he would jump to amuse his comely female visitors), and is memorialized by an iron pillar he had erected in advance of his passing in a remote village in south Lancashire. Still, a case can also be made that “Iron-Mad Wilkinson,” as he was widely known, had predecessors who can lay near-equal claim to parenthood. One of them was a luckless clockmaker from Yorkshire named John Harrison, who worked just a few decades earlier to create devices that kept near-perfect time; the other, rather unexpectedly to those who suppose precision to be more or less a modern creation, was a nameless craftsman who worked in Ancient Greece some two thousand years before Harrison, and whose triumph of precise craftsmanship was discovered deep in the Mediterranean at the turn of the last century by a group of fishermen out diving for sponges. The Greek team, diving in the warm waters south of the Peloponnese, close to the small island of Antikythera, found sponges in abundance, as they usually did. Yet this time they found something else: the spars and tumbled beams of a wrecked ship, most probably a Roman-era cargo vessel. Among all the broken wood, they came upon a diver’s dream: a massive trove of marvels of art and luxury, along with, more mysteriously, a telephone directory–size lump of corroded and calcified bronze and wood that was initially discounted and almost discarded as being of little archaeological significance. Except that after sitting for two years in a drawer in Athens, overlooked and yet all the while patiently drying itself out, the sorry-looking lump fell apart. It sundered itself into three pieces, revealing within, and to the astonishment of all, a mess of more than thirty metallic and cleverly meshing gearwheels. One of these wheels had a diameter almost as wide as the object itself; others were no wider than a centimeter. All had hand-cut triangular teeth—the tiniest wheels had as few as 15; the enormous one had a then-inexplicable 223. It looked as though all the wheels had been cut from a single plate of bronze. Astonishment at this discovery quickly turned to disbelief, to skepticism, to a kind of puzzled fearfulness among scientists who simply could not believe that even the most sophisticated of Hellenistic engineers had ever been capable of making such a thing. So, for almost half a century, this most intimidating machine—if that is what it was—was locked away again, secured and contained like a deadly pathogen. It was given a name, the Antikythera mechanism, for the island, halfway between Crete and the southern tendrils of mainland Greece, off which it was found. It was then quietly and casually all but erased from a Greek archaeological history that was much more comfortable dealing with the more customary fare of vases and jewelry, amphorae and coins, and statues of marble or the most lustrous bronze. A handful of slim books and pamphlets were published, declaring the device to be some kind of astrolabe or planetarium, but otherwise, there was a near-universal lack of interest in the find. It was not until 1951 that Derek Price, a young British student of the history and social impact of science, won permission to take a closer look at the Antikythera mechanism, and for the next two decades he subjected the shattered relic, with a total of now more than eighty additionally found bits and pieces as well as the three main fragments, to blizzards of X-rays and wafts of gamma radiation, probing secrets that had been hidden for two thousand years. Eventually, Price decided the work was much more complex and important than a mere astrolabe—it was in fact more likely to be the once-beating heart of a mysterious computing device of unimagined mechanical complexity, one that had evidently been made in the second century BC and was clearly a work of staggering genius. Price’s work in the 1950s was limited by his technology’s inability properly to peer inside the device. All this changed with the invention twenty years later of magnetic resonance imaging, or MRI, which led in 2006, more than a century after the sponge seekers made their first find, to the publication in Nature of a profoundly more detailed and sophisticated analysis. The world-scattered team of specialist researchers who produced the Nature article concluded that what the Greek divers had pulled to the surface were the remains of a miniaturized and neatly boxed mechanical device, an analog computer, essentially, with dials and pointers and rudimentary instructions for how to use it. It was a device that “calculated and displayed celestial information, particularly cycles such as the phases of the moon and a luni-solar calendar.” Moreover, minuscule inscribed lettering in Corinthian Greek chased into the machine’s brass work—a total of 3,400 letters, all millimeter-size, have been found thus far—suggested that the gearwheels, once fully engaged with one another with the turning of a crank on the side of the box, could also predict the movement of the five other planets then known to the Ancient Greeks. Enthusiasts, a small but fervent corps of devotees of this extraordinary little instrument, have since built working models of the mechanism, in wood and brass and, in one instance, with its bronze innards expanded and exploded as in a 3-D checkers game, between layers of transparent Perspex. It was the numbers of teeth on the various wheels that offered the first clues as to how they might have been employed by the machine’s makers. The fact that there were 223 teeth on the largest of the gearwheels, for example, provided a eureka moment for the investigators, who remembered that Babylonian astronomers, who were the most astonishingly able watchers of the skies, had calculated that lunar eclipses were usually separated by 223 full moons. Use of this particular wheel, then, would have enabled the user to predict the timing of eclipses of the moon (just as other wheels and combinations of wheels would have turned pointers on dials to display phases and planetary perturbations) and the dates, more trivially, of upcoming public sporting events, most notably the ancient Olympic Games. Modern investigators have concluded that the device was very well made, “with some parts constructed to accuracies of a few tenths of a millimeter.” By that measure alone, it would seem that the Antikythera mechanism can lay claim to being a most precise instrument—and, crucially for this introduction to the story, maybe the first precision instrument ever made. Except that there is an inherent flaw in this claim. The device, as model-tested by the legions of fascinated modern analysts, turns out to be woefully, shamefully, uselessly inaccurate. One of the pointers, which supposedly indicates the position of Mars, is on many occasions thirty-eight degrees out of true. Alexander Jones, the New York University antiquities professor who has perhaps written most extensively about the Antikythera mechanism, speaks of its sophistication as being that only “of a young and rapidly developing craft tradition,” and of “questionable design choices” by its makers, who, in summary, produced a device that was “a remarkable creation, but not a miracle of perfection.” There is one additional puzzling aspect of the mechanism that still intrigues historians of science to this day: while it was filled to bursting with what is self-evidently a complicated assemblage of clockwork, none of its assemblers apparently ever thought of using it as a clock. Hindsight permits us to be puzzled, of course, and persuades us to want to reach back to the Greeks and shake them a little for ignoring what to us seems obvious. For time was already being measured in Ancient Greece with the help of all manner of other devices, most popularly with sundials, dripping water, hourglasses (as in egg timers), oil lamps with time-graduated fuel holders, and slow-burning candles with time graduations on the stick. And though the Greeks possessed (as we now know from the existence of the mechanism) the wherewithal to harness clockwork gears and make them into timekeepers, they never did so. The penny never dropped. It never dropped for the Greeks or, subsequently, for the Arabs or, even beforehand, for the much more venerable civilizations of the East. It would take many more centuries for mechanical clocks to be invented anywhere, but once they were, they would have precision as their most essential component. Though the eventual function of the mechanical clock, brought into being by a variety of claimants during the fourteenth century, was to display the hours and minutes of the passing days, it remains one of the eccentricities of the period (from our current viewpoint, that is) that time itself first played in these mechanisms a somewhat subordinate role. In their earliest medieval incarnations, clockwork clocks, through their employment of complex Antikythera-style gear trains and florid and beautifully crafted decorations and dials, displayed astronomical information at least as an equal to the presentation of time. It was almost as though the passage of celestial bodies across the heavens was considered more significant than the restless ticking of the passage of moments, of that one-way arrow of time that Newton so famously called “duration.” There was a reason for this. Nature’s offerings of dawn, midday, and dusk already provided the temporal framework—the mundane business of when it was time to rise and work, when came the time to rest, to mop the brow and take a drink, and when the time to take nourishment and prepare for sleep. The more finicky details of time (a man-made matter, after all), of whether it was 6:15 a.m. or ten minutes to midnight, were necessarily of lesser importance. The behavior of the heavenly bodies was ordained by gods, and therefore was a matter of spiritual significance. As such, it was far worthier of human consideration than our numerical constructions of hours and minutes, and was thus more amply deserving of flamboyant mechanical display. Eventually, though, the reputation and standing of the hours and minutes themselves did manage to rise through the ranks, did come to dominate the usage of the clockwork mechanisms that became known, generically, as timekeepers. The Ancients may have looked upward to the skies to gather what time it was, but once machinery began to perform the same task, a vast range of devices took over the duty, and has done so ever since. Monasteries were the first to employ timekeepers, the monks having a need to awaken and observe in some detail the canonical hours, from Matins to Compline by way of Terce, None, and Vespers. And as various other professions and callings started to appear in society (shopkeepers, clerks, men of affairs bent on holding meetings, schoolteachers due to instruct to a rigid schedule, workers on shifts), the need for a more measured knowledge of numerical time pressed ever more firmly. Toilers in the fields could always see or hear the hour on the distant church clock, but city dwellers late for a meeting needed to know how many minutes remained until the “appointed hour” (a phrase that gained currency only in the sixteenth century, by which time public mechanical clocks were widely on display). On land, it was the railways that most prolifically showed—one might say defined—the employment of time. The enormous station clock was more glanced at than any other feature of the building; the image of the conductor consulting his (Elgin, Hamilton, Ball, or Waltham) pocket watch remains iconic. The timetable became a biblically important volume in all libraries and some households; the concept of time zones and their application to cartography all stemmed from railways’ imprint of timekeeping on human society. Yet, before the chronological influence of railways, there was one other profession that above all others truly needed the most precise timekeeping. It was that which had been developing fast since the European discovery of the Americas in the fifteenth century and the subsequent consolidation of trade routes to the Orient: the shipping industry. Navigation across vast and trackless expanses of ocean was essential to maritime business. Getting lost at sea could be costly at best, fatal at worst. Also, because the exact determination of where a ship might be at any one moment was essential to the navigation of a route, and because one part of that determination depends, crucially, on knowing the exact time aboard the ship and, even more crucially, the exact time at some other stable reference point on the globe, maritime clockmakers were charged with making the most precise of clocks. And none was more sedulously dedicated to achieving this degree of exactitude than the Yorkshire carpenter and joiner who later became England’s, perhaps the world’s, most revered horologist: John Harrison, the man who most famously gave mariners a sure means of determining a vessel’s longitude. This he did by painstakingly constructing a family of extraordinarily precise clocks and watches, each accurate to just a few seconds in years, no matter how sea-punished its travels in the wheelhouse of a ship. An official Board of Longitude was set up in London in 1714, and a prize of twenty thousand pounds offered to anyone who could determine longitude with an accuracy of thirty miles. It was John Harrison who, eventually and after a lifetime of heroic work on five timekeeper designs, would claim the bulk of the prize. Harrison’s legacy is much treasured. The curator of the Greenwich Royal Observatory, high on its all-seeing hill above the Maritime Museum to the east of London, comes in each dawn to wind the three great clocks that he and his staff are disposed simply to call “the Harrisons.” He stands on much ceremony to wind them, well aware of the immense historical importance invested in the three timepieces and their one unwound sibling. Each was a prototype of the modern marine chronometer, which, in allowing ships to fix their positions at sea with accuracy, has since saved countless sailors’ lives. (Before the existence of the marine chronometer, before ships’ masters had the ability to determine exactly where they were, vessels tended to collide with importunate frequency into islands and headlands that loomed up unexpectedly before their bows. Indeed, it was the catastrophic collision off the Cornish coast of Admiral Sir Cloudesley Shovell’s squadron of warships in 1707 [which drowned him and two thousand of his sailors] that compelled the British government to think seriously about the means of figuring out longitude—setting up a Board of Longitude and offering prize money—which led, ultimately, to the making of the small family of clocks that are wound each dawn at Greenwich.) There were other reasons for the vast importance of the Harrisons. By allowing ships to know their positions and plot their voyages with efficiency, accuracy, and precision, these clocks and their successors enabled the making of untold trading fortunes. And though it may no longer be wholly respectable to say so, the fact that the Harrison clocks were British-invented and their successor clocks firstly British-made allowed Britain in the heyday of her empire to become for more than a century the undisputed ruler of all the world’s oceans and seas. Precise-running clockwork made for precise navigation; precise navigation made for maritime knowledge, control, and power. And so the curator pulls on his white curatorial gloves and, using in each case a unique double set of brass keys, unlocks the tall glass-sided cabinets in which the great timekeepers stand. Each of the three is on near-permanent loan from Britain’s Ministry of Defence. The earliest made, finished in 1735 and known these days as H1, the curator can wind with a single strong downward pull on a chain made of brass links. The later pair, the midcentury H2 and H3, require simply a swift turn of a key. The final device, the magnificent H4 “sea watch” with which Harrison eventually won his prize money, remains unwound and silent. Housed in a five-inch-diameter silver case that makes it look rather like an enlarged and biscuit-thick version of grandfather’s pocket watch, it requires lubrication and, if it runs, will become less precise with time as the oil thickens; it will lose rate, as horologists say. Moreover, if H4 were kept running, only its second hand would be seen to be moving, and so, as spectacle, it would be somewhat uninteresting—and as a trade-off for the inevitable wear and tear of the movement beneath, the sight of a moving second hand made no sense. So the decision of the observatory principals over the years has been to keep this one masterpiece in its quasi-virgin state, much like the unplayed Stradivarius violin at the Ashmolean Museum in Oxford, as an untouched testament to its maker’s art. And what sublime pieces of mechanical art John Harrison made! By the time he decided to throw his hat in the ring for the longitude prize, he had already constructed a number of fine and highly accurate timekeepers—most of them pendulum clocks for use on land, many of them long-case clocks, each one more refined than the last. Harrison’s skills lay in the imaginative improvement of his timekeepers, rather than in the decorative embellishment that many of his eighteenth-century contemporaries were known for. He was fascinated, for instance, with the problem of friction, and in a radical departure from the norm, he made all his early clocks with wooden gearwheels, which needed none of the lubricant oils of the day, oils that became notoriously more viscous with age and had the trying effect of slowing down most clockwork movements. To solve this problem, he made all his gear trains first of boxwood and then of the dense, nonfloating Caribbean hardwood Lignum vitae, combined in both cases with pivots made of brass. He also designed an extraordinary escapement mechanism, the ticking heart of the clock, that had no sliding parts (and hence no friction, either) and that is still known as a grasshopper escapement because one of the components jumps out of engagement with the escape wheel, just as a grasshopper jumps suddenly out of the grass. A portable precision clock designed for use on a rolling ship cannot easily use a long gravity-driven pendulum, however, and the first three timepieces Harrison designed for the contest were powered by systems of weights that look very different from the heavy plumb bobs that hang in a conventional long-case clock. They are instead brass bar balances that look like a pair of dumbbells, both placed vertically on the outer edges of the mechanism and its wheel trains, and connected at their tops and bottoms by pairs of springs, which provide the mechanism with a form of artificial gravity, as Harrison wrote. These springs allow the two balance beams to swing back and forth, back and forth, nodding toward and away from each other endlessly (provided that the white-gloved curator, successor on land to the ship’s master at sea, winds the mechanism daily) as the clock ticks on. H1, H2, and H3, each clock a subtle improvement upon its predecessor, each the work of years of patient experimentation—H3 took Harrison fully nineteen years to build—employ essentially the same bar balance principle, and when they are working, they are machines of an astonishing, hypnotic beauty and seemingly bewildering complexity. Many of the improvements that this former carpenter and viola player, bell tuner and choirmaster—for eighteenth-century polymaths were polymaths indeed—included in each have gone on to become essential components of modern precision machinery: Harrison created the encaged roller bearing, for example, which became the predecessor to the ball bearing and led to the founding of huge modern corporations such as Timken and SKF. And the bimetallic strip, invented solely by Harrison in an attempt to compensate for changes in temperature in his H3 timekeeper, is still employed in scores of mundane essentials: in thermostats, toasters, electric kettles, and their like. As it happened, none of these three fantastical contraptions, however beautiful in appearance and revolutionary in design they may have been, turned out to be a success. Each was taken to a ship and used by the crew as timekeeper, and each time, though the timekeeper offered an improvement on surmising the ships’ various positions, the accuracy of the vessel’s clock-derived longitude was wildly at variance from what the Board of Longitude demanded—and so no prize was awarded. Harrison’s genius and determination were recognized, though, and hefty grants continued to come his way in the hope that he would, in time, make a horological breakthrough. And this he did, at last, when between the four years from 1755 until 1759 he made not another clock, but a watch, a watch that has been known, since it was cleaned and restored in the 1930s, simply as H4. The watch was a technical triumph in every sense. After thirty-one years of near-obsessive work, Harrison managed to squeeze almost all the improvements he had engineered in his large pendulum clocks into this single five-inch silver case, and add some others, to make certain that his timekeeper was as close to chronological infallibility as was humanly possible. In place of the oscillating beam balances that made the magic madness of his large clocks so spectacular to see, he substituted a temperature-controlled spiral mainspring, together with a fast-beating balance wheel that spun back and forth at the hitherto unprecedented rate of some eighteen thousand times an hour. He also had an automatic remontoir, so-called, which rewound the mainspring eight times a minute, keeping the tension constant, the beats unvarying. There was a downside, though: this watch needed oiling, and so, in an effort to reduce friction and keep the needed application of oil to a minimum, Harrison introduced, where possible, bearings made of diamond, one of the early instances of a jeweled escapement. It remains a mystery just how, without the use of precision machine tools—the development of which will be central to the story that follows—Harrison was able to accomplish all this. Certainly, all those who have made copies of H4 and its successor, K1 (which was used on all Captain James Cook’s voyages), have had to use machine tools to fashion the more delicate parts of the watches: the notion that such work could possibly be done by the hand of a sixty-six-year-old John Harrison still beggars belief. Once his task was completed, he handed the finished watch over to the Admiralty for its crucial test. The instrument (in the care of Harrison’s son William, who acted as its chaperone) was taken aboard the HMS Deptford, a fifty-gun fourth-rate ship of the line, and sent out on a five-thousand-mile voyage from Portsmouth to Jamaica. Careful observation at the end of the trip showed the watch to have accumulated a timekeeping error of only 5.1 seconds, well within the limits of the longitude prize. Over the entire 147 days of a voyage that involved a complex and unsettling stormy return journey (in which William Harrison had to swaddle the timekeeper in blankets), the watch error was just 1 minute 54.5 seconds, a level of accuracy never imagined for a seaborne timekeeping instrument. And while it would be agreeable to report that John Harrison then won the prize for his marvelous creation, much has been made of the fact that he did not. The Board of Longitude prevaricated for years, the Astronomer Royal of the day declaring that a much better way of determining longitude, known as the lunar distance method, was being perfected, and that there was therefore no need for sea clocks to be made. Poor John Harrison, therefore, had to visit King George III (a great admirer, as it happens) to ask him to intercede on his behalf. A series of humiliations followed. H4 was forced to be tested once again, and recorded an error of 39.2 seconds over a forty-seven-day voyage—once again, well within the limits set by the Board of Longitude. Harrison then had to dismantle the watch in front of a panel of observers and hand his precious instrument to the Royal Observatory for a ten-month running trial to check (once again, but this time on a stable site) its accuracy. It was torturous and vexing for the now-elderly Harrison, who at seventy-nine was becoming increasingly and understandably embittered by the whole procedure. Finally, and thanks in large part to King George’s intervention, Harrison did get almost all his money. The popular memory of him, though, is of a genius hard done by, and his great clocks and the two sea watches, H4 and K1, remain the most potent memorials, three of them beating out the time steadily and ceaselessly as a reminder of how their maker, with his devotion to precision and accuracy in his craft work, helped so profoundly to change the world. THE ANTIKYTHERA MECHANISM, then, was a device remarkable and precise in its making and aspect, but its inaccuracy and understandably amateurish construction rendered it unreliable and, in practical terms, well-nigh useless. John Harrison’s timekeepers, though, were both precise and accurate, but given that they took years to make and perfect, and were the result of hugely costly craftsmanship, it would be idle to declare them either as candidates or as the fountainhead for true and world-changing precision. Also, though intending no disrespect to an indelible technical achievement, it is worth noting that John Harrison’s clockworks enjoyed perhaps only three centuries’ worth of practical usefulness. Nowadays, the brassbound chronometer in a ship’s chart room, just like the sextant kept in its watertight morocco box, is a thing more decorative than essential. Time signals of impeccable accuracy now come across the radio. The digital readout of longitude and latitude coordinates come to a ship’s bridge from a Global Positioning System’s (GPS) interrogation of faraway satellites. Clockwork machines, however beautifully their gears may be cut and enclosed in casings, however precious and intricately engraved, are a creation of yesterday’s technology, and are retained nowadays by and large for their precautionary value only: if the seagoing vessel loses all power, or if the master is a purist with a disdain for technology, then John Harrison’s works have real practical worth. Otherwise, his clocks gather dust and salt, or are kept in glass cases, and his name will begin to slip gradually astern, to vanish inevitably and soon in a sea fret of history, way stations at the beginning of the voyage. For precision to be a phenomenon that would entirely alter human society, as it undeniably has done and will do for the foreseeable future, it has to be expressed in a form that is duplicable; it has to be possible for the same precise artifact to be made again and again with comparative ease and at a reasonable frequency and cost. Any true and knowledgeable craftsman (just like John Harrison) may be able, if equipped with sufficient skill, ample time, and tools and material of quality, to make one thing of elegance and evident precision. He may even make three or four or five of the same thing. And all will be beautiful, and most will inspire awe. Large cabinets in museums devoted to the history of science (most notably at Oxford and Cambridge and Yale) are today filled with such objects. There are astrolabes and orreries, armillary spheres and astraria, octants and quadrants, and formidably elaborate sextants, both mural and framed, which are to be seen in particular abundance, most of them utterly exquisite, intricate, and assembled with a jeweler’s care. At the same time, all of each instrument was perforce made by hand. Every gear was hand-cut, as was every component part (every mater and rete and tympan and alidade, for example; astrolabes have their own quite large vocabulary), every tangent screw and index mirror (words relating to sextants are similarly various). Also, the assembly of each part to every other and the adjustment of the assembled whole—all had to be accomplished with, quite literally, fingertip care. Such an arrangement produced fine and impressive instruments, without a doubt, but given the manner in which they were made and how they were put together, they could necessarily have been available only in rather limited numbers and to a small corps d’?lite of customers. They may have been precise, but their precision was very much for the few. It was only when precision was created for the many that precision as a concept began to have the profound impact on society as a whole that it does today. And the man who accomplished that single feat, of creating something with great exactitude and making it not by hand but with a machine, and, moreover, with a machine that was specifically created to create it—and I repeat the word created quite deliberately, because a machine that makes machines, known today as a “machine tool,” was, is, and will long remain an essential part of the precision story—was the eighteenth-century Englishman denounced for his supposed lunacy because of his passion for iron, the then-uniquely suitable metal from which all his remarkable new devices could be made. IN 1776, THE forty-eight-year-old John Wilkinson, who would make a singular fortune during his eighty years of life, had his portrait painted by Thomas Gainsborough, so he is far from an uncelebrated figure—but if not uncelebrated, then not exactly celebrated, either. It is notable that his handsome society portrait has for decades hung not in prominence in London or Cumbria, where he was born in 1728, but in a quiet gallery in a museum far away in Berlin, along with four other Gainsboroughs, one of them a study of a bulldog. The distance suggests a certain lack of yearning for him back in his native England. And the New Testament remark about a prophet being without honor in his own country would seem to apply in his case, as Wilkinson is today rather little remembered. He is overshadowed quite comprehensively by his much-better-known colleague and customer, the Scotsman James Watt, whose early steam engines came into being, essentially, by way of John Wilkinson’s exceptional technical skills. History will show that the story of such engines, which were so central to the mechanics of the following century’s Industrial Revolution, is inextricably entwined with that of the manufacture of cannons, and not simply because both men used components made from heavy hunks of iron. A further link can be made, between the thus gun-connected Wilkinson and Watt on the one hand and the clockmaker John Harrison on the other, as it will be remembered that Harrison’s early sea clock trials were made on Royal Naval warships of the day, warships that carried cannon in large numbers. Those cannons were made by English ironmasters, of whom John Wilkinson was among the most prominent and, as it turned out, the most inventive, too. So the story properly begins there, with the making of the kind of large weapons used by Britain’s navy during the mid-eighteenth century, a time when the nation’s sailors and soldiers were being kept exceptionally busy. (#ulink_c3b8ed9f-a4fe-523d-93d8-ce430fc5b1c0) John “Iron-Mad” Wilkinson, whose patent for boring cannon barrels for James Watt marked both the beginning of the concept of precision and the birth of the Industrial Revolution. John Wilkinson was born into the iron trade. His father, Isaac, originally a Lakeland shepherd, discovered by fortuitous chance the presence of both ore and coal on his pastures and so became in time an ironmaster, a trade very much of its time. The word describes the owner of a family of furnaces, and one who used them to smelt and forge iron from its ore with either charcoal (which stripped England of too-large tracts of forest) or (as an environmentally responsible response) coal that had been half burned and transmuted into coke. John himself, uncomfortably born, it was said, bumping along in a market cart while his mother was en route to a country fair, became fascinated by white heat and molten metal and the whole process of taking mere rocks that lay underground and creating useful things simply by violently heating and hammering them. He learned the trade at the various places in the English Midlands and the Welsh Marches where his father settled down, and was sufficiently adept that by the early 1760s, by now married into money and owning a considerable foundry in the Welsh-English borderland village of Bersham, he began in earnest the production, according to the firm’s first ledger, of “calendar-rolls, malt mill rolls, sugar rolls, pipes, shells, grenades and guns.” It was the final item on the list that would give the tiny village of Bersham, along with the man who would become its most prosperous resident and its largest employer, a unique place in world history. Bersham, which lies in the valley of the River Clywedog, enjoys an indisputable though half-forgotten role both in the founding of the Industrial Revolution and in the story of precision. For it is here that on January 27, 1774, John Wilkinson, whose local furnaces, all fired by coal, were producing a healthy twenty tons of good-quality iron a week, invented a technique for the manufacture of guns. The technique had an immediate cascade effect very much more profound than those he ever imagined, and of greater long-term importance, I would argue, than the much more famed legacies of his friend and rival Abraham Darby III, who threw up the still-standing great Iron Bridge of Coalbrookdale that attracts tourist millions still today, and is regarded by most modern Britons as the Industrial Revolution’s most potent and recognizable symbol. Wilkinson filed a patent, Number 1063—it was still quite early in the history of British patents, which were first issued in 1617—with the title “A New Method of Casting and Boring Iron Guns or Cannon.” By today’s standards, his “new method” seems almost pedestrian and an all-too-obvious improvement in cannon making. In 1774, however, a time when naval gunnery all over Europe was enjoying a period of sudden scientific improvement in both technique and equipment, Wilkinson’s ideas came as a godsend. Up until then, naval cannons (most particularly the thirty-two-pound long gun, a standard on first-rate ships of the line in the Royal Navy, often ordered a hundred at a time when a new vessel was launched) were cast hollow, with the interior tube through which the powder and projectile were pushed and fired preformed as the iron was cooling in its mold. The cannon was then mounted on a block and a sharp cutting tool advanced into it at the end of a long rod, with the idea of smoothing out any imperfections on the tube’s inner surface. The problem with this technique was that the cutting tool would naturally follow the passage of the tube, which may well not have been cast perfectly straight in the first place. This would then cause the finished and polished tube to have eccentricities, and for the inner wall of the cannon to have thin spots where the tool wandered off track. And thin spots were dangerous—they meant explosions and bursting tubes and destroyed cannon and injuries to the sailors who manned the notoriously dangerous gun decks. The poor quality of early eighteenth-century naval artillery pieces led to failure rates that decidedly alarmed the sea lords at Admiralty headquarters in London. Then came John Wilkinson and his new idea. He decided that he would cast the iron cannon not hollow but solid. This, for a start, had the effect of guaranteeing the integrity of the iron itself—there were fewer parts that cooled early, for example, as would happen if there was a form installed to create the inner tube. A solid cylindrical chunk of iron, heavy though it might have been, could, if carefully made, come out of the Bersham furnaces without the air bubbles and spongy sections (“honeycomb problems,” as they were called) for which hollow-cast cannon were then notorious. Yet the true secret was in the boring of the cannon hole. Both ends of the operation, the part that did the boring and the part to be bored, had to be held in place, rigid and immovable. That was a canonical truth, as true today as it was in the eighteenth century, for to cut or polish something into dimensions that are fully precise, both tool and workpiece have to be clasped and clamped as tightly as possible to secure immobility. Moreover, in the specific case of gun barrels, there could be no allowable temptation for the boring tool to wander while the bore was being made. This was the reason the cannons were cast solid rather than hollow. To do otherwise was to risk explosive catastrophe. In the first iteration of Wilkinson’s patented process, this solid cannon cylinder was set to rotating (a chain was wrapped around it and connected to a waterwheel) and a razor-sharp iron-boring tool, fixed onto the tip of a rigid base, was advanced directly into the face of the rotating cylindrical workpiece. This created a brand-new hole, straight and precise, as the boring tool was pushed directly into the iron. “With a rigid boring bar and the bearing true,” wrote a recent biographer of Wilkinson’s, somewhat poetically, “accuracy was bound to ensue.” In later versions, it was the cannon that remained fixed and the tool, itself now connected to the waterwheel, that was turned. In theory, and provided that the turning bar itself was rigid; that it was supported at both ends and so maintained its rigidity; and that, as it was advanced into the hole it was boring into, the cylinder face did not bend or turn or hesitate or waver in any way, a hole of great accuracy could be created. Indeed, that is just what was obtained. Cannon after cannon tumbled from the mill, each accurate to the measurements the navy demanded, each one, once unbolted from the mill, identical to its predecessor, each one certain to be the same as the successor that would next be bolted onto it. The new system worked impeccably from the very start, encouraging Wilkinson to apply for and indeed receive his famous patent. Instead of an eccentrically drilled-out version of a previously cast hole in a cannon barrel that was already peppered with flaws and weak spots, and which, if it fired at all, would hurl the ball or chain shot or shell wildly through the air, the Royal Navy now received from the Bersham works wagonloads of guns that had a much longer shelf life and would fire their grapeshot or canister shot or explosive shells exactly at their intended target. The improvements were all thanks to the efforts of John Wilkinson, ironmaster. Already a wealthy man, Wilkinson prospered mightily as a result: his reputation soared, and new orders flooded in. Soon, his ironworks alone were producing fully one-eighth of all the iron made in the country, and Bersham was firmly set to be a village for the ages. Yet what elevates Wilkinson’s new method to the status of a world-changing invention, and Bersham’s consequent elevation from the local to the world stage, would come the following year, 1775, when he started to do serious business with James Watt. He would then marry his new cannon-making technique, though this time without a brand-new patent, incautiously, with the invention that Watt was just then in the throes of completing, the invention that would ensure that the Industrial Revolution and much else besides and beyond were powered by the cleverly harnessed power of steam. The principle of a steam engine is familiar, and is based on the simple physical fact that when liquid water is heated to its boiling point it becomes a gas. Because the gas occupies some 1,700 times greater volume than the original water, it can be made to perform work. Many early experimenters realized this. A Cornish ironmonger named Thomas Newcomen was the first to turn the principle into a product: he connected a boiler, via a tube with a valve, to a cylinder with a piston, and the piston to a beam on a rocker. Each time steam from the boiler entered the cylinder, the piston was pushed upward, the beam tilted, and a small amount of work (a very small amount) could be performed by whatever was on the far end of the beam. Newcomen then realized he could increase the work by injecting cold water into the steam-filled cylinder, condensing the steam and bringing it back to 1/1,700 of its volume—creating, in essence, a vacuum, which enabled the pressure of the atmosphere to force the piston back down again. This downstroke could then lift the far end of the rocker beam and, in doing so, perform real work. The beam could lift floodwater, say, out of a waterlogged tin mine. Thus was born a very rudimentary kind of steam engine, almost useless for any application beyond pumping water, but given that early eighteenth-century England was awash with shallow mines that were themselves awash with water, the mechanism proved popular and useful to the colliery community. The Newcomen engine and its like remained in production for more than seventy years, its popularity beginning to lessen only in the mid-1760s, when James Watt, who was then employed making and repairing scientific instruments six hundred miles away at the University of Glasgow, studied a model of its workings closely and decided, in a series of moments of the purest genius, that it could be markedly improved. It could be made efficient, he thought. It could possibly be made extremely powerful. And it was John Wilkinson who helped to make it so—once, that is, Watt had had his strokes of the purest genius. These can be summed up simply enough. For weeks, alone in his rooms in Glasgow, Watt puzzled over a model of the Newcomen engine, a machine famed for being so woefully inadequate, so inefficient, so wasteful of all the heat and energy expended upon it. Watt, patiently trying out various ways to improve on Newcomen’s invention, is reported to have remarked wearily that “Nature has a weak side, if only we can find it out.” He finally did so, according to legend, one Sunday in 1765, as he was taking a restorative walk through a park in central Glasgow. He realized that the central inefficiency of the engine he was examining was that the cooling water injected into the cylinder to condense the steam and produce the vacuum also managed to cool the cylinder itself. To keep the engine running efficiently, though, the cylinder needed to be kept as hot as possible at all times, so the cooling water should perhaps condense the steam not in the cylinder but in a separate vessel, keeping the vacuum in the main cylinder, which would thus retain the cylinder’s heat and allow it to take on steam once more. Moreover, to make matters even more efficient, the fresh steam could be introduced at the top of the piston rather than the bottom, with stuffing of some sort placed and packed into the cylinder around the piston rod to prevent any steam from leaking out in the process. These two improvements (the inclusion of a separate steam condenser and the changing of the inlet pipes to allow for the injection of new steam into the upper rather than the lower part of the main cylinder)—improvements so simple that, at this remove, they seem obvious, even though, to James Watt in 1765, they were anything but—changed Newcomen’s so-called fire-engine into a proper and fully functioning steam-powered machine. It became in an instant a device that in theory could produce almost limitless amounts of power. (#ulink_7a7eb106-31d9-5a24-bb4a-aee94ee38e7e) A cross section of a late eighteenth-century Boulton and Watt steam engine. The main cylinder, C, would have been bored by John Wilkinson, the piston, P, fitting snugly inside it to the thickness of an English shilling, a tenth of an inch. As he began what would be a full decade of testing and prototype building and demonstrating and seeking funds (during which time he moved south from Scotland to the vibrantly industrializing purlieus of the English Midlands), Watt sought and was swiftly awarded a patent: Number 913 of January 1769. It had a deceptively innocuous title: “A New Invented Method of Lessening the Consumption of Steam and Fuel in Fire-Engines.” The modest wording belies the invention’s importance: once perfected, it was to be the central power source for almost all factories and foundries and transportation systems in Britain and around the world for the next century and more. What is especially and additionally noteworthy, though, is that a historic convergence was in the making. For, living and working nearby in the Midlands, and soon to produce a patent himself (the already noted Number 1063 of January 1774, an exact one hundred fifty patents and exactly five years later than James Watt’s), was no less an inventor than John Wilkinson, ironmaster. By then, Wilkinson’s amiable madness was making itself felt throughout the ferrous community: all came to learn that he had made an iron pulpit from which he lectured, an iron boat he floated on various rivers, an iron desk, and an iron coffin in which he would occasionally lie and make his frightening mischief. (Women were in plentiful attendance, despite his being a somewhat unattractive man with a massively pockmarked face. He had a vigorous sex drive, fathering a child at seventy-eight by way of a maidservant, a calling of which he was inordinately fond. He kept a seraglio of three such women at one time, each one unaware of the others.) Still, Wilkinson could and would free himself from these distractions, and by 1775, he and Watt, though of very different temperaments, had met and befriended each other, though it was a friendship based more on commerce than affection. Before long, their two inventions were, and to their mutual commercial benefit, commingled. Wilkinson’s “New Method of Casting and Boring Iron Guns or Cannon” was married to Watt’s “New Invented Method of Lessening the Consumption of Steam and Fuel in Fire-Engines.” It was a marriage, it turned out, of both convenience and necessity. James Watt, a Scotsman renowned for being pessimistic in outlook, pedantic in manner, scrupulous in affect, and Calvinist in calling, was obsessed with getting his machinery as right as it could possibly be. While he was making and repairing and improving the scientific instruments in his workshop in Glasgow, he became well-nigh immured by his passion for exactitude, to much the same degree as had John Harrison in his clock-making workshop in Lincolnshire. Watt was quite familiar with the early dividing engines and screw thread cutters and lathes and other instruments that were then helping engineers take their first tentative steps toward machine perfection. He was accustomed to instruments that were carefully built and properly maintained, and that worked as they were intended to. He was mortally offended, then, when things went wrong, when inefficiencies were compounded, and when the monster iron engines he was now trying to build in the giant Boulton and Watt factory in Soho performed less well than the brass-and-glass models on which he had experimented back up in Scotland. His first prototype large engines were spectacular behemoths: thirty feet tall, with a main steam cylinder four feet in diameter and six feet long, a coal-fired boiler, and a separate steam condenser, all massive. All the working parts were connected by a convoluted spiderweb of brass pipes and well-oiled valves and levers, with a spinning two-ball governor that prevented runaways. Above it all was a heavy wooden beam that rocked back and forth with metronomic regularity, turning a huge iron flywheel that in turn worked a pump that gushed water or compressed air or performed other tasks fifteen times a minute. Once at full power, the engine produced a concatenation of noise and heat and a juddering, thudding, stomach-churning intensity that somehow seemed an impossible consequence of merely heating water up to its natural boiling point. Yet everywhere, perpetually enveloping his engine in a damp, hot, opaque gray fog, were billowing clouds of steam. It was this, this scorching miasma of invisibility, that incensed the scrupulous and pedantic James Watt. Try as he might, do as he could, steam always seemed to be leaking, and doing so not stealthily but in prodigious gushes, and most impudently of all, it was doing so from the engine’s enormous main cylinder. He tried blocking the leak with all kinds of devices, things, and substances. The gap between the piston’s outer surface and the cylinder’s inner wall should, in theory, have been minimal, and more or less the same wherever it was measured. But because the cylinders were made of iron sheets hammered and forged into a circle, and their edges then sealed together, the gap actually varied enormously from place to place. In some places, piston and cylinder touched, causing friction and wear. In other places, as much as half an inch separated them, and each injection of steam was followed by an immediate eruption from the gap. This is where the blocking came in: Watt tried tucking in pieces of linseed oil–soaked leather; stuffing the gap with a paste made from soaked paper and flour; hammering in corkboard shims, pieces of rubber, even dollops of half-dried horse dung. A solution of sorts came when he decided to wrap the piston with a rope and tighten what he called a “junk ring” around the compressible rope. Then, by the purest accident, John Wilkinson, in Bersham, asked for an engine to be built for him, to act as a bellows for one of his iron forges—and in an instant, he saw and recognized Watt’s steam-leaking problem, and in an equal instant, he knew he had the solution: he would apply his cannon-boring technique to the making of cylinders for steam engines. So, without taking the precautionary step of filing a new patent for this entirely new application of his method, he proceeded to do with the Watt cylinders exactly what he had done with the naval guns. He had Watt’s workmen haul a solid iron cylinder blank the seventy miles across to Bersham. He then strapped the blank (in this case, for the very engine that he, as customer, eventually wanted, so six feet long and thirty-eight inches in diameter) onto a firmly fixed stage, and then secured it with heavy chains to make certain it did not move by so much as a fraction of an inch. He then fashioned a massive cutting tool of ultrahard iron that was three feet across (which should in theory have produced a cut that left a thirty-eight-inch-diameter cylinder with one-inch-thick walls) and bolted it securely to the end of a stiff iron rod eight feet long. This he supported at both ends and mounted onto a heavy iron sleigh that could be ratcheted slowly and steadily into the huge iron workpiece. As soon as he was ready to begin working the piece, he directed, through a hose, a water-and-vegetable-oil mixture both to cool the thrashing metals and to wash away any fragments of cut iron; opened the water valve for the millrace and wheel that would set the rod and its cutting tool turning; and slowly and steadily, notch by notch by notch, set the rod moving forward until its cutting edge began chewing away at the face of the iron billet. After just half an hour of searing heat and grinding din, the cylinder was cut. The tool, hot but barely blunted, was withdrawn. The hole, three feet in diameter, looked smooth and clean, straight and true. Using a set of chains and blocks, he placed the heavy cylinder (now rather less heavy, as so much iron had been bored away) upward, on its end. The piston, fractionally less than three feet in diameter itself and smeared with lubricating grease, was carefully lifted up and over the lip of the cylinder and down into its depths. There was, I like to think, a round of cheers, for the piston slipped noiselessly and snugly into the cylinder and could be lifted up and down with ease and without any apparent leakage of air, of grease, of anything. It then took Watt just a few days, once the disassembled pieces were brought back to his works in Soho, to mount the cylinder in pride of place in what would now be his, and the world’s, first working full-scale single-action engine. He and his engineers then added all the supplementary parts (the pipes, the second condenser, the boiler, the rocking arm, the governor, the water tank, the flywheel) and then loaded the firebox with coal, added a primer, lit the fire, and, once the water was hot enough to set steam pouring from the safety line, opened the main valve. With an enormous chuff-chuff-chuff, the piston began to move up and down, up and down, out of the newly machined cylinder. The rocking beam above then began to oscillate up and back; the connecting rod on the far side started to move up and down, up and down; the set of eccentric sun-and-moon gears on the flywheel started to move; and then the huge wheel itself, tons of solid iron that would in effect store the engine’s power, started to turn. Within moments, with the governor’s shiny couplet of balls spinning merrily to keep matters in check, the engine was roaring along at full power, thumping and thudding and whirring and chuffing—and now all perfectly visibly because, for the first time since Watt had begun his experiments, there was no leaking steam. The engine was working at maximum efficiency: it was fast, it was powerful, and it was doing just what was demanded of it. Watt beamed with delight. Wilkinson had solved his problem, and the Industrial Revolution—we can say now what those two never imagined—could now formally begin. And so came the number, the crucial number, the figure that is central to this story, that which appears at the head of this chapter and which will be refined in its exactitude in all the remaining parts of this story. This is the figure of 0.1—one-tenth of an inch. For, as James Watt later put it, “Mr. Wilkinson has bored us several cylinders almost without error, that of 50 inch diameter … does not err the thickness of an old shilling at any part.” An old English shilling had a thickness of a tenth of an inch. This was the tolerance to which John Wilkinson had ground out his first cylinder. He might in fact have done even better than that. In another letter, written rather later—by which time Wilkinson had bored no fewer than five hundred cylinders for Watt’s engines, which were being snapped up by factories and mills and mines all over the country and beyond—the Scotsman boasted that Wilkinson had “improved the art of boring cylinders so that I promise upon a seventy two inch cylinder being not farther distant from absolute truth than the thickness of an old sixpence at the worst part.” An old English sixpence was even slighter: half of a tenth of an inch, or 0.05 inches. Yet this is a quibble. Whether the thickness of a shilling coin or the thinness of an old sixpence, it does not really matter. The fact is that a whole new world was being created. Machines had now been made that would make other machines, and make them with accuracy, with precision. All of a sudden, there was an interest in tolerance, in the clearance by which one part was made to fit with or into another. This was something quite new, and it begins, essentially, with the delivery of that first machine on May 4, 1776. The central functioning part of the steam engine was possessed of a mechanical tolerance never before either imagined or achieved, a tolerance of 0.1 inches, and maybe even better. ON THE FAR side of the Atlantic Ocean, and precisely two months after the culmination of these events, on July 4, 1776, a whole new political world was to be created. The United States of America was born, with implications unimagined by all. It was very shortly thereafter that the new nation’s principal representative in Europe, Thomas Jefferson, heard tell of these miraculous mechanical advances and started to ponder how his own faraway country might well take advantage of developments that appeared to him to have the very greatest potential. Maybe, Jefferson declared, they could form the basis for a new trade well suited to his new country. Maybe, replied the engineers in response, we can do better than we have done already, and using their own arcane language of numbers, they translated their ambitions: maybe we can make and machine and manufacture metal pieces in America to a tolerance much greater than John Wilkinson’s 0.1. Maybe we can be adroit enough to reach down to 0.01. Maybe better than that—maybe to 0.001. Who could possibly know? As with the new nation, these visionary engineers wondered, so perhaps with the new machines. As it happened, the engineers—in England, mainly, but also, and most significantly for the next part of the story, in France—would do a great deal better than they ever supposed. The genie of accuracy was now out of the bottle. True precision was now out of the gate, and moving fast. Chapter 2 (#ulink_ac063c5b-b919-5550-b741-4996aab990ea) (TOLERANCE: 0.0001) Extremely Flat and Incredibly Close (#ulink_ac063c5b-b919-5550-b741-4996aab990ea) It is to the exactitude and accuracy of our machine tools that our machinery of the present time owes its smoothness of motion and certainty of action. —SIR WILLIAM FAIRBAIRN, BT. (1862), REPORT OF THE BRITISH ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE On the north side of London’s Piccadilly, overlooking Green Park and sandwiched between the quarters of the aged and imperturbable Cavalry Club to the west and a rather more ephemeral Peruvian-style ceviche restaurant on its eastern side, stands Number 124, these days an elegant but somewhat anonymous structure providing offices for the discreet and service apartments for the wealthy. Since 1784, when this far-western end of the great boulevard was still ripe for development, the address had been the home and atelier of a cabinet, engine, and lock maker named Joseph Bramah. On fair-weather days some six years after its opening, when Bramah and Company was an established and familiar small firm, modest gatherings of curious passersby would assemble outside to peer into the front bow window, puzzling at a challenge so difficult that it went unanswered for more than the sixty subsequent years. There was just a single object on view in the window, placed on a velvet cushion like a religious icon. It was a padlock, oval shaped, of modest size, and with a smooth and uncomplicated external appearance. On its face was written, in a small script legible only to those who pressed their faces close to the window glass, the following words: THE ARTIST WHO CAN MAKE AN INSTRUMENT THAT WILL PICK OR OPEN THIS LOCK SHALL RECEIVE 200 GUINEAS THE MOMENT IT IS PRODUCED. (#ulink_2b069c87-2bbd-57b2-88df-09313c7082e2) Joseph Bramah, locksmith extraordinaire, also invented the fountain pen, a device for keeping beer cool and under pressure in a pub basement, and a machine for counting banknotes. The designer of this boastfully unbreakable lock was the firm’s principal, Joseph Bramah. Its maker, however, was not Bramah but a then-nineteen-year-old former blacksmith’s apprentice named Henry Maudslay, whom Bramah had taken on the previous year, entirely because of Maudslay’s reputation for having a formidable skill in delicate machining. It would not be until 1851 that the Bramah lock was successfully—although, as we shall see in a later chapter, controversially—picked and the very handsome pledge redeemed. And in the years leading up to this event (which only their descendants would survive to witness), these two men, Bramah and Maudslay, proved themselves to be engineers supreme. They invented all manner of intriguing new devices, and they effectively and independently wrote the rule books for the precise world that was beginning to emerge as a consequence of (or, at least, in the wake of) John Wilkinson’s achievements with his cylinder-boring machine at Bersham. Some of the two men’s inventions have faded away into history; some others, however, have survived as the foundations on which much of today’s most sophisticated engineering achievements would eventually be built. Though Maudslay remains today the better-known figure, with a legacy recognized by most engineers, Bramah was at the time perhaps the more showily ingenious of the pair. His first invention was dreamed up while he lay in bed after a fall, and must rank as the least romantic: for a London population that sorely needed an improvement in public hygiene, he built water closets, and he patented his ideas for a system of flaps and a float and valves and pipes that made the device both self-cleansing (flushing, indeed, for the first time) and free from the usual risk of freezing in winter that created unpleasant results for all. He made a small fortune from this creation, selling six thousand in the first twenty years of production, and a Bramah WC was still the centerpiece of the civilized English middle-class bathroom right up until Victoria’s Jubilee, a hundred years later. Bramah’s interest in locks, which required far more intricacy and precise workmanship than a toilet, of course, seems to have started when he was elected in 1783 a member of the newly formed (and still there, in its original home) Royal Society for the Encouragement of Arts, Manufactures and Commerce. What is now simply the Royal Society of Arts, the RSA, back in the eighteenth century had six divisions: Agriculture, Chemistry, Colonies and Trade, Manufactures, Mechanicks (spelled thus), and most quaintly, the Polite Arts. Bramah not unnaturally opted to attend most of the Mechanicks meetings and, soon after joining, rocketed to prominence by the simple act of picking a lock. Not so simply, actually: in September 1783, a Mr. Marshall had submitted for consideration what he declared was a formidably unpickable lock, and had a local expert named Truelove worry away at it with a quiverful of special tools for an hour and a half, before accepting defeat. Then, from the back of the audience stepped Joseph Bramah, who quickly fashioned a pair of instruments and opened the lock in fifteen minutes flat. A buzz of excitement went around the room: they were clearly in the presence of a most Mechanickal man. Locks were a British obsession at the time. The social and legislative changes that were sweeping the country in the late eighteenth century were having the undesirable effect of dividing society quite brutally: while the landed aristocracy had for centuries protected itself in grand houses behind walls and parks and ha-has, and with resident staff to keep mischief at bay, the enriched beneficiaries of the new business climate were much more accessible to the persistent poor. They and their possessions were generally both visible and, especially in the fast-growing cities, nearby; they tended to live in houses and on streets within earshot and slingshot of the vast armies of the impoverished. Envy was abroad. Robbery was frequent. Fear was in the air. Doors and windows needed to be bolted. Locks had to be made, and made well. A lock such as Mr. Marshall’s, pickable in fifteen minutes by a skilled man, and by a desperate and hungry man maybe in ten, was clearly not good enough. Joseph Bramah decided he would design and make a better one. He did so in 1784, less than a year after picking the Marshall lock. His patent made it almost impossible for a burglar with a wax-covered key blank, the tool most favored by the criminals who could use it to work out the position of the various levers and tumblers inside a lock, to divine what was beyond the keyhole, inside the workings. Bramah’s design, which he patented that August, had the various levers inside a lock rise or fall to different positions when the key was inserted and turned to release the bolt, but then had those same levers return to their initial positions once the bolt had been shot. The effect of this was to make the device almost burglar-proof, for no amount of foraging with a wax key blank would ever allow a picklock to work out where the levers needed to be (as they weren’t there anymore) in order to free the bolt. Once Bramah had come up with this basic mechanical premise, it remained for him, with great cleverness and elegance, to form the entire lock into a cylindrical shape, with its levers not so much rising and falling under the influence of gravity as moving in and out along the radii of the cylinder under the impress of the key’s various teeth, and then moving back to their original positions with the aid of a spring, one for each lever. The entire lock could thus be rendered as a small tube-shaped brass barrel, which could be easily fitted into a tube-shaped cavity in a wooden door or an iron safe, and with the deadbolt flush to the door’s outer edge (when the lock was open) or settled into its brass cavity in the door frame (when securely closed). Joseph Bramah would go on to invent many more contraptions and concepts during his life, many of them having nothing to do with locks, but involving his particular other fascination with the behavior of liquids when subjected to pressure. He invented the hydraulic press, for example, with its vast importance in industry worldwide. More trivially, he launched onto the market a primitive form of fountain pen and drew designs for a propelling pencil; more lastingly, he made the beer engine, which is still employed by the more traditionally minded innkeepers, and which would allow beer kept cool in a cellar to be pressure-delivered to thirsty customers in the bar above. (This invention obviated the need for the bartender to stagger up and down the cellar stairs, lugging fresh barrels of ale.) Draft beer drinkers today have little cause to remember the name “Bramah,” though there is a pub in Lancashire named for him. Likewise, few banknote printers know that it was Joseph Bramah who made the first machine that could cleverly ensure that their thousands of identical bills each bore a different sequential number. He also made an engine for planing large wooden planks, another for making paper, and he forecast that, one day, large screws would be used to propel big ships through the water. Yet it is really only by way of his lock making that Bramah’s name has now formally entered the English language. True, one can still find in literature references to a Bramah pen and a Bramah lock—the Duke of Wellington wrote admiringly of each, as did Walter Scott and Bernard Shaw. Yet when the word is used alone—and Dickens did on numberless occasions, in The Pickwick Papers, in Sketches by Boz, in The Uncommercial Traveller—it is a reminder that at least for the Victorian citizenry, his was an eponym: one used a Bramah to open a Bramah, one’s home was secured with a Bramah, one gave a Bramah to a favored friend so he or she might visit at all hours, come what may. Only when Mr. Chubb and Mr. Yale arrived on the scene (noted by the Oxford English Dictionary as first making it into the language in 1833 and 1869, respectively) did Joseph Bramah’s lexical monopoly hit the buffers. What made a Bramah lock so good was its vastly complicated internal design, of course, but what made it so lastingly good was the precision of its manufacture. And that was less the work of its inventor than of the man—the boy, really—whom Bramah hired to make copious numbers of his device and to make them well, to make them fast, and to make them economically. Henry Maudslay was eighteen years old when Bramah lured him away as an apprentice: he would go on to become one of the most influential figures in the early days of precision engineering, his influence being felt to this day both in his native Britain and around the world. The very young Maudslay, “a tall, comely young fellow” by the time Bramah hired him, cut his teeth in the Woolwich Royal Arsenal in East London. Working first as a twelve-year-old powder monkey—small boys, fleet of foot, were used by the Royal Navy to bring gunpowder down from the ships’ magazines to the gun deck—he was then moved to the carpenter’s shop, only to pronounce himself bored by the inaccuracy of wood. It was starkly clear to all who employed him that the youngster much preferred metal. They looked away when he smuggled himself into the dockyard smithy, and they said nothing when he developed a sideline in making a range of useful and very handsome trivets out of cast-off iron bolts. IN 1789, JOSEPH Bramah cut an anxious figure. The political situation across the Channel was causing an influx of terrified French refugees, most of them bound for London, where the more nervously xenophobic residents of England’s capital suddenly started to demand ever more security for their homes and businesses. Bramah, with his patent-protected monopoly, was caught in a bind: he alone could make his locks, but neither he nor any engineer he could find had the ability to make them in sufficient numbers at a low enough price. Most men who called themselves engineers may have been adept at the cruder crafts—at thumping ingots of heat-softened iron with heavy hammers and then working to shape the crudely formed results with anvils, chisels, and, most especially, files—but few had a great feel for delicacy, for the construction of (and the word had only recently been adopted) mechanisms. Change was coming, though. Workers at the smithies of eighteenth-century London were a close-knit group, and word eventually did reach Bramah that a particular youngster at Woolwich was startlingly unlike his older peers and, rather than bashing hunks of iron, was apparently crafting metal pieces of an unusual, fastidious daintiness. Bramah interviewed the teenage Maudslay. Though taking to him immediately, the former was only too well aware that the custom was for any would-be entrant to the trade to serve a seven-year apprenticeship. However, commercial need trumped custom: with would-be patrons beating down his door back on Piccadilly, Bramah had no time to spare for the niceties, decided to take a chance, and hired the youngster on the spot. His decision was to change history. Henry Maudslay turned out to be a transformative figure. First of all, he solved Bramah’s supply problems in an inkling—but not by the conventional means of hiring workers who would make the locks one by one through the means of their own craftsmanship. Instead, and just like John Wilkinson two hundred miles west and thirteen years earlier, Maudslay created a machine to make them. He made a machine tool: in other words, a machine to make a machine (or, in this case, a mechanism). He built a whole family of machine tools, in fact, that would each make, or help to make, the various parts of the fantastically complicated locks Joseph Bramah had designed. They would make the parts, they would make them fast and well and cheaply, and they would make them without the errors that handcrafting and the use of hand tools inevitably bring in their train. The machines that Maudslay made would, in other words, make the necessary parts with precision. Three of his lock-making devices can be seen today in the Science Museum in London. One is a saw that cut the slots in the barrels; another—perhaps less a machine tool than a means of ensuring that production went along at high speed, with every part made exactly the same—is a quick-grip, quick-release vise, a fixture that would hold the bolt steady while it was milled by a series of cutters mounted on a lathe; and the third is a particularly clever device, powered by a foot-operated treadle, that would wind the lock’s internal springs and hold them under tension as they were positioned and secured in place until the outer cover, a well-shined brass plate with the flamboyant signatures of the Bramah Lock Company of 124 Piccadilly, London, inscribed on its face, was bolted on to finish the job. A fourth and, some would argue, most supremely important machine tool component also started to make its widespread appearance around this time. It would shortly become an integral part of the lathe, a turning device that, much like a potter’s wheel, has been a mechanical aid to the betterment of human life since its invention in pharaonic Egypt. Lathes evolved very slowly indeed over the centuries. Perhaps the biggest improvement came in the sixteenth century, with the concept of the leadscrew. This was a long and (most often, in early times) wooden screw that was mounted under the main frame of the lathe and could be turned by hand to advance the movable end of the lathe toward or away from the fixed end. It could do so with a degree of precision; one turn of the handle might advance the movable part of the lathe by an inch, say, depending on the pitch of the leadscrew. It gave wood turners working on a lathe a much greater degree of control, and allowed them to produce things (chair legs, chess pieces, handles) of great decorative beauty, symmetric loveliness, and baroque complexity. Henry Maudslay then improved the lathe itself by many orders of magnitude—first by making it of iron, forging its structure stoutly and heavily, and at a stroke allowing it not merely to machine wooden items, but also to create symmetry out of shapeless billets of hard metal, which the flimsy lathes of old were incapable of doing. This alone might have been sufficient for us to remember the man, but then Maudslay employed one further component on his working lathes, a component whose origins are debated still, however, with the tenor of the debate pointing to an endless argument that complicates the historiography of precision and precision engineering. (#ulink_f1fb7944-c054-5f7b-aecb-730b919971f0) Henry Maudslay, once a “tall, comely fellow,” machined the innards of Bramah’s locks and went on to become the founding father of precision toolmaking, mass production, and the key engineering concept of achieving perfect flatness. Specifically, the device in question mounted on Maudslay’s lathes is known as a slide rest, a part that is massive, strongly made, and securely held but movable by way of screws, and is intended to hold any and all of the cutting tools. It is filled with gears that allow for the adjustment of the tool or tools to tiny fractions of an inch, to permit the exact machining of the parts to be cut. The slide rest is necessarily placed between the lathe’s headstock (which incorporates the motor and the mandrel that spins the workpiece around) and the tailstock (which keeps the other end of the workpiece secure). The leadscrew—Maudslay’s was made of metal, not wood, and with threads much closer together and with a more delicate pitch than was possible for a wooden version—advances the workpiece. The tools held on the slide rest can then be moved across the path of travel dictated by the leadscrew, thereby allowing the tools to make holes in the workpiece, or to chamfer it or (in due course, once milling had been invented, a process related in the next chapter (#ulink_7e8649ed-b4da-523b-b0c2-7ce5717a1570)) mill it or otherwise shape it to the degree that the lathe operator demands. So the leadscrew moves the workpiece longitudinally, and the slide rest that holds the tools that cut or chamfer or make holes in the workpiece moves transversely, or in all sorts of directions that are across the path made by the leadscrew. Metal pieces can be machined into a range of shapes and sizes and configurations, and provided that the settings of the leadscrew and the slide rest are the same for every procedure, and the lathe operator can record these positions and make certain they are the same, time after time, then every machined piece will be the same—will look the same, measure the same, weigh the same (if of the same density of metal) as every other. The pieces are all replicable. They are, crucially, interchangeable. If the machined pieces are to be the parts of a further machine—if they are gearwheels, say, or triggers, or handgrips, or barrels—then they will be interchangeable parts, the ultimate cornerstone components of modern manufacturing. Of equally fundamental importance, a lathe so abundantly equipped as Maudslay’s was also able to make that most essential component of the industrialized world, the screw. Over the centuries, there were many incremental advances in screw making, as we shall see, but it was Henry Maudslay (once he had invented or mastered or improved or in some other manner become intimately associated with the slide rest on his lathe) who then devised a means of cutting metal screws, efficiently, precisely, and fast. Much as Bramah had a lock in his workshop window on Piccadilly, for reasons of pride as much as for his famous challenge, so Maudslay, Sons and Field placed in the bow window of the firm’s first little workshop, on Margaret Street in Marylebone, a single item of which the principal was most proud—and that was a five-foot-long, exactly made, and perfectly straight industrial screw made of brass. Technically, Maudslay was not the first to perfect a screw-making lathe. Twenty-five years earlier, in 1775, Jesse Ramsden, a scientific instrument maker in Yorkshire who was funded by the same Board of Longitude for which the clockmaker John Harrison had labored, and who was not allowed to patent his invention, had made a small and exquisite screw-cutting lathe. This could cut tiny screws with as many as one hundred twenty-five turns to the inch—meaning it would take one hundred twenty-five turns to advance the screw by one inch—and so would allow the tiniest adjustments to any device to which the screw was harnessed. But Ramsden’s was effectively a one-off machine, as delicate as a watch, meant for work with telescopes and navigational instruments, and in no way destined for the making of large-scale devices made of much metal and that could work at great speed and maintain accuracy and be durable. What Maudslay had done with his fully equipped lathe was to create an engine that, in the words of one historian, would become “the mother tool of the industrial age.” Moreover, with a screw that was made using his slide rest and his technique, and with a lathe constructed of iron and not with the wooden frame he and Bramah had used initially, he could machine things to a standard of tolerance of one in one ten-thousandth of an inch. Precision was being born before all London’s eyes. So, whoever did invent the slide rest can take the credit for the later precise manufacture of countless components of every conceivable size and shape and relevance to a million and one machined objects. The slide rest would allow for the making of myriad items, from door hinges to jet engines to cylinder blocks, pistons, and the deadly plutonium cores of atomic bombs—as well as, of course, the screw. But just who did invent it? Not a few say Henry Maudslay, and that he did so in Joseph Bramah’s “secret workplace [which] contained several curious machines … constructed by Mr. Maudslay with his own hands.” Others say it was Bramah. Still others refute the idea of Maudslay’s involvement entirely, saying definitively that he did not invent it, nor ever claimed to have done so. Encyclopedias say the first slide rest was actually German, having been seen illustrated in a manuscript in 1480. Andrey Nartov, the Russian scientist who had the eighteenth-century title of personal craftsman to Tsar Peter the Great, was revered as the greatest teacher of lathe operation in Europe (and taught the methods to the then-king of Prussia) and is said to have made a working slide rest (and taken it to London to show it off) as early as 1718. And just in case the story from St. Petersburg has any doubters, a Frenchman named Jacques de Vaucanson quite provably made one in 1745. Chris Evans, a professor in North Carolina who has written extensively about the early years of precision engineering, notes the competing claims, and cautions against the “heroic inventor” treatment of the story. Far better to acknowledge, he says, that precision is a child of many parents, that its advances invariably overlap, that there are a great many indeterminate boundaries between the various disciplines to which the word precision can be attached, and that it was, in its early days, a phenomenon that evolved steadily over three centuries of ever-lessening bewilderment. It is, in other words, a story far less precise than its subject. That being said, Henry Maudslay’s principal legacy is a wholly memorable one, for other inventions and involvements followed his association with Joseph Bramah, from whose employ he left, in a huff, after his request for a raise—he was making thirty shillings a week in 1797—was turned down too curtly for his taste. MAUDSLAY PROMPTLY PROCEEDED to free himself from the circumscribed world of West London lock making, and he entered—one might say, he inaugurated—the very different world of mass production. He created in the process the wherewithal for making, in truly massive numbers, a vital component for British sailing ships. He built the wondrously complicated machines that would, for the next one hundred fifty years, make ships’ pulley blocks, the essential parts of a sailing ship’s rigging that helped give the Royal Navy its ability to travel, police, and, for a while, rule the world’s oceans. This all came about in a moment of the happiest chance, and just as with Bramah and the lock in Piccadilly, it involved a shopwindow (Henry Maudslay’s) and the proud public showroom display of the five-foot-long brass screw Maudslay had made on his lathe and which he had placed there, center stage, as an advertisement of his skills. Soon after he set up the screw display, so naval legend has it, came the serendipitous moment. It involved the two figures who were going to create the pulley block factory, and who vowed to do so properly, to fill an urgent and growing need. A block-making factory of sorts had already been set up in the southern dock city of Southampton in the mid-eighteenth century, performing some of the sawing and morticing of the wooden parts, but much of the finishing work still had to be done by hand, and in consequence, the supply chain remained unreliable at best. And a reliable supply chain was seen to be vital for England’s survival. Britain had been at war with France, on and off, for much of the late eighteenth century, and the arrival on the scene of Napoleon Bonaparte in the aftermath of the French Revolution convinced London that her forces needed to be at the ready for much of the early nineteenth century, too. Of the two British fighting forces, the army and the Royal Navy, it was the admirals who took the lion’s share of the war budget, and Britain’s docks were soon bristling with big ships ready to cast off at a moment’s notice to give any French opponents, Napoleon’s especially, a taste of the lash. Shipyards were busy building, dry docks were busy repairing, and the seas from the Channel to the Nile, from the Barbary Coast to Coromandel, were alive with great British men-o’-war, powerful and watchful, ceaselessly on the prowl. These were, of course, all sailing vessels. Mostly they were enormous craft with wooden hulls and copper-sheathed keels, with three decks ranged with cannon, with enormous masts of Norfolk Island pine supporting equally vast acreages of canvas sailcloth. And all the sail ware of the time were bolts of canvas suspended, supported, and controlled by way of endless miles of rigging, of stays and yards and shrouds and footropes, most of which had to pass through systems of tough wooden pulleys that were known simply to navy men as blocks—pulley blocks, part of a warship’s arrangements known within and beyond the maritime world as block and tackle. A large ship might have as many as fourteen hundred pulley blocks, which were of varying types and sizes depending on the task required. A block with a single pulley might be all that was needed to allow a sailor to hoist a topsail, say, or move a single spar from one location to another. The lifting of a very heavy object (an anchor, for example) might need an arrangement of six blocks, each with three sheaves, or pulleys, and with a rope passing through all six such that a single sailor might exert a pull of only a few easy pounds in order to lift an anchor weighing half a ton. Block-and-tackle physics, taught still in some good primary schools, shows how even the most rudimentary pulley system can offer the greatest of mechanical advantage, and combines this power with an equally great degree of simplicity and elegance. Blocks for use on a ship are traditionally exceptionally strong, having to endure years of pounding water, freezing winds, tropical humidity, searing doldrums heat, salt spray, heavy duties, and careless handling by brutish seamen. Back in sailing ship days, they were made principally of elm, with iron plates bolted onto their sides, iron hooks securely attached to their upper and lower ends, and with their sheaves, or pulleys, sandwiched between their cheeks, and around which ropes would be threaded. The sheaves themselves were often made of Lignum vitae, the very same hard and self-lubricating wood that John Harrison used for the gear trains of some of his clocks: most modern blocks have aluminum or steel sheaves and are themselves made of metal, except where the desired look of the boat is old-fashioned, in which case there is much showy brassware and varnished oak. Hence the early nineteenth-century Royal Navy’s acute concern. An increasingly fractious Napoleonic France lay just twenty miles away across the Channel, and countless maritime problems were demanding Britain’s maritime attentions elsewhere: what principally concerned the admirals was not so much the building of enough ships but the supply of the vital blocks that would allow the sailing ships, to put it bluntly, to sail. The Admiralty needed one hundred thirty thousand of them every year, of three principal sizes, and for years past, the complexity of their construction meant that they could be fashioned only by hand. Scores of artisanal woodworkers in and around southern England were originally bent to the task, a supply system that proved notoriously unreliable. As hostilities at sea became ever more commonplace, as more and more ships were ordered, the drumbeat for a more efficient system became ever louder. The then–inspector general of naval works, Sir Samuel Bentham, finally decided he would act; he would sort things out. And in 1801, Bentham was approached by a figure named Sir Marc Brunel, who said he had in mind a specific scheme for doing so. Brunel, a royalist refugee from the very French instability currently so vexing the Lords of the Admiralty—though he had first immigrated to America and become New York’s chief engineer before returning to England to marry—had sized up the mechanics of the block-making problem. He knew the various operations that were necessary to make a finished block—there were at least sixteen of them; a block, simple though it might have looked, was in fact as complex to make as it was essential to employ—and he had roughed out designs for machines that he thought could perform them. He sought and, in 1801, won a patent: “A New and Useful Machine for Cutting One or More Mortices Forming the Sides of and Cutting the Pin-Hole of the Shells of Blocks, and for Turning and Boring the Shivers, and Fitting and Fixing the Coak Therein.” His design was, in more ways than one, revolutionary. He had one machine perform two separate functions—a circular saw, for example, could perform the duties of a mortice cutter as well. He had the surplus motion of one machine drive its neighbor, maintaining a kind of mechanical lockstep. The necessary coordination of the machines one with the other required that the work each machine performed be accomplished with the greatest precision, for a wrong dimension passed into the system by one wrongly set machine would act much as a computer virus does today, amplifying and worsening by the minute, ultimately infecting the entire system, and forcing it to shut own. And rebooting a system of enormous iron-made steam-powered machines with flailing arms and whirling straps and thundering flywheels is not just a matter of pressing a button and waiting half a minute. Given the complexity of the system he had sold to the navy, it was essential only for Brunel to find an engineer who would and could construct such a set of never-before-made machines, and ensure that they were capable of the repetitious making, with great precision, of the scores of thousands of the wooden pulley blocks the navy so keenly needed. This is where Henry Maudslay’s window comes in. An old friend of Brunel’s from his French days, another migrant, named M. de Bacquancourt, happened to pass by the Maudslay workshop on Margaret Street and saw, prominent in the bow window, the famed five-foot-long brass screw that Maudslay himself had made on his lathe. The Frenchman went inside, spoke to some of the eighty employees in the machine shop, and then to the principal himself, and came away firm in the belief that if one man in England could do the work Brunel needed, here he was. So Bacquancourt told Brunel, and Brunel met Maudslay out at Woolwich. As part of the interview, Brunel then showed the youngster an engineering drawing of one of his proposed machines—whereupon Maudslay, who was able to read drawings in the same way that musicians can read sheet music with the facility that others read books, recognized it in an instant as a means of making blocks. Models of the proposed engines were constructed to show the Admiralty just what was envisioned, and Maudslay set to work, with a formal government commission. He was to devise and build, as specified in Brunel’s drawings, the first precision-made machines in the world that would be established for the sole purpose of manufacturing items. In this case it was pulley blocks, but the items could just as well have been guns, or clocks, or, in time to come, cotton gins or motorcars—en masse. The project took him six years. The navy built an enormous brick structure in its dockyard at Portsmouth to accommodate the armada of engines they knew was coming. And one by one, first from his workshop back up on London’s Margaret Street and then, as the company expanded, from a site in Lambeth, south of the River Thames, Maudslay’s epoch-making machines started to arrive. There would be forty-three of them in total, each performing one or another of the sixteen separate tasks that transformed a felled elm tree into a pulley block to be sent to the naval warehouse. Each machine was built of iron, to keep it solid and sturdy and able to perform its allotted task with the kind of accuracy the navy contract demanded. So there were machines that sawed wood, that clamped wood, that morticed wood, that drilled holes and tinned pins of iron and polished surfaces and grooved and trimmed and scored and otherwise shaped and smoothed the blocks’ way to completion. A whole new vocabulary was suddenly born: there were ratchets and cams, shafts and shapers, bevels and worm gears, formers and crown wheels, coaxial drills and burnishing engines. And all inside the Block Mills, as the structure was named in 1808, which was soon set to thundering activity. Each of Maudslay’s machines was sent power by ever-rotating and flapping leather belts, which themselves were spinning by their connection to long iron axles mounted to the ceiling and that, in turn, were set eventually rotating by an enormous thirty-two-horsepower Boulton and Watt steam engine that roared and steamed and smoked outside the building, in its own noisy and dangerous three-story lair. The Block Mills still stand as testament to many things, most famously to the sheer perfection of each and every one of the hand-built iron machines housed inside. So well were they made—they were masterpieces, most modern engineers agree—that most were still working a century and a half later; the Royal Navy made its last pulley blocks in 1965. And the fact that many of the parts—the iron pins, for example—were all made by Maudslay and his workers to exactly the same dimensions meant that they were interchangeable, which had implications for the future of manufacturing more generally—as we shall soon see, when the concept of interchangeability was recognized by a future American president. But the Block Mills are famous for another reason, one with profound social consequences. It was the first factory in the world to have been run entirely from the output of a steam engine. True, earlier machines had been driven by water, and so the concept of mechanization itself was not entirely new. But the scale and the might of what had been built in Portsmouth were different, and stemmed from a source of power not dependent on season or weather or on any external whim. Providing there was coal and water, and an engine made to specifications demanding of the greatest precision, the factory powered by it would run. The saws and the morticing devices and the drills of the future would thus be powered by engines. These engines would (both here in Portsmouth and then very soon thereafter in a thousand other factories elsewhere, making other things by other means) no longer be turned and powered and manipulated by men. The workers who in their various wood shops had hitherto cut and assembled and finished the navy’s pulley blocks had now become the first victims of machinery’s cool indifference. Where more than a hundred skilled craftsmen had once worked, and had filled, just, the navy’s insatiable appetite, now this thundering factory could feed it with ease, without ever breaking a sweat: the Portsmouth Block Mills would turn out the required one hundred thirty thousand blocks each year, one finished block every minute of every working day, and yet it required a crew of just ten men to operate it. Precision had created its first casualties. For these were men who needed no special skills. They did no more than feed logs into the slicing machine hoppers and, eventually, take the finished blocks away and stack them inside the storehouses; or else they took their oilcans and their bunches of cotton waste and set to greasing and lubricating and polishing and keeping a weather eye on the clanging and clattering maelstrom of black-and-green and brass-trimmed behemoths, all endlessly mocking them, by revolving and spinning and belching and rocking and lifting and splitting and sawing and drilling, an immense orchestra of machinery that was crammed into the massive new building. The social consequences were immediate. On the plus side of the ledger, the machines were precise; the machines did accurate work. The Lords of the Admiralty declared themselves content. Brunel received a check for the money saved in one year: ?17,093. Maudslay received ?12,000 and the acclaim of the public and of the engineering fraternity and became generally regarded as one of the most important figures in the early days of precision engineering and one of the prime movers of the Industrial Revolution. The Royal Navy shipbuilding program would now go ahead as planned, and with the new squadrons and flotillas and fleets that were able to be created so swiftly, the British saw to it that the wars with France were duly ended, and to Britain’s advantage. Napoleon was finally defeated, and was shipped off to Saint Helena in exile, traveling aboard a seventy-four-gun third-rate ship of the line, the HMS Northumberland, with as escort the smaller sixth-rate twenty-gun HMS Myrmidon. The rigging and other rope work of these two vessels were secured with about sixteen hundred wooden pulley blocks, almost all of them made in the Portsmouth Block Mills, sawed and drilled and milled with Henry Maudslay’s iron engines, all operating under the supervision of ten unskilled navy contract workers. Still, the ledger had two sides, and on the minus side, a hundred skilled Portsmouth men had been thrown out of work. One imagines that over the days and weeks after they were handed their final pay and told to go, they and their families wondered just why this had happened, why it was that as the need for products demonstrably increased, the need for workers to construct these products began to shrink swiftly away. To this scattering of Portsmouth men, and to those who relied upon these men for security and sustenance, a sum total rather too few for any serious political consideration, the arrival of precision was not altogether welcome. It seemed to benefit those with power; it was a troubling puzzlement to those without. There was a social consequence, a reaction, although the best known, mainly because of its intermittent and spectacular violence, took place some hundreds of miles to the north of Portsmouth and was specifically involved in another industry altogether. Luddism, as it is known today, was a short-lived backlash—it started in the northern Midlands in 1811—against the mechanization of the textile industry, with stocking frames being destroyed and mobs of masked men breaking factories to stop the production of lace and other fine fabrics. The government of the day was spooked, and briefly introduced the death penalty for anyone convicted of frame breaking; some seventy Luddites were hanged, though usually for breach of other laws against riot and criminal damage. By 1816, the steam had gone out of the rioters, and movement generally subsided. It never entirely died, though, and the word Luddite (from the movement’s presumed leader, Ned Ludd) remains very much in today’s lexicon, mainly as a pejorative term for anyone who resists the siren song of technology. That it does so serves as a reminder that, from its very beginnings, the world of precision-based engineering had social implications that were neither necessarily accepted nor welcomed by all. It had its critics and its Cassandras then; it has them still today, as we shall see. Henry Maudslay was by no means done with inventing. Once his forty-three block-making machines were all thrumming along merrily down in Portsmouth, once his contract with the navy had been completed, once his reputation (“the creator of the industrial age”) was secure, he came up with two further contributions to the universe of intricacy and perfection. One of them was a concept, the other a device. Both are essentials, even at this remove of two centuries, the concept most especially so. It involves the notion of flatness. It involves the notion that a surface may be created that is, as the Oxford English Dictionary has it, “without curvature, indentation or protuberance.” It involves the creation of a base from which all precise measurement and manufacture can be originated. For, as Maudslay realized, a machine tool can make an accurate machine only if the surface on which the tool is mounted is perfectly flat, is perfectly plane, exactly level, its geometry entirely exact. An engineer’s need for a standard plane surface is much the same as a navigator’s need for a precise timekeeper, as John Harrison’s, or a surveyor’s need for a precise meridian, such as that drawn in Ohio in 1786 to start the proper mapping of the central United States. The more prosaic matter of the making of a perfectly flat surface, a critical part of the machine-made world, required only a little ingenuity and a sudden leap of intuition—both these gifts combining in the late eighteenth century in the workshop of Henry Maudslay. The process is simplicity itself, and the logic behind it flawless. The Oxford English Dictionary illustrates it nicely with a quotation from the James Smith classic Panorama of Science and Art, first published in 1815, that “to grind one surface perfectly flat, it is … necessary to grind three at the same time.” While it has to be assumed that this basic principle had been known for centuries, it is commonly believed that Henry Maudslay was the first to put it into practice, and create thereby an engineering standard that exists to this day. (#ulink_16c56d22-7952-518c-bcf1-36b4132b0e3b) So accurate was Henry Maudslay’s bench micrometer that it was nicknamed “the Lord Chancellor,” as no one would dare have argued with it. Photograph courtesy of the Science Museum Group Collection. Three is the crucial number. You can take two steel plates and grind them and smooth them to what is believed to be perfect flatness—and then, by smearing each with a colored paste and rubbing the two surfaces together and seeing where the color rubs off and where it doesn’t, as at a dentist’s, an engineer can compare the flatness of one plate with that of the other. Yet this is a less than wholly useful comparison—there is no guarantee that they will both be perfectly flat, because the errors in one plate can be accommodated by errors in the other. Let us say that one plate is slightly convex, that it bulges out by a millimeter or so in its middle. It may well be that the other plate is concave in just the same place, and that the two plates then fit together neatly—giving the impression that the flatness of one is the same as the flatness of the other. Only by testing both these planes against a third, and by performing more grinding and planing and smoothing to remove all the high spots, can absolute flatness (with the kind of near-magical properties displayed by my father’s gauge blocks) be certain. AND THEN THERE was the measuring machine, the micrometer. Henry Maudslay is generally also credited with making the first of this kind of instrument, most particularly one that had the look and feel of a modern device. In fairness, it must be said that a seventeenth-century astronomer, William Gascoigne, had already built a very different-looking instrument that did much the same thing. He had embedded a pair of calipers in the eyeglass of a telescope. With a fine-threaded screw, the user was able to close the needles around each side of the image of the celestial body (the moon, most often) as it appeared in the eyepiece. A quick calculation, involving the pitch of the screw in inches, the number of turns needed for the caliper to fully enclose the object, and the exact focal length of the telescope lens, would enable the viewer to work out the “size” of the moon in seconds of arc. A bench micrometer, on the other hand, would measure the actual dimension of a physical object—which was exactly what Maudslay and his colleagues would need to do, time and again. They needed to be sure the components of the machines they were constructing would all fit together, would be made with exact tolerances, would be precise for each machine and accurate to the design standard. As with Gascoigne’s invention of a century before, the bench micrometer’s measurement was based on the use of a long and skillfully made screw. It employed the basic principle of a lathe, except that instead of having a slide rest with cutting or boring tools mounted upon it, there would be two perfectly flat blocks, one attached to the headstock, the other to the tailstock, and with the gap between them opened or closed with a turn of the leadscrew. And the width of that gap, and of any object that fitted snugly between the two flat blocks, could be measured—the more precisely if the leadscrew was itself made with consistency along its length, and the more accurately if the leadscrew was very finely cut and could advance the blocks toward one another slowly, in the tiniest increments of measurable movement. Maudslay tested his own five-foot brass screw with his new micrometer and found it wanting: in some places, it had fifty threads to the inch; in others, fifty-one; elsewhere, forty-nine. Overall, the variations canceled one another out, and so it was useful as a leadscrew, but because Maudslay was so obsessive a perfectionist, he cut and recut it scores of times until, finally, it was deemed to be wholly without error, good and consistent all along its massive length. The micrometer that performed all these measurements turned out to be so accurate and consistent that someone—Maudslay himself, perhaps, or one of his small army of employees—gave it a name: the Lord Chancellor. It was pure nineteenth-century drollery: no one would ever dare argue with or challenge the Lord Chancellor. It was a drily amusing way to suggest that Maudslay’s was the last word in precision: this invention of his could measure down to one one-thousandth of an inch and, according to some, maybe even one ten-thousandth of an inch: to a tolerance of 0.0001. In fact, with the device’s newly consistent leadscrew sporting one hundred threads per inch, numbers hitherto undreamed of could be achieved. Indeed, according to the ever-enthusiastic colleague and engineer-writer James Nasmyth, who so worshipped Maudslay that he eventually wrote a rather too admiring biography, the fabled micrometer could probably measure with accuracy down to one one-millionth of an inch. This was a bit of a stretch. A more dispassionate analysis performed much later by the Science Museum in London goes no further than the claim of one ten-thousandth. And this was only 1805. Things made and measured were only going to become more precise in the years ahead, and they would do so to a degree that Maudslay (for whom an abstraction, the ideal of precision, was perhaps the greatest of his inventions) and his colleagues could never have imagined. Yet there was some hesitancy. A short-lived hostility to machines—which is at least a part of what the Luddite movement represented, a mood of suspicion, of skepticism—briefly gave pause to some engineers and their customers. And then there was that other familiar human failing, greed. It was greed that in the early part of the nineteenth century played some havoc with precision’s halting beginnings across the water, to where this story now is transferred, in America. Chapter 3 (#ulink_828db123-ef31-5cf9-a4f1-5baf8a130d10) (TOLERANCE: 0.000 01) A Gun in Every Home, a Clock in Every Cabin (#ulink_828db123-ef31-5cf9-a4f1-5baf8a130d10) To-day we have naming of parts. Yesterday, We had daily cleaning. And to-morrow morning, We shall have what to do after firing. But to-day, To-day we have naming of parts. —HENRY REED, “NAMING OF PARTS” (1942) He was a soldier, his name unknown or long forgotten, a lowly young volunteer in Joseph Sterrett’s Fifth Baltimore Regiment. It was August 24, 1814, and I imagine the youngster was probably sweating heavily, his secondhand wool uniform patched and ill fitting and hardly suitable for the blazing late-summer sun. He was waiting for the fighting to begin, for battle to be joined. He was hiding behind a tumbled stone wall outside a cornfield, not entirely certain where he was, though his sergeant had suggested he was in a small port city named Bladensburg, connected to the sea by a branch of the Potomac that led into the Chesapeake Bay. British forces, the word went, had landed there from ships and were now rapidly advancing from the east. Washington, the capital of his country, a country now not even forty years old as an independent nation, was eight miles to the west behind him, and he was part of a force of six thousand that had been deployed to protect it. Whispers along the line held that President James Madison himself was on the Bladensburg battlefield, determined to make sure the Britons were made to run back to their vessels and flee for their lives. The young man doubted he would be of much use in the coming battle, for he had no gun—not a gun that worked, anyway. His musket, a new-enough Springfield 1795 model, had a broken trigger. He had fractured it, cracked the guard, and so ruined the trigger during a previous battle, an earlier skirmish of what they were starting to call the War of 1812. In all other ways he was well enough equipped. He had an ample supply of black powder paper cartridges, a pouch full of roundball ammunition. But the regimental armorer had told him it would be at least three days before they could forge a new trigger for him, and that he had best do all he could with his bayonet, which he had sharpened that very night, before the sun rose. Otherwise, the armorer had said with a grin, just hit the enemy hard with the gun’s oakwood stock—it should give him a black eye at the very least. That turned out not to be at all funny. The British were close by, on the left bank of the East Branch of the Potomac, when their artillery opened up later that morning, first with a deafening volley of Congreve rockets, a terrifying technique they had learned from fighting in India. It was at that moment, as massive divots of torn earth and stones clattered down around him, that the young man decided his life was more valuable than the winning of this particular battle, and that if the army couldn’t be bothered to fix his musket, then he was going to run. So he turned and plunged into the high corn, heading back home to Baltimore. He soon understood he was not alone. Through the stands of corn he could see at least five, ten, dozens of other men who were doing just the same, streaming away from the fight. Some he knew, young lads from Annapolis and the Washington Navy Yard and the Light Dragoons, all of them apparently believing that the defense of Bladensburg was hopeless. He ran and ran and ran, and they ran, too, and all of them were still running when they crossed the line marking the District of Columbia, and they continued running, loping breathlessly in many cases, when, half an hour later, there rose before him some of the mighty structures of his capital, great buildings from where his country’s government was dealing with the incomprehensible vastness of America. He slowed to a walk. He felt he was safe now. His city was not. Before the night was out, the pursuing British troops had sacked it, more or less entirely. He found out later that the British told some of the city folk they were acting so cruelly because American forces some weeks before had had the temerity to wreck and damage buildings in the city of York, in Upper Canada. So here they burned out of revenge. They torched the half-built Capitol. They gutted the Library of Congress, and its three thousand books, and they sacked the House of Representatives. British officers dined that evening on the food Madison had been planning to eat at his Presidential Mansion, and then, after wreaking that domestic indignity, they burned his house down, too, until a ferocious rainstorm—some say a tornado—blew in and doused the flames. The date, August 24, 1814, would be remembered for centuries to come. The Battle of Bladensburg, the last stand before the Burning of Washington and Burning of the White House, that most potent of incendiary symbols, had been one of the most infamous routs in all American history, a shameful and sorry episode indeed. The imagined account of this one soldier at war was typical of what happened that day, with battle lines being broken and troops running away in panic before the advancing enemy. There were many reasons for the defeat, and they would be debated by clubbable old soldiers for many years. Inept leadership, ill-preparedness, insufficient numbers—the usual excuses for substantial loss have all been offered down the years. Yet one, a most notorious shortcoming of the American forces (who, after all, had fought little in the years since the War of Independence), was that the muskets with which their infantrymen had been equipped were notoriously unreliable. More important, when they failed, they were fiendishly difficult to repair. When any part of a gun failed, another part had to be handmade by an army blacksmith, a process that, with an inevitable backlog caused by other failures, could take days. As a soldier, you then went into battle without an effective gun, or waited for someone to die and took his, or did your impotent best with your bayonet, or else, as the young man of Sterrett’s regiment did, you ran. The problem with gun supply was twofold. The U.S. Army’s standard long gun of the time was a smooth-bored flintlock musket based on a model first built in France and known as the Charleville. The first of these weapons had been imported into the newly independent United States directly from France; they were then manufactured by agreement at the newly built U.S. government armory in Springfield, Massachusetts. Both models had worked adequately, though all flintlocks had misfiring problems and suffered all the simple physical shortcomings that afflicted handmade weapons that were pressed into continuous service—they overheated; their barrels became clogged with powder residue; or the metal parts broke, snapped, got bent, unscrewed, or were simply lost. This led to the second problem—because once a gun had been physically damaged in some way, the entire weapon had to be returned to its maker or to a competent gunsmith to be remade or else replaced. It was not possible, incredible though this might seem at the remove of a quarter millennium, simply to identify the broken part and replace it with another from the armory stores. No one had ever thought to make a gun from component parts that were each so precisely constructed that they were identical one with another. Had this step been taken, a broken part could have been replaced, swapped for another, because thanks to the precision of its making, it would have been interchangeable. Break a trigger in battle, and all one would have to do was fall back and get the armorer at the rear of the line to reach into his tin box marked “Triggers” and get another, ease it into place, secure it, and be back on the firing line as a fully armed and effective infantryman within minutes. Yet no one had thought of such a thing—except that they had. Thirty years before the humiliating debacle at Bladensburg, a new manufacturing process had been created that, had it been in operation in the United States in 1814, might well have staved off a defeat occasioned by the failure of the soldiers’ guns. The new thinking about the principles of gun making, thinking that, if put into practice, might perhaps have kept Washington from being put to the torch, began not in Washington, nor in the two federal armories at Springfield and down at Harpers Ferry, Virginia, nor in most of one of the stripling gun-making factories that had sprung up during and immediately after the Revolutionary War. The idea was actually born three thousand miles away, in Paris. BACK IN THE late eighteenth century, no one spoke about “the dark side.” The phrase is modern, too new for the OED. In almost all the interviews for this book, about the ultrahigh-precision instruments, devices, and experiments that indicate where the precision that originates here is likely to be going, engineers and scientists referred frequently, and usually obliquely, to what “the dark side” might be doing. Once in a while, I would meet someone who admitted to having security clearance, and would thus in theory be able to discuss in greater detail what this experiment was leading to, how this device might be constructed, what the future of such-and-such a project might be—but he would invariably grin and say that, no, he couldn’t discuss what “the dark side” was doing. “The dark side” is the American military, and in terms of new weaponry or research into the unimaginably precise, that tends to mean the U.S. Air Force. Area 51 is the dark side. DARPA is the dark side. The NSA is the dark side. The role of the dark side in this story is immense, but in today’s world, it is mainly to be only alluded to. Lewis Mumford, the historian and philosopher of technology, was one of the earliest to recognize the major role played by the military in the advancement of technology, in the dissemination of precision-based standardization, in the making of innumerable copies of the same and usually deadly thing, all iterations of which must be identical to the tiniest measure, in nanometers or better. The stories that follow, in which standardization and precision-based manufacturing are shown to become crucial ambitions of armies on both sides of the Atlantic, serve both to confirm Mumford’s prescience and to underline the role that the military plays in the evolution of precision. The examples from the early days of the science are of course far from secret; those from today, and that might otherwise be described in full to illustrate today’s very much more precise and precision-obsessed world, are among the most secure and confidential topics of research on the planet—kept in permanent shadow, as the dark side necessarily has to be. IT WAS IN the French capital in 1785 that the idea of producing interchangeable parts for guns was first properly realized, and the precision manufacturing processes that allowed for it were ordered to be first put into operation. Still, it is reasonable to ask why, if the process was dreamed up in 1785, was it not being applied to the American musketry in official use in 1814, twenty-nine years later? Men were running, battles were being lost, great cities were being burned—and in part because the army’s guns were not being made as they should have been made. There is an answer, and it is not a pretty one. TWO LITTLE-REMEMBERED FRENCHMEN got the honor of first introducing the system that, had it been implemented in time and implemented properly, would have given America the guns it should have had. The first, the less familiar of the pair, despite the evidently superior nature of his name, was Jean-Baptiste Vaquette de Gribeauval, a wellborn and amply connected figure who specialized in designing cannons for the French artillery. He supposedly came up with a scheme, in 1776, for boring out cannons using almost exactly the same technique that John Wilkinson had invented across in England, that of moving a rotating drill into a solid cannon-size and cannon-shaped slug of iron. Wilkinson had patented his precisely similar system two years earlier, in 1774, but nonetheless, the French system, the syst?me Gribeauval, as it came to be known for the next three decades, long dominated French artillery making. It gave the French armies access to a range of highly efficient and lightweight, but manifestly not entirely originally conceived, field pieces. (Gribeauval did employ what were called go and no-go gauges as a means of ensuring that cannonballs fitted properly inside his cannons, but this was hardly revolutionary engineering, and it had been around in principle for five centuries.) The second figure, the man who did the most to bring the system of interchangeable parts to the making of guns, and whose technique was, unlike Gribeauval’s, unchallengeable, was Honor? Blanc. He was not a soldier but a gunsmith, and during his apprenticeship he became well aware of the Gribeauval system. He decided early in his career that he could bring a similar standardization to the flintlock musket, for the benefit of the man on the battlefield. Yet there was a difference. A cannon was big and heavy and crude—a gunner simply touched his linstock, with its attached lighted match, to the vent, and the cannon fired—and so such parts as there were proved easily amenable to standardization. With the flintlock, however, the lock (that part of a musket that delivered the spark that exploded the priming powder that ignited the main charge and drove the ball down the barrel) was a fairly delicate and complex piece of engineering, made of many oddly shaped parts and liable to all kinds of failure. To the uninitiated, the names of the bits and pieces of a flintlock alone are bewildering: a lock has parts that are variously known as the bridle, the sear, the frizzle, the pan, and any number of springs and screws and bolts and plates as well as, of course, the spark-producing (when struck by the aforementioned metal frizzle) piece of flint. To render the lock into a standard piece of military equipment, with all its parts made exactly the same for each lock, was going to be a tall order. (#ulink_061dbd7c-af89-5c44-9f39-d1933cb9acd6) The many component parts of the flintlock on a late eighteenth-century rifle were each made by hand, and had to be filed to fit. Cost, rather than the well-being of the infantryman or the conduct of the battle, was the prime motive. The French government declared in the mid-1780s that the country’s gunsmiths were charging too much for their craftsmanship, and demanded they improve their manufacturing process or lower their prices. The smiths not unnaturally balked at the impertinence of the suggestion, and promptly tried selling their products to the new armories and gun makers across the Atlantic in America, a move that alarmed the French government, as it imagined it might well run out of weaponry as a result. It was at this point that Honor? Blanc entered the picture, taking a civilian job as the army’s quality-control inspector. His brother gunsmiths expressed their dismay over the fact that one of their number was going over to the other side, was a poacher turning gamekeeper. Blanc dismissed the criticism and got on with his job, his own motivation being the welfare of the soldier out in the field rather than allowing the government to cut costs. He was greatly influenced by M. de Gribeauval, and decided he could ape his system of standardization, ensuring that all the component parts of a flintlock be made as exact and faithful copies of one perfectly made master. This master he made himself, carefully and with great precision, and with all the specifications laid down as precisely as possible (using the arcane system of the Ancien R?gime, which still employed dimensional measures such as the pointe, the ligne, and the pouce) to tolerances of about what today we would recognize as 0.02 millimeters. He then made a series of jigs and gauges to ensure that all the locks made subsequently were faithful to this first perfect master, by the judicious use of files and such lathes as were available. The gunsmiths hired by Blanc to perform this task—by hand, still—made each lock exactly as the original. Providing that they did so, exactly, all the pieces would then fit perfectly together, and the whole assembled lock would fit equally perfectly into each completed weapon. Yet only a small number of gunsmiths were willing to work under these stringent new conditions. Most balked. Making guns simply by copying parts reduced the value of the gunsmith’s craftsmanship to near insignificance, they argued. Unskilled drones could do their work instead. By arguing this, the French smiths were voicing much the same complaints as the Luddites had grumbled over in England: that precision was stripping their skills of worth. This argument would be heard many times in the future as the steady march of precision engineering advanced across Europe, the Americas, the world. The kind of mutinous sentiments heard in the English Midlands half a century before were now being muttered in northern France, as precision started to become an international phenomenon, its consequences rippling into the beyond. Such was the hostility in France to Honor? Blanc, in fact, that the government had to offer him protection, and so sequestered him and his small but faithful crew of precision gun makers in the basement dungeons of the great Ch?teau de Vincennes, east of Paris. At the time, the great structure (much of it still standing, and much visited) was in use as a prison: Diderot had been incarcerated there, and the Marquis de Sade. In the relative peace of what would, within thirty years, become one of postrevolutionary France’s greatest arsenals, Blanc and his team worked away producing his locks, all of them supposedly identical. Blanc made all the necessary tools and jigs to help in his efforts—according to one source, hardening the metal pieces by burying them for weeks in the copious leavings of manure from the castle stables. By July of 1785, Blanc was ready to offer a demonstration. He sent out invitations to the capital’s nabobs and military flag officers and to his still-hostile colleague gunsmiths, to show them what he had achieved. Many officials came, but few of the smiths, who were still seething. Yet one person of great future significance did present himself at the donjon’s fortified gates: the minister to France of the United States of America, Thomas Jefferson. Jefferson had arrived in France the year before, to work as official emissary of the new American government alongside Benjamin Franklin and John Adams. By chance, both these men left Paris that July (Adams for London, Franklin for Washington), leaving the intellectually curious and polymathic Jefferson alone in the ferment of prerevolutionary France. A demonstration of something scientific, with possible application for his own fledgling arms industry across the ocean, sounded like an ideal way to spend a hot Friday afternoon. Besides, it was pleasantly cool down in the ch?teau’s dungeons, while up above in the Paris of July 8, 1785, it sweltered. (#ulink_5d0853ef-7936-5465-a0f7-6dbaaf332ff2) Thomas Jefferson, while U.S. minister to France, observed the early work on creating interchangeable parts for flintlock muskets, and told his superiors in Washington that American smiths should follow the French practice. Honor? Blanc had arranged before him a collection of fifty locks, each gleaming in such daylight as filtered through the slit windows. Once everyone was settled on the bleachers, with onlookers paying close attention, he quickly disassembled half of them, throwing the various components of the twenty-five randomly selected locks into trays: twenty-five frizzle springs here, twenty-five faceplates there, twenty-five bridles there, twenty-five pans in another box. He shook each box so that the pieces were as disarranged as possible—and then, with a calm and an aplomb born of his supreme confidence in his method, he quickly reassembled out of this confusion of components twenty-five brand-new musket locks. Each one of these was made of parts that had never been joined together before—but it made no difference. Everything fitted to everything, for the simple reason that with the great precision of its making, and its faithful adherence to the dimensions of the master lock, each part was identical to each other. The parts were all, in other words, exactly interchangeable. The French officials were at first vastly impressed. The army set Blanc up in an officially sponsored workshop, he began producing inexpensive flintlock parts for the military and profits for himself, and for four further years all seemed fine. Then came 1789 and the unholy trinity of the Revolution, Gribeauval’s death, and the Terror. The ch?teau was stormed, and Blanc’s workshop was sacked by the rioters. His sponsor was suddenly no longer there to protect him, and there was a fast-growing, eventually fanatical, opposition among the sansculottes toward mechanization, toward efficiencies that favored the middle classes, toward techniques that put the honest work of artisans and craftsmen to disadvantage. By the turn of the century, the idea of interchangeable parts had withered and died in France—and some say to this day that the survival of craftsmanship and the reluctance entirely to embrace the modern has helped preserve the reputation of France as something of a haven for the romantic delight of the Old Ways. In America, though, the reaction was very different, and all thanks to the prescient eye of Thomas Jefferson. The first time he described what he had seen was on August 30, in a long letter to John Jay, the then–secretary of foreign affairs. He began with the customary flourish of logistical explanation regarding the route by which his last letter had reached Jay, an inconvenience unknown today with postal services being such a commonplace. I had the honor of writing to you on the 14th. inst. by a Mr. Cannon of Connecticut who was to sail in the packet. Since that date yours of July 13 is come to hand. The times for the sailing of the packets being somewhat deranged, I avail myself of a conveiance [sic] of the present by the Mr. Fitzhughs of Virginia who expect to land at Philadelphia … … An improvement is made here in the construction of the musket which it may be interesting to Congress to know, should they at any time propose to procure any. It consists in the making every part of them so exactly alike that what belongs to any one, may be used for every other musket in the magazine. The government here has examined and approved the method, and is establishing a large manufactory for the purpose. As yet the inventor [Blanc] has only completed the lock of the musket on this plan. He will proceed immediately to have the barrel, stock, and their parts executed in the same way. Supposing it might be useful to the U.S., I went to the workman, he presented me the parts of 50 locks taken to pieces and arranged in compartments. I put several together myself taking pieces at hazard as they came to hand, and they fitted in the most perfect manner. The advantages of this, when arms need repair, are evident. He effects it by tools of his own contrivance which at the same time abridge the work so that he thinks he shall be able to furnish the musket two livres cheaper than the common price. But it will be two or three years before he will be able to furnish any quantity. I mention it now, as it may have influence on the plan for furnishing our magazines with this arm. Jefferson was indeed seriously impressed with Blanc’s system, and wrote further to friends and colleagues back in Washington, and in Virginia several times, to underline his belief that American gunsmiths should be encouraged to adopt the new French system. And in due course, the makers began to get the message, most especially in New England, where most gunsmiths were to be found. If skepticism lingered back in Europe, America proved herself, quite literally, to have the mind-set of the New World, any reluctance being swiftly dispelled by the U.S. government’s decision to place enormous orders for new muskets, so long as their parts were, in line with Jefferson’s thinking, interchangeable. Two firms of private gunsmiths led the bidding for this government contract to make the first batch of muskets: ten thousand by one account, fifteen thousand by others. The winner of the contract, which meant an immediate cash payment of the not insignificant sum of five thousand dollars, was one Eli Whitney, of Massachusetts. Whitney remains a man of great fame, still known to most in America today as he has been for two centuries. His face appears on a postage stamp. He is part of the educational curriculum. He ranks alongside inventors and businessmen—Edison, Ford, John D. Rockefeller. To any schoolchild today, his name means just one thing: the cotton gin. This New Englander, at the age of just twenty-nine, had invented the device that removed the seeds from cotton bolls, and thus made the harvesting of cotton the foundation of a highly profitable Southern states economy—but only if slaves were used to perform the work, an important caveat. To any informed engineer, however, the name Eli Whitney signifies something very different: confidence man, trickster, fraud, charlatan. And his alleged charlatanry derives almost wholly from his association with the gun trade, with precision manufacturing, and with the promise of being able to deliver weapons assembled from interchangeable parts. “I am persuaded,” he declared with a flourish of elaborate solemnity in his bid to make a cache of guns for the U.S. government, “to make the same parts of different guns, as the lock for example, as much like each other as the successive impressions of a copperplate engraving.” It was the utmost piffle. When Whitney won the commission and signed the government contract in 1798, he knew nothing about muskets and even less about their components: he won the order largely because of his Yale connections and the old alumni network that, even then, flourished in the corridors of power in Washington, DC. Once he had the contract in hand, he put up a small factory outside New Haven and promptly claimed to be manufacturing muskets there, weapons based, as were all smooth-bore American guns of the time, on the French Charleville design. He took an unconscionable time to produce any weapons, however. The contract specified a delivery of at least some of the muskets by 1800, but there were only a handful of finished guns, and all Whitney could offer as a salve by that due date was a demonstration of the quality, as he claimed, of the guns that his new factory was now notionally in the process of making. Whitney performed what is seen as his notorious demonstration in January 1801—a supposed confidence-building exercise, it would be called today—before a distinguished audience that included the then-president, John Adams, and his vice president, soon to become president, Thomas Jefferson, the man who had started the ball rolling fifteen years before. There were also dozens of congressmen and soldiers and senior bureaucrats, all men who needed to be convinced that public treasure was going to be expended on what would be a truly worthwhile venture. They had been told they were there to witness Whitney demonstrating, with the use of a single screwdriver, how his musket locks were properly interchangeable. Everyone in the room was ready to believe him, Whitney’s cotton-gin-based reputation having long preceded him. It seemed to be of no great moment to anyone in the room, however, that the man didn’t even bother to disassemble the locks he had on show. Instead, he merely took a number of finished muskets, used his screwdriver to detach the locks from their wooden gunstocks, then slipped them whole into slots on other gunstocks, and so made it appear to the guileless visitors as though his parts were, as promised, truly interchangeable. He explained as he went along what he was doing, and not even Jefferson, who had seen Blanc’s demonstration at Vincennes in 1785 and might have had sufficient knowledge to splutter, “Hold on a minute!” had the temerity to challenge him, to express even the smallest measure of skepticism. Quite the reverse: the president-elect bought Whitney’s explanation in its entirety, and wrote enthusiastically to the then-governor of Virginia, saying that Whitney had “invented moulds and machines for making all the pieces of his locks so exactly equal, that take 100 locks to pieces and mingle their parts, and the hundred locks may be put together as well by taking the first pieces that comes to hand.” The truth is Jefferson had been hoodwinked, as had everyone else present that day. For there had been no molds, no machines for making all the parts “so exactly equal.” Whitney’s new-made factory, powered by water, not yet by steam (even though engines were readily available), had neither the tools nor the capacity to make precision-engineered pieces. Realizing this, he had instead hired a clutch of artisans, craftsmen, and told them to make the flintlock components with their own files and saws and polishers, and make them one by one, by hand—and not necessarily all the same, either, for the way he had planned his show did not allow for anyone to inspect the locks themselves, only that they fitted into the stocks. Êîíåö îçíàêîìèòåëüíîãî ôðàãìåíòà. Òåêñò ïðåäîñòàâëåí ÎÎÎ «ËèòÐåñ». Ïðî÷èòàéòå ýòó êíèãó öåëèêîì, êóïèâ ïîëíóþ ëåãàëüíóþ âåðñèþ (https://www.litres.ru/simon-winchester/exactly-how-precision-engineers-created-the-modern-world/?lfrom=688855901) íà ËèòÐåñ. Áåçîïàñíî îïëàòèòü êíèãó ìîæíî áàíêîâñêîé êàðòîé Visa, MasterCard, Maestro, ñî ñ÷åòà ìîáèëüíîãî òåëåôîíà, ñ ïëàòåæíîãî òåðìèíàëà, â ñàëîíå ÌÒÑ èëè Ñâÿçíîé, ÷åðåç PayPal, WebMoney, ßíäåêñ.Äåíüãè, QIWI Êîøåëåê, áîíóñíûìè êàðòàìè èëè äðóãèì óäîáíûì Âàì ñïîñîáîì.
Íàø ëèòåðàòóðíûé æóðíàë Ëó÷øåå ìåñòî äëÿ ðàçìåùåíèÿ ñâîèõ ïðîèçâåäåíèé ìîëîäûìè àâòîðàìè, ïîýòàìè; äëÿ ðåàëèçàöèè ñâîèõ òâîð÷åñêèõ èäåé è äëÿ òîãî, ÷òîáû âàøè ïðîèçâåäåíèÿ ñòàëè ïîïóëÿðíûìè è ÷èòàåìûìè. Åñëè âû, íåèçâåñòíûé ñîâðåìåííûé ïîýò èëè çàèíòåðåñîâàííûé ÷èòàòåëü - Âàñ æä¸ò íàø ëèòåðàòóðíûé æóðíàë.