J. Storrs Hall

born in 1954 AD; still alive (age ~70)

Quotes (Authored)

The optimism and progress of the first half of the century was due to the rapidly expanding supply of energy, as was the optimism and progress of the Victorian era, and, indeed, of the Industrial Revolution as a whole.

At a rough average, consumer retail energy runs about five times the commodity price of the raw fuel. (The exception is that gasoline costs less than twice the price of the crude oil it's made from.)

Most electricity in the U.S. comes from coal or natural gas. With uranium, since it involves a costly isotopic enrichment step, the markup is more like a factor of 30. Even so, the cost of the uranium going into power generation is trivial compared to the other costs.


The average price of a year's energy, in chemical fuels: $6,553. In nuclear: $5.80.

The cost of actual electricity delivered to your home is due to the cost of the capital—the reactor and generating plant, but also the transmission lines, substations, power grid control centers, and so forth—and overhead, including maintenance and a truly staggering regulatory burden, which multiplies the cost by an order of magnitude.

The cost structure of nuclear power is much more like that of solar or wind power than that of fossil fuels. The cost of "renewables" is essentially all capital and overhead.

Nuclear fuels produce 1 million to 10 million times the energy, per weight, of chemical ones—and thus require the extraction of a million times less raw material, and the production of a million times less ash, than fossil fuel for the same amount of energy.

A wind turbine uses up more lubricating oil than a nuclear plant uses uranium, per kilowatt-hour generated.

Molten salt reactors using thorium and integral fast reactors using uranium-plutonium alloys could achieve a 99 percent fuel burn-up, improving both fuel efficiency and waste production by a couple order of magnitude over the 1960s designs that we're still using.

Tennessee's Oak Ridge facility [operated] a molten salt reactor (but without the full thorium cycle) for a year or so, before the project was canceled by Richard Nixon.

  • about Richard Nixon
  • [Richard Nixon] had banished a [molten salt] reactor that was virtually meltdown-proof, left comparatively little long-lived waste, made it more difficult to fashion a bomb from the waste, ran at friendlier atmospheric pressure instead of the potentially explosive pressurized environments of conventional reactors, and ran at much higher temperatures, making it more cost-effective as an electricity generator.

  • about Richard Nixon
  • There is as much headroom in physics and engineering for energy as there is in computation; what is stopping us is not lack of technology but lack of will and good sense.

    As a result of the earthquake and tsunami, about 16,000 people died (most drowned), 6,000 were injured, and 2,500 went missing... It was a disaster of epic proportions. As part of all this devastation, the Fukushima power plant was damaged and some radioactive materials were released into the local environment. How much danger did this release add to the overall cataclysm? Zero. No one was killed by radiation exposure, and projections for excess cancer from accumulated exposure are negligible.

    It is worth pointing out just how much harm and suffering the nuclear fear industry has caused in human terms. On the order of 1 percent of the evacuees from both Fukushima and the 1986 Chernobyl disaster committed suicide.

    One example of a valuable and sensible piece of regulation was the Clean Air Act. By reducing coal-fired air pollution, it saved over 50,000 lives per year in the U.S. This was a very good thing—I remember, at age 12, being the youngest pallbearer for my grandfather, who had died of emphysema.

    Two-thirds of all the new generating plants started in 1966 were nuclear.

    [Nuclear power plants] cannot explode. An explosive chain reaction requires an unmoderated critical mass with 80 percent or more enrichment. Power reactors use a moderated reaction in uranium that is only enriched 3 percent. Yes, they are both fission. But then the detonation of a stick of dynamite and the digestion of a stick of chocolate are both oxidation.

    It is radiophobia, not properly engineered nuclear power, that is the silent unseen killer. Energy poverty is estimated to kill roughly 28,000 U.S. residents annually from cold alone, a toll that falls almost entirely on the poor. In the last 25 years, we could well have saved a million lives.

    There is no evidence of a carcinogenic effect in humans for acute irradiation at doses less than 100 millisieverts (mSv) and for protracted irradiation at doses less than 500 mSv. Even the pusillanimous Nuclear Regulatory Commission (NRC) admits, and I quote, "But there are no data to establish a firm link between cancer and doses below about 10,000 mrem (100 mSv)." And then they proceed to point out that this level is 100 times greater than what they allow for public radiation exposure. This is like setting a speed limit of one mile per hour because people have been injured doing 100. You can imagine what regulations made on the basis of this kind of thinking did to the cost of a nuclear power plant.

    Yes, nuclear power today is expensive. Shipping would be expensive, too, if trucks had to operate with a speed limit of one mile per hour.

    A study by economist Peter Lang found that during the 1950s and '60s the cost of nuclear plants was decreasing by about 25 percent for each doubling of capacity. And then the trend utterly reversed. He calculates that, had the earlier trend continued, the price of power would have fallen to 10 percent of what it is in 2020, and "the extra nuclear generation could have exceeded the extra generation from coal by year 2000 (assuming electricity demand did not change)." And that's not all. "If the extra nuclear generated electricity had substituted for coal and gas generation, about 9.5 million deaths and 174 Gt CO2 may have been avoided."

  • J. Storrs Hall, Peter Lang
  • A clear case that bureaucracy was the main factor in choking the advance of nuclear power in the United States can be found in the one place where it didn't happen, the U.S. Navy. Former Navy secretary John Lehman wrote, "The reason for Navy nuclear success is because there has always been one strong experienced person in charge and accountable, standing like a stone wall against the bureaucratic onslaught." The Navy has over 6,000 reactor-years of accident-free operation. It has built 526 reactor cores (for comparison, there are 99 civilian power reactors in the U.S.), with 86 nuclear-powered vessels in current use.

  • J. Storrs Hall, John Lehman
  • We could have had flying cars in 1940, but for World War II. We could have started on nanotech in 1960. We could have started on solar-powered satellites in the '70s. The low-hanging fruit of nuclear power was left to rot on the ground, and it was sprayed with the kerosene of hysteria and ignorance.

    In the 1970s, when the Department of Energy was created and energy per capita flatlined, one heard a lot of well-meaning people decry the space program and say how, instead, we should pay attention to the problems we had right here on Earth. But most of the problems we have here are due to poverty. Poverty is ameliorated by cheap energy.

    The Great Stagnation and the Henry Adams flatline were not merely coincidental; they were synonymous.

    If you had told a farmer in 1900 that his descendants a century thence would spend their time mostly inside heated and air-conditioned (air what?) buildings, sitting on cushioned seats, talking, reading, and writing, he might have believed you if he was a believer in Progress. If you had told him they would call this "working," he might have laughed in your face.

    Scientists, in general, make lousy futurists. The job of a scientist is to establish facts by careful experimentation, documentation, and repeated testing until it is as verified as human inquiry can make it. If futurists waited for the kind of verification scientists require, they'd be historians.

    Drexler coined the word nanotechnology by analogy to the already common microtechnology, which mostly applied to photolithograpic chip-making techniques but was broadly applied to any technology that manipulated matter at the micron scale. Since the new technology would manipulate matter at the nanometer scale, the extension seemed straightforward.

    By the mid-1990s, nanotechnology had picked up the cachet of a popular buzzword, and started showing up in science fiction, popular scientific, and technical publications—and grant proposals. After all, if nanotechnology means dealing with matter on the scale of nanometers, then a huge amount of existing science and engineering was, by definition, nanotechnology: chemistry, molecular biology, surface physics, thin films, ultrafine powders, and so forth. By the turn of the century, computer chips, the original microtechnology, had features small enough that they could reasonably be measured in nanometers.

    The world of science, no less than the world of politics, is driven by fads and buzzwords. And where the two intersect—the science and technology funding agencies—you need a buzzsaw to get through the buzzwords. So the word nanotechnology was rapidly adopted by researchers attempting to get just that one extra edge over their peers in the highly competitive, not to say cutthroat, business of getting government money.

    Every general futurist gets the future wrong to a significant extent more than a couple of decades ahead. (I'm sure I'm no exception.) The most common failing, by far, is the simplest one: to assume the future will be like the present.

    There's a big difference between science and engineering. It's a distinction often lost on journalists, so when they need a pronouncement on the likelihood of some technological advance or another, they have a tendency to ask a scientist. This is often a mistake.

    Science, historically, tends to lag behind engineering.

    Both science and engineering involve the discovery of natural law and its application to the real world. The difference is that the scientist is trying to discover the most general law possible, applicable to any situation, while the engineer is trying to discover recipes to build machines that work in exactly one situation: when built as specified.

    Imagine it's the 1600s and we are comparing Sir Isaac Newton with a military engineer of the day. Newton's law of gravity, one of the towering achievements of the human intellect, is applicable to all masses anywhere in the universe. The engineer is interested in a cannonball of exactly ten pounds traveling up to a mile from his gun. He can tell you how much powder to put in the gun and what elevation to give the barrel—because he's tried it. Newton's theory can't do that, because nobody knows how much energy the exploding powder produces, or how efficiently that energy is transferred to the cannonball, or how much the ball is slowed by friction with the barrel or the atmosphere on its flight. The engineer doesn't know these things, either, but he does know how much powder to use to shoot a mile. The chemistry and aerodynamics (supersonic aerodynamics!) necessary to figure this all out from first principles didn't come along for centuries.

    The road to nanotechnology is like a five-mile race. You have to swim the first mile. Then you can run the second. At that point you find a bicycle. At the end of the third mile is a car, and at the end of the fourth is an airplane... If someone gets a significant lead, he'll be hard to catch.

    A historian, particularly a historian of technology, makes a better futurist than a scientist. Beyond the immediate horizon of what's in the lab today, the future is shaped by the same forces that shaped the poast, and the past is one of our best guides for prediction.

    Future technology is shaped by what people want. It's also shaped, of course, by what is possible. But a lot more is possible than people, particularly scientists, realize. The reason is that there are usually many different ways of getting a given result. A scientist can tell you, quite correctly, that a specific approach using a specific phenomenon won't work. But he can't know whether some other way, outside his narrow field of expertise, would reach the same goal.

    Nanotechnology is not a set of particular techniques, devices, or products. It is, rather, the set of capabilities that we will have when our technology gets near the limits set by atomic physics. We can make predictions for such a technology without knowing the specifics of how it will be achieved.

    The laws of physics don't tell us directly how strong a material can be; they tell us how strong a particular one will be. It's similar with other things of interest to a technologist. Physical law doesn't tell us how powerful a motor can be; it lets us say how powerful a specific motor will be. So we can only get a grasp of the outlines of the capabilities of nanotechnology by analyzing a set of designs.