I’d like to contrast two recent contributions to the nuclear energy discussion. The first is the IAEA Climate Change and Nuclear Power report, a straightforward, fully-referenced resource that spells out the considerable past and future capacity of nuclear as a non-emitting, scalable source of electricity. It summarises the sort of information that has lately seen a growing number of critical thinkers reassess their positions on nuclear, and despite less than four years having passed since an arguably deathless multiple-reactor accident, has seen the IPCC progress from popularising a doubtfully simplistic majority renewable plan from Greenpeace to calling for at least a tripling of nuclear generation in response to climate change.
If only that yellow had kept expanding.
The other is a very recent survey that reinforces the unsurprising popularity of solar and wind in Australia. Popularity for nuclear remains a factor that probably dwarfs the quite tractable issues surrounding its serious consideration in Australia – especially for a clear majority of women who don’t like the idea at all, and seemingly prefer coal to some extent. Indeed, the authors report that:
Forty-six per cent of people surveyed don’t agree that coal is good for humanity but forty-five percent agree that it is an essential part of our economic future. Forty-one percent also accept that coal is the world’s principal energy source and is likely to remain so for decades to come…
Popular intermittent non-fossil generators certainly abate a proportion of emissions. But I don’t see climate change and emissions pollution response as a popularity contest. If we also largely realise that reliable electricity is important, that’s a great start. We each have to honestly consider what our chosen position on this matter means to us.
Even a short evening of calm discussion will help interested people.
Or to put it another way…
Ever had a best friend who was there when you needed them, every week/month/year, day & night? That’s my idea of a truly valuable friend. What about a friend who only answered your calls on nice days? Or one who blew in maybe a third of the time, regardless of time of day or whether their help was even needed? Sure, those two can still make a contribution, but your best friend is who you need if you want to get anything useful achieved.
Now, would you rather that best friend was the sort of dude who slowly, nonchalantly poisons you, your children, and everyone you know? Or, instead, sorts all the garbage and recycling to the point of obsessive-compulsion, and drives really, really carefully?
As highlighted at Brave New Climate a few months ago, there are journalists who opt to report on matters of radiological hazard without providing context, or necessarily understanding what they are saying. I was glad to recently see an article on tsunami risk that resisted mentioning nuclear catastrophe, but the fact remains that the overall public ambivalence to precise measurements and estimations, and what they tell us about absolute and relative hazards, is largely unchalleneged, and often exploited.
Except it’s really not that hard, I promise! Most people can imagine a lump of stuff sitting there, radiating alphas, betas and/or gammas. Which of these predominates, and how hazardous their energy is depends on the isotope and its total mass, which is referred to in becquerels – just think of it like the weight, but specifically for a given radioisotope.
The normal smoke alarms in your house contain 37 000 becquerels each of americium-241. I know there are non-ionising models, I used to sell them occasionally in a previous life (often to customers who loudly proclaimed they were safer). Am-241 has a half-life of 432 years, and releases mostly alpha particles. You can read more here, but the point is we don’t have to imagine stuff any more, we can think of a smoke alarm. Ever wondered what’s inside one? I did. I also have a personal dosimeter, so it’s experiment time.
Pretty normal background level.
Micro Sieverts per hour (μSv/h) are what we’re all receiving constantly as an average dose of radiation. Despite my grasp of the metrics, principles and cellular biochemistry involved, I’m not an expert, but 0.1 μSv/h? I wouldn’t worry.
The hatch for the Geiger-Müller tube is off (it usually stops beta particles) so this is everything emitted in normal operation.
Controversy! Ubiquitous household item blasts five times the normal radiation at your family every hour! Well, of course such a sensationalised yet technically true statement seems like hyperbole now. Still, I admit that I’d forgotten I’d set the threshold at 0.3 and when the alarm screamed, I jumped! The round yellow shroud in the top photo can be convinced away from the circuit board, and-
Oh dear god oh the humanity
After a minute it settled at around 1.8 μSv/h. Spectacular! If you remember the figures from Geoff’s article, Catalyst was worried about 7 μSv/h in Fukushima prefecture, but they didn’t provide context. Is this context? This is a ubiquitous domestic safety device which dramatically reduces the chance of death in the event of a house fire. Apart from the odd customer, everyone’s used to them. Could they be causing cancer anyway? I don’t think so at all, and here’s why.
I trust everyone’s heard of the Radium Girls. Apparently, not a few of the workplace protections we take for granted originated with their fight for a bit of justice. Initially, they were provided with no guidance regarding the safety of pointing the brushes for dial painting with their lips, and ingesting concentrated radium-226 and radium-228 in the process. The bioaccumulation above a threshold resulted in serious health impacts. 100 “microcuries” equals 3.7 million becquerels – a couple orders of magnitude higher than in my smoke alarm, which I managed not to swallow. The study showed that the significant population of workers who absorbed a dose below this threshold avoided the horrendous bone malignancies characteristic of radium exposure. This was achieved through safer work guidelines, not by removing the isotopes. The proportionally fractional amount of Am-241 is even safer as a labelled, discrete button in its tight housing – a form that cannot realistically enter a human body, especially while properly encased and screwed to your hallway ceiling.
But no one’s worried about radium – or americium – at Fukushima! It’s the cesium, right? Well, whichever isotope has little relevance at low exposures,* but the substantial exposures from unsecured medical supply of cesium-137 at the centre of the 1987 Goiânia accident led to five deaths. How many becquerels? 7 000 000 000 000 were thought to have been spread throughout the environment, from an initial 50.9 trillion. A substantial population – who pointedly didn’t die of radiation syndrome or cancer – received doses many orders of magnitude larger than what my eviscerated smoke alarm could emit, and, indeed, than one can conceivably absorb by eating the feared matsutake mushrooms of Minamisoma City. 102 900 becquerels per kilogram! shrieked one hysterical website just last week. Would I eat them? Not a whole kilogram, just like I wouldn’t drink the seawater downhill from the Fukushima Daiichi plant, tritium or no tritium. Of course no one should be expected to eat contaminated fungus, but no one should be perpetuating context-free agitation about how dangerous it is either, when we have history as such a stark guide.
It is quite clear that there is a threshold below which radiation, isotopes and decays do no harm. What might that threshold be, though? There’s much healthy discussion, but one source recently stated:
For all of the above reasons, it is recommended that use of the linear no-threshold (LNT) model be abandoned and replaced by a more realistic approach to the estimation of radiological risks. A new model to replace LNT should be based on thresholds below which risks are considered to be zero. In accordance with present knowledge and data, thresholds are considered to be within the following ranges depending on circumstances. These figures are proposed as a basis for further discussion:
– Within the range 50-300 mSv for acute single doses to adults;
– Within the range 100-700 mSv per year for continuous chronic exposures; and
– Within the range 50-200 Bq/m3 for naturally occurring radon in the air breathed in confined spaces, which causes about half the exposure to background radiation for many people.
Thresholds also need to be developed for the sum totals per year, per month or per week of intermittent and protracted exposures, and for acute single doses to embryos, foetuses and infants.
Risks might be assumed to depend on, or be proportional to, the incremental dose or dose rate over limited ranges above the relevant threshold. Simple explanations of the meaning and level of actual risk and benefits should be developed.
That 700 mSv/yr equates to just shy of 80 μSv/h. Even the lower end of that range is 11.4 μSv/h – granting considerable breathing room for any visitors to Fukushima. So what is the opposition actually afraid of? Do they have better empirical information than such experts? What might it be, and how might it account for the few examples I’ve provided, and the numerous others that exist, of the benignity of low dose radiation? And, importantly, how is it of sufficient gravity to justify existing stringent limitations than have caused far more harm than benefit? No, seriously – if I’ve erred here, I want to know. I’ll remove this article if the evidence shows it’s wrong. But as things stand, a triple reactor disaster three and a half years ago has resulted in contamination that will not harm the population that it continues to terrify. If there’s no evidence forthcoming, then those vocal nuclear critics have a whole lot of explaining to do.
To their credit, we do have journalists who approach these matters with very open minds when honestly exploring the potential for nuclear energy in Australia, new technology or the prospects of an interim waste facility. To my mind the crucial axis this all hinges on is the willingness of people to hear what our experts have to say, and to approach the opposing messages of fear and indolent bias with razor-sharp skepticism.
*Except possibly for special cases like iodine-131 that essentially attacks the thyroid unless the dose is medically controlled, one isotope is as effectively harmless as another, contamination-wise, at such low exposures. Why? Look at that radium study again:
Ra-226, half-life 1620 years, emits an alpha particle, and is transformed into radon (half-life 3.8 days); eight more radioactive decays, which emit either alpha or beta particles, take place before it becomes an atom of non-radioactive lead.
All of those workers who didn’t develop symptoms had all these various mixed-up isotopic decays occurring in their bodies, of different energies and in different organs. It’s hard to be more precise about it, but it happened.
What is it about energy storage that excites people?
There are many answers to this question, but for opponents of nuclear energy it is predominantly seen as the most promising way to overcome the intermittency and unreliability of those most photogenic of renewable energy technologies, photovoltaics and wind turbines. In fanciful majority solar/wind future scenarios, without storage to back up low production times – like windless nights or even dark, very stormy days – a magnitude of overbuild is needed which leaves much spare, idle capacity in optimal production times, else other technologies like hydropower, biomass combustion or (predominantly) natural gas turbines are unavoidable.
Biomass combustion may constitute up to 60% of renewable energy by 2030, followed by a stagnant share of hydropower. Why only panels and turbines on the posters?
The first major irony is that backed-up or overbuilt renewable models try to overlap a bunch of technologies to meet baseload demand – conventionally met by just one or two main dispatchable technologies (like coal or nuclear) and erroneously dismissed as a myth. The second is that in a nation-scale storage-reliant grid, solar and wind capacity would require a comparable magnitude of overbuild anyway – to both meet demand and charge the batteries enough* on good days.
It’s not that I oppose storage – I love batteries, I’m pro-EVs, I’m excited by technological advancements and I support hydropower (as well as solar and wind, all where appropriate) in our future mix – or other measures like increased efficiency and sensible demand management. I just refuse to delude myself that storage, with its intrinsic limitations, can fully enable any clean technologies that cannot manage the decarbonisation job themselves, at the required scale and in the dictated time frame (if at all).
Tom Murphy dealt with pumped storage – still the most economical form – years ago and anyone interested in energy needs to have read this. Why is it the most economical form? All you need is suitable land, accessible water, and you build the structure once to last a century or more. As Murphy demonstrates, this is supremely, physically unachievable in a country like the US.
But let’s say a pumped storage site were commissioned to backup some considerable solar and wind capacity. The storage operator will derive income from flowing water through turbines and selling power to retailers, and the operating costs will mainly be buying power to pump that water back up to the reservoir. They will want to sell at times of high demand to maximise return on investment, and to store during low demand (cheap power), and this is exactly how such capacity is currently used. But high demand generally coincides with sunny afternoons, when PV capacity is being used directly and the price is high. Wind will blow, or not, largely regardless of demand, or lack there-of. The storage operator will be expected to pay a premium to store power during the time when it makes most sense to generate, and to hope for windy, overproducing nights. Who will compensate the operator for this forgone income? Who will invest in constructing this facility on this basis?
The most ubiquitous chemical storage technology in the world today is based on the oxidation states of lead. Lead is inexpensive, common, and the vast bulk used in batteries is economically recycled. Emerging technology like liquid metal batteries are exciting and deserve substantial support to rapidly penetrate the market and contribute as best they can, but the power is still limited by the magnitude of change in oxidation states of metals, and this can’t be increased.
That net energy release is only part of the story, but it is sufficient to illustrate the quantities we must accept. The oxidation of lead provides the equivalent of 34.35 kJ/mole, where kJ are the same metric kilojoules we consider for food, and a mole is a chemist’s way of levelising between different elements and compounds. Specifically, when coal is burned, the energy is overwhelmingly provided by the oxidation of carbon-carbon bonds, and yields 153 kJ/mole. Again, don’t directly compare these values – one is a reversible electrochemical half-reaction, the other is the energy liberated from a bond by combustion, and they are subject to vagaries and efficiencies of technology design – but instead accept that they fundamentally frame the scale at which chemical energy is stored. Tom Murphy also dealt with what happens when we want to scale this up. The point is, the fundamental coulombic interactions being manipulated through combustion or electrochemical storage are immutable and, despite likely improvements in the future, there will be and cannot be any Moore’s Law of battery storage.
And guess what – the same economic problems facing pumped hydro would also apply to operators of grid-scale battery storage, but without the extended infrastructure lifetime. Remember, pumped hydro is still the most economical form of storage.
The Best Storage of All
Everyone’s favourite nuclear-powered space science tank.
The rough comparison above alludes to treating coal like a form of storage – popularly, stored sunlight from millions of years ago. To the extent that this describes fossil fuels, it also describes nuclear fuels. The steady decay of 18.89 moles (4.8 kg) of plutonium-238 oxide in Curiosity’s RTG provides over 118 kJ every minute (1968W). The fissioning of uranium-235 liberates 20 000 000 000 kJ/mole, stored by the strong nuclear force in the hearts of primordial supernovae that seeded our little region of the galaxy with plentiful heavy elements billions of years ago. I can see how that image may be a bit daunting. But given nuclear power’s proven carbon-mitigating capacity, dismissing it in favour of an inferior energy storage fantasy is provably irresponsible. We can now build reactors that release the comparable energy from all U-238, and even from the plutonium from used nuclear fuel as well as dismantled weapons.
SN-1987A: natural actinide production.
The irony of ironies is that the ultra-low carbon, dispatchable power provided by nuclear fits the economic operation of pumped hydro far better. The addition of even modest conventional storage to the fast reactors described above results in formidable clean electricity generation indeed.
I think the reality is that some significant storage will be integrated into grids (where technically feasible) of developed countries (who can afford it). It will certainly play a small-scale role in domestic systems, and odds are you know someone (who knows someone) who already operates batteries connected to rooftop PV. But that * up at the top of this article is crucial – how much is enough? If rejection of dispatchable electricity capacity – from fossil fuels as well as ultra-low emission nuclear – is the goal… rather than rapid and effective decarbonisation of reliable supply, how can anyone know how much impossible storage would be enough?