One of the things that is widely regarded as a menace that might destroy civilization, or at least be enormously damaging, is a repeat of the "Carrington Event", the September 1859 geomagnetic storm. Back then there was hardly any electrical infrastructure, but there are stories of telegraph offices catching on fire and telegraph operators experiencing electric shocks. Now, it is said, with all our electronics, we'd be devastated. Even NASA has gotten into the act, forecasting trillions of dollars in damage from a repeat of such an event, and talking about it "disabling everything that plugs into a wall socket". But taking a hard look at the mechanism for such harm, really the danger is quite small.

The way that solar flares cause damage is by changing Earth's electromagnetic field. The flare in the sun releases charged particles, a "coronal mass ejection"; these travel outward until they hit the Earth's magnetic field (or, more likely, miss it entirely; but this is about the ones that do hit).

The Earth's magnetic field acts as something of a shield, repelling those particles, but as it does so it itself distorts. (Contrary to another apocalyptic myth, us ground-dwellers don't need that shielding; the atmosphere is a much better shield. It's only in low earth orbit that the magnetic field plays a big role in shielding from charged particle radiation.) The distortion in the field is what causes the danger: changes in magnetic field induce electrical currents, which can be damaging.

But the size of the change in magnetic field is miniscule. That NASA web page speaks of the Carrington event as producing anywhere from -800 to "a staggering -1750" nanotesla. (The negative numbers are because it's usually a diminution of the field rather than an increase.) For those unfamiliar with the units here, a one-tesla magnet is quite a strong magnet; you can hold a magnet in your hand that produces a field of that strength, but it's a dangerous object which can easily crush fingers. But 1750 nanotesla is less than two millionths of a tesla. Earth's normal magnetic field is between 25 and 65 microtesla, so we're talking about a variation of less than ten percent of its normal strength. Even that normal strength is so weak that to get compass needles to point in a north-south direction they have to be mounted on a low-friction mount.

So we see instantly that talk of a Carrington event killing every electronic device is just nonsense: a cell phone, for instance, experiences far more magnetic disturbance from just flipping it over, which instantly reverses the polarity of the field applied to it, a 200% change. And even that doesn't even come close to affecting it, let alone damaging it. Even taking one of those one-tesla magnets and quickly running it over your cell phone is unlikely to do much. (I just tried it with my own cell phone, with no evident effect, though I wouldn't have been too surprised to see some minor effect.) Magnetic induction is weak; motors, generators, and transformers usually have not just one loop of wire but hundreds of loops to amplify the effect.

Another thing about magnetic induction is that it isn't the total change in the field that determines the induced current; it's the speed of the change. Those above nanotesla figures are the total change; they don't say how fast it is. It's slow. When studying these events, they commonly graph the magnetic field with the horizontal scale of the graph being in days. When plotted on that scale it looks like it varies quite fast, but it's not varying anywhere near as fast as the current varies in a normal 60Hz or 50Hz electric motor; its timescale is maybe a thousand times slower than that.

So with this "staggering" effect really only being slow single-digit percentage changes in an already weak magnetic field, and with the effect being inherently incapable of being much larger (it can only distort Earth's natural magnetic field, not create a much larger field), how is there any risk at all?

Well, if you have an electrical loop whose size is measured in miles, or (better) hundreds of miles, you can pick up enough current to matter. The telegraph stations were like that. At a telegraph station, there might be one long line going off to, say, an office in Chicago, and another long line going off to an office in Dayton. There might also be a line straight from the Chicago office to the Dayton office. Take those three lines together, and they form a large loop; add a large solar event, which shifts the Earth's magnetic field, and you get induction into that loop.

Of course there were breaks in the loop: the line going to Chicago wasn't normally electrically connected to the line going to Dayton. But they often ran wires next to each other as they went through buildings, providing places where they could arc over to each other and start a fire. With loops encompassing thousands of square miles, there was plenty of voltage to drive that arcing-over. Still, it wasn't a matter of raw power from space making humans look puny; rather it's that they put out some really big magloop antennas with spark gaps in just the right places to set fire to their buildings or shock their operators. (Note that accounts of the event speak of telegraph operators being "shocked", not "incinerated".)

For context, that was the era that gave rise to the phrase "shouting fire in a crowded theater". If someone shouted fire in a crowded theater today, people would look around and vaguely wonder what was wrong. Things were very different back then. Theaters were tinderboxes which notoriously burned down and trapped people; shouting fire could cause a stampede. Today, even on the rare occasions when there really is a lethal theater fire trapping lots of people -- the Great White fire, for instance -- people don't panic at the mere suggestion of fire. At that fire they did eventually panic and stampede, but only when the fire quite obviously started threatening their lives. Today's construction standards are very different: we expect sprinklers, multiple exits, and so forth, even when they're actually absent. By our standards those 1859 telegraph stations were wooden shacks that could easily burn down.

These days we don't have telegraph, but we do have the electrical power grid. Local distribution of electric power is mostly in a tree structure (power just coming from one place and going to houses and businesses), but on the national level power is connected in large "grids" spanning hundreds or thousands of miles. Power doesn't just flow one way in those grids, but may go from wherever there's generation capacity available at the moment to wherever there's demand at the moment. These grids include loops whose area is measured in tens of thousands of square miles.

Even so, the power grid operates at very high voltages: hundreds of kilovolts. Equipment operating at hundreds of kilovolts basically laughs at the sorts of voltages that caused sparking in those old telegraph offices. Instead the risk arises from a peculiar weakness of high voltage transformers. Transformers operate on AC current. The currents induced by solar flares change so slowly that from a transformer's point of view they're DC current. High voltage transformers are very unsuited for having DC added to the normal AC voltage. For one thing, the normal AC currents are rather low. That's the whole point of using high voltage in the first place: to reduce the amount of current that's necessary to transmit power. At 200 kilovolts, a ten megawatt transformer only needs to run at 50 amps. If that transformer is already operating at 35 amps AC, and a solar flare adds another 20 amps of DC, it will be exceeding its rating and might overheat and burn out. This despite the fact that the amount of energy driving that DC is miniscule compared to the amount driving the AC: it's not transformed, so there's no counter-EMF opposing it. Still, its mere presence adds to the current through the transformer. Besides simply adding to the current and thus increasing the heating due to resistance, it also results in a phenomenon known as "core saturation", where the iron core reaches the limit of how magnetized it can get. This causes large current increases in the AC part of the current, and thus results in even faster overheating.

Now, this isn't hard to stop: just turn the transformer off when you notice it happening. Transformers commonly come with automated systems to do exactly that. (The electrical grid in general has quite a lot of shut-offs, many of them automatic. That’s how we get cascading blackouts like the 2003 Northeast blackout: equipment protecting itself and shutting down.) Turning the transformer off may cause a blackout, but the transformer remains intact and can be turned back on after the event. The apocalyptic scenarios (and the near-apocalyptic dollar figures) rely on transformers being burned out and taking months or years to replace; a one-day blackout is nothing in comparison. Besides the automated shut-offs at low level, there are also centralized monitoring stations for the grid that keep track of induced DC currents and could shut things down. In addition there are people (including at NASA) watching the Sun for solar flares and giving warnings days before the actual impact. For there to be a real problem all of those would have to fail. The sun-watching and grid-monitoring people get regular practice from ordinary solar flares, so failure is unlikely.

The risk from monster solar flares is often compared to the risk from nuclear electromagnetic pulse. There is a great deal of similarity, but solar flares lack the "pulse" part. A high-altitude nuclear burst creates lots of charged particles which distort the Earth's magnetic field in much the same way that solar flares do, and thereby pose the same sort of risk to the power grid, but nuclear EMP also generates a lot of high-frequency electromagnetic energy. That above-quoted line about solar flares "disabling everything that plugs into a wall socket" is, in context, merely talking about what happens when the power goes off and stays off. But with nuclear EMP, enough energy might come roaring down the power lines (collected by all those miles of wire acting as an antenna) to actually fry devices that are plugged into them. It still would have a hard time frying devices that aren't plugged into anything, though. Despite much talk of it killing the electronics in cars, when the federal EMP Commission did tests on cars in the early 2000s, nothing particularly dire happened to them: the worst was that they had to be restarted. But there still is quite a lot that is plugged in, and the risk to it is worth respecting.

One thing that is plugged in are the aforementioned safety devices that monitor the DC current through big high-voltage transformers and shut them off if it gets out of bounds. That's how the EMP Commission posited a serious risk to the power grid: first the high-frequency pulse destroys the electronics in the automatic shutoffs, and then the unprotected transformers get fried by the low-frequency fluctuations in the Earth's magnetic field. This is to be taken seriously (such as by designing such shutoffs to be resistant to EMP).

Because if that fails, we are indeed in trouble. Large high-voltage transformers are not things that normally wear out. Since they have long service lives, their production is quite slow. People speak of lead times of months or years for ordering one. I have not looked into this aspect in detail, but expect that production could be sped up considerably if need be: there's nothing fundamentally difficult about manufacturing a transformer, even a high voltage one. Whether, in such an event, it actually would be sped up is another matter; highly competent people would have to be in charge of the effort, and both our political parties are weak on technology.

As for the wider topic of protecting from EMP, that's too much to address in this post. The subject is quite complicated. Indeed, one could write a book on it. On one level it can be simple: putting something inside a Faraday cage protects it completely. But inside a Faraday cage it cannot communicate with the outside world. Pass wires into the cage for communications (or even for power), and you've introduced a vulnerability. However, it's a vulnerability that people already are dealing with every day. Surges on power lines and telecommunication lines are present even in ordinary circumstances, and the surge suppressors people install are also useful against EMP -- though not necessarily enough, depending on the strength of the EMP. (They may not even be enough for ordinary surges; there's a lot of junk out there.) The subject just of proper grounding of surge suppressors has considerable complexity in itself; a grounding scheme that works at low frequencies often utterly fails at high frequencies.

Protecting high-bandwidth communications lines from EMP is another whole topic in itself. With power lines, you can just filter out all the high frequencies, via a combination of inductors and capacitors. But communications lines have to let high frequencies through. Again, though, a large part of the job is already done, since it has had to be done: unprotected lines not only would pick up EMP but would pick up other signals and would broadcast the signals traveling down them. (There is a reciprocity principle at work here: an antenna that is good at transmitting is also always good at receiving, and vice versa.) The business of preventing unwanted interference is called "electromagnetic compatibility"; books are written on how to achieve it, and government regulations demand it. The wires themselves are chosen for this purpose: wires that transmit high-bandwidth signals are generally coaxial cable or twisted pair, which minimize unwanted transmission and reception. Optical fiber, which is increasingly replacing them, is immune to EMP. So it's not right to paint the world as altogether naked to EMP, but also not right to paint it as completely protected.

Solar flares, though, are simpler. We're basically protected against those. It's not a giant electrical fist from the sun smashing civilization, it's a narrow vulnerability in high voltage transformers, and its danger can be averted with even modest care.