Showing posts with label risk assessment. Show all posts
Showing posts with label risk assessment. Show all posts

Monday, September 6, 2010

NASA's warnings on the dangers of severe space storms

Back in June I blogged about the potential dangers arising from space storms that could spawn devastating solar flares. This is no joke, nor is it part of the laughable (but conveniently co-incidental) 2012 doomsday nonsense. There's actual science involved here; NASA issued a solar storm warning back in 2006 in which it predicted that the worst of it could come sometime between 2011 and 2012. Last year they slightly downgraded their warning, while extending their forecast to 2013—May 2013 to be exact, which sounds eerily specific.

According to NASA, we are currently in a solar maximum period. These cycles are capable of creating space storms—what are known as "Carrington Events," named after astronomer Richard Carrington who witnessed a particularly nasty solar flare back in 1859. The flare he documented resulted in electrified transmission cables, fires in telegraph offices, and Northern Lights so bright that people could read newspapers by their red and green glow...in Mexico.

If this is what happened in 1859, imagine what would happen today. Well, we're starting to have some idea—and the news is pretty bad.

A recent report by the National Academy of Sciences found that if a similar storm occurred today, it could cause $1 to 2 trillion in damages to society's high-tech infrastructure and require four to ten years for complete recovery. It could damage everything from emergency services’ systems, hospital equipment, banking systems and air traffic control devices, through to everyday items such as home computers, iPods and GPS's. Because of our heavy reliance on electronic devices, which are sensitive to magnetic energy, the storm could leave a multi-billion dollar damage bill and cataclysmic-scale problems for governments.

Worse than this, however, would be the potential length of blackouts. According to a Metatech Corporation study, an event like the 1921 geomagnetic storm would result in large-scale blackouts affecting more than 130 million people and would expose more than 350 transformers to the risk of permanent damage. It could take months—if not years—to put everybody back on the grid.

For more reading, I recommend the NASA report, "Severe Space Weather Events--Understanding Societal and Economic Impacts: A Workshop Report" (2008). Excerpt:
Modern society depends heavily on a variety of technologies that are susceptible to the extremes of space weather—severe disturbances of the upper atmosphere and of the near-Earth space environment that are driven by the magnetic activity of the Sun. Strong auroral currents can disrupt and damage modern electric power grids and may contribute to the corrosion of oil and gas pipelines. Magnetic storm-driven ionospheric density disturbances interfere with high-frequency (HF) radio communications and navigation signals from Global Positioning System (GPS) satellites, while polar cap absorption (PCA) events can degrade—and, during severe events, completely black out—HF communications along transpolar aviation routes, requiring aircraft flying these routes to be diverted to lower latitudes. Exposure of spacecraft to energetic particles during solar energetic particle events and radiation belt enhancements can cause temporary operational anomalies, damage critical electronics, degrade solar arrays, and blind optical systems such as imagers and star trackers.

The effects of space weather on modern technological systems are well documented in both the technical literature and popular accounts. Most often cited perhaps is the collapse within 90 seconds of northeastern Canada’s Hydro-Quebec power grid during the great geomagnetic storm of March 1989, which left millions of people without electricity for up to 9 hours. This event exemplifies the dramatic impact that extreme space weather can have on a technology upon which modern society in all of its manifold and interconnected activities and functions critically depends.

Nearly two decades have passed since the March 1989 event. During that time, awareness of the risks of extreme space weather has increased among the affected industries, mitigation strategies have been developed, new sources of data have become available (e.g., the upstream solar wind measurements from the Advanced Composition Explorer), new models of the space environment have been created, and a national space weather infrastructure has evolved to provide data, alerts, and forecasts to an increasing number of users.

Now, 20 years later and approaching a new interval of increased solar activity, how well equipped are we to manage the effects of space weather? Have recent technological developments made our critical technologies more or less vulnerable? How well do we understand the broader societal and economic impacts of extreme space weather events? Are our institutions prepared to cope with the effects of a “space weather Katrina,” a rare, but according to the historical record, not inconceivable eventuality?
Read more.

Saturday, September 4, 2010

Anissimov: Beware botulinum and EMP attacks

Michael Anissimov of Accelerating Future is feeling a bit doomy these days—and for good reason. He argues that we're collectively understating and underreporting non-conventional but thoroughly viable catastrophic risks, including the deliberate spread of botulinum toxin and an EMP attack.

On the latter risk, Anissimov writes:
If an EMP attack came, cars and trucks would just stop. Factories, controlled by computers, would stop. Molten steel on the assembly line would cool and solidify in place due to failure of the heating elements. The vast majority of tractors, combines, and other heavy machinery would become useless. Transformers and other electrical elements, large and small, would be fried. The largest transformers have to be ordered from China and are generally ordered with a year of lead time.

An effective EMP attack on the US would cause tens of trillions of dollars of damage. Cities would run out of food in a few days. The US grain stockpile only has about a million bushels of wheat. Wheat is the only common grain with enough nutrients to sustain someone on an all-grain diet. A bushel is only 60 pounds, and someone needs about a pound of wheat a day to avoid hunger pangs. Ideally two pounds if you are doing manual labor. 60 million man-days of food is not a lot. The population of the United States is 300 million. That means our grain stockpiles are enough food for everyone to eat a fifth of a pound and then they’re gone.
By the way, if you're particularly paranoid about this, you can always convert your house into a Faraday Cage. I'm just not sure how useful all your electronics will be given that everyone else's will be fried.

And in regards to the botulinum risk, he writes, "99.9% of the population will dismiss [it] as not a big deal, due to wishful thinking. It’s all just words on the page, until people start dying."

Tuesday, December 2, 2008

Commission warns of nuclear or biological weapons attack by 2013

A U.S. bi-partisan commission is warning that the world will "more likely than not" face a terrorist attack using nuclear or biological weapons by 2013 if governments fail to undertake major security and prevention measures.

The report, titled World at Risk, recommends that the Obama administration appoint a national security aide devoted exclusively to coordinating U.S. intelligence, military and political efforts to curb weapons proliferation.

Other recommendations include:
  • better safeguards for uranium and plutonium stockpiles and step up measures against nuclear smuggling rings
  • toughen the nuclear Non-Proliferation Treaty
  • ensure access to nuclear fuel for countries committed to developing only peaceful atomic technology
  • prevent new countries, including Iran and North Korea, from possessing uranium enrichment or plutonium reprocessing capabilities
  • urgently tighten security in domestic bio-sphere institutes and laboratories
  • call for an international conference of countries with major biotechnology industries
  • secure nuclear and biological materials in Pakistan
  • constrain a growing Asian arms race
  • agree with Russia on extending essential monitoring provisions of the Strategic Arms Reduction Treaty due to expire in 2009
  • create a White House advisory post on weapons of mass destruction proliferation
The report also went on to describe Pakistan as the weakest link in world security, and it noted that terrorists are more likely to be able to obtain biological than nuclear weapons, with anthrax being a particular concern.

These threats are "evolving faster than our multi-layered response," says the commission, "our margin of safety is shrinking, not growing."

Saturday, March 29, 2008

Large Hadron Collider accused of being an existential threat

Walter L. Wagner and Luis Sancho are pursuing a lawsuit in a U.S. federal court to prevent Geneva's Large Hadron Collider from being switched on later this summer.

They're afraid that the new giant particle accelerator could destroy the entire planet.

That is, without trying to over-state the obvious, a rather extraordinary claim; it doesn't get much more serious than that.

Wagner and Sancho argue that scientists at the European Center for Nuclear Research, or CERN, have played down the chances that the collider could produce a tiny black hole or a strangelet that would convert Earth to a shrunken mass of strange matter.

They also claim that CERN has failed to provide an environmental impact statement as required under the National Environmental Policy Act.

This case illustrates a disturbing new trend -- one that started with the development of the atomic bomb: we are increasingly coming into the possession of technologies that could cause the complete extinction of the human species.

Or, at the very least, technologies that we think might destroy us.

Memories of the Manhattan Project

We don't know for certain that the collider will create a black hole or cause some other unpredictable disaster.

But we suspect that it might. Thus, it has to be considered an existential risk.

This is now the second time this has happened to us.

Back during the early days of the Manhattan Project, a number of scientists voiced their concern that the explosion might start a runaway chain-reaction by "igniting" the atmosphere. It was decided that the threat was very low and, as we all know, the United States went ahead and detonated the first bomb on July 16, 1945.

But for a brief moment 63 years ago, some concerned observers held their breath and nervously watched at the bomb lit-up the New Mexico sky.

And now we have a new contender for the perceived existential threat de jour.

Let science be our guide

Is the Hadron Collider an existential risk? Well, based on our current understanding of physics, we have to conclude that there is a non-zero probability that the collider will act in a way that could destroy the planet.

Just how non-zero is the next big question.

Three years ago, Max Tegmark and Nick Bostrom wrote a piece for Nature in which they took a stab at the question. They warned that humanity has been lulled into a false sense of security and that "the fact that the Earth has survived for so long does not necessarily mean that such disasters are unlikely, because observers are, by definition, in places that have avoided destruction."

To reach an answer, they combined physics, philosophy, probability theory (and most assuredly a hefty dose of wild-ass guessing) and concluded that a civilization destroys itself by a particle accelerator experiment once every billion years.

Admittedly, one in a billion seems excruciatingly improbable.

So let's have some fun and smash those particles together at extreme velocities.

But I have to wonder: what if they had concluded a one in a million chance? Is that sufficiently low? Remember, we're talking about the fate of all human life here.

What about one in a hundred thousand?

At what probability point do we call it all off?

How will we ever be able to agree? And would our conclusions cause us to become cripplingly risk averse?

I have no good answers to these questions, suffice to say that we need to continually develop our scientific sensibilities so that we can engage in risk assessment with facts instead of conjectures and hysteria.

The new normal

Moving forward, we can expect to see the sorts of objections being made by Wagner and Sancho become more frequent. Today's it's particle accelerator experiments. Tomorrow it will be molecular nanotechnology. The day after tomorrow it will be artificial intelligence.

And then there are all those things we haven't even thought of yet.

The trick for human civilization will be to figure out how to assess tangible threats, determine level of risk, and devise steps on how to take action.

But it doesn't end there. Inevitably, we will develop technologies that have great humanitarian potential, but are like double-edged swords. Molecular nanotechnology certainly comes to mind.

Consequently, we also have to figure out how to manage our possession of an ever-increasing arsenal of doomsday weapons.

It will be a juggling act where one single mistake will mean the end of all human life.

Not a good proposition.

Friday, February 22, 2008

A prosthetic that alerts us to the real dangers in life

Humans tend to have an extremely distorted view of risks. We tend to be bothered by low probability but high profile threats, while often completely oblivious to those risks that could actually harm us. Check out this chart:


According to artist Susanna Hertrich, this is because humans have lost their natural instinct for sensing genuine dangers. Her solution? A prosthetic device for lost instincts that literally makes your hair stand on end.

She calls it the Alertness Enhancing Device. It's an art-piece, thesis, and human enhancement device that stimulates goosebumps and shivers that go down your spine and make your neck hair stand up, "waking up the alert animal inside." According to Hertrich, the AED helps you become more alert and ready for the real dangers in life. More here.

Wednesday, January 31, 2007

Why there should be an X Prize for an artificial biosphere

Conventional futurist wisdom suggests that if our atmosphere should completely go to pot -- which it certainly appears to be doing -- humans could still eek out an existence living in self-sustaining biospheres. This would hardly represent a desirable outcome, but hey, it would certainly beat extinction. Moreover, a successful biosphere would prove to be an important step in the direction of space colonization, terraforming and remedial ecology.

But there is one major problem with this suggestion: we have yet to create a closed ecosystem that can support human life for the long term. This revelation seems strange at first, but it's true. We can send men to the moon, but we can't sustain an artificial ecosystem. The fact that we haven't been able to do so needs to be taken much more seriously. The Earth's natural biosphere is still the only functioning one we have; all our eggs are currently residing in one basket.

It's time to revive the biosphere projects of the early 1990s. Given the private sector's recent enthusiasm to develop space tourism technologies, perhaps another X Prize is in order.

BIOS-3 and Biosphere 2

Our inability to create a closed ecosystem is not for a lack of trying. To date there have been two major biosphere projects, both of them failures.

The Soviets conducted a number of experiments in BIOS-3 from 1972 to 1984. Technically speaking it was not a completely isolated biosphere as it pulled energy from a nearby power source and dried meat was imported into the facility. BIOS-3 facilities were used to conduct 10 manned closure experiments with the longest experiment lasting for 180 days. Among its successes, the Soviets were able to produce oxygen from chlorella algae and recycle up to 85% of their water.

More recently there was the Biosphere 2 project in Oracle, Arizona. At a cost of US$200 million, Biosphere 2 was an attempt to create a closed artificial ecological system to test if and how people could live and work in an independent biosphere. It was a three-acre Earth in miniature complete with a desert, rainforest and ocean. Organizers conducted two sealed missions: the first for 2 years from 1991 to 1993 and the second for six months in 1994.

The failure of Biosphere 2

Setting up and managing the parameters that drive a functioning ecosystem proved to be exceptionally difficult. Soon after the launch of the first mission, oxygen levels started to decline at a rate of 0.3% per month. Eventually the internal atmosphere resembled that of a community at an elevation of over 4,000 feet (1,200 m). Oxygen levels eventually settled at a dangerously low level of 14% (rather than the nominal 21% found on Earth) and team members started to become ill.

Organizers had no choice but to start pumping in pure oxygen and bring in other supplies from the outside. Biosphere 2 ceased to be a closed system (as much as it could be given the circumstances) and was branded a failure.

As it turns out, oxygen was hardly the only problem. Biosphere 2 also suffered from wildly fluctuating CO2 levels. Most of the vertebrate species and all of the pollinating insects died, while cockroaches and ants started to take over the place. The ocean eventually became too acidic and the ambient temperature became impossible to control (biospheres don't come with thermostats).

Compounding the environmental problems were health and psychological issues that affected the team. After two years of relative isolation, the 4 men and 4 women left Biosphere 2 depressed and malnourished. Interpersonal relationships had regressed over the course of the two years, creating what the biospherians called a 'dysfunctional family.'

After the first experiment, the Biosphere 2 organizers conducted a shorter six month stint that ended in 1994. After the completion of this more focused experiment the owners decided to change directions and asked Columbia University for advice. Today it is largely a place where students can conduct experiments and tourists can loiter.

Lessons learned, lessons not learned

Consequently, the Biosphere 2 project has been considered a big joke. If it's a joke, however, I'm not amused. Biosphere 2 was an important and eye-opening project. It revealed to us not only the difficultly of managing a closed ecosystem and the fragility of human psychology, but how challenging it will be for us to manage Biosphere 1 -- the Earth's biosphere -- should things really start to get out of whack.

In this sense, Biosphere 2 should not be considered a failure, but rather a wake-up call to scientists, environmentalists, politicians and the general public. It should have resulted in the immediate creation of similar projects and related research.

Unfortunately, the impetus these days from the private sector is towards the development of space tourism technologies like space planes and space hotels. Perhaps some entrepreneur should start an X Prize for the first viable and long term biosphere. It is the space tourism industry, after all, that would most certainly benefit from the creation of a working biosphere; humans will not go very far in space without a self-sustaining ecosystem around them.

Moreover, given the rate of global warming and the ongoing depletion of the ozone layer, our atmosphere may start to turn on us. In the more distant future there will be such risks as global ecophagy. In our desperation, we may have no choice to but to dwell in temporary biospheres until we learn to fix our broken planet.

Digg!