Wednesday, 24 January 2024

A moment of historic danger: It is still 90 seconds to midnight


Founded in 1945 by Albert Einstein, J. Robert Oppenheimer, and University of Chicago scientists who helped develop the first atomic weapons in the Manhattan Project, the Bulletin of the Atomic Scientists created the Doomsday Clock two years later, using the imagery of apocalypse (midnight) and the contemporary idiom of nuclear explosion (countdown to zero) to convey threats to humanity and the planet. The Doomsday Clock is set every year by the Bulletin’s Science and Security Board in consultation with its Board of Sponsors, which includes nine Nobel laureates. The Clock has become a universally recognized indicator of the world’s vulnerability to global catastrophe caused by man-made technologies.

The Bulletin's Science and Security Board (SASB) is composed of a select group of globally recognized leaders with a specific focus on nuclear risk, climate change, and disruptive technologies. Learn more...

A moment of historic danger: It is still 90 seconds to midnight

Ominous trends continue to point the world toward global catastrophe. The war in Ukraine and the widespread and growing reliance on nuclear weapons increase the risk of nuclear escalation. China, Russia, and the United States are all spending huge sums to expand or modernize their nuclear arsenals, adding to the ever-present danger of nuclear war through mistake or miscalculation.

In 2023, Earth experienced its hottest year on record, and massive floods, wildfires, and other climate-related disasters affected millions of people around the world. Meanwhile, rapid and worrisome developments in the life sciences and other disruptive technologies accelerated, while governments made only feeble efforts to control them.

The members of the Science and Security Board have been deeply worried about the deteriorating state of the world. That is why we set the Doomsday Clock at two minutes to midnight in 2019 and at 100 seconds to midnight in 2022. Last year, we expressed our heightened concern by moving the Clock to 90 seconds to midnight—the closest to global catastrophe it has ever been—in large part because of Russian threats to use nuclear weapons in the war in Ukraine.

Today, we once again set the Doomsday Clock at 90 seconds to midnight because humanity continues to face an unprecedented level of danger. Our decision should not be taken as a sign that the international security situation has eased. Instead, leaders and citizens around the world should take this statement as a stark warning and respond urgently, as if today were the most dangerous moment in modern history. Because it may well be.

But the world can be made safer. The Clock can move away from midnight. As we wrote last year, “In this time of unprecedented global danger, concerted action is required, and every second counts.” That is just as true today.

The many dimensions of nuclear threat

A durable end to Russia’s war in Ukraine seems distant, and the use of nuclear weapons by Russia in that conflict remains a serious possibility. In February 2023, Russian President Vladimir Putin announced his decision to “suspend” the New Strategic Arms Reduction Treaty (New START). In March, he announced the deployment of tactical nuclear weapons in Belarus. In June, Sergei Karaganov, an advisor to Putin, urged Moscow to consider launching limited nuclear strikes on Western Europe as a way to bring the war in Ukraine to a favorable conclusion. In October, Russia’s Duma voted to withdraw Moscow's ratification of the Comprehensive Nuclear Test Ban Treaty, as the US Senate continued to refuse even to debate ratification.

Nuclear spending programs in the three largest nuclear powers—China, Russia, and the United States—threaten to trigger a three-way nuclear arms race as the world’s arms control architecture collapses. Russia and China are expanding their nuclear capabilities, and pressure mounts in Washington for the United States to respond in kind.

Meanwhile, other potential nuclear crises fester. Iran continues to enrich uranium to close to weapons grade while stonewalling the International Atomic Energy Agency on key issues. Efforts to reinstate an Iran nuclear deal appear unlikely to succeed, and North Korea continues building nuclear weapons and long-range missiles. Nuclear expansion in Pakistan and India continues without pause or restraint.

And the war in Gaza between Israel and Hamas has the potential to escalate into a wider Middle Eastern conflict that could pose unpredictable threats, regionally and globally.

An ominous climate change outlook

The world in 2023 entered uncharted territory as it suffered its hottest year on record and global greenhouse gas emissions continued to rise. Both global and North Atlantic sea-surface temperatures broke records, and Antarctic sea ice reached its lowest daily extent since the advent of satellite data. The world already risks exceeding a goal of the Paris climate agreement—a temperature increase of no more than 1.5 degrees Celsius above pre-industrial levels—because of insufficient commitments to reduce greenhouse gas emissions and insufficient implementation of commitments already made. To halt further warming, the world must achieve net zero carbon dioxide emissions.

The world invested a record-breaking $1.7 trillion in clean energy in 2023, and countries representing half the world’s gross domestic product pledged to triple their renewable energy capacity by 2030. Offsetting this, however, were fossil fuel investments of nearly $1 trillion. In short, current efforts to reduce greenhouse gas emissions are grossly insufficient to avoid dangerous human and economic impacts from climate change, which disproportionately affect the poorest people in the world. Barring a marked increase in efforts, the toll of human suffering from climate disruption will inexorably mount.

Evolving biological threats

The revolution in life sciences and associated technologies continued to expand in scope last year, including, especially, the increased sophistication and efficiency of genetic engineering technologies. We highlight one issue of special concern: The convergence of emerging artificial intelligence tools and biological technologies may radically empower individuals to misuse biology.

In October, US President Joe Biden signed an executive order on “safe, secure, and trustworthy AI” that calls for protection “against the risks of using AI to engineer dangerous biological materials by developing strong new standards for biological synthesis screening.” Though a useful step, the order is not legally binding. The concern is that large language models enable individuals who otherwise lack sufficient know-how to identify, acquire, and deploy biological agents that would harm large numbers of humans, animals, plants, and other elements of the environment. Reinvigorated efforts this past year in the United States to revise and strengthen oversight of risky life science research are useful, but much more is needed.

The dangers of AI

One of the most significant technological developments in the last year involved the dramatic advance of generative artificial intelligence. The apparent sophistication of chatbots based on large language models, such as ChatGPT, led some respected experts to express concern about existential risks arising from further rapid advancements in the field. But others argue that claims about existential risk distract from the real and immediate threats that AI poses today (see, for example, “Evolving biological threats” above). Regardless, AI is a paradigmatic disruptive technology; recent efforts at global governance of AI should be expanded.

AI has great potential to magnify disinformation and corrupt the information environment on which democracy depends. AI-enabled disinformation efforts could be a factor that prevents the world from dealing effectively with nuclear risks, pandemics, and climate change.

Military uses of AI are accelerating. Extensive use of AI is already occurring in intelligence, surveillance, reconnaissance, simulation, and training. Of particular concern are lethal autonomous weapons, which identify and destroy targets without human intervention. Decisions to put AI in control of important physical systems—in particular, nuclear weapons—could indeed pose a direct existential threat to humanity.

Fortunately, many countries are recognizing the importance of regulating AI and are beginning to take steps to reduce the potential for harm. These initial steps include a proposed regulatory framework by the European Union, an executive order by President Biden, an international declaration to address AI risks, and the formation of a new UN advisory body. But these are only tiny steps; much more must be done to institute effective rules and norms, despite the daunting challenges involved in governing artificial intelligence.

How to turn back the Clock

Everyone on Earth has an interest in reducing the likelihood of global catastrophe from nuclear weapons, climate change, advances in the life sciences, disruptive technologies, and the widespread corruption of the world’s information ecosystem. These threats, singularly and as they interact, are of such a character and magnitude that no one nation or leader can bring them under control. That is the task of leaders and nations working together in the shared belief that common threats demand common action. As the first step, and despite their profound disagreements, three of the world’s leading powers—the United States, China, and Russia—should commence serious dialogue about each of the global threats outlined here. At the highest levels, these three countries need to take responsibility for the existential danger the world now faces. They have the capacity to pull the world back from the brink of catastrophe. They should do so, with clarity and courage, and without delay.

It’s 90 seconds to midnight.


Editor’s note: Additional information on the threats posed by
nuclear weapons, climate change, biological events, and the misuse of other disruptive technologies can be found elsewhere on this page and in the full PDF / print version of the Doomsday Clock statement.

Learn more about how each of the Bulletin's areas of concern contributed to the setting of the Doomsday Clock this year:

No comments:

Post a Comment