What just happened? A group of over 100 global figures from the worlds of business, politics, and other fields have signed an open letter urging global leaders to address current and future existential threats, including AI, climate change, pandemics, and nuclear war.

The letter, published Thursday, comes from The Elders, a nongovernmental organization set up by former South African President Nelson Mandela, and the Future of Life Institute, a nonprofit that aims to steer transformative technology towards benefiting life and away from large-scale risks.

Signatories of the letter include billionaire Virgin Group founder Richard Branson, former United Nations General Secretary Ban Ki-moon, and Charles Oppenheimer (grandson of J. Robert Oppenheimer). It's also signed by several former presidents and prime ministers, activists, CEOs, founders, and professors.

"Our world is in grave danger," the letter starts. "We face a set of threats that put all humanity at risk. Our leaders are not responding with the wisdom and urgency required."

The changing climate, the pandemic, and wars in which the option of using nuclear weapons has been raised are cited as examples of current threats. The letter states that worse could come, especially as we still don't know just how significant the emerging threats associated with ungoverned AI will prove.

"Long-view leadership means showing the determination to resolve intractable problems not just manage them, the wisdom to make decisions based on scientific evidence and reason, and the humility to listen to all those affected."

The letter calls for governments to agree on certain items, such as agreeing how to finance the transition away from fossil fuels and toward clean energy, relaunching arms control talks to reduce the risk of nuclear war, and creating an equitable pandemic treaty. When it comes to AI, the suggestion is to build the governance needed to make the technology a force for good, not a runaway risk.

MIT cosmologist Max Tegmark, who set up the Future of Life Institute alongside Skype co-founder Jaan Tallinn, told CNBC that The Elders and his organization do not see AI as "evil," but fear it could be used as a destructive tool if it advances rapidly in the hands of the wrong people.

The Future of Life Institute also published the open letter last year that called for a six-month pause on advanced AI development. It was signed by 1,100 people, including Apple co-founder Steve Wozniak, Elon Musk, and Pinterest co-founder Evan Sharp. That letter didn't have its intended effect; not only did AI companies fail to slow down development, many actually sped up their efforts to develop advanced AI.

Comparisons between AI and nuclear war aren't new. Experts and CEOs warned of the extinction risk posed by the technology last May. And even ChatGPT-creator OpenAI says an AI smarter than people could cause the extinction of the human race.