PauseAI US

About

Our proposal is simple:

Don’t build powerful AI systems until we know how to keep them safe. Pause AI.

Surveys of thousands of AI researchers estimate a 1 in 6 chance of human extinction from uncontrollable, superintelligent AI. Uncontrollable, superintelligent is exactly where we are headed if the AI industry gets its way. There are almost no guardrails into the development of frontier AI. AI companies are allowed to train models of any size, regardless of danger level. Some AI company CEOs have admitted there’s a chance their technology destroys humanity – but they’re willing to roll the dice anyway.

Our proposal is simple: Don’t build powerful AI systems until we know how to keep them safe. Pause AI. We advocate government regulation of the chips that are necessary to train and run the most powerful AI, pausing training runs above a threshold of computing power, and international cooperation to ensure that no company or country builds unsafe AI.

If we Pause AI, we can enjoy the benefits of the safe AI systems below the Pause threshold while we have the time to work to ensure that it’s safe to build even more powerful systems that could be even more beneficial.

Volunteer Opportunities