Open Letter Calls for Prohibition on Superintelligent AI

Source: CyberScoop

An open letter released by the Future of Life Institute calls for a halt to the development of superintelligent AI until safety and public approval are guaranteed. Signed by over 700 individuals, including prominent figures like Prince Harry and technology experts, the letter voices growing concerns over the implications of AI surpassing human intelligence. Issues highlighted include job losses due to automation, threats to national security, and potential societal harm.

The letter urges that any advancements in superintelligent AI should not proceed without a strong foundation of scientific agreement and public acceptance. Historically, the institute previously called for a pause in AI development in 2023, but major tech firms did not comply. With increasing urgency reflected in public opinion polls, the need for effective oversight and control over AI advancements remains critical, prompting ongoing discussions about governance and corporate responsibility.

👉 Pročitaj original: CyberScoop