An artificial intelligence expert has given the stark warning that all humanity will die at the hands of a "hostile superhuman" version of the software.
Eliezer Yudkowsky, an American decision theory and artificial intelligence theorist and writer, made the comments writing for Time in response to an open letter that called for “all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4”.
GPT-4, or Generative Pre-trained Transformer 4, is a multimodal large language model created by the artificial intelligence research laboratory OpenAI – and the fourth in its GPT series.
READ MORE: Man has penis chopped off during threesome and stabbed 54 times in brutal murder
"I refrained from signing because I think the letter is understating the seriousness of the situation and asking for too little to solve it," Yudkowsky wrote, saying that pausing its development isn't enough, only halting it will do.
It comes after the Future of Life Institute published an open letter about the technology, signed by Elon Musk and hundreds of other prominent AI researchers and commentators.
The group called for a pause in the development of the potentially dangerous tech. A fear is that the technology will surpass humans and become smarter than its creators.
"The key issue is not 'human-competitive' intelligence (as the open letter puts it); it’s what happens after AI gets to smarter-than-human intelligence," Yudkowsky explained.
He said he is afraid that "a research lab would cross critical lines without noticing."
"Without that precision and preparation, the most likely outcome is AI that does not do what we want, and does not care for us nor for sentient life in general," he added.
Driving home his point, he said: "Many researchers steeped in these issues, including myself, expect that the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die."
He said the "hostile superhuman AI" would manifest itself like an "entire alien civilization, thinking at millions of times human speeds" and "won’t stay confined to computers for long."
"If somebody builds a too-powerful AI, under present conditions, I expect that every single member of the human species and all biological life on Earth dies shortly thereafter."
Keep up to date with all the latest news stories. By signing up for one of Daily Star's free newsletters here.
- Al-Qaeda-style terrorist who planned London Stock Exchange bombing set for early release
- Shamima Begum was 'pals with ISIS slave master who oversaw rape and kidnap'
- North Korea brags 1.4m young people have joined its army in three days – to wipe out US
Source: Read Full Article