Synonyms that are in the dictionary are marked in green. Synonyms that are not in the dictionary are marked in red.
Antonyms that are in the dictionary are marked in green. Antonyms that are not in the dictionary are marked in red.
But facing a hostile superhuman AI would be like facing "an entire alien civilization, thinking at millions of times human speeds" and viewing humans as "very stupid," Yudkowsky writes.
Source: https://www.newser.com/story/333379/halt-ai-learning-indefinitely-or-everyone-will-die.html
In March, TIME published an essay from AI safety advocate Eliezer Yudkowsky that prompted discussion in the White House press briefing room about the Biden Administration’s plan on AI.