2021-10-02

Stephen Hawking

Hawking warned that superintelligent artificial intelligence could be pivotal in steering humanity's fate, stating that "the potential benefits are huge... Success in creating AI would be the biggest event in human history. It might also be the last, unless we learn how to avoid the risks."[364][365] However, he argued that we should be more frightened of capitalism exacerbating economic inequality than robots.[366]

Hawking was concerned about the future emergence of a race of "superhumans" that would be able to design their own evolution[351] and, as well, argued that computer viruses in today's world should be considered a new form of life, stating that "maybe it says something about human nature, that the only form of life we have created so far is purely destructive. Talk about creating life in our own image."

No comments:

Post a Comment