The next 100 years will be one of the most tenuous for the human race, the renowned theoretical physicist and cosmologist cautioned in a radio interview earlier this week, as advances in science and technology will outpace our ability to escape Earth in case of an apocalyptic emergency.
Hawking, who was speaking with Radio Times prior to a scheduled BBC Reith Lecture on black holes, said that advances in research are set to create “new ways things can go wrong” during the years ahead, while it will be at least a century before we can establish a human colony in space.
As a result, he said, chances are that a global disaster will happen before we have a home among the stars to which we can flee. Such a disaster, Hawking explained, would probably be caused by nuclear weapons or genetically-engineered viruses, The Guardian said.
“Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next 1,000 or 10,000 years,” he said, according to BBC News. “By that time we should have spread out into space, and to other stars, so a disaster on Earth would not mean the end of the human race.”
“However, we will not establish self-sustaining colonies in space for at least the next hundred years, so we have to be very careful in this period,” Hawking added. “We are not going to stop making progress, or reverse it, so we have to recognize the dangers and control them.”
Despite a decade of warnings, Hawking isn’t all doom-and-gloom
Hawking went on to call himself “an optimist” and said that he was confident that humanity would be able to make the changes needed from destroying itself through its own inventions. However, as Gizmodo noted, this is far from the first time that the 74-year-old professor and author has warned of impending doom and gloom caused by our own inventions.
In 2006, he asked a question online wondering how the human race could survive another 100 years “in a world that is in chaos politically, socially and environmentally,” later admitting that he did not know the answer, and that was why he asked the question. The following year, while speaking in Hong Kong, he warned of global nuclear war or other man-made disasters.
Since then, he has cautioned that artificial intelligence could be the “worst mistake” that people have ever made, and that while AI had tremendous benefits, that the behavior of such a system would become unpredictable once it become exponentially more powerful. Hawking has called for increased AI oversight and has called for a ban on “autonomous weapons beyond meaningful human control.” Finally, he has expressed concerns over our attempts to contact aliens.
In spite of such concerns, Hawking also said that it was “a glorious time to be alive and doing research in theoretical physics” and that nothing could match “the Eureka moment of discovering something that no one knew before.” That said, he added, “it’s important to ensure that these changes are heading in the right directions,” according to BBC News.
—–
Feature Image: Thinkstock
Comments