Everyone knows of Stephen Hawking. Some know him as the man who proved the theory of a Big Bang; others know him as the author of ‘’A Brief History of Time’, still one of the most popular scientific books ever written: and some just know him as that clever man in a chair who needs a computer to speak, as a result of his motor neuron disease of course. But one thing about Steven Hawking is that when he speaks on matters of science, people listen.
You may have seen some headlines from last month as a result of a prediction Hawking made about the ‘God particle’. Now the ‘God particle’ is of course another name for the Higgs Boson, which was finally found last year after 50 years of research and is the particle that is believed to give everything mass. Without it nothing would exist. Ironically the discovery of the particle resulted in Hawking losing a £100 bet on whether it existed or not, but that has not stopped him predicting what its future may hold.
In the forward of his new book called ‘Starmus’, Hawking told of a bleak future as a result of this ‘God particle’. He says:
“The Higgs potential has the worrisome feature that it might become metastable at energies above 100bn gigaelectronvolts (GeV).
This could mean that the universe could undergo catastrophic vacuum decay, with a bubble of the true vacuum expanding at the speed of light. This could happen at any time and we wouldn’t see it coming.”
So the ‘God particle’ could wipe out the entire universe at any time if it were to ever reach an energy of 100bn GeV. Time to get out of the universe I think.
He does however put his statement into context, when he says:
“A particle accelerator that reaches 100bn GeV would be larger than Earth, and is unlikely to be funded in the present economic climate.”
Well thank God for the current recession! However if and when the economy does recover we need to keep an eye on any evil geniuses who might be ‘experimenting’ in space.
Obviously it is unlikely that this scenario will play out, even though Hawking is theoretically correct, but it is not the first time that Hawking has turned heads making a prediction.
In 2010, Hawking gave a warning that perhaps we were maybe making a bit too much noise from our planet, and could be drawing attention to ourselves from undesirable extra-terrestrials. He said:
“If aliens ever visit us, I think the outcome would be much as when Christopher Columbus first landed in America, which didn’t turn out very well for the Native Americans”
Which is a fair enough prediction, as you would imagine any alien life form would most likely have superior technology to our own, having just travelled potentially thousands of light years through the galaxy to get here. Hawking carries on saying:
“I imagine they might exist in massive ships, having used up all the resources from their home planet. Such advanced aliens would perhaps become nomads, looking to conquer and colonize whatever planets they can reach.”
Even though that is pretty much the plot to Independence Day, it is still a valid point. Is it not the same reason why we are currently searching for Earth-like planets in our galaxy? Hawking reckons we have about 1000 years left of use of this planet and when that is all done, what next? Watch out the Universe, the Humans are coming!
This prediction also inspired a film, ‘Skyline’, in which aliens receive distance communication from Earth and decide to pay us a visit, hovering up all humans as they went along, as you can see in the trailer below:
It is not just particles and aliens though that Hawking thinks could be the end of the Human race, he even weighs in with a thought about artificial intelligence. He refers to a more recent film, ‘Transcendence’, in which Johnny Depp plays a scientist who uploads his conscious into a computer to try and advance mankind, but ultimately loses control in the process resulting in a full technology blackout.
On the subject of AI Hawking states:
“Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.
So, facing possible futures of incalculable benefits and risks, the experts are surely doing everything possible to ensure the best outcome, right? Wrong. If a superior alien civilization sent us a message saying: ‘We’ll arrive in a few decades,’ would we just reply: ‘OK, call us when you get here — we’ll leave the lights on’? Probably not — but this is more or less what is happening with AI.”
What Hawking is saying is that it seems as if the commitment to engineering supersedes any threat the end product might have to humanity, self-driving cars being an obvious example. Engineers don’t seem to care much that people actually enjoy driving.
He suggests that the creating of AI would be the biggest event in Human history, but also could be its last. Finally, he states that “Humanity has a tendency to fall in love with its own cleverness and somehow never consider that something might go wrong.”
See there we go, Vacuum creating particles, invading Aliens, or an uprising of artificial intelligence: which one will be the end?