Its purpose will be to study the four greatest threats to the human species, artificial intelligence, climate change, nuclear war and rogue biotechnology.
The Centre for the Study of Existential Risk (CSER) will be co-launched by Lord Rees, the astronomer royal and one of the world's top cosmologists, the Daily Mail reported.
Rees's 2003 book 'Our Final Century' had warned that the destructiveness of humanity meant that the species could wipe itself out by 2100.
The idea that machines might one day take over humanity has featured in many science fiction books and films, including the Terminator, in which Arnold Schwarzenegger stars as a homicidal robot.
In 1965, Irving John 'Jack' Good wrote a paper for New Scientist called 'Speculations concerning the first ultra-intelligent machine'.
Good, a Cambridge-trained mathematician, Bletchley Park cryptographer, pioneering computer scientist and friend of Alan Turing, wrote that in the near future an ultra-intelligent machine would be built.
This machine, he continued, would be the 'last invention' that mankind will ever make, leading to an 'intelligence explosion.'
Huw Price, Bertrand Russell Professor of Philosophy and another of the centre's three founders, said such an 'ultra-intelligent machine, or artificial general intelligence (AGI)' could have very serious consequences.
"Nature didn't anticipate us, and we in our turn shouldn't take AGI for granted. We need to take seriously the possibility that there might be a 'Pandora's box' moment with AGI that, if missed, could be disastrous," Price said.
"I don't mean that we can predict this with certainty, no one is presently in a position to do that, but that's the point. With so much at stake, we need to do a better job of understanding the risks of potentially catastrophic technologies," he said.