Stephen Hawking: The implications of artificial intelligence - are we taking AI seriously enough?:
Success in creating AI would be the biggest event in human history. Unfortunately, it might also be the last, unless we learn how to avoid the risks, says a group of leading scientists Ref. Source 5
Stephen Hawking Says He “Would Consider” Killing Himself in an Assisted Suicide
In England, assisted suicide proponent Stephen Hawking said he would consider taking his own life if he felt he was no longer living a productive life or became too much trouble. According to the Daily Telegraph, Hawking said, “I would consider assisted suicide only if I were in great pain or felt I had nothing more to contribute but was just a burden to those around me.” Ref. Source 7q
Nothing except ourselves, and if we just spread en masse to another place that is likely exactly what will happen.
The thing is, (If you get a chance for a more controlled exodus) when designing a colony isolated from Earth, you would have a chance to engineer a society built from the ground up to avoid the mistakes humanity has made and continues to make. It won't be perfect of course, nothing is, but with some forethought, one should not only be able to give the colonists a strong shot at lasting a heck of a lot longer than we did on Earth, but at attaining the wisdom needed to avoid those problems entirely.
This might seem hurtful but it isn't meant to be it is just showing how messed up life can be. Just imagine having all that intelligence to talk about the stars, the universe, the future and yet not being able to get your own body up to walk. Life can suck badly.