Typical coffee break question at our institute: What probability of dying would you accept if you would become an immensely powerful and enlightened posthuman if you survived? The fun thing is that even among people I know who are fairly radical transhumanists the majority at most accept somewhere around 10%, even though they believe that a posthuman life could be *amazingly* much better than our current life. However, there are a few who take their estimates seriously and would accept something like 99% chance of dying, since the rewards outweigh the risk so much. In EP, exhumans are likely making the same estimation. And accepting enormous risks for the shot at becoming gods. Of course, with backups you can even try again and again if you fail - so what if a hundred copies of you die in agony or are driven permanently mad if one of you can transcend all human limitations? A related question: what risk of wiping out mankind is acceptable for learning something interesting in your favourite field? From a friend who had been arguing with him about AGI-related existential risks I learned that a world famous AI researcher agreed that even if he believed that there was a 10% chance of his research led to the end of the world he would still do it. Not the one chance in a billion (or less) that LHC physicists seem to accept as a existential risk limit for when to be concerned, but 10% chance. Maybe the researcher was not taking his numbers seriously, but it is still intriguing.