Gary Gutting, a professor of philosophy at the University of Notre Dame, argues that seeking contact with extraterrestrials could put the human race into harm's way; and therefor the SETI program isn't worth the risk.
As many may recall, Stephen Hawking also took a stance against seeking contact last year.
Here’s some of Gary Gutting’s argument against SETI:
What is likely to happen if we make contact with ETI? Given the size of astronomical distances and assuming the speed of light as the maximum possible velocity, the most likely outcome is not real contact but merely an exchange of messages, perhaps at very long intervals. Little chance of harm there.
But there is still non-zero probability of real contact. Since we have no way of predicting with any certainty the outcome of such contact, it might seem that we have no reason to assume a bad rather than a good result. From this we might conclude that there is no objection to pursuing SETI, if only to satisfy our curiosity.
But we do know this: for the foreseeable future, contact with ETI would have to result from their coming here, which would in all likelihood mean that they far surpassed us technologically. They would be able to enslave us, hunt us as prey, torture us as objects of scientific experiments, or even exterminate us and leave no trace of our civilization. They would, in other words, be able to treat us as we treat animals — or as our technologically more advanced societies have often treated less advanced ones.
This suggests an argument against SETI that is the reverse of Pascal’s famous wager argument for believing in God. Pascal’s idea was that even a small probability of bringing about an enormous good (without risking unacceptable evil) was good reason for acting. This is a reasonable principle: even a small prospect of enormous good can swamp the prospect of more probable but much lesser goods. Pascal’s argument runs into trouble not because of this principle but because of worries about, for example, which God we ought to believe in. (There is also, as William James pointed out, the disconcerting possibility that God might be particularly ill-disposed to people who believe in him through the calculating reasoning of the wager argument.)
The swamping principle also applies to a small possibility of an enormous evil, which can provide a good reason for not acting. This would seem to be the case with ETI. Since there’s at least a small (and perhaps a not so small) probability that they will bring us catastrophic evil, why should we risk such an outcome?
Read his full argument at New York Times: “Will the Aliens Be Nice? Don’t Bet On It”
Follow us: https://twitter.com/spookscentral