Ted Peters of PLTS, GTU, and CTNS has had a paper published in the Philosophical Transactions of the Royal Society. It is a shortened version of a longer paper you can read here at Counterbalance (which has lots of cool stuff). The paper discusses how individuals of various religious groups (and non-religious too) think they might react to the discovery of extraterrestrials. The interesting findings are that most religious people do not think the discovery of extraterrestrials would make them lose their faith. I think that is a good finding.
For the sake of full disclosure, I participated in Dr. Peters’ research, I filled out one of the 1300 surveys he based his data on. It was fun. And I have heard him give his talk on this subject previously as well.
Besides the interesting fact that most believers don’t see themselves being adversely affected by aliens, there is some interesting data from the “non-religious” side of things.
1) The non-religious are overconfident about the robustness of their worldview relative to others when faced with aliens as a “meaning-shock” (discoveries that shock you cultural meaning-system… e.g. for Christians the ultimate meaning-shock might be finding Jesus’ bones). The religious just do not see it that way.
2) The non-religious tend to equate technological and moral progress (Question 6 here). This is a very interesting philosophical and ethical issue and it is worthy of further inquiry.
Does advanced technology lead to advanced morals? If so why and how? Harvard Prof. Steven Pinker (among others, he lists a few) thinks morality does progress, but he is not sure how. Does advanced technology allow more advanced social structures for the policing of morality? That is a possibility. Are people just nicer now because of technology? I cannot see any reason why that would be (and internet comments certainly do not indicate that). What mechanism could account for it?
Morality and technology were not considered progressively connected by Aristotle; they are different virtues, one dealing with action, the other production. Hans Jonas agrees, saying “the more closely a phenomenon of collective life is related to morality, the less certain is progress in it.” (The Imperative of Responsibility, 169)
Morality has to do with the habits of individuals, who die and lose all their good (and bad) habits. Science and technology are properties of societies and cultures – they are on a longer time-frame, and are cumulative. Science does not die with a scientist, what dies with a scientist are his or her particular virtues and vices. For science and technology, every generation builds on the previous one. For morality, every generation starts anew.
So, lastly, what do I think about ET? I think they are no problem for religion (unless your religion has specifically excluded the possibility of ET), and might actually help some, like ET cults.
But I have another concern. While there might be other life out there, extraterrestrial intelligence (ETI) might not exist. I think the Rare Earth hypothesis may be correct. The lack of contact (the Fermi Paradox) might just mean that there are no aliens out there, either because they never evolved or because if they do evolve they invariably kill themselves. The “Great Filter” as some have called it: the unwise shall not pass. Both options are of unimaginable importance.
If we are the only intelligent civilization around because we are the only one to yet evolve, we are utterly unique and inconceivably important. That is a weight we must bear with extreme responsibility.
And what if civilizations do evolve, but they invariably kill themselves? I know some people at SETI and NASA who take this as a serious possibility. Maybe I listened to too much Carl Sagan as a kid. But this ties in directly to my work on human nature and ethics of technology. As we collectively, as a civilization, gain power, we will find it easier and easier to annihilate ourselves. This is certain, we are not losing destructive power, we are rapidly gaining it. Without proper controls, it could happen, maybe just by lab accident (as in self-replicating nanotechnology); “Whoops,” as Bill Joy once described it. Ethics is therefore of paramount importance. We need not only to possess power, we need to know how to use it rightly, or we may all end up dead.
So on that cheery note, I bid you look at Ted Peters’ paper! It’s more fun than either extinction or work.