Wednesday, December 17, 2008


Morality Without God

I don't think alot of people watch the Terminator show on FOX, but on the last episode they introduced an interesting dilemma--How do you teach a robot to value human life? In the show, of course, they use the existence of God to justify morality and ethics. If God created life in his image, then it must be valuable, with all the ethical ramifications of that. However, how would an atheist approach the problem? It's a common fallacy that Christians assume that morality and ethics cannot cannot arise in an athiest worldview. On the contrary, considering what we know about how humans and their brains operate, athiests can easily create a contemporary moral system. Humans are social creatures, and society relies on give-and-take, which leads to a "Golden Rule". However, this is based on human-human interactions. We know our value and can understand the value of others. We could even apply this reasoning to animals or aliens because we are all carbon-based life forms (and we hope the aliens agree so they won't destroy us with their superior technology). But robots and AI are completely different entities. How do we convince them that human life is special without positing the existence of God?

No comments: