Originally posted by Kolar
View Post
I just want to make the point that everything isn't black and white just because the law or the majority of a society's population says it is. The right to bear arms is a wonderful example. This is a fundamental right for Americans, but I can't see how the majority of people believe this right improves the quality of life in America. If it doesn't then it is immoral isn't it? More or less immoral than torturing prisoners? More or less immoral than executing mass murderers? More or less immoral than anything I have proposed? I wouldn't presume to say, but it's something to think about I believe.
Comment