Herophant
Member
- Joined
- Oct 19, 2005
- Messages
- 192
- Reaction score
- 0
- Gender
- Male
- Political Leaning
- Undisclosed
Often people explain morals from a Christian point of view, claiming the western morals and success is based upon Christianity. Personally much of what I find wrong with western society has Christian roots, and furthermore many of the “good” effects of Christian norms are certainly not a Christian only thing. Hell I have seen people trying to pass of liberalist and democratic ideals as Christian ideas. So what is what is christian culture, what are the norms and ideals that can be identified as christian morals?
Secondly why do people fear that the lack of a hegemonic Christian morality will be the end of society? Particularly when more religious western democratic societies have loots more murders. Not to mention more teen abortions, which is actually a thing the religious people think is going to be an effect of a less religious society.
Secondly why do people fear that the lack of a hegemonic Christian morality will be the end of society? Particularly when more religious western democratic societies have loots more murders. Not to mention more teen abortions, which is actually a thing the religious people think is going to be an effect of a less religious society.