American Nihilism
Nihilism is a philosophical and theological view that human life has no ultimate meaning–no God, no morality, no sense. The word from Latin literally means nothing. We find it in the English word, annihilate. Signs of nihilism in America include: Great decline in belief in God and religious practice including…
Read More