a Christian America?

TalkWorld Religions

Join LibraryThing to post.

a Christian America?

This topic is currently marked as "dormant"—the last message is more than 90 days old. You can revive it by posting a reply.

1DeusExLibrus
Mar 10, 2009, 2:36pm

I'm taking a course this semester, called "History of Religion in America." We're currently reading a book called a Christian America. Its an incredibly dry, boring book. However it does show something very important. America for most of its history has never been a Christian theocracy to any degree. It has, however, been culturally Christian for most of its history. I've got another book that I'm going to be reading on my own that discusses the history of non-Christian religions in America, and I'll let you all know what I think of that one as well as the other books I'm reading for this class when I do, but I was just wondering what you all thought of the whole "Christian America" the Fundamentalists keep claiming, when we don't seem to have historical evidence of it, unless they mean something different than it sounds like they do.