While it varies regionally within both the US and Western Europe it seems overall that the United States has a more religious society than Western Europe's.
Curiously, the US has the separation of church and state enshrined constitutionally yet religious matters seem to come to the fore there in political life way more than they do in much of Europe. Am I correct in this perception? And If I am what is the reason for it? Is this a hangover from early religious settlers in what is now the US or is there more to it?
Stuff like Creationism isn't a mainstream issue in most of Western Europe notwithstanding the odd exception. Evangelical churches, whilst they do exist here don't seem to have nearly as strong or as vibrant a following as many of those churches within the US.