As a member of American evangelical culture, I heard the same question multiple times over Independence Day weekend: is the U.S. a Christian nation?
Yes, it's a loaded question, and yes, the debate has raged for years.
It, along with the issues of abortion and whether or not Al Gore invented the Internet, will fester until egos dissolve and we can dialogue effectively our intellectual and thoughtful reasoning behind our opinions. So basically never.
Originally I wrote several long paragraphs detailing the path to the conclusion I've come to. Instead I have a simple answer with a relatively short explanation.
(Drum roll, please.)
Is the U.S. a Christian nation? No.
Is that a bad thing? No.
The majority of a population checking a certain box on a census form (or that one group assumed the beliefs of an entire population) does not warrant the notion that a certain country "is" one kind of religion or another.
The majority of Americans subscribe to a faith falling under the umbrella term "Christian," just as most Israelis read the To'rah and the mainstream beliefs in Iraq follow the teachings of Mohammad. All three religions are diverse and have several, in some cases many, branches.
An example closer to home would be an examination of Campbell's student body. Statistics will tell you most students checked "Baptist" on their application, but that certainly doesn't mean the statement "Campbell has a Baptist student body" is anywhere near true.
What about the Methodists, Catholics and atheists at Campbell?
And what about the Taoists, Hindus, and Buddhists in the U.S.? Their religion should be part of the country's image just as much as Christianity should be.
And why is it okay for the U.S. to not be a "Christian nation"?
For starters, there's no reason any country need be defined by a certain religion. A country's religious make-up is one of many facets of its culture, just as race, gross domestic product, and average family size are.
The mainstream evangelical movement makes a big deal about the religion of this country because it thinks the answer determines its success in being evangelical. It's mistaken.
Secondly, bundling countries into teams based on religion is only going to exacerbate global political tension. Defining the U.S. as "Christian" fuels the fire in the Middle East, just as labeling Israel "Jewish" intensifies the crisis and heartache on the West Bank and Gaza Strip.
Think of other situations when a certain philosophy defined a geographical area. Northern Ireland and Ireland. North and South Korea. The North and South in the U.S.
I'm not too sure why some evangelicals so wholeheartedly believe the U.S. is a Christian country. After all, evangelicals are the ones asking the question. No other group is wondering.
Is it just convenient for an evangelical to convince oneself we can proudly go about life believing God is going to back us up in every battle and resuscitate the economy to persuade the world the Great Experiment and capitalism rule the day once again?
Or perhaps it's to hold on to the status quo. To grasp with knuckles whitened a time when every family went to church on Sunday and Blue Laws weren't necessary because the Sabbath was a day of rest for everyone.
Except for Jews whose Sabbath is Saturday, and for Muslims who formally recognize God five times a day with prayer.
And except for families who couldn't afford to take a day off work. Whose inclusion in the church succumbed to a stigma against the poor.
And except for citizens whose heritage didn't include Christianity.
And except for fellow Americans not believing in God at all.
So, except for those minorities, someone can generalize the entire country, all 300 million people, by saying the U.S. is a Christian nation.
I guess some people like to round up.