Is the United States a "Christian nation," as many politicians often describe it? Should it be? Justify
your answer