To be clear: the United States is not and has never been a “Christian Nation.” As David French, conservative Christian and New York Times columnist recently observed, “America has always been a ...