Hmm.
At Pew Research, "45% of Americans Say U.S. Should Be a ‘Christian Nation’."
But they hold differing opinions about what that phrase means, and two-thirds of U.S. adults say churches should keep out of politics.
The implication is that Americans want "Christian Nationalism," which is a left-wing boogeyman.
0 comments:
Post a Comment