Are We Really A “Fundamentally Racist” Nation?

If you heard a liberal talk about the United States you might find yourself wondering what country they’re talking about, sometimes what planet. Rather than seeing this country as a beacon of hope and freedom for the world – a “shining city on a hill,” so to speak – they see only a horribly racist country built on the concept of beating down anyone who is not white. But, as is often the case with the liberal worldview, it is something else entirely.

Share