The United States Of America Is NOT A Racist Nation. And That Is The Truth
Over the years, I have heard numerous blacks and white liberals exclaim, “The US is a racist nation.” In fact, I heard that statement made so many times that I lost count way back in the late 1960s. Well, I deeply disagree with that view, notion or whatever you might want to call it. In my opinion,... Sign in to see full entry.