Tuesday, December 3, 2013

LIBERALS FEEL COMPELLED TO LIE ABOUT AMERICA...


Liberals have been telling lies about this country for decades.  It's all part of their concerted effort to turn 'We the People' against our own country. Because if we do that, it then creates an opportunity for these self-descibed defenders of the little people to take what it was that the Founders handed to us for safe-keeping and to twist it into something completely unrecognizable. They continue in their efforts to create a utopia here on Earth, but all they succeed in bringing about is abject misery for all.

No comments:

Post a Comment