Speaking just for me, but the end of society as we know it because:
We are now a nation without even the slightest generally agreed apoun and accepted values moral or beliefs. No right or wrong any more, do whatever you feel like
The value of hard is no longer appreciated, sloth is encouraged and sanctioned by the government.
The value of family is no longer appreciated and openly ridiculed in society. This is the glue that holds our society together.
Free market capitalism and the Christian work ethic built this country. Elected officials in our highest offices are now openly hostile toward these principles
We are being systematically scrubbed of our national identity as Americans at every turn and being nudged toward a more global identity. One world government so the wealth can be shared.
When the government takes 45 to 50% of the fruits of your labor to give to those unwilling to work, and over half the citizens of this country are ok with that.
I could go on for days about this, I'm sorry for the rant, but I am bored today.
