We're losing the Christian foundations of this country, and people of faith are
letting it happen. It's time for us to take a page from the Tea Party, but instead
of taking our government back, we need to take our culture back.
Don't misunderstand me: America is a diverse country, and that's one of our strengths. I'm not denying that at all. However, America was founded on Christian principles; it's part of our national identity. If you go to Egypt or Saudi Arabia, you understand that the culture is based on Islamic principles, and you need to respect that as a visitor. Likewise, when you go to Mexico, you know that the national language is Spanish, and you need to speak it if you want to communicate. Those nations aren't expected to undermine their own foundations just to accommodate other people, so why are non-Christians demanding that of America? Why are we always accommodating?