Who is it that has placed such importance on America?? This is a very curious question I genuinely have but never seem to find answer for.
Is America significant to biblical prophesy? I thought the important nation was Israel? and that God sees things through the eyes of Jews and Gentiles ( which has NOTHING to do with location of birth ).
For decades now, we have heard how powerful America is, and how God loves her so and is on her side, yet she is falling apart, right before our very eyes due to arrogance, disobedience, and greed.
Christianity in America has done nothing it seems to save her. It seems that all we have to offer is the belief that we think we are way more important that we actually are.
Because America and North America has prostituted Christianity, we are about to pay the price. How can we truly believe that God will protect institutions filled with evil and corruption?
It has been the time to humble ourselves and get right with God. I find it odd that the complaining and screaming about "our rights" continues to take center stage.
I guess it all boils down to perspective.