Not true. It was never a Christian theocracy. But, Christianity influenced the founding principles of the nation. The first democratic institutions were established by the colonists in their churches. The Bible was taught in our public schools. Some of the notable people of the day, I think even a SCOTUS judge, called the new country a "Christian nation".
Now, however, that is drying up. It is probably due to the weakening of the American church itself. But, unlike you and your attitude of surrendering and waiting for the last days to conclude, I am going to keep praying for a new great awakening within the American church. That will polish and sharpen us, making us an effective spiritual weapon in God's hands, and (hopefully) cause cultural and political change and restoration.