Ahhh. THE question. I apologize for not reading the entire thread.
The answer to this is that God has no authority on this planet, without an excersise of faith on the part of those that do, which are men (or women). This is because:
16The heaven, even the heavens, are the LORD's: but
the earth hath he given to the children of men.
-Psalm 115:16
God cannot lie. In fact, the Bible says that it is impossible for God to lie. He gave dominion (complete sovereign authority) to Adam over this earth.
26And God said, Let us make man in our image, after our likeness: and
let them have dominion over the fish of the sea, and over the fowl of the air, and over the cattle, and
over all the earth, and over every creeping thing that creepeth upon the earth.
27So God created man in his own image, in the image of God created he him; male and female created he them.
-Genesis 1:26-27
God will not violate that word. Man has dominion on this planet, and that has not changed.
If you are really interested in more on this subject, please see my video teaching on my u-tube channel:
God Gives Man Dominion
During this life, on this planet, we are free to choose to live however we want. Certain behaviors will bring bad results, and certain behaviors will bring good results. This is the biblical law of sowing and reaping and it applies to everyone everywhere, regardless of what one believes or does not believe. This applies beyond generational lines, and our choices can affect many generations after us, for good or ill. That is why most of the worlds population is born into great poverty, and some not. Why most live barely get by lives, and some do not. It is a combination of ones own choices, and the choices of their ancestors that brought them to whatever state they are in today. However, because sin has been let loose in the earth, none of us can escape the fact that death will come to us, no matter what the state of our life may be. After death, then comes judgement, and the life of God for eternity, or the second death, separation from God for eternity; what we call hell.
God however, in His graciousness, has provided hope for the life that now is, and the life that is to come through the avenue of faith in the promises that were purchased by the blood of Christ on the cross of calvary. Christians, by definition, no matter what stripe they are, (ie, Catholic, Protestant, Pentecostal, Charismatic, Word Of Faith, etc.), have all expressed faith in the promise of salvation by faith in the finished work of Jesus who died in our place to pay the price for our sin and release us from the grip of eternal death, the second death. We will be with God forever in the light of life.
But the promises also give us access to the intervention of God on this earth. In fact, it is this life on this earth that is supposed to teach us how to live by faith. Unfortunately, most christians, aside from trying to be 'good', never really progress beyond believing God for heaven. Therefore they continue to live much like they did before, and God is not allowed to work in their behalf. Some few put the Word of God first in all things, and they begin to believe the promises of God, and begin to defend themselves from the effects of sin that are endemic in the world around us. But this is a very small minority of the Christian community.
Hope this helps answer your question.
Peace...