This randomly popped into my head from my childhood, I searched for it online, and apparently it’s not even that uncommon of a Christian take today.
This randomly popped into my head from my childhood, I searched for it online, and apparently it’s not even that uncommon of a Christian take today.
The Bible says God gave man dominion over the Earth, and many Christians take “dominion” to mean in the colonial sense.
I’ve had some leaders who would cede that were are to be good stewards of the earth, so maybe don’t set fire to forests… while defending the people destroying forests and other ecological structures.