christians don’t believe that we can treat our body in any way we please and God will magically make it all better.
So why would they possibly think we can do this to the Earth?