Since when are humans supposed to dominate the world? Humans control nature... is that how it's supposed to be? There's a real question, not one that you can answer by typing me six or seven lines on this website. I've pondered it for years. Why has the population hit 7 billion? Why, since 1350, the age of the Black Death, has human population spiked so heavily? It's because there were such drastic measures taken - discoveries, breakthroughs, medical revolutions, etc. etc. - and these drastic measures result in the foundation of modern sciences... bacteria? What the hell was that in 1350? No, it was God getting pissed off. In nature, sickness means you are weak. You get left behind. The cheetah will get you at some point; who cares, you've reproduced, your life is essentially a useless mouth to feed.
I don't believe in that, but I believe that it has a lot of value in what we should believe. We aren't greater than the rest of the world... we are natural predators but we are not the only species on earth. Draug, I agree, and I am saying drastic things for a reason. We do settle in six of the seven continents in huge numbers, but why are the trees coming down and why is the ozone going away? It's not the rats and the coral reefs and the Native Americans that did that. They all have something in common... they know their place in nature and they share their place with everything else. They take what they need. We don't need seven continents to sustain us. I think it's really sad that the only natural wonderlands left are the Arctic and the tropical paradises in Africa, Asia, and South America. We consider all the people who live there to be low, third-world, whatever you want to say... they live in the world instead of living above it.