Why do bother watching nature documentaries? I know they’re rubbish and yet each time I’m like “maybe this one won’t suck”
Right off the bat it’s projecting vicious intent onto nature. Nature isn’t just shit that happens, oh no, it’s A BRUTAL WAR OF DYNASTIES!!!1!!!
“Look at this centipede from the Devonian, but invertebrates wouldn’t be the ones to win the game of survival” WHAT DO YOU MEAN? WHAT GAME? INVERTS ARE STILL HERE, THEY’RE THE MOST COMMON AND THRIVING LIFEFORM ON THE PLANET.
And of course the whole thing chooses to fixate on competition and ignore how much of nature revolves around cooperation and symbiosis.
I am begging the media (especially media that sells itself as educational) to stop speaking about nature the same way a 1930s German pseudoscientist would.
I’m thinking of all those documentaries with celebrity narrators making “funny” commentary over the most mundane things