One of the occupational hazards systems thinkers face is equating the understanding that something is a system with understanding the system itself. Knowing that the outcomes you see result from complex interactions between myriad components doesn’t endow you with the ability to tweak them skillfully.
In complex systems — the weather, the economy — the interactions between the parts often produce counter-intuitive behavior in the whole. You must observe the functioning system for a long time to develop a useful mental model. (Useful in that it helps you make reasonable predictions about what’s coming next.)
Developing good models of complex systems is very difficult. Even after observing the system for a long time — as people have done with the weather and the economy — what makes it tick may elude us. The larger and more complex the system is, the more there is to take in. Such systems are continually changing; often the best you can do is procure a snapshot of its state at any given time. Also, such systems are often the only of their kind. (The sample size for atmospheres precisely like the one that envelops our planet is one.)
The ultimate hazard is hubris. Having understood that something is a system — and perhaps even having developed a good snapshot model of the system — we start to believe we could do as well or better if allowed to start over. You’ll recognize the trap as a violation of Gall’s Law. When dealing with complex systems, no individual — no matter how smart or clairvoyant — can design a better system than one that has evolved to suit a particular purpose over time.