Thinking in Systems
Thinking in Systems is an introductory book on system dynamics written mainly by a member of system dynamics group at MIT. System dynamics is a method of doing science by modelling any system in terms of feedback loops, flows and stocks. This book is written in three parts, first part introduces the fundamentals concepts of systems thinkings: feedback loops, flows and stocks; the second part is about how these tools are used in different real-world scenerios: economic system, social systems and environment systems; third part discusses the leverage points and decision making in system.
Properties of Systems:
Resilience: The ability of a system by virtue of balancing feedback/negative feedback loops to get restored into its original steady state.
Self-organization: The capability of a system to evolve/grow/move into an organized state with very few input from external world. There might be a resilience factor associated with the self-organization too.
Hierarchy: This is “natural” way to form association between components of similar functionality(“sibling”-ish) and those with higher level functionality (“parental”-ish)
These properties are not mutually exclusive, it mights happen that one of these properties is dependent on the occurence of other in some systems.
Thinking Caveats
Linear Intuition in Non-linear world: An apparently unwritten subconcious belief about the world that the output of a system increases roughly in a linear manner with the increment of input
Artificial Boundaries: Our mental models of the real world systems are usually too platonic and simple with very sharp boundaries between different “zones” which can be representative state of the system
Delays: An untrained individual usually only have intuition of first-order consequence of the delay, be it in feedforward-system or feed-backed systems. But even so, it is easier to get intuition of consequences of delay in feedforward system than feedback systems.
Bounded Rationality: Making rational decision based on the incomplete information that one have. This caveat is specially interesting because most of the times incompleteness of the information is unknown known, and normally it is coupled with irrational tendency of filling artifical details to make the series of events reasonably coherent.