back to notes

Try…Sketch causal loop diagrams at whiteboards with others

Try…Sketch causal loop diagrams at whiteboards with others

The practical aspect of this tip is more important than may first be appreciated. It is vague and low-impact to suggest “be a systems
thinker.” But if you and four colleagues get into the habit of standing together at a large whiteboard, sketching causal loop diagrams together, then there is a concrete and potentially high-impact practice that connects “be a systems thinker” with “do systems thinking.”

The following examples seem sterile when presented in a book. But imagine you were at a whiteboard with other people and the diagrams were being sketched during a lively conversation. That’s the way we suggest ‘doing’ systems thinking.

Concrete modeling tip: We start by writing on sticky notes to define variables. A note might read “feature velocity” or “# defects.” We place these on a whiteboard. Then we sketch causal link lines between the sticky notes. There will be (or should be) lots of rewriting, erasing, and redrawing during the modeling session. The most meaningful outcome is understanding; in addition, some participants will want to take a digital photo of the whiteboard sketch.

Notation and Examples

Causal loop diagrams contain many elements; the following common useful subset is explored through a scenario.

• variables
• causal links
• opposite effects
• constraints
• goals
• reactions; quick-fix reactions
• interaction effects
• extreme effects
• delays
• positive feedback loops

The following simplified scenario is for a particular organization. It is not a generalization.

Variables—Causal loop diagrams include variables (or stocks) such as the velocity (rate of delivery) of software features and number of defects. Variables have a measurable quantity.

feature
velocity
# defects

Causal links—An element can have an effect on another, such as if feature velocity increases, then the number of defects increase; that is, more new code, more defects.

feature
velocity
|
# defects

Now it is time to bump into Weinberg-Brook’s Law and the Causation Fallacy. It is easy to sketch a diagram; it is something else to model with insight. For example, consider the relationship between the number of developers and feature velocity.

The nature of any cause-effect relationship is actually not obvious, though it is common for people to jump to conclusions such as more developers means better velocity. Adding people late in development may reduce velocity (a sub-element of “Brooks’ Law” [Brooks95]). Or, more bad programmers could really slow you down. An argument can be made that removing terrible developers can improve velocity.

# defects <- feature velocity <- ? - # of developers
\
? (based upon people’s beliefs (mental model) they will ascribe some causal link between # of developers and feature velocity; it may not be accurate)

Opposite effects — A causal link effect may be the same or opposite direction; if A goes up then B goes up, or vice versa. Opposite effect is shown with an ‘O’ on the line. Suppose defects going up puts a drag on the system, lowering the velocity of new features because people spend more time fixing or working around bugs.

# defects <- feature velocity <- O - Opposite effect: as number of defects goes up, feature velocity goes down

Constraints — Unless you can find people to work for free, there is a constraint on the number of developers, based upon cash supply.

Constraints are not causal links. As cash supply goes up, it is not the case that the number of developers goes up.





last updated october 2019