In ‘Thinking In Systems: A Primer’ one of the most interesting ideas that Donella Meadows describes is what Herbert Simon coined ‘bounded rationality’:
Later on in the chapter the following idea is suggested:
This helps explains something that I’ve noticed happen quite frequently.
Someone who was previously non management gets pulled into a management position and ‘mysteriously’ starts acting exactly like all the others in that type of role rather than having a holistic view.
The strange thing is that we don’t expect this to happen. The person was on ‘our’ side very recently so surely they should be able to see both perspectives!
Esther Derby referred to this problem in her keynote at XP2011 where she talked about two different types of information that occur in a system:
When the people who recently moved into a management position are challenged on this they will often point out that “you can’t see the bigger picture” which is true but still doesn’t account for the fact that they probably aren’t seeing it either!
We’re both just seeing different parts of the system.
Meadows goes on to point out that the design of the system tends to encourage this type of behaviour:
Taking out one individual from a position of bounded rationality and putting in another person is not likely to make much difference. Blaming the individual rarely helps create a more desirable outcome.
Meadows finishes this section of the book with the following suggestion which I think is especially useful in a consulting environment where both consultants and management quite obviously tend to suffer from bounded rationality.
I’ve seen various attempts at trying to help people enlarge their bounded rationality at ThoughtWorks, such as:
I think if this type of thing happened more frequently then you’d probably see an enlargement of everyone’s bounded rationality which would be useful for all involved!