One element of systems thinking that was poetically touched on by Donella Meadows is the human aspect and desire to understand complexity. She says, “Our culture, obsessed with numbers, has given us the idea that what we can measure more important than what we can’t measure” (Meadows, 175). In the readings done before the synthesis of this blog post, several bright minds use a systems thinking approach to overcome this tendency to quantify.
In a pilot study done on ecosystem health by Cleland and Wyborn, a variance to the standardized research template of methods, results, analysis was added by incorporating visual methods to draw out participants basic underlying assumptions. Both researchers adopted visual methods in their research by allowing participants to draw a picture and then to play a board game for part of their methodology. Using these visual methods was done with tact, and for their “ability to uncover a range of new understandings about the relationships between self-identity, disease, and health” and to “gain access to representations of knowledge that would not otherwise be accessible” (Cleland Wyborn, 416). Rather than the researchers letting their own values be imbedded within the survey questions asked and potentially limit the scope of the conversation, this more creative format of response allowed uniqueness to their study. Towards the end of their paper, they discuss the limitations of piloting a new form of research in this manner; however, the ceiling of potential for drawing out participant beliefs and values can help explain some of the complexity often discussed in systems thinking.
Both Cleland and Wyborn were trying to innovate the way in which research draws how values and beliefs. It turns out, though, that both researchers and the organizations operating nuclear reactors have some commonalities. Both can develop tendencies for tunnel vision, confirmation bias, oversimplification, and hindsight bias. Deetz says that it was a confirmation of these basic assumptions that lead to the Fukushima Daiichi accident. Another parallel that I arrived at from the comparison between these highly specialized experts is the disadvantage it is to have lack of diversity or a homogenous expertise comprising your organization. Doing so leads to “alternative possibilities and information that challenge basic assumptions tending to be unconsciously avoided” (Deetz, 2). Additionally, I find it interesting that the two messages Deetz proposed to learn from the Fukushima disaster are to promote a more open dialogue in organizations and to develop additional resilience capacities. For me this ran counterintuitively to what I expected. People probably think of the disaster as an engineering failure; however, it turns out after a period of reflection the failure came from a lack of information flow between actors.
So far we are painting a picture of systems thinking that is highly dependent upon mindfulness of when you're being objective and subjective in order to innovate through an otherwise stagnate situations.
Question: How do you incorporate your values and expertise to navigate such a value-laden and multidisciplinary topic as the environment?
Comments
You can follow this conversation by subscribing to the comment feed for this post.