The ‘wickedness’ of socio-technical ecosystems

by Philip Boxer

Software-intensive ecosystems—systems with large numbers of independent software-intensive and human agents and adaptive behavior—are an increasingly important social, financial, and political force in the world. These systems are different from traditional “closed-world” systems: they are constantly evolving, they have no centralized control, they have many heterogeneous elements, their requirements are inherently conflicting and unknowable, failures are normal, and the boundary between people and systems is blurred. [1]

Such ecosystems have emergent properties – properties which their original designers could not predict – and present a kind of “wicked” problem.[2] Wicked problems have the following characteristics:

  • There is no definitive formulation.
  • They have no stopping rule.
  • Solutions are not true-or-false, but good-or-bad.
  • There is no immediate or ultimate test of a solution.
  • They do not have a well-described set of potential solutions.
  • Every implemented solution has consequences.
  • Every wicked problem is essentially unique.
  • Every wicked problem can be considered to be a symptom of another problem.
  • The causes of a wicked problem can be explained in numerous ways.
  • The planner (designer) has no right to be wrong.

Wicked problems are thus not amenable to traditional reductionist analysis. As Rittel and Webber say: “As we seek to improve the effectiveness of actions in pursuit of valued outcomes, as system boundaries get stretched, and as we become more sophisticated about the complex workings of open societal systems, it becomes ever more difficult to make the planning idea operational”. We simply cannot draw a box around the “system” and analyze it. This presents us with a challenge not just at the level of the software ecosystem, but also at the level of the socio-technical ecosystems that they support. While this challenge appears to undermine our ability to do any meaningful analysis, simply not analyzing such ecosystems is not an acceptable option given that society is increasingly dependent on them – for example, the ecosystems supported by the internet and the “smart grid” for energy production and distribution.

The US Army considered the impact of these wicked problems on the Commander’s Appreciation and Campaign Design, which it defined as “ill-structured”. It concluded that a different approach to problem solving was needed that was inductive in nature, concerned with producing “a well-framed problem hypothesis and an associated campaign design—a conceptual approach for the problem.” [3] Thus as much attention had to be paid to the way the problem was framed (i.e. to the way the boxes were defined), as to the subsequent analysis of what was placed within those boxes. The conclusion reached was as follows:

“The issue is whether a commander should begin by analyzing the mission, or whether complexity compels the commander to first understand the operational problem, and then—based upon that understanding—design a broad approach to problem solving. The answer to this question depends upon the problem and the mission. If the problem is structured so that professionals can agree on how to solve it, and the mission received from higher headquarters is properly framed and complete, then it makes sense to begin with the analysis of the mission (breaking it down into specified, implied, and essential tasks). However, if the problem is unstructured (professionals cannot agree on how to solve the problem), or the mission received from higher headquarters is not properly framed (it is inappropriate for this problem), or higher headquarters provided no clear guidance (permissive orders), then it is crucial to begin by starting to identify and understand the operational problem systemically. This is one of the functions of operational art.”

Another way of stating the challenge, therefore, is to analyze our understanding of the contexts-of-use into which our systems are being deployed before analyzing any proposed architectures for such systems, or proposed architectural changes, to ensure that they are as suitable as possible given our understanding of those contexts-of-use. In this way, architecture analysis becomes an alignment mechanism, ensuring that the software infrastructure that we build is as appropriate as possible for the needs of the contexts-of-use, which collectively form a socio-technical ecosystem.

These socio-technical ecosystems are distinguished by the presence of both task systems and the social systems of meaning that they support. [4] In order to examine the architectural characteristics of both software-intensive and socio-technical ecosystems, traditional architectural analysis must be extended to account for how alignment impacts on the wicked (ill-structured) nature of ecosystems. Such an analysis can give us insight into the properties of an ecosystem and can help us reason about the alignment of the ecosystem with the goals of its many stakeholders.

Notes
[1] This perspective on complex adaptive systems exhibiting organized complexity is to be found in Northrop, L., et al., Ultra-Large-Scale Systems: The Software Challenge of the Future. June, 2006, Pittsburgh: Software Engineering Institute, Carnegie Mellon University.
[2] The original use of this term is to be found in Rittel, H. and M. Webber, Dilemmas in the General Theory of Planning. Policy Sciences, 1973.
[3] TRADOC, Commander’s Appreciation and Campaign Design. 2008.
[4] The original work on this emphasized that while sentient and task groups might correspond, the nature of task systems and snetient systems were essentially incommensurable. Quoting from Miller, E. J. and A. K. Rice (1967), Systems of Organization: The Control of Task and Sentient Boundaries. London, Tavistock: “We have considered many different words – commitment, identity, affiliation, cathexis – to denote the groups with which human beings identify themselves, as distinct from task groups, with which they may or may not become identified. We have chosen sentient – ‘that feels or is capable of feeling; having the power or function of sensation or of perception by the senses, 1632’ (Shorter Oxford English Dictionary) – as expressing most clearly what we mean. We shall therefore talk of sentient system and sentient group to refer to that system or group that demands and receives loyalty from its members; and we shall talk of a sentient boundary to refer to the boundary round a sentient group or sentient system. We shall use sentience to mean ‘the condition or quality of being sentient’ (Shorter Oxford English Dictionary)”.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.