Software Testing and Cynefin

Update: Please change your bookmarks and RSS feeds to point at Thanks!

Cyne-what? That's exactly what we said in Software Testing class when our Professor, David Socha, gave us a brief overview of Cynefin. In fact, Mr. Socha said he wasn't sure how to relate it to testing at all. However, in the couple of minutes he spent on it, my mind was opened to a new way of understanding things.

Cynefin (pronounced "Kan-a-vin") is a Welsh word meaning "place". It is "a model used to describe problems, situations and systems" (wikipedia) created by Dave Snowden. In Cynefin, a particular system is in one of four main domains: Simple, Complicated, Complex, or Chaotic.

A simple system, like a bicycle, is easy to understand and can be taken apart and put back together without changing its nature. Cause and effect relate directly. With a simple system, there are best practices -- the one best way to fix a particular problem. To solve problems, you Sense, Categorize, and then Respond. The categorization helps you identify which best practice you need to use.

A complicated system, like a train, is just an amped up simple system with more parts. The relationship between cause and effect involves more parts and steps. The approach in this system is to Sense, Analyze, and then Respond. The analysis helps you identify a good practice -- but there may be no best practice for any given problem.

A complex system, like a metropolis or a living thing, is not as straightforward. Parts interact in subtle and evolving ways. Cause and effect are intertwined here, and it is often difficult to understand what is happening until after it has occurred. Changing some part of the system cannot be fully reversed, since the whole system will have adapted along the way. In this system you must Probe, Sense, and then Respond. Through this process you reach an emergent practice, which continues changing as you learn and the system changes.

A chaotic system is one in which "there is no relationship between cause and effect". In such systems you can only arrive at a novel practice, which you reach by Acting, Sensing and Responding, and which probably won't work the same way again. I struggled to think of an example here. I thought of a war or a natural disaster but there is definitely a kind of logic in those systems. Maybe a bar fight (sadly I have no direct experience there)? Then I realized a good example might be the Great Depression. I am reading about the presidency of FDR in Jonathon Alter's "The Defining Moment: FDR's Hundred Days and the Triumph of Hope". FDR's approach to fixing the economy was essentially "do something even if it's wrong". In a chaotic system, you'll be better off acting and then seeing where that takes you than waiting around to understand what's going on.

There is a fifth domain, "Disorder", which is more of a state of not knowing what type of system you are in than it is a specific system state.

As I've thought about these categorizations, I've been amazed at how applicable they really are. For example, in two of our software projects at work we've been dealing with pattern recognition problems. In both cases, our software is processing human-generated, messy data and trying to interpret it in a consistent way. We began in each case by trying to deal with the data-sets assuming they were simple systems. Over time our approach to them became more complicated until finally we were trying to deal with them in their actual domain -- complexity.

However, the results of our efforts to deal with the data-sets in all their complexity was inconsistent and unpredictable results. You could never be sure what would "come out the other end" or why you were seeing a particular result. If we had been dealing with another complex system, like a disease or an economy, no one would expect consistent, completely understandable results. But people expect computers to make sense, so the results were not acceptable.

In the one project, we solved the problem by inventing a way to carve up the data set into separate, more internally consistent groupings, which we could then deal with as complicated systems. In the other project, the customer involved decided they were better off investing the time to clean up their data and that we should simplify our logic so it was merely complicated instead of complex. I've discussed this model of these projects with the developers involved and it definitely provides some fresh insights into what we've done and even how we can improve it.

So what does it have to do with testing?

First of all, I think we can subdivide various types of testing according to which sorts of domain it deals with.

  • Simple
    • Record/Playback testing
    • Smoke testing
    • Unit testing

  • Complicated
    • Automated Testing
    • Black box testing
    • Compatibility testing
    • Conformance testing
    • Integration testing
    • Model based testing
    • Regression testing

  • Complex
    • Beta testing
    • Performance testing
    • Scalability testing
    • Security testing
    • System testing
    • Usability testing
    • White box testing

  • Chaotic
    • (thankfully, nothing really fits here, hopefully because we try not to let software get to the point of chaos, though possibly because once software is in this space it is untestable)

Obviously this list is not comprehensive and I am sure there are any number of other ways to place specific items. However, it is probably helpful classify the software system you are trying to test and then determine what type of testing you ought to be doing. When testing your own well defined methods and APIs, then unit testing is certainly sufficient, because your own code is a simple system. If you have an entire stack of different software layers with pieces living on different systems and interacting in myriad ways, then unit testing is very insufficient. In such a complex system you need different testing techniques. This might seem obvious to a veteran tester, but possibly not to someone new to the subject.

Second, it seems like new testing techniques could be derived from other fields by understanding Cynefin. For example, in the field of medicine various forms of diagnostic tests are used to identify disease which occurs in the complex system of the human body. Can these be used as a model for complex software testing methodologies? Economics and politics work on complex systems and have testing in the form of economic indicators, polls, and so on. Is there a way to duplicate these testing approaches, where successful, into software testing?

Finally, just being able to understand and explain things in terms of Cynefin could aid a tester. How many times have unrealistic expectations been placed on a test team because management thought the software was like a bicycle (simple) when in fact it was like a metropolis (complex)? How often do we developers assume testing will be easy because we fail to understand how all our simple features are used in concert by real users? Conversely, how often are testers overwhelmed because they cannot figure out how to test something, when they could focus on carving the software into complicated or simple pieces which are more testable?

I think Cynefin is incredibly relevant for the field of software testing as well as software development in general. I hope to be able to learn more about it in the near future.

For further info, take a look at Cognitive Edge, A simple explanation of the Cynefin Framework and The Cynefin Framework.

Update It is exciting to see this post get some attention! Dave Snowden himself posted about this on his own blog, as did my professor David Socha on his blog. Great!


  1. Nice to see someone understand the model other than as a two by two and find an interesting application

  2. Nice mapping of the model. I'm curious as to why your professor presented the framework if he couldn't see the link.

  3. So interesting to continue to see the models used for Shared Understanding being applied to software development. This model is used as a great way to help organisations understand problems. I like the fact you say 'Chaotic
    (thankfully, nothing really fits here,...' - from a software testing aspect this really should be the case - however what fits here is 'Wicked Problems'. The problem is only really understood once a solution is tested, and in this regard it is almost impossible to apply any traditional testing approaches. Perhaps in this instance we need to be thinking more imaginatively with our testing?


  4. Steve - it was kind of a rabbit trail. But many of the most educational moments in my schooling have been off topic - put enough smart people in a room and interesting things happen!

    Dave - I am happy you stumbled onto this. Thanks so much for this contribution to our mental toolkit!

    Andrew - Perhaps "Exploratory Testing" would fit in the chaotic space?

  5. well said' we are using the cynefin model in medical practice

  6. Joe - thanks for writing this down.

    Steve - I presented this framework in class because I know how important it is. And yes, I have thought about how it connects to software testing. I knew it connected to exploratory testing, and some other things, but for some reason as I introduced the topic in class I found my mouth saying that I did not know how it connected. Oh well. In any case, my role as teachers is to provoke thought and enable more effective actions, and I seem to have done both here. Which makes me smile.

  7. Thanks Joe,

    Enjoyed your article, picked up from Daves blog.

    I've recently also been writing about the applicability of Cynefin for Information Technology, Management and Healthcare sciences..

    Having being introduced to complexity about 10 years ago (MDs, managers and software engineers in my experience tend to think in terms of tackling "complicated" rather than "complex" systems most/all the time). I have since found Cynefin very helpful in framing very many of the challenges I come across in all domains, which is needed as complexity applied to all things turns folk off.

    I think some of Cynefins potential lies in its ability to offer a framework across disciplines, as I hint at when exploring the divide between Process Improvement and Information Technology disciplines which you are likely to be familiar with...

    Kind regards

    Tony Shannon

  8. Thanks for writing this up. I am going to learn more about Cynefin.

  9. Hello Bret, great also to see you here as 'Lessons Learned in Software Testing' was one of our textbooks. I also read quite a few of your articles around the internet on Software testing.


Note: Only a member of this blog may post a comment.