A Grateful Graduate

On Monday night I walked out of my very last final at the University of Washington, Bothell. I paused for a few minutes just to stare down the stairwell, surprised by the magnitude of my feelings. A long three and a half years ago I had resumed my journey towards a college degree, after having taken a long hiatus from college. Hundreds of hours of classroom time, group work, reading, studying, coding, writing, and generally scrambling, and now suddenly it was all over. Assuming I hadn't unexpectedly flunked anything, I had earned my Bachelor's of Science in Computing and Software Systems.

Beyond the feelings of accomplishment, relief, and even a twinge of sadness to see the end of another phase of my life, my strongest feeling was gratitude. Make no mistake -- I had worked incredibly hard to earn this degree, while trying to still do good work at my job and be a good family man. But there are any number of ways in which this mission could have been far more difficult or even impossible. Without a supportive, flexible employer to make the afternoon school hours possible, state funding and federally backed loans to make the financial side workable, and a great school with a degree relevant to my career just ten minutes drive from my home, earning a degree would have been an unworkable puzzle. If my wife (who also worked full time and finished her degree during this period) was less patient, our family dysfunctional, our friends and neighbors less helpful, or our other life circumstances slightly different, this would have been a miserable endeavor.

It is only a wealth of good luck that made it possible to get to this point. I hope that I can take what I have earned and give back to those who helped me earn it.

In closing, I want to say that after the attention to last week's post, I had been trying to think of something interesting to post related to Cynefin. In fact, all week I have been adding to a list of possible post topics, related to Cynefin and otherwise. But with the holiday madness I haven't had time to put together anything too serious, and that will probably be true for another week or two. In the meantime, my gratitude is the most profound thing I could write about this week, even if slightly off topic. Happy Holidays!

Software Testing and Cynefin

Update: Please change your bookmarks and RSS feeds to point at joewlarson.com. Thanks!


Cyne-what? That's exactly what we said in Software Testing class when our Professor, David Socha, gave us a brief overview of Cynefin. In fact, Mr. Socha said he wasn't sure how to relate it to testing at all. However, in the couple of minutes he spent on it, my mind was opened to a new way of understanding things.

Cynefin (pronounced "Kan-a-vin") is a Welsh word meaning "place". It is "a model used to describe problems, situations and systems" (wikipedia) created by Dave Snowden. In Cynefin, a particular system is in one of four main domains: Simple, Complicated, Complex, or Chaotic.

A simple system, like a bicycle, is easy to understand and can be taken apart and put back together without changing its nature. Cause and effect relate directly. With a simple system, there are best practices -- the one best way to fix a particular problem. To solve problems, you Sense, Categorize, and then Respond. The categorization helps you identify which best practice you need to use.

A complicated system, like a train, is just an amped up simple system with more parts. The relationship between cause and effect involves more parts and steps. The approach in this system is to Sense, Analyze, and then Respond. The analysis helps you identify a good practice -- but there may be no best practice for any given problem.

A complex system, like a metropolis or a living thing, is not as straightforward. Parts interact in subtle and evolving ways. Cause and effect are intertwined here, and it is often difficult to understand what is happening until after it has occurred. Changing some part of the system cannot be fully reversed, since the whole system will have adapted along the way. In this system you must Probe, Sense, and then Respond. Through this process you reach an emergent practice, which continues changing as you learn and the system changes.

A chaotic system is one in which "there is no relationship between cause and effect". In such systems you can only arrive at a novel practice, which you reach by Acting, Sensing and Responding, and which probably won't work the same way again. I struggled to think of an example here. I thought of a war or a natural disaster but there is definitely a kind of logic in those systems. Maybe a bar fight (sadly I have no direct experience there)? Then I realized a good example might be the Great Depression. I am reading about the presidency of FDR in Jonathon Alter's "The Defining Moment: FDR's Hundred Days and the Triumph of Hope". FDR's approach to fixing the economy was essentially "do something even if it's wrong". In a chaotic system, you'll be better off acting and then seeing where that takes you than waiting around to understand what's going on.

There is a fifth domain, "Disorder", which is more of a state of not knowing what type of system you are in than it is a specific system state.

As I've thought about these categorizations, I've been amazed at how applicable they really are. For example, in two of our software projects at work we've been dealing with pattern recognition problems. In both cases, our software is processing human-generated, messy data and trying to interpret it in a consistent way. We began in each case by trying to deal with the data-sets assuming they were simple systems. Over time our approach to them became more complicated until finally we were trying to deal with them in their actual domain -- complexity.

However, the results of our efforts to deal with the data-sets in all their complexity was inconsistent and unpredictable results. You could never be sure what would "come out the other end" or why you were seeing a particular result. If we had been dealing with another complex system, like a disease or an economy, no one would expect consistent, completely understandable results. But people expect computers to make sense, so the results were not acceptable.

In the one project, we solved the problem by inventing a way to carve up the data set into separate, more internally consistent groupings, which we could then deal with as complicated systems. In the other project, the customer involved decided they were better off investing the time to clean up their data and that we should simplify our logic so it was merely complicated instead of complex. I've discussed this model of these projects with the developers involved and it definitely provides some fresh insights into what we've done and even how we can improve it.

So what does it have to do with testing?

First of all, I think we can subdivide various types of testing according to which sorts of domain it deals with.

  • Simple
    • Record/Playback testing
    • Smoke testing
    • Unit testing

  • Complicated
    • Automated Testing
    • Black box testing
    • Compatibility testing
    • Conformance testing
    • Integration testing
    • Model based testing
    • Regression testing

  • Complex
    • Beta testing
    • Performance testing
    • Scalability testing
    • Security testing
    • System testing
    • Usability testing
    • White box testing

  • Chaotic
    • (thankfully, nothing really fits here, hopefully because we try not to let software get to the point of chaos, though possibly because once software is in this space it is untestable)

Obviously this list is not comprehensive and I am sure there are any number of other ways to place specific items. However, it is probably helpful classify the software system you are trying to test and then determine what type of testing you ought to be doing. When testing your own well defined methods and APIs, then unit testing is certainly sufficient, because your own code is a simple system. If you have an entire stack of different software layers with pieces living on different systems and interacting in myriad ways, then unit testing is very insufficient. In such a complex system you need different testing techniques. This might seem obvious to a veteran tester, but possibly not to someone new to the subject.

Second, it seems like new testing techniques could be derived from other fields by understanding Cynefin. For example, in the field of medicine various forms of diagnostic tests are used to identify disease which occurs in the complex system of the human body. Can these be used as a model for complex software testing methodologies? Economics and politics work on complex systems and have testing in the form of economic indicators, polls, and so on. Is there a way to duplicate these testing approaches, where successful, into software testing?

Finally, just being able to understand and explain things in terms of Cynefin could aid a tester. How many times have unrealistic expectations been placed on a test team because management thought the software was like a bicycle (simple) when in fact it was like a metropolis (complex)? How often do we developers assume testing will be easy because we fail to understand how all our simple features are used in concert by real users? Conversely, how often are testers overwhelmed because they cannot figure out how to test something, when they could focus on carving the software into complicated or simple pieces which are more testable?

I think Cynefin is incredibly relevant for the field of software testing as well as software development in general. I hope to be able to learn more about it in the near future.

For further info, take a look at Cognitive Edge, A simple explanation of the Cynefin Framework and The Cynefin Framework.

Update It is exciting to see this post get some attention! Dave Snowden himself posted about this on his own blog, as did my professor David Socha on his blog. Great!