Dorothy

About 10 years ago I saw a movie about stormchasers called Twister.  In order to create a better warning system, they hunted violent tornados in Oklahoma with a complex measuring instrument called Dorothy. Dorothy contained small balls that needed to get inside the tornado whilst sending back huge amounts of data. Of course, the biggest trouble for the heroes of the movie (apart from the romance that the storywriter weaved in) was to insert the measuring device into the very heart of the tornado. Despite their skills, planning and preparation, nature wasn’t picking up Dorothy. To their frustration it took several try-outs before they discovered that Dorothy was too light and fell over each time the eye of the storm approached.

Dorothy from the movie TwisterIt is pretty much the same frustration whenever I build a measuring instrument on organizational change projects: the biggest trouble is to get the instrument inside the tornado. Quite recently my team built a Dorothy that needed to track ‘who will do what in the future’, based on a list of employees and a number of future roles, which were derived from the blueprinted business processes. We experienced quite some practical problems to get our Dorothy inside and a setback on our schedule. However, in return of my frustration, there are some major learnings that I can share with you now:

1. Designing Dorothy. A measuring instrument should be light enough to get picked up and used by the target audience and stable enough so it does not fall over once the major audience starts using it. We discovered that web-based technology is a virtue for this kind of measurements, because it allows you to have multiple users at the same time, that it needs no installation of any add-ons, and that the design allows you to make it simple to understand and easy to use.

2. Making sure that Dorothy will get picked up. We did this by making sure that users could get more out of the application than they put in; this is the so-called WIIFM (What’s In It For Me). In our case the extensive reporting allowed users to compare their input with that of others sites and users; so the ‘how am I doing compared to the rest of the organization’ proved to be an element that motivated them to massively use and refine their input.

3. Finally, never underestimate nature’s ability to frustrate the s**t out of you. It is quite tiring to bump into a tornado where you "just can’t get the data out" because people just won’t pick up your instrument before their conditions are being met. For example, one of the sites attached quite some importance to some additional check-boxes in the application and simply did not want to use it before that condition was met. Another example: it took quite a while to convince people that a spreadsheet was only half as effective as the web-based application.  We reached the tipping point by getting together in a local conversation where we spent more than 50% of the time listening instead of pushing our point-of-view and then demonstrating the tool with their datasets. We countered the so-called ‘resistance’ with ‘respect’ instead of ‘being right’.

More often than not it takes several try-outs and push-backs from the organization before people even allow you to go ahead with the measurement. But once the tornado picks up Dorothy, there is a wealth of information that becomes available – that is: if you have the perseverance to make it through the try-outs. It is my contention that frustration and failure is part of the job. Preparation is one thing, but improving your prototype based on try-outs is also a must.