OZWST 2012 in Review
- Scott Barber (Facilitator)
- Anne-Marie Charrett (Content Owner)
- David Greenlees (Facilitator in training)
- Alessandra Moreira
- Andrew Dempster
- Aman Suri
- Kim Engel
- Mark Tolfts (day 2)
- Oliver Erlewein
- Richard Robinson
As this was the first LAWST-style Peer CONFERence in Australia the majority of the attendees had not experienced the format. Scott provided a brief overview, however preferred the ‘just in time’ approach to guiding people through it. This helped as key items were explained directly preceding their use. The more I see the ‘just in time’ format used, the more I like it!
Check-in was next. Each attendee introduced themselves, their official title, what they ‘actually’ do (often quite different to what their title suggests – surprise!), and any distractions they were currently dealing with. It’s great knowing what’s on people’s minds at that very point in time, and can help the Content Owner shape their approach. For example, they will have picked one particular Experience Report (ER) to be first, however if the presenter of that ER is distracted by some sort of family issue, then this is known and can be accounted for; potentially changing the order.
Speaking of the Content Owner, Anne-Marie was up next to provide an overview of the theme and how it came about. Our theme? Influencing a Context-Driven Approach. I felt a little bad at this point in time as I had pretty much determined the theme and ‘volunteered’ Anne-Marie to the Content Owner role. Anne-Marie being awesome, was well prepared despite this. Also, it is a theme close to Anne-Marie’s heart, which helped. The main driver for this theme; to learn how to share the context-driven approach to testing, and gain the confidence to debate/converse with others that are of different mindsets. I want to be clear here, this was not about learning how to make everyone follow our approach, because as we know well… there is never just one approach. The last thing we want to do is tell the world, our way or the highway. However we do want to be comfortable in telling the world, there is are different ways… let us show you how they work.
After our theme overview, it was time for ERs. In keeping with the spirit of the Peer CONFERence IP Agreement, I won’t be going into great levels of detail while outlining each ER below.
ER 1 – Richard Robinson – Persuasive Progress Reporting Without Counting Test Cases
Richard has a knack for making me (and others) laugh. His ER was no exception. Richard told the story of a conversation between himself and his Project Manager; a conversation where Richard had a strong feeling that the subject of counting test cases would come up. Sure enough, the subject was raised and Richard proceeded to walk us through the fascinating conversation that followed.
The clincher here was that Richard had prepared his response; as much as one can leading in to the unknown.
Richard took the attendees through his approach to testing the web services in the particular environment his ER related to. It was impressive… and considering my current project is also using web services I took a lot away from it.
During Open Season (aka question time) the attendees discussed various aspects of Richard’s ER, most notable being how he went about reporting the approach that he has explained during his ER.
We then had a good amount of whiteboard action along with some ‘kung fu’ style movements from Richard while he was explaining the web service architecture in order to describe how he documented his test cases. I think many of the attendees were a little unsure when Richard first mentioned documenting test cases during his ER, only to later discover that his process for doing so was ultimately lean, and very usable.
Through his ER he displayed not only an approach to testing, but also an approach to gathering and reporting metrics in a context-driven way.
ER 2 – David Greenlees – Death by Test Documentation
Then it was my turn. I was nervous… not because of my ER, but because I wanted so much for OZWST to be successful. I had put many hours of effort into making it happen, and I was desperate for all attendees to learn something and enjoy the experience. It was too early at this stage for me to be comfortable that OZWST had achieved my goals.
Despite these nerves I thoroughly enjoy giving my ER. I spoke of my years of test documentation ‘battles’. The fight against many templates with pre-filled information to the point that only a find and replace of the project name was required.
I didn’t spend a lot of time talking about what I feel is required/not required in a template, but more about how I approached the ‘battles’. I had tried a few different ways of explaining different approaches to test documentation over the years; the primary goal being to eliminate waste, and prompt the use of people’s brains (i.e. no pre-fill of information). Some of these ways had limited success at best… needless to say it was great listening to the attendee’s suggestions and their questions which prompted me to rethink various aspects of my approach.
I left my ER with new goals in this area of my work, and plenty of valuable ammunition to get me started!
After lunch Andrew and Schneider Electric (our wonderful and very up to date venue for the event) were kind enough to offer a tour of the hardware testing section. It was very interesting to see how they test various components and Richard managed to ask many question in relation to standards and how Schneider Electric feels they could be improved. He had us all laughing once again! I think many of us left the tour wanting to test hardware. Then back to the ERs…
ER 3 – Andrew Dempster – Context-Driven Testing in a Large Multinational Organisation
Andrew’s ER was amazing. To me; a tale of success. To Andrew; a never ending tale of some success and a LOT of hard work. I have known Andrew for a good period of time now, and have discussed his approach over a few beers now and then… therefore I know how much hard work he has put in to get his test team where they are today. After his ER, I think all the other attendees also shared my knowledge in relation to his hard work.
Andrew works for a large multinational company with many dispersed development and test teams. His being the only team approaching testing in a context-driven manner, and utilising Session Based Testing with some fantastic results.
He spoke of his initial introduction to the company along with another of the attendees of OZWST (Mark), and of how his previous testing experience (factory school approach) did not work in the new environment for many reasons. This prompted him to begin research into context-driven testing and at one stage even made the call out to James Bach to come and teach his Rapid Software Testing class.
This was a truly fascinating story (even if I had heard parts of it before), and I believe the clincher here was that it was a success! No matter how you look at the journey Andrew has taken over the years… it’s ended with a successful test team who are getting great results and testing according to their context. There is no doubt that Andrew will be up against it for years to come, but he can take stock of some great achievements to date.
There was time for some great Open Season discussion prior to check-outs and announcements from the organisers.
This brought us to the end of day 1. Through the check-out it was clear that all attendees were still energised, which was a great result. For those that have attended a Peer CONFERence, you would know that mental exhaustion is not uncommon.
Also a bonus for the extra energy in the room was the fact that an OZWST dinner had been arranged at the beautiful Andre’s Cucina & Polenta Bar in the east end of Adelaide CBD.
There was great food, great conversation, napkins being drawn on, and some candles! That’s right; day 1 was also Anne-Marie and Kim’s birthday! Special deserts with candles and OZWST attendees singing Happy Birthday… who knew testers could sing?
Attendees arrived just on time for day 2 after a slight taxi/attendee ‘mix up’. I won’t go into detail in order to avoid incriminating a select few!
The energy was still good and check-in proved that day 1 was a success. Attendees were ready for another day of full on discussion, learning, and were keen to get straight to more ERs.
ER 4 – Aman Suri – The Only Context-Driven Tester in a Multinational Project
Aman spoke of his very first project working for his current employer. As the title suggests, he was the only context-driven tester with all other testers following a factory approach. He was responsible for testing one component of a system made up of seven in total. From the outset he was advised to only focus on his component, but very quickly realised that in order to test it well he would need it to interact with the other six. He wanted to ensure the ‘real world’ scenario for use would be successful. The attendees agreed that this big picture view from Aman was exactly what was required in this situation.
This approach was not the norm for these projects. Aman explained that this would be termed System Testing, and that was scheduled for later in the project, yet he felt that it was required earlier in order to find important bugs fast (music to our ears!).
Session Based Testing was utilised by Aman, and the usual requests for test cases/scripts were made by the other Test Managers and Project Managers. Aman was steadfast in his approach and instead provided full access to his session sheets and dashboard reporting.
As a result of Aman’s approach, and the resulting communication with the development team, his component of the system to this day remains in good quality while the other components are still holding up the implementation of the overall system, for various and all too obvious reasons.
Another success story for context-driven testing; this has provided both Aman and his team with valuable experiences that they can demonstrate in the future when influencing the context-driven approach.
ER 5 – Kim Engel – Context-Driven Testing in a Bureaucracy
Kim’s ER was one of those that need to stay within the four walls of the Peer CONFERence. She displayed great courage in talking about her ‘battles’ for a better way of testing. Kim’s ER highlighted to many of the attendees that the small wins can actually be just as important as the big wins. She took the attendees through a journey of various bureaucratic conversations, how she handled them, and what results she felt that she gained throughout. There was talk of test case counts, developer co-location, session based testing, exploratory testing, and even some stealth testing as well.
Of particular note was her approach to test cases for use by UAT testers. Instead of lengthy step by step scripts she developed a checklist approach which guided the testers via idea generating dot points. This helped the testers avoid the age old problem of inattentional blindness. This will be a lasting legacy, and an example of a very valuable small win.
Kim also highlighted to attendees two points that she was steadfast in opposing. These were capturing screen shots for every single test, and traceability back to requirements. Kim felt that capturing screen shots for all tests was not required as she trusted her testers, and felt that others should too. Of course if any problems were encountered then these would be captured. The way the scenario was described I agreed that this would be an unnecessary overhead. As for linking test cases back to requirements… why would you when the requirements were not completed!
This session was originally suggested to me by Mark during the initial planning for the event. I had intended for this to be a ‘fun’ session along with a valuable one in terms of learning how to handle the many questions we context-driven testers get asked.
Scott and Anne-Marie (our rock star testers) took to the hot seats at the front of the room while attendees threw curly questions their way in relation to many factory approach tasks.
I got the ball rolling by welcoming both Scott and Anne-Marie to my wonderful consultancy called ‘ISO Nine Thousand & Something’. I had employed them to head into clients and implement our new software testing standard. What followed was a brilliant conversation in relation to standards and why we do/don’t need them. Scott provided wonderful analogies which provoked a lot of thought… needless to say that I ended up changing the name of my consultancy!
Richard then requested their help in writing a test plan, because the process required it. There was back and forth while Scott and Anne-Marie both established context via questioning Richard. This resulted in a suggestion from Anne-Marie to bring all the key players into a room so that she could whiteboard the plan as an initial step while Richard did his best to play ‘dumb’.
What followed was fascinating. Good interaction from attendees in seeking answers to the many questions and problems they face when trying to test in the best possible way; context-driven.
It was fun, of that there is no doubt. It was also valuable. So thank you to Mark for the suggestion.
ER 6 – Oliver Erlewein – Learning From History… Or Not
I loved Oliver’s ER. He took a different approach and actually read out a blog post that he has had sitting in draft for a while now. It will end up being published, so I won’t go into detail. Needless to say, I suggest you read it when it’s released.
His post took us through the many problems he sees in our industry, at which point he stopped to allow for Open Season. These problems produced some passion in the room and discussion threads were flying everywhere. Once the threads had dried up I facilitated a vote to see if the attendees wanted Oliver to continue reading, which would cover his thoughts on how to combat the problems he had read about previously… a unanimous thumbs up!
So Oliver continued to read his post and once finished a second Open Season was facilitated which equalled more passion from the room!
As stated previously I urge you to read the post once it’s published.
ER 7 – Alessandra Moreira – Winning Small Battles Before We Win the War
We were now nearing the close of Day 2. There was just enough time for one last ER. Alessandra took the ‘stage’ as she was particularly keen to present her story, and what a fine story it was.
Alessandra spoke of being the only context-driven tester in her small team of three, an all too common scenario it would seem, in a financial services environment (highly risk adverse) working with trading systems where every second counts! She quickly established that she was going to need to win the small battles in order to demonstrate how effective exploratory testing and a context-driven approach can be.
She discovered that the existing test cases where highly detailed, with the number of steps anywhere between 50 and 80. Some jaws in the room did drop when she stated that fact! The other two testers in her team were described as ‘switched on’ and knew the systems well as they had been there for about a year. To explain context-driven testing to her team Alessandra took a ‘coffee shop’ approach which was quite effective. Taking her testers out of the working environment helped them to concentrate and value the lessons more so than otherwise. There were several agreeable nods around the room, so I would suggest many of the other attendees have been successful with this approach on previous occasions.
Focusing on those small wins, Alessandra described some of her achievements over the two years that she worked for the team. Test cases decreased in size considerably, and more importantly the testers actually got excited about testing! Also, she managed to move the test team so they were co-located with the development team which assisted in reducing the existing ‘us versus them’ mentality.
Another notable point from Alessandra’s ER was ET Friday’s. As a compromise with her Manager it was agreed that three hours each Friday would be dedicated to exploratory testing. All three would work in a room so that communication was easier. They would begin with a quick mind-mapping session in order to plan their session and break up the areas of the system. The results of each session were reported back to her Manager, and the testers enjoyed the experience as they were no longer bored!
Alessandra’s overall goal for this assignment was to achieve a full exploratory testing approach, and as this didn’t happen she felt that she had failed. Upon reflection, the small wins she had did actually amount to a success.
During open season it was discovered that Scott has a particular ‘interest’ in how UAT is approached within our industry. There were some fireworks, however facilitation was solid and all attendees maintained the required level of control. This is one of the many reasons that these events work so well. It’s a ‘safe’ environment where rules are in place to ensure things don’t get out of control.
Next on the list was wrap up and check-out for day 2. As mentioned earlier I had been nervous for most of the workshop as I really wanted it to go well. Listening to the attendees check-out, hearing their praise, and what they had learned; the nerves vanished. OZWST was a success!
I would like to thank all attendees for their willingness to share such personal stories. The courage in the room was amazing.
I also need to give special thanks the following:
- Scott Barber – Our very own testing rock star direct from the US of A! Scott was more than happy to change his travel plans to accommodate OZWST and did a stellar job of facilitating and training me in that fine art. I look forward to collaborating with him more in the future.
- Anne-Marie Charrett – Content Owner extraordinaire! As mentioned, I threw this role at Anne-Marie and she caught it with both hands. Having not done it before, she nailed it.
- Andrew Dempster – My co-organiser! Andrew did a considerable amount of work on OZWST. He took care of all logistics, and executed the sign making with precision. Andrew will be a great help in organising OZWST 2.
- Association for Software Testing – Via their Grant Program the AST supported OZWST and without them I would have been considerably out of pocket (if I was able to manage it at all). Matt Heusser and Michael Larsen were extremely easy to work with and the turnaround time for the grant was outstanding.
Planning is already underway for OZWST 2, and we’re currently talking with other testing rock stars that may be able to take part. Thank you once again to all involved. A truly wonderful and engaging event which strives to change the face of testing down under!
On behalf of the OZWST Committee
I smirk when I read:
because it is completely accurate, yet totally understated.
See, the misuse (IMHO) of UAT (not just the term, but the misuse of the activity frequently conducted under that name in software development land) is a “bozo-bit-flipping” thing for me, but that is a combination of old news, not overly interesting in itself, and not at all pertinent to the peer workshop.
What is pertinent is that at the moment I realized I’d launched into a (small, calm and controlled by my standards) soap box rant that wasn’t particularly on theme (on thread, but not on theme), I’d normally have “shut it down” in a peer workshop. In this case, I deliberately cranked it up to about an 8 (of 10… by my standards… which by most folks standards is also about an 8… on a scale of 1 -5 ).
Why would I do that, you ask? Because it was very close to the end of the last day. I was “lead facilitator”, but was there with the help of the AST Grant program to train this group in facilitating and managing LAWST-Style peer workshops. At that point in time, David Greenlees was facilitating & I was participating. David was doing an excellent job, but it was a relatively small and entirely simple to manage group, and in that moment I realized that before I’d be comfortable reporting back to AST that David was ready to facilitate on his own and that the OZWST community really understood the dynamics of the workshop format, I’d need to observe the response to (as David says) a testing rock star causing some fireworks (as he put it). So I took that opportunity to conduct a little, shall we say, Exploratory Testing.
Folks were far more amused/intrigued than turned off, but I think some of that was a shock reaction. Either way, I was all kinds of impressed with the fact that they calmly and professionally, but sternly, followed the agreed upon protocol to bring the discussion back to where it belonged. Test = Pass
I did explain during check-out what I did & why (as well as very calmly completing the thought that I was sharing when the plug got pulled — and I only shared that last sentence and a half because I know that it drives me *insane* when I know someone didn’t get to finish their thought and as a result I don’t know which way they were going!) I think a couple of folks didn’t really believe it was a spontaneous, but deliberate test, but if anyone there was more familiar with me in these types of environments, they’d have known something interesting was coming when I took my eyes off of who I was “ranting in the general direction of” to check to ensure the video camera was *not* recording. See, for me, that’s a dead giveaway that something interesting is about to happen (that I don’t want snipped and spliced out of context onto youTube).
Whether folks believe the “fireworks” were a deliberate test or a loss of control, Test = Pass, which was the last thing I needed to know to send the following report back to the AST Grant Committee:
System Performance Strategist and CTO; PerfTestPlus, Inc.
Author; Web Load Testing for Dummies
Co-Author; Performance Testing Guidance for Web Applications
Contributing Author; Beautiful Testing
Contributing Author; How to Reduce the Cost of Software Testing
“If you can see it in your mind…
you will find it in your life.”
Congratulations on a successful workshop! This was a highly informative summary, Kudos to the Scribe!
I especially enjoyed reading about Alessandra’s ER, and I’ll be looking for more small wins in my own team now! 🙂
That was great reading. One persistent theme that I picked up was the courage that many of these testers had to push forward in the face of monolithic and well entrenched practices (I think Ann-Marie did a recent blog post in this?). Hopefully their companies will recognise not only their contributions, but also the difference a little context can make when it comes to testing.
Well done to David and everyone else who played a part in getting OZWST off the ground. Here’s hoping this is the start of a context driven revolution down under!
Where will the videos and/or articles for each Experienc Report?
When will they be posted?
Excellent topics. This review is nice reading, but I want to get more specifics for a few of the ERs. Thanks in advance!
We won’t be posting anything more here, check out the OZWST website.
Thank you for your interest.
For more specific details on each of the ERs you will need to get in contact with the presenters. In keeping with the IP Agreement of these events it is completely up to them how much detail they provide. Each of the attendees reviewed and agreed to this report prior to publishing.
The videos will be given to each presenter and, similar to above, it will be up to them what they are used for or who they give them to.
Sounded like a fantastic couple of days – nice one guys on getting it sorted!
Great write up as usual David – well balanced & informative.
Great response Scott – as usual, its great to get another perspective from someone else who attended the same event.
I wonder if I could arrange a GWST…?
Once again, thanks for another great contribution to the testing community!