Agile Testing Days 2013 – Day 3 Talks Notes

In the last day, after being in Germany for just over 4 days, I decided to still attend the morning talks and take a look at the sketching workshop (which I left after the break: I was tired, and I’m really poor at drawing, which meant that I couldn’t even concentrate on what we were asked to do, so I thought I’d give it another go another time). Here are the key points in the morning talks I attended.

“Natural born testers. Are you one? If not, then become one!” by Graham Thomas

  • A natural born tester is someone who tests by default. Whatever. Not destructively, or maliciously, just out of habit, or compulsion, a what if?;
  • who is a natural born tester in this picture?
    • 010
    • Hopefully you guessed (3)
  • why lemmings?
    • skills – multi-tasking, parallel processing, problem solving, time management, goal oriented, fun
  • why play railroad tycoon?
    • all about planning, management, different views by context
    • monitoring, measuring and predicting
    • controlling
    • adapting to change
    • reacting to change
    • fun
  • angry birds:
    • teaches you to explorer your content
    • simple solution is not always optimal
    • different techniques
    • combine techniques
    • plan
    • think in the abstract
  • playing angry (test) birds – hit different parts of the code
  • learn through play – raspberry pi, penguin puzzle

“Don’t you trust me?” by Seb Rose

  • Go through the behaviours with the business, everyone involved, stakeholders, look for the knowns and the unknowns;
  • our systems can be described as behaviours of our system;
  • Cucumber is good because it will bring everyone together to specify software – developers, testers, BAs, product owners;
  • it also helps you give live documentation which is why it has some advantage over other tools;
  • what is the problem with this collaboration in BDD?
    • some places aren’t quite as agile as they think they are
    • talking to each other – BDD actually helps with this because you need to speak to each other
  • look at different components, don’t just test drive them;
  • regaining trust
  • too many organisations are; agile in the way that are not what we would like to think about it – they are still too structured;
  • Ron Jeffries – No Estimates;
  • testing pyramid (uni/integration/end-to-end/exploratory&manual);
  • ice-cream collapse pattern
    • 006
  • don’t treat acceptance tests as system tests – both are different and have a different audience;
  • be careful what you test with BDD – it’s expensive, certain things you can go directly to the method and test it there

Agile Testing Days 2013 – Day 2 Talks Notes

The second day of the conference, and when I was already feeling quite exhausted, I attended a couple of talks in the morning and just sat down in the main room during the afternoon to see what the consensus were about. Here are my notes for the day.

“BDD – Live coding sessions” by Carlos Ble

Carlos’ idea for this session was that he would be coding scenarios in real time a web app so that the audience could participate in giving him feedback and come up with requirements. He would then use this to write up his scenarios before translating them to Cucumber. Unfortunately there were too many problems at the start of the session to due with the network and his server so we only had limited time. Carlos still managed to explain what he intended to do and as easy as it would be to say that he could have prepared it better, those things can happen (live demo gods) and it was brave to do something like that in front of a packed room. I caught up with Carlos in the evening as we were both having dinner in the same restaurant and we had a little chat about BDD and his experiences with it, and I got plenty of resources to go and look at, so it was definitely worth attending. Also, my brain got a bit of a break ūüôā

“Myths of exploratory testing” by Luis Fraile and Jose Aracil

This was one of the most discussed talks (at least on Twitter) of the conference, partly because of some of the very controversial claims the presenters made during the talk, but also because they seemed to have a slight different idea of what exploratory testing was for them than to most people in the audience. Despite this, here are their key points.

  • when you explore it you want to come back;
  • keys to success: inspects and adapt, be creative! take advantage of your team and skill set, additional to other testing, quickly find defects, add value to your customer and test early/test often;
  • myth 1: same as ad-hoc testing
    • must be planned and documented
    • you must know: what has been tested; when it was tested; what defects were logged
    • some ideas: testing tours by James Whitaker (!); session based from James Bach; your own method
  • myth 2: can’t be measured
    • multiple measurement techniques: SBTM; amount of logged defects; defects vs. user story complexity
    • you must be creative
    • warning.. warning! don’t be fooled by metrics
  • myth 3: endless
    • difficult to focus on long tasks (> 25 mins)
    • endless = useless
    • must focus on specifics; requirements, problems and complex parts
    • stay focused for burst periods (25 mins)
    • pomodoro testing
  • myth 4: can’t reproduce¬†defects
    • how do you reproduce a defect: be an explorer, like David Livingstone
    • toolset: record video/audio; screen capture; analog recording
  • myth 5: only for agile teams
    • inspect and adapt
    • insanity is doing the same thing over and over again and expecting different results
    • look for new ways of testing
    • empower your team by allowing creativity
    • do you trust your team?
  • myth 6: no documented
    • tester “lonely planet”: user manual; online help (F1 tour); help from third parties
    • alternative tester: goes outside the tour (cancelling processes halfway, using undo, doing things twice); use uncommonly used functionality or processes; always with an objective in mind
    • second visit: you need pictures/notes

Consensus Talk

Organization, Roles and Responsibilities of Testers and Test Managers on Agile Projects” by Dr. Jennifer Blechar

  • 2 projects which were virtually the same, yet they evolved differently – here are the learning points:
    • there’s no I in team:
      • testers must be part of the team
      • test managers outside of teams useful to co-ordinate across teams
      • consider dedicated technical testers
      • consider dedicated test automation experts
      • communication essential
    • use the right tool for the job:
      • everyone in the team must use the same tools
      • user stories in the tool – linked to tests created by testers, users, devs, etc.
      • careful evaluation of tools for test automated prior to implementation
    • never underestimated the value of socialising
      • people are much more likely to “go along” with your ideas if they know you
      • make time to get to know key stakeholders in the project – this includes the customer as well as developers
      • create opportunities for socialising
    • get everyone on board – and keep them on board!
      • the test plan is useless if only the testers agree with it – everyone needs to buy-in to the test plan and be commited to their role in the process
      • relationships needs to be constantly maintained
      • don’t be afraid to change course if needed – use lessons learned
    • reporting
      • customers and other key stakeholders need to be aware of the progress at all times – consider “light” but often reporting
      • reporting is also motivational and useful to get and keep everyone on-board
  • five factors identified as influencing the success of agile testing effort
  • additional factors likely, most important factor is to be agile!

Agile Testing Days 2013 – Day 1 Talks Notes

In the first (well technically second if you count the tutorial day) day of the conference I attended the 2 morning talks and in the afternoon I floated between consensus talks and the various workshops and vendor boots. Here are the key points talked about in those talks.

“How to avoid the testing swiss cheese syndrome” by Marc Rambert

  • We’re not born testers – we become testers;
  • 2% change in a full release (development effort), how much testing effort is required?:
    • 10 test cases related to new features and bugs fixed in this release
    • 90 regression test cases
    • solution: there is no relation – it’s too difficult to align testing and coding
  • speed and continuous delivery make it impossible to test everything after each change;
    • change request implemented and tested (build 0)
    • functional regression set #1 (build 1)
    • bug because of last minute effect (build 2)
    • go live!
  • strategies to focus testing where it adds value (requirements, risks, experience, collaboration)
  • a new opportunity to improve testing in a black box:
    • learning system: learn your tests as usual but capture footprints (link code and test)
    • detection: application changes
    • smart engine
    • you can also add tests that have been run before
  • test scoring to prioritise test execution
  • avoid the testing swiss cheese syndrome: find a way to make your application speak

“Be a real team member” by Tony Bruce

  • What makes a good team member:
    • engage, and use that to build a relationship, interest and motivate people (4 keyword framework)
    • motivate: provide someone with a reason for doing something
  • Models (engage)
    • Belbin team model – action oriented roles, people oriented roles, thought oriented roles
    • plant is someone that comes up with the idea
    • resource investigator,¬†co-ordinator,¬†shaper,¬†monitor evaluator,¬†team worker,¬†implementer, completer finisher, specialist (they key is balance)
  • as you learn more your role will change too
  • Margerison-McCann model
  • day to day:
    • positive action over positive thinking – do it rather than mention it / think about it
    • ask the questions!
    • feedback: express what you do want, rather than what you don’t want!
    • reciprocation: essentially states that if someone gives something to us, we feel obligated to¬†repay that debt; give help; ask for help (give and take)
    • always acknowledge, never dismiss or ignore
    • don’t assume that only people with higher jobs than you have valid opinions
    • beware of the curse of knowledge
      – cognitive bias
      – can be off-putting
      – can leave people feeling dejected
      – why the should care?
    • act as a sounding board
    • appreciate any input
    • beliefs followed by behaviours
    • find people who work because they believe people over money
    • always able to offer different perspectives
    • invest time with people whose work crosses organisational boundaries
    • breaking bread – sharing your lunch – best ideas are shared over food
    • remain reliable
    • listen – don’t just sit around with your headphones on – listen and eavesdrop
    • do the i’s and cross the t’s
    • problems don’t lie in the philosophy of procedures but in practice, and practice is governed by attitude
    • before you speak think – is it true, helpful, inspiring, necessary, kind?

Consensus Talks

“Group Testing” by Christian Baumann

  • Regression testing, how to overcome its error proneness and boredom?
    • no testers in the team
    • group testing! everyone involved, tests distributed randomly, everything until finish and debrief
  • benefits: concurrency and performance issues detected, no one is testing alone, safety net before release;
  • how often do you do it and when? it got forgotten;
  • lists get too long, big tests vs. small tests;
  • regression testing was done in areas where automated tests are lacking;
  • executed regularly (frequency depends on findings);
  • not too boring or repetitive;
  • unsolved issues:
    • decreasing motivation
    • retrospective not done regularly any more
    • number of tests growing
  • officially it was meant to happy every 2 weeks but it just didn’t;

“Are we still testing the wrong stuff?” by Stephan Kamper

  • There’s more to test than what’s desired today
  • two values of software:
    • ability to tolerate and facilitate such ongoing change is the primary value of software, it has to be soft
    • build the software without too many bugs (we’re ok at this [-ish])
  • however, the 1 value people keep saying you ain’t gonna need it (yagni);
  • everyone (test, ux, ba, managers, etc.) should care about eh primary value too;
  • but, few teams do this kind of testing, future readiness not that important after all? relation to software life time?
  • we need zebricon: there’s no answer but may be a concrete answer isn’t the point;
  • how about testing future readiness?

“So I am an Agile test manager now… but what does that mean?” by Mitch (surname unknown)

  • Manager could manage test cases… but you would be a tester in that case;
  • could also manage tests… but you would be a test co-ordinator;
  • could also manage testers… but you would be a people manager;
  • strategies and guidelines: manage environment, boundaries around team, strategy and guidelines for self organised teams, impact mapping;
  • 3 amigos idea in a test manager.

Agile Testing Days 2013 – Day 3 Keynotes Notes

The last day of the conference started off with an interesting keynote by David Evans. Here are my notes for his keynote and also the other 2, including the closing keynote from Lisa Crispin and Janet Gregory.

“Visualising Quality” by David Evans

  • The product of testing is confidence, not quality!
  • Decision support – balance risk and reward;
  • the expert witness in a trial;
  • accidents happen – unpredictable, one-in-a-million freak accident:
    • the question was asked (suffer catastrophic failure due to the cold temperate (NASA accident 1987)
      • reasonable doubt may mean no go when it comes to a decision
      • engineers did their best to highlight problems
      • failures in communication that resulted in a decision to launch based on incomplete and sometimes misleading information, a conflict between engineering data and management judgement.
  • the value of the information we provide is equal to the value of the decision it informs;
  • testing only isn’t enough, it’s the communication about the collected data that counts;
  • McGurk effect – beware of conflict between what we show and what we say;
  • hard to process information without context – put data together, or compare different data points;
  • put other contexts too – military budget of the US – put it as a percentage of the US GDP;
  • get feedback quickly – put it somewhere visible;
  • look for opportunities where we taking shortcuts and bending them behind their original purpose;
  • system diagrams: scale it by:
    • size
    • usage
    • value
    • risk
  • smell: arbitrary representations – work with the brain, not against it;
  • smell: the warm glow of the dashboard – visualisation overload;
  • smell: colour bias and averaging;
  • false perspective: choose perspective that relates to the context;
  • represent people on a kanban board;
  • keep it simple, get it green (builds);
  • visualise subjective assessments (James Bach low tech dashboard);
  • show named milestones (Jeff Patton’s story map);

“The next decade of Agile Software Development” by J. B. Ransberger

  • Kent Beck: “Why aren’t we rich yet?”
  • when people say bring me data they really mean shut up and go away;
  • Etudes for Excellence by James Shore
    • agile practices are etudes not rules!
  • we don’t want change because eventually we will do the wrong thing;
  • Tim Lister’s “Adrenaline junkies and Template Zombies”;
  • the opportunity to address risk is what’s missing from daily standups;
  • Tim Lister’s “Waltzing with bears – Managing risks on software projects”;
  • inbox for later process in our whiteboards (if you can’t resolve in 2 minutes);
  • backlog: what are we missing about all those features?
    • involving the customer
    • cartoon for agile project spec – what programmer thought, designer, product owner, etc.
    • talking in examples
      • having conversations, is more important than capturing conversation, is more important than automating conversations – BDD
      • abstraction then talk about details (example driven development)
      • lost luggage example – point to a bag that’s most similar to yours and note how it’s different
  • negotiating the scope: ask how much of each story not which stories;
  • when you spot something wrong in a pairing session wait 15 seconds before you point it out – no one wants to know there’s a semi colon missing;
  • pairing also develops your confidence, and/or humility – helps with trust;
  • “you cook, you clean up”;
  • continuous integration is an attitude not a tool;
  • test retreats

“Build bridges, change viewpoints, delight customers” by Lisa Crispin and Janet Gregory

  • Customer, focus on the customer;
  • it’s all about perspectives:
    • have multiple perspectives
    • be open, listen
    • consider your customer needs
    • are business needs different?
  • changing our culture
    • focus on quality, not speed
    • learning culture
    • short feedback loops
    • respect for all team members
  • ways to collaborate:
    • 3 amigos / power of 3
    • pairing
    • talk about tests
    • continuous feedback
    • impact mapping
    • story mapping
  • collaboration happens easily between children because there is trust;
  • interruption isn’t necessarily rude;
  • test framework:
    • tests/examples passes to test method/fixture calls developer code
  • delight customers: collaboration helps simplify, deliver what customer wants most;
  • when programmers and testers automate acceptance tests without the customer, they risk merely throwing features over the wall;
  • learn non-threatening ways to ask “why do you want that?”;
  • provide an environment that nurtures change;
  • teach people (testers) how to do a good resume, and instead of leaving they won’t want to once they realise how much they are learning;
  • trust, visibility, transparency, team focus on quality and … the 3 Cs (contractual, communication and competence);
  • it’s still more important to have the right people than to have them together;
  • it’s only 5 steps to the top – you can rest, do it in small chunks;

Agile Testing Days 2013 – Day 2 Keynotes Notes

There were again 3 keynotes during the second day of the conference. Here are some of the notes I took:

“Live it – or leave it! Returning your investment into Agile” by Christian Hassa

  • fIxed time, budget and score: what do we do?
  • SAP Business by design disaster
    • target: 10000 customers
    • 1bn / year
    • started in 2003, 2-3bn investment announced in 2007
    • merged in 2013, first release only in 2010, <1000 customers by 2013
  • fixed time and budget are not the problem, lack of frequent validation is;
  • be prepared that the best laid plans don’t work out;
  • your job as a tester is not to verify software, your job is to verify the world is actually changing (fast enough);
  • scaling TDD to the enterprise – it’s not about how to do more work with more people;
  • 005
  • impact maps: goal -> actors¬†-> impacts¬†-> deliverables
    • encourage collaboration with stakeholders
    • break down the goal and turn it into impact map
    • example: business says they want to increase yearly revenue by 3%
      • goal: keep market share in blockbuster concerts; reduce call-center load from blockbuster concerts
      • actors: mobile phone shop users; customers calling to order by phone
      • impacts: reduce bounce rate; order blockbuster tickets
      • deliverables: introduce mobile platform for concert tickets web shop; static information on blockbuster concerts; order one particular blockbuster concert
  • influence vs. control
    • influence – goal, actors and impacts
      • define roadmap of goals
      • test goals and impacts as early and as often as possible using:
        • scale – what to measure
        • meter – how to measure
        • range – benchmark, constraint and target
    • control – deliverables
      • smaller deliverable slices into production
      • easier to parallelise
      • across systems and departments
      • prioritised with business sponsors
  • impact maps and story maps for different levels of collaboration
  • story maps should allow to describe the product to anyone
  • specification vs. assumptions:
    • 006
  • agile fluency model;
  • conclusion: scaling agile doesn’t mean doing more stuff with more people; it means what to do with higher agile methods;
  • don’t just focus on delivering larger backlogs with larger teams;
  • apply principles to next level: focus on impacts and business goals;
  • elevate your practices: build, measure and learn.

“Accelerating Agile Testing” by Dan North

In terms of delivering the presentation Dan North was my favourite. Dan is considered to be by many the father of BDD and my admiration for his work grew even higher after listening to his keynote.

  • testing isn’t a thing that testers do – it’s a mindset, a capability, it should be on everybody (but it isn’t);
  • scrum gets product owners wrong – they should be IN the team;
  • “bla bla bla test done!” – blahterfall software process;
  • blacklogs shouldn’t be groomed;
  • exploratory testing is for the cool kids;
  • what we do reveals what we value
    • values and believes <-> capabilities <-> behaviour
  • which capabilities are missing?
  • don’t automate tests until they are boring to perform;
  • the goal is confidence, not automation;
  • what UX¬†is, is dealing with people’s emotions: people don’t buy apply products because they are actually $400 better than the device before, the UX is that people actually feel about it and queue the night before;
  • emotional response: anger, frustration, delight, curiosity;
  • load and soak testing;
  • where should we test:
    • where the likelihood of failure has a big impact! low likelihood/impact…
    • low risk stuff: need to know enough about it not to care!
    • know about impact/likelihood before worrying about coverage
  • what should we test:
    • likelihood and impact are meaningless without context: business rightness, security
    • consider operations as a first class stakeholder
    • security type planning poker
    • ask who cares, if the answer is noone either find someone (stakeholders) or stop doing the work
  • when should we test:
    • now we have a strategy we can feed forward
    • how we test determines when we test
  • explore other testing methods;
  • consider opportunity costs – is the payoff of this worth it? don’t automate all the things!
  • test deliberately – how where what when;
  • waste is any non-adding economic value to the final product;
  • testing software increases its economic value;
  • become good at something so you know the tradeoffs and the suitanability of a method
    • level of experience needed to understand tradeoffs
    • good choices come from experience, and experience comes from bad choices!

“Who says we can’t be faster?” by Matt Heusser

  • testers: safety net, someone looks after it and after your back…
  • system 1 vs. system 2 (Daniel Kahneman “Fast and Slow” book);
  • “Black Swan” by Nassim Nicholas Taleb;
  • every test ends with “And I’m not going to check anything else” or “And I hope nothing else important happens”;
  • if the risk is outside the risk model you can’t see it – great value to add testing into the model;
  • ten kinds of tests:
    • quality factors, creative ideas, states, taxonomies, previous failures
  • coverage decays over time;
  • create and put test ideas on the board to make it visible! (test kanban!);
  • test smarter – perception any change needs full retest so automate more;
  • pair or team exploratory testing offers different people that highlight ideas, bugs, surprises and questions;
  • coach, continue to help other people and keep looking for problems – it’s where software is going and its great to be that safety net.
  • economical benefits of retesting something that things have changed.

Agile Testing Days 2013 – Day 1 Keynotes Notes

The conference days, apart from the first day which was a tutorial one, consisted of a morning start keynote, followed by 2 talks (in which you could pick 5 different tracks), an after lunch keynote, followed by 2 more talks or workshops and a closing day keynote.

“Agile Testing is nonsense, because Agile is testing” by Andrea Tomasini

It was the first talk/keynote of the conference and the feeling afterwards was that Andrea tried to cover a little bit too much ground specially considering we were just getting started. I guess you could argue that if he tried to deliver the same keynote towards the end of the conference then people would have been exhausted already so it wouldn’t have worked. There was plenty of information and here are some of my notes (also as a side note: just because I write down my¬†conference notes, it doesn’t necessarily mean I agree with them, it means I would be happy to discuss them with you, and that hopefully it has sparked some interest from you too):

  • testing is an attitude – being competitive is an attitude of mind;
  • questions matter a lot more than the mind;
  • every step performed while creating a new product is unique, therefore all testing is unique;
  • clients don’t always have domain knowledge to know what is right or not;
  • inspect the outcome and learn to validate assumptions and hypothesis;
  • if we test because we don’t trust what the team did then we are far away from understanding the true value of testing;
  • testing as an approach: if we write a test that doesn’t bring anything new, then it brings no value;
  • we like to learn using short feedback loops;
  • being successful once doesn’t mean you got it right – you could have got lucky (“works on my machine!);
  • reduce:
    • social risk
    • schedule and cost risks
    • business risks (show it to the customer as often as possible; contracts are not as important as collaborations)
    • technical risk (no one does a project twice – each step is an opportunity to learn)
  • testing as a practice:
    • creating a vision with stakeholders
    • test as a vision – you test hypothesis first, then you go back and test again after thinking about scaling the business
    • the next step is consulting – get consultants to help fill the gaps
    • co-creating – you start creating something that at the beginning no one knew what it was
  • don’t overload people when trying to go lean – it’s not only about reducing waste;
  • Unnecessary variations – keep the flow even;
  • wasteful activities – remove non value adding ones;
  • when retrospective actions keep being the same then no continuous improvement process is in place
    • under pressure your muscle memory wins so you will do the same mistake
  • Testing is
    • an attitude because we embrace the Agile manifesto and its principles, we have to accelerate learning, and requires individual commitment to validate assumptions
    • everybody makes mistakes, every mistake is an opportunity to learn
    • testing is an approach, it requires to systematically initiate everything we do by understanding the constraints, and the expected outcome
    • testing is a practice, because once we develop the attitude and learned the approach, we will be able to emerge practices.

“Agile Testing: Strength through interdependence” by Mary Gorman

  • Types of interdependence:
    • pooled interdependence: marketing, develop, training, operations
    • sequential¬†interdependence: analyse, design, develop, test
    • reciprocal¬†interdependence: analyse to and from design, and so on for develop and test
    • comprehensive¬†interdependence: 4 quadrants (analyse, design, develop, test) interacting with each other
  • strongly interdependent team relies on each other;
  • trust: team interdependence is built on a foundation of mutual trust;
  • capacity for trust: trust ourselves and others
    • 3 C’s of trust: contractual, communication and competency
      • contractual: clear expectation, meet commitments
      • communication: who knows what, when, open to feedback from each other, direct, constructive feedback
      • competence: trust of capability, respect each other, engage others and empower them, help others learn new skills
  • agile activities: discover via structured conversation (explore -> value/evaluate -> confirm (loop))
  • typical testing used to happen when code was happening; we need to bring it to the discovery stage (early testing)
  • we can’t afford the V&V model anymore – we need to collapse the hands and bring it together
  • product requirements interdependencies:
    • external: where to begin, where to end?
    • internal: 7 dimensions (user, interface, action, data, control, environment, quality attribute), techniques, user and action / action and data / data and interface have to be tested together, cross interdependent and internal dependent
    • techniques: scenarios, example, data tables, given when then, planguage (Tom Gilb)
    • shared techniques: yield a strong, higher quality product
    • interdependence across all views
    • interdependent wear, fail, dependent controlling, interdependent strong, flexible.

“The science behind building and sustaining high-performance teams through understanding behavioural science, neuroscience and social psychology” by Peter Saddington

  • True self organisation? Theory or reality?
    • ability to change and influence everything around you in autonomous teams
  • high performance – high productivity, fun, sustainability;
  • people are the problem… and the solution, not heuristics, methods or patterns;
  • emotions aren’t accurate over time;
  • as behaviour patterns are understood, accuracy of predictable engagement pattern increases;
  • as team dynamics are understood productivity of teams increase
    • and management effort decreases, shifting towards inspiring, enabling and fulfilling people – managers become managers of inspiration instead of issues;
  • re-interview process:
    • what do other people say you are?
    • what do you love to do (outside work)?
    • who do you look up to (mentor/role model)? why?
    • what type of problems do you enjoy solving¬†(outside work)? why?
    • how do you know you’ve done a good job at something¬†(outside work)?
    • what is your best way of supporting others¬†(outside work)?
  • the power of play (fun):
    • year after year (for the past 30 years) fun has only decreased
    • happier people are more productive people
    • increase fun – increase innovation
  • how do we get more fun at work?
    • purpose
    • autonomy
  • switching context causes disruption to teams: silly and unproductive context switching;
  • multiple projects cost:
    • money to companies
    • 40% loss in productivity
    • visual input drops 29% and brain activation drops 53%
    • context switching linked to memory loss
    • multitasking linked to madness
  • focus is the important word;
  • effective leaders should see themselves not as managers or even problem solvers but as lovers of people and inspiration starters;
  • understand people, increase fun, focus.

Agile Testing Days 2013 – Day 0 (Tutorial) Notes

Agile Testing Days in 2013 was the first conference I went to since I started my professional career in the software industry.

For those of you that don’t know Agile Testing Days is an annual conference held in the beautiful city of Potsdam, which is situated¬†around 15 miles south west of Berlin, Germany. It’s considered by many as the best software testing conference in Europe, perhaps on a similar level to EuroSTAR (its location changes every year as far as I know).

So the conference is held Monday to Thursday with the first day being full of tutorial tracks. In this post I will cover the full day tutorial which I attended which was called “Exploratory Testing in Practice” by Matt Heusser and Pete Walen. Being quite new to the whole testing community, I did not know a lot about these 2 guys, but since then I’ve learned an awful lot from them and I am really pleased I picked the tutorial I did back then.

The day started off with people arriving in the tutorial room which was a mess. Well, a mess on purpose. The game we played, alongside the dice game which at first did my head in, was to rearrange the room following different rules and instructions, and also working out who else was in the tutorial (from a professional point of view and also personal). It certainly set the tone up for the day as the rest of it was again full of games and activities rather than listening to Matt and Pete speak about testing, which in my opinion is how a tutorial should be run!

One of the best activities we did during the day was play battleships. If you are not familiar with the game, it’s purpose is that you play against someone else (or pair with someone else so you play 2v2 like we did) and you have both an attacking and a defensive grid that is 10 by 10. In the defensive grid you set up your ships and other “vehicles” (jet fighters, etc.) and in the attacking grid you plot out the hits or misses you’ve got from the team you playing against. Each team has a go at a time and the first team to successfully defeat the other team’s vehicles wins.

The point of the exercise was that one team had to follow a previously planned script. So for example, you could roll the dice to add some randomness into your script, then follow a path to get picks on the opposing team chart. The opposition in our case was allowed to take an exploratory approach, so they would be basing their attacks on previously learnt information (if they had hit something on position B8, then they would most likely try to hit something else again on B9 and B7, and possibly on A8 and C8 (you weren’t allowed to position vehicles diagonally. To our surprise, we actually managed to defeat the opposition team even though we had to follow a script. In fairness, our script crashed after a few goes and we had to make some changes – so exploratory was the default winner I guess.

Since then, I have actually used the battleships exercise during an interview, where the interviewee was someone with a lot of experience in software testing but one of the things we wanted to find out was around exploratory testing practises and what this person thought exploratory testing was all about. I truly recommend this as a possible testing kata or to use during an interview (although bear in mind that not all cultures are familiar with this game and it can take some time, even though it covers a lot of ground, from the “is this candidate asking questions before he jumps in the puzzle and solve it?” point of view to the more concrete exploration vs. script one).

The second part of the tutorial we were given a fair bit of information like:

  • quick attacks that can be used during exploratory testing sessions;
  • Michael Bolton’s stopping heuristics;
  • mind map tools: xmind, mindmup;
    • mind maps allow for visibility and to create conversations;
  • SF D(I)POT mnemonic:
    • Structure: view source, ss/https;
    • Functionality: change colours, drag and drop, search, duplicate;
    • (I)nterface: csv export, credit card, slow connections;
    • Platform: browser;
    • Operations: resize browser, back and forth between pages;
    • Timings: logout timings, real time updates;
    • (User types): user, guest, owner;
  • Cern Kaner website;

I also recorded 3 quotes from Matt:

  • “When sequence makes a difference we have to consider it”
  • “You are guaranteed to miss things that are not in the context.”
  • “In a competitive market place you need software that is beautifully crafted, rather than a set of screws together tap tap tap done! kind of thing”

I thoroughly enjoyed the tutorial day, and hopefully did it justice with this set of notes. But most importantly I got to meet 2 guys (amongst many others during the day) who have become people I admire in the software testing community.