Tag Archives: agile testing days

Agile Testing Days 2015

November in my work calendar means only one thing: Agile Testing Days 2015.

This was the 3rd time I participated in ATD, although this time as a speaker which was a really interesting experience.

As usual what is always great at ATD are the conversations and I was lucky enough to speak to Anne-Marie Charrett and find out how she has been coaching testers at her organisation and the challenges she has faced hiring new testers. The MIATP award party on Tuesday night was also brilliant and it was good to meet some new people from around the world.

One of the highlights of the conference was playing the pen game with my colleague Andrew Fraser (who has also published his conference notes on his blog, which are a far better representation than mine! check them out here) and playing some testing games on the Wednesday night!

I will write a separate blog post about the talk I presented but here are my (short) notes taken during some of the keynotes and talks I attended.

This year I also missed the tutorials day due to schedule restrictions so I’ll go straight to day 1. I hope you find these notes useful, but if you have any questions or want some clarification feel free to leave a comment!

EDIT: You can now find the slides from all presentations here.

Day 1

“Where words fail music speaks” Keynote by Huib Schoots and Alexandra Schladebeck

  • what’s common between music and testing
    • instant feedback
    • reduce complexity to practice
    • learn fundamentals (own them)
    • muscle memory
    • you don’t practice from front to back
  • teams have the same soundtracks, they are on the waves (i.e. understanding)
  • hero complex, music teams and their contexts
    • solo
    • trios/quartets
    • bands
    • orchestras
    • sessions
  • STAR heuristic
    • structure
    • tune
    • accompaniment
    • rhythm
  • know what your cadences are… (authentic and deceptive, like musical notes)
  • exploratory testing and music:
    • scripting versus improvisation
    • communication over documentation
    • standards and music
  • experiments: outcome is unsure
  • non-deterministic and non-reproducible
  • models: based on experience, culture, etc.
  • lessons:
    • learn your team’s tune
    • finish your sprints on a good cadence
    • recognise your role and context
    • practice, patterns, adapting = you’ll be a star


“Experimenting in context for Exploratory Testing” Talk by Maaret Phyajarvi

  • replacing a test case driven style with a learning tester driven style in two organisations
  • what testing gives us:
    • unit testing: spec, feedback, regression, granularity, testing as artifact creation
    • exploratory testing: guidance, understanding, models, serendipity, testing as performance
  • data intensive application
    • things that couldn’t be changed at certain organisations (the “givens”)
      • waterfall process
      • contractual distance between acceptance testers and subcontractor
      • test case metric based reporting
      • manage, don’t test
      • business end users as testers
    • things that have changed (the “experiments”)
      • acceptance tester degree of freedom
      • test cases from step by step scripts to test data with process outline for notes
      • making change requests acceptable
      • reporting ~20% of testing to 3rd party
      • unofficial tips sharing sessions with the subcontractor
  • Function intensive application:
    • “Givens”
      • roadmapping creating disconnect to current priorities
      • tendency for remote work
      • developers doing majority of testing
      • requirements / specifications format as UI spec
    • “Experiments”
      • no test cases or wasteful documentation
      • tester with developer tools
      • removing “acceptance testing” by moving testing to the team
      • continuous delivery (without test automation)
      • holding space for testing to happen
      • true teamwork with mob programming

    • past
    • results
    • obstacles
    • outlook
    • feelings
  • vision, current charter, other charters, details (bug reports)
  • charters to give ideas of exploration


“A happy marriage between context-driven and agile” Talk by Ilari Henrik

  • Automate everything?
  • checking vs. testing
  • checking the algorithmicly catchable stuff
  • TDD is not testing
  • automated check:
    • passed – ok or missing a bug
    • failed – there is a problem or false alarm
  • “Embedded testers”
  • Context-driven and agile manifesto
  • context driven testers will “click” really well in agile teams
  • ilari.com/agile PTE Agile Testing Manifesto
  • how we do it:
    • get involved early
    • bridge between developers and testers
    • pair on tasks
    • educate the team about testing
    • technical awareness “I comment my code – but I still don’t know what I did or why”
    • domain knowledge
    • willingness to learn
    • when crafstpeople meet other craftspeople, that’s when the magic happens


“Testers are dying” Keynote by Karen Greaves and Sam Laing

  • There’s more demand than there’s capacity – testers aren’t dying
  • 6 steps:
    • awareness
    • punishment (different tariffs, rates going up)
    • personal responsibility
    • remove barrier
    • visible impact
    • start a movement
  • trends:
    • pressure at the end of the sprint
    • often asked to release without testing
    • testing is always behind development
    • blame around bugs
    • automation is at least a sprint behind
  • personal responsibility by different boards – different columns like “show me”, “review (in pairs)”, different colour coding


“The human refactor experiment” Keynote by Bryan Beecham

  • horizon of predictability: we can only see so far into the future
  • moment of maximum ignorance
  • get past mental limits – other people will start buying in
  • 23 1/2 hour challenge
  • refactoring:
    • exercise
    • diet
    • continuous improvement
  • make the world a kanban board


Day 2

“If I can do it so can you” Keynote by Dr. Sue Black

Dr. Black told the audience the story of her life right from growing up in the suburbs of London to forming a campaign that helped save Bletchley Park, the home of the World War II code breakers. Really inspirational.


“Shift left and shift right – the testing swing” Talk by Laurent Py

  • journey from waterfall to agile
  • breaking silos
    • speed of feedback loop
  • put value before correctness
    • why? is it worth doing?
    • how to ensure quality of deliverables?
  • testing matrix
    • build / production
    • what should we buid and why? is it really worth it?
    • how to automate? is it really reliable and does it scale?
  • good for challenging business assumptions


“There is no secret sauce” Keynote by Ben Williams and Tom Roden

  • investing in impact model
  • are we investing in the right things?
  • what aren’t we going to do now?
  • investing in impact – a leaner approach to investing in software
    1. decide to invest – explicitly acknowledge hypothesis (between writing software and making money), consider this investment in the context of others
      • diversification: people’s skills, technical component
    2. establish performance boundaries
      • ranged planning
      • pro actively manage risk
    3. measure income
    4. measure impacts
    5. decentralised decision making
      • steer by exception
    6. learn


“Test beyond quality, beyond software” Keynote by Mike Sutton

  • Steve Blank’s book “Startup Owners Manual Step: The Step-By-Step Guide for Building a Great Company
  • agile teams are a place to cultivate behaviours and cultures for a new reality
  • it’s time for agile teams and testing discipline to go beyond software, beyond testing and into organisations
  • ask questions, go where they lead
    • “Why does it take so long?”
    • “What if?”


Day 3

Unfortunately, after giving my talk I didn’t take any notes from the two keynotes I attended. The first one was by Tom Bradford “Nowhere and back again: a software engineer’s tale” in which he described his career in software (development) and how he tried to quit before only to go back and hope to make things better. The second one, which was also the final keynote of the conference was from Olaf Lewitz “Integral Quality” in which he talked about quality across all parts of the system and the organisation and what we could do to change that.


Agile Testing Days 2014

Last November I attended Agile Testing Days 2014. If you read my previous posts you would recognise this was the second time I attended the conference.

It was great being back in Potsdam, Germany, and see many familiar faces, including some from across the pond whilst also meeting new ones from all over the place.

This time I had decided I would pace myself in terms of the number of talks I attended, the tweets I posted and the notes I took, just so I could enjoy more of the conversations and attempt to take less information in but at the same time hoping I would be able to retain it for longer and use it more consistently afterwards.

Below are the notes I took during the 4 days, which started off with a tutorial day on a Monday.

“Technical testing in an agile environment” by Alan “The evil tester” Richardson

  • Technical testing:
    • reminder to keep going deeper
    • tool augmentation
    • technical details will:
      • inspire more testing
      • identify more risks
      • identify different problems
    • not limit our testing to acceptance criteria
  • MORIM:
    • Model: understand different viewpoints
    • Observer: corroborate, invalidate model
    • Reflect: find gaps, lack of depth, derive intent
    • Interrogate: focussed, deep dive observation with intent
    • Manipulate: hypothesis exploration and “how we do stuff”
  • tool augmentation:
    • is not automation, it uses automation
    • passively observe, maintain history of observations
    • alert specific conditions
    • observe the unobserved, and interrogate the inaccessible
    • help model, reflect and manipulate
    • never tools to control, tools to augment
  • go beyond the surface structure
    • transformational grammar
    • surface and deep structure
    • chomsky
    • multiple surface structures
    • single deep structure
      • filtered, biased, distorted -> surface structure
    • questions operate as tools to investigate surface to deep mapping people
  • how to do technical testing:
    • identify tools
    • questioning systems at different surface levels
    • learning system structure technology
    • model system surface structures
    • observe system surface structures
  • automation? sure if you have time;
  • “EditThisCookie” plugin;
  • fiddler and use of its breakpoints feature;
  • burpsuite;
  • WebHackers handbook;
  • “The tangled web: a guide to securing modern web applications”;
  • retrospectives:
    • don’t just pat on the back
    • raise process issues that impact
    • agree what to do about them
    • treat broken windows
    • you might have to be mr. nag and mrs. nasty
  • standups:
    • pay attention to changes
    • describe in value terms
    • see help, pairing
  • acceptance tests:
    • abstraction layers
    • re-use abstraction layers for adhoc automation as well as acceptance tests
    • seek to understand technicalities as well as domain
    • pair

I really enjoyed this tutorial day – it was probably my first tutorial where the majority of our time we spent doing what we do on a day to day basis which is testing software. Yes we got stopped and had some time limits, but those breaks were used very effectively not just by Alan but also by everyone in the room where we shared our experiences and different tools we use in our technical testing role. The only improvement I would suggest to this tutorial would be more pairing and different pairs for the testing exercises as it could spur up more conversations and it would set the tone up for more people to carry on doing it during the break and also during the rest of the conference. But I must say a big thank you to Alan for this tutorial as it was certainly the most useful tutorial I’ve been to so far in my career 🙂

Day 1

“Welcome to the future! Preparing for our agile testing journeys” by Lisa Crispin and Janet Gregory

  • Preparing for the future, or the agile future…;
  • what are the challenges, what can testers do? How can we change conversations?;
  • ability to broaden t-shape skills (breadth and depth);
  • ability to learn, become a t-shaped tester, cognitive learning skills, take charge of your career;
  • customers assume that we know what they want and they want us to deliver exactly what they are thinking;
  • sometimes it’s better to train people than automate certain processes, ask good questions, walk people through;
  • borrow from other disciplines, like business analysis (the 7 product dimensions);
  • change it:
    • from counting bugs
    • from counting metrics that do not matter
  • we have to start thinking about all kinds of risks – call for more risk assessment?;
  • models can help us chose how to attack a problem;
  • serious play is a great way to learn about new things (play, observe, innovate);
  • inspect and adapt: try again, scrap it and try again – what’s next for you?

“A poet’s guide to acceptance testing” by George Dinwiddie

  • We want tests to last over time;
  • not just automated tests, the way you express yourself counts;
  • you need to start thinking about the theme, then start analysis each part;
  • without context words can be confusing;
  • “Given Mary had a little lamb, When Mary went to school, Then the lamb went to school”;
  • purpose is to be picky so words can stand the test of time;
  • some words like everywhere can be quite hard to test against – use examples that cover a good range of options;
  • when a scenario has too much setup that’s a clue – you may not need all that;
  • why do we write tests that ask question of what we should do? We should ask what the system should do;
  • purpose of this talks is that words matter – pay attention to them.

“From good to great: Combining session and thread-based test management into xBTM” by Michael Albrecht

  • SBTM/TBTM/xBTM: models, mindmaps and automatic reporting;
  • xBTM in a nutshell is both session and thread based test management;
  • James Bach’s exploratory testing spectrum;
  • session: test charter – what and what problems;
    • produce a session report (time, bugs, setup, defects, issues and notes);
  • PROOF – past, results, obstacles, outlook and feelings;
    • past and results are very similar in the report
  • ratio for session based test management is 15 minutes test design, 90 minutes test execution or 10/60 minute ratio;
    • fixed length
    • planned (charters, key areas)
    • reviewable (reports)
  • TBTM – threads are a test activity or test idea;
  • use mindmaps to generate threads;
  • to do list, no timeboxing, working in parallel, mind maps;
  • xBTM – models for test design (you show a map to point a country for example);
  • “yed” tool;
  • using a mindmap, navigate through the flow of the system to guide your session/test design.

“Strategy testing // building a product users want” by Roman Pichler

  • What a boss wants vs. what programmers want – programmers just want to give it a go, build a prototype and see if it files;
  • vision – your goal;
  • strategy – path to the goal;
  • details – steps;
  • who are the users and who are the customers?
  • “find an itch that’s worth scratching”;
  • test your strategy – areas of uncertainty, risk assessment;
  • go and talk to your users about the experiences and how they use (your) product today;
  • create a failure tolerant environment.

“The antimatter principle” by Bob Marshall

  • Golden rule – treat as you want to be treated;
  • platinum rule – treat people as they want to be treated;
  • the antimatter principle – antimatter is the most expensive/rare substance known to man;
  • attend to people’s needs;
  • why do we do software development? Why do we do testing?
    • to attend to people’s needs
  • autonomy, mastery and purpose – Dan Pink;
  • Non-violent Communication by Mark Rosenberg;
  • our default mode when we are “not thinking about anything” is to think about ourselves in relation to others – neuroscience research;
  • theory x or theory y organisation.

“Testing the untestable” by Peter Thomas

  • You don’t know what you don’t know;
  • Dan North’s “deliberate discovery;
  • blue green deployments (continuous delivery);
  • the third way;
  • sometimes “The best testers… are your users”;
  • monitoring over testing…;
  • look for abnormal patterns in your data, usage, etc.

“The pillars of testing” by David Evans

  • Model: a (sub) set of things to create or improve within a development or testing process;
  • “all models are wrong, some of them are useful”;
  • Confidence, above safety and courage;
  • stronger evidence, better test design, greater coverage, faster feedback;
  • collaboration and communication, business engagement, team structures, test techniques, skills and experience, effective automation, application testability, configuration management;
  • there’s also the bottom layer which represents the strong base;
  • then there’s the foundation layer;
  • this model can be used to discover or apply root cause analysis;
  • it can also be used to assess, survey teams and organisations, rate the perceived success and importance of each element and look for hotspots and for variances;
  • confidence is the balance of safety and courage.

“Don’t put me in a box” by Anthony Marcano

  • Most people tell you what they are instead of what they do;
  • categorisation still defines what we do for example mum and dad duties;
  • quality comes from people and not from processes;
  • sharing the responsibility within the team doesn’t stop people having expertise or being the expert/reference point, but we may not need him/her all the time, we can all do it.

“Pull requests and testing can be friends” by Alan Parkinson

  • Use files changed in pull requests to guide your charters and your exploratory testing;
  • ask questions in the pull requests comments feature to learn about risks;
  • learn from history;
  • see who contributed what.

“Lateral and critical thinking for testers” by Dan Ashby

  • Left and right side of the brain;
  • left – critical thinking;
  • when we got the information upfront it’s a lot easier to ask questions
    • it’s hard to use critical thinking alone!
  • lateral thinking is when thinking leads the information;
  • “Lateral thinking” by Ed de Bono;
  • Lateral = side thinking.

“Communication: What are you thinking about?” by Shachar Shiff

  • Bad communication causes failure;
  • “Visual aids improve communication”;

“Helping testers add value to Agile projects” by Alan “The evil tester” Richardson

  • Testers adapt different filters to different systems;
  • waterfall projects:
    • removed waste
    • responding to need, not want
    • exploring more
    • taking responsibility for my testing rather than conforming to the process
  • view it as systems rather than agile;
  • testing needs to remain viable and needs to add value;
  • steal from other disciplines (systems thinking, cybernetics, etc.);
  • ownership for testing;
  • exploration beyond acceptance criteria;
  • we need strategies that cope with things coming fast and things getting delayed.

In summary, Agile Testing Days was yet again a great conference and a great week spending some quality learning time with people that I previously admired and new people I met and learn to admire. Tuesday night I saw Matt Heusser getting his Most Influential Agile Testing Person award for 2014, which I was happy for as I voted for him, and also the Brazilian National Team was awarded 1st place in the Software Testing World Cup. Most of the other nights were spent in the bar chatting with people from all over the place, whilst also helping out at the Games Night on Wednesday where I was (as you would expect) one of the people co-ordinating the SET game table. Also had the chance to act as the coach in the infamous Pen game to two of my Redgate colleagues. What can also seem as an off topic to some people, and hopefully on topic for some others, we also went to one of the “Lock rooms” in Berlin where you have to solve a variety of puzzles in order to get out of a locked room – that also opened my eyes to the variety of skills testers have and it was great to further meet some people and chat some more in a slightly more relaxed environment (except for the fact we were locked in a room!).

Hopefully see you at Agile Testing Days 2015!

Agile Testing Days 2013 – Day 3 Talks Notes

In the last day, after being in Germany for just over 4 days, I decided to still attend the morning talks and take a look at the sketching workshop (which I left after the break: I was tired, and I’m really poor at drawing, which meant that I couldn’t even concentrate on what we were asked to do, so I thought I’d give it another go another time). Here are the key points in the morning talks I attended.

“Natural born testers. Are you one? If not, then become one!” by Graham Thomas

  • A natural born tester is someone who tests by default. Whatever. Not destructively, or maliciously, just out of habit, or compulsion, a what if?;
  • who is a natural born tester in this picture?
    • 010
    • Hopefully you guessed (3)
  • why lemmings?
    • skills – multi-tasking, parallel processing, problem solving, time management, goal oriented, fun
  • why play railroad tycoon?
    • all about planning, management, different views by context
    • monitoring, measuring and predicting
    • controlling
    • adapting to change
    • reacting to change
    • fun
  • angry birds:
    • teaches you to explorer your content
    • simple solution is not always optimal
    • different techniques
    • combine techniques
    • plan
    • think in the abstract
  • playing angry (test) birds – hit different parts of the code
  • learn through play – raspberry pi, penguin puzzle

“Don’t you trust me?” by Seb Rose

  • Go through the behaviours with the business, everyone involved, stakeholders, look for the knowns and the unknowns;
  • our systems can be described as behaviours of our system;
  • Cucumber is good because it will bring everyone together to specify software – developers, testers, BAs, product owners;
  • it also helps you give live documentation which is why it has some advantage over other tools;
  • what is the problem with this collaboration in BDD?
    • some places aren’t quite as agile as they think they are
    • talking to each other – BDD actually helps with this because you need to speak to each other
  • look at different components, don’t just test drive them;
  • regaining trust
  • too many organisations are; agile in the way that are not what we would like to think about it – they are still too structured;
  • Ron Jeffries – No Estimates;
  • testing pyramid (uni/integration/end-to-end/exploratory&manual);
  • ice-cream collapse pattern
    • 006
  • don’t treat acceptance tests as system tests – both are different and have a different audience;
  • be careful what you test with BDD – it’s expensive, certain things you can go directly to the method and test it there

Agile Testing Days 2013 – Day 2 Talks Notes

The second day of the conference, and when I was already feeling quite exhausted, I attended a couple of talks in the morning and just sat down in the main room during the afternoon to see what the consensus were about. Here are my notes for the day.

“BDD – Live coding sessions” by Carlos Ble

Carlos’ idea for this session was that he would be coding scenarios in real time a web app so that the audience could participate in giving him feedback and come up with requirements. He would then use this to write up his scenarios before translating them to Cucumber. Unfortunately there were too many problems at the start of the session to due with the network and his server so we only had limited time. Carlos still managed to explain what he intended to do and as easy as it would be to say that he could have prepared it better, those things can happen (live demo gods) and it was brave to do something like that in front of a packed room. I caught up with Carlos in the evening as we were both having dinner in the same restaurant and we had a little chat about BDD and his experiences with it, and I got plenty of resources to go and look at, so it was definitely worth attending. Also, my brain got a bit of a break 🙂

“Myths of exploratory testing” by Luis Fraile and Jose Aracil

This was one of the most discussed talks (at least on Twitter) of the conference, partly because of some of the very controversial claims the presenters made during the talk, but also because they seemed to have a slight different idea of what exploratory testing was for them than to most people in the audience. Despite this, here are their key points.

  • when you explore it you want to come back;
  • keys to success: inspects and adapt, be creative! take advantage of your team and skill set, additional to other testing, quickly find defects, add value to your customer and test early/test often;
  • myth 1: same as ad-hoc testing
    • must be planned and documented
    • you must know: what has been tested; when it was tested; what defects were logged
    • some ideas: testing tours by James Whitaker (!); session based from James Bach; your own method
  • myth 2: can’t be measured
    • multiple measurement techniques: SBTM; amount of logged defects; defects vs. user story complexity
    • you must be creative
    • warning.. warning! don’t be fooled by metrics
  • myth 3: endless
    • difficult to focus on long tasks (> 25 mins)
    • endless = useless
    • must focus on specifics; requirements, problems and complex parts
    • stay focused for burst periods (25 mins)
    • pomodoro testing
  • myth 4: can’t reproduce defects
    • how do you reproduce a defect: be an explorer, like David Livingstone
    • toolset: record video/audio; screen capture; analog recording
  • myth 5: only for agile teams
    • inspect and adapt
    • insanity is doing the same thing over and over again and expecting different results
    • look for new ways of testing
    • empower your team by allowing creativity
    • do you trust your team?
  • myth 6: no documented
    • tester “lonely planet”: user manual; online help (F1 tour); help from third parties
    • alternative tester: goes outside the tour (cancelling processes halfway, using undo, doing things twice); use uncommonly used functionality or processes; always with an objective in mind
    • second visit: you need pictures/notes

Consensus Talk

Organization, Roles and Responsibilities of Testers and Test Managers on Agile Projects” by Dr. Jennifer Blechar

  • 2 projects which were virtually the same, yet they evolved differently – here are the learning points:
    • there’s no I in team:
      • testers must be part of the team
      • test managers outside of teams useful to co-ordinate across teams
      • consider dedicated technical testers
      • consider dedicated test automation experts
      • communication essential
    • use the right tool for the job:
      • everyone in the team must use the same tools
      • user stories in the tool – linked to tests created by testers, users, devs, etc.
      • careful evaluation of tools for test automated prior to implementation
    • never underestimated the value of socialising
      • people are much more likely to “go along” with your ideas if they know you
      • make time to get to know key stakeholders in the project – this includes the customer as well as developers
      • create opportunities for socialising
    • get everyone on board – and keep them on board!
      • the test plan is useless if only the testers agree with it – everyone needs to buy-in to the test plan and be commited to their role in the process
      • relationships needs to be constantly maintained
      • don’t be afraid to change course if needed – use lessons learned
    • reporting
      • customers and other key stakeholders need to be aware of the progress at all times – consider “light” but often reporting
      • reporting is also motivational and useful to get and keep everyone on-board
  • five factors identified as influencing the success of agile testing effort
  • additional factors likely, most important factor is to be agile!

Agile Testing Days 2013 – Day 1 Talks Notes

In the first (well technically second if you count the tutorial day) day of the conference I attended the 2 morning talks and in the afternoon I floated between consensus talks and the various workshops and vendor boots. Here are the key points talked about in those talks.

“How to avoid the testing swiss cheese syndrome” by Marc Rambert

  • We’re not born testers – we become testers;
  • 2% change in a full release (development effort), how much testing effort is required?:
    • 10 test cases related to new features and bugs fixed in this release
    • 90 regression test cases
    • solution: there is no relation – it’s too difficult to align testing and coding
  • speed and continuous delivery make it impossible to test everything after each change;
    • change request implemented and tested (build 0)
    • functional regression set #1 (build 1)
    • bug because of last minute effect (build 2)
    • go live!
  • strategies to focus testing where it adds value (requirements, risks, experience, collaboration)
  • a new opportunity to improve testing in a black box:
    • learning system: learn your tests as usual but capture footprints (link code and test)
    • detection: application changes
    • smart engine
    • you can also add tests that have been run before
  • test scoring to prioritise test execution
  • avoid the testing swiss cheese syndrome: find a way to make your application speak

“Be a real team member” by Tony Bruce

  • What makes a good team member:
    • engage, and use that to build a relationship, interest and motivate people (4 keyword framework)
    • motivate: provide someone with a reason for doing something
  • Models (engage)
    • Belbin team model – action oriented roles, people oriented roles, thought oriented roles
    • plant is someone that comes up with the idea
    • resource investigator, co-ordinator, shaper, monitor evaluator, team worker, implementer, completer finisher, specialist (they key is balance)
  • as you learn more your role will change too
  • Margerison-McCann model
  • day to day:
    • positive action over positive thinking – do it rather than mention it / think about it
    • ask the questions!
    • feedback: express what you do want, rather than what you don’t want!
    • reciprocation: essentially states that if someone gives something to us, we feel obligated to repay that debt; give help; ask for help (give and take)
    • always acknowledge, never dismiss or ignore
    • don’t assume that only people with higher jobs than you have valid opinions
    • beware of the curse of knowledge
      – cognitive bias
      – can be off-putting
      – can leave people feeling dejected
      – why the should care?
    • act as a sounding board
    • appreciate any input
    • beliefs followed by behaviours
    • find people who work because they believe people over money
    • always able to offer different perspectives
    • invest time with people whose work crosses organisational boundaries
    • breaking bread – sharing your lunch – best ideas are shared over food
    • remain reliable
    • listen – don’t just sit around with your headphones on – listen and eavesdrop
    • do the i’s and cross the t’s
    • problems don’t lie in the philosophy of procedures but in practice, and practice is governed by attitude
    • before you speak think – is it true, helpful, inspiring, necessary, kind?

Consensus Talks

“Group Testing” by Christian Baumann

  • Regression testing, how to overcome its error proneness and boredom?
    • no testers in the team
    • group testing! everyone involved, tests distributed randomly, everything until finish and debrief
  • benefits: concurrency and performance issues detected, no one is testing alone, safety net before release;
  • how often do you do it and when? it got forgotten;
  • lists get too long, big tests vs. small tests;
  • regression testing was done in areas where automated tests are lacking;
  • executed regularly (frequency depends on findings);
  • not too boring or repetitive;
  • unsolved issues:
    • decreasing motivation
    • retrospective not done regularly any more
    • number of tests growing
  • officially it was meant to happy every 2 weeks but it just didn’t;

“Are we still testing the wrong stuff?” by Stephan Kamper

  • There’s more to test than what’s desired today
  • two values of software:
    • ability to tolerate and facilitate such ongoing change is the primary value of software, it has to be soft
    • build the software without too many bugs (we’re ok at this [-ish])
  • however, the 1 value people keep saying you ain’t gonna need it (yagni);
  • everyone (test, ux, ba, managers, etc.) should care about eh primary value too;
  • but, few teams do this kind of testing, future readiness not that important after all? relation to software life time?
  • we need zebricon: there’s no answer but may be a concrete answer isn’t the point;
  • how about testing future readiness?

“So I am an Agile test manager now… but what does that mean?” by Mitch (surname unknown)

  • Manager could manage test cases… but you would be a tester in that case;
  • could also manage tests… but you would be a test co-ordinator;
  • could also manage testers… but you would be a people manager;
  • strategies and guidelines: manage environment, boundaries around team, strategy and guidelines for self organised teams, impact mapping;
  • 3 amigos idea in a test manager.

Agile Testing Days 2013 – Day 2 Keynotes Notes

There were again 3 keynotes during the second day of the conference. Here are some of the notes I took:

“Live it – or leave it! Returning your investment into Agile” by Christian Hassa

  • fIxed time, budget and score: what do we do?
  • SAP Business by design disaster
    • target: 10000 customers
    • 1bn / year
    • started in 2003, 2-3bn investment announced in 2007
    • merged in 2013, first release only in 2010, <1000 customers by 2013
  • fixed time and budget are not the problem, lack of frequent validation is;
  • be prepared that the best laid plans don’t work out;
  • your job as a tester is not to verify software, your job is to verify the world is actually changing (fast enough);
  • scaling TDD to the enterprise – it’s not about how to do more work with more people;
  • 005
  • impact maps: goal -> actors -> impacts -> deliverables
    • encourage collaboration with stakeholders
    • break down the goal and turn it into impact map
    • example: business says they want to increase yearly revenue by 3%
      • goal: keep market share in blockbuster concerts; reduce call-center load from blockbuster concerts
      • actors: mobile phone shop users; customers calling to order by phone
      • impacts: reduce bounce rate; order blockbuster tickets
      • deliverables: introduce mobile platform for concert tickets web shop; static information on blockbuster concerts; order one particular blockbuster concert
  • influence vs. control
    • influence – goal, actors and impacts
      • define roadmap of goals
      • test goals and impacts as early and as often as possible using:
        • scale – what to measure
        • meter – how to measure
        • range – benchmark, constraint and target
    • control – deliverables
      • smaller deliverable slices into production
      • easier to parallelise
      • across systems and departments
      • prioritised with business sponsors
  • impact maps and story maps for different levels of collaboration
  • story maps should allow to describe the product to anyone
  • specification vs. assumptions:
    • 006
  • agile fluency model;
  • conclusion: scaling agile doesn’t mean doing more stuff with more people; it means what to do with higher agile methods;
  • don’t just focus on delivering larger backlogs with larger teams;
  • apply principles to next level: focus on impacts and business goals;
  • elevate your practices: build, measure and learn.

“Accelerating Agile Testing” by Dan North

In terms of delivering the presentation Dan North was my favourite. Dan is considered to be by many the father of BDD and my admiration for his work grew even higher after listening to his keynote.

  • testing isn’t a thing that testers do – it’s a mindset, a capability, it should be on everybody (but it isn’t);
  • scrum gets product owners wrong – they should be IN the team;
  • “bla bla bla test done!” – blahterfall software process;
  • blacklogs shouldn’t be groomed;
  • exploratory testing is for the cool kids;
  • what we do reveals what we value
    • values and believes <-> capabilities <-> behaviour
  • which capabilities are missing?
  • don’t automate tests until they are boring to perform;
  • the goal is confidence, not automation;
  • what UX is, is dealing with people’s emotions: people don’t buy apply products because they are actually $400 better than the device before, the UX is that people actually feel about it and queue the night before;
  • emotional response: anger, frustration, delight, curiosity;
  • load and soak testing;
  • where should we test:
    • where the likelihood of failure has a big impact! low likelihood/impact…
    • low risk stuff: need to know enough about it not to care!
    • know about impact/likelihood before worrying about coverage
  • what should we test:
    • likelihood and impact are meaningless without context: business rightness, security
    • consider operations as a first class stakeholder
    • security type planning poker
    • ask who cares, if the answer is noone either find someone (stakeholders) or stop doing the work
  • when should we test:
    • now we have a strategy we can feed forward
    • how we test determines when we test
  • explore other testing methods;
  • consider opportunity costs – is the payoff of this worth it? don’t automate all the things!
  • test deliberately – how where what when;
  • waste is any non-adding economic value to the final product;
  • testing software increases its economic value;
  • become good at something so you know the tradeoffs and the suitanability of a method
    • level of experience needed to understand tradeoffs
    • good choices come from experience, and experience comes from bad choices!

“Who says we can’t be faster?” by Matt Heusser

  • testers: safety net, someone looks after it and after your back…
  • system 1 vs. system 2 (Daniel Kahneman “Fast and Slow” book);
  • “Black Swan” by Nassim Nicholas Taleb;
  • every test ends with “And I’m not going to check anything else” or “And I hope nothing else important happens”;
  • if the risk is outside the risk model you can’t see it – great value to add testing into the model;
  • ten kinds of tests:
    • quality factors, creative ideas, states, taxonomies, previous failures
  • coverage decays over time;
  • create and put test ideas on the board to make it visible! (test kanban!);
  • test smarter – perception any change needs full retest so automate more;
  • pair or team exploratory testing offers different people that highlight ideas, bugs, surprises and questions;
  • coach, continue to help other people and keep looking for problems – it’s where software is going and its great to be that safety net.
  • economical benefits of retesting something that things have changed.

Agile Testing Days 2013 – Day 0 (Tutorial) Notes

Agile Testing Days in 2013 was the first conference I went to since I started my professional career in the software industry.

For those of you that don’t know Agile Testing Days is an annual conference held in the beautiful city of Potsdam, which is situated around 15 miles south west of Berlin, Germany. It’s considered by many as the best software testing conference in Europe, perhaps on a similar level to EuroSTAR (its location changes every year as far as I know).

So the conference is held Monday to Thursday with the first day being full of tutorial tracks. In this post I will cover the full day tutorial which I attended which was called “Exploratory Testing in Practice” by Matt Heusser and Pete Walen. Being quite new to the whole testing community, I did not know a lot about these 2 guys, but since then I’ve learned an awful lot from them and I am really pleased I picked the tutorial I did back then.

The day started off with people arriving in the tutorial room which was a mess. Well, a mess on purpose. The game we played, alongside the dice game which at first did my head in, was to rearrange the room following different rules and instructions, and also working out who else was in the tutorial (from a professional point of view and also personal). It certainly set the tone up for the day as the rest of it was again full of games and activities rather than listening to Matt and Pete speak about testing, which in my opinion is how a tutorial should be run!

One of the best activities we did during the day was play battleships. If you are not familiar with the game, it’s purpose is that you play against someone else (or pair with someone else so you play 2v2 like we did) and you have both an attacking and a defensive grid that is 10 by 10. In the defensive grid you set up your ships and other “vehicles” (jet fighters, etc.) and in the attacking grid you plot out the hits or misses you’ve got from the team you playing against. Each team has a go at a time and the first team to successfully defeat the other team’s vehicles wins.

The point of the exercise was that one team had to follow a previously planned script. So for example, you could roll the dice to add some randomness into your script, then follow a path to get picks on the opposing team chart. The opposition in our case was allowed to take an exploratory approach, so they would be basing their attacks on previously learnt information (if they had hit something on position B8, then they would most likely try to hit something else again on B9 and B7, and possibly on A8 and C8 (you weren’t allowed to position vehicles diagonally. To our surprise, we actually managed to defeat the opposition team even though we had to follow a script. In fairness, our script crashed after a few goes and we had to make some changes – so exploratory was the default winner I guess.

Since then, I have actually used the battleships exercise during an interview, where the interviewee was someone with a lot of experience in software testing but one of the things we wanted to find out was around exploratory testing practises and what this person thought exploratory testing was all about. I truly recommend this as a possible testing kata or to use during an interview (although bear in mind that not all cultures are familiar with this game and it can take some time, even though it covers a lot of ground, from the “is this candidate asking questions before he jumps in the puzzle and solve it?” point of view to the more concrete exploration vs. script one).

The second part of the tutorial we were given a fair bit of information like:

  • quick attacks that can be used during exploratory testing sessions;
  • Michael Bolton’s stopping heuristics;
  • mind map tools: xmind, mindmup;
    • mind maps allow for visibility and to create conversations;
  • SF D(I)POT mnemonic:
    • Structure: view source, ss/https;
    • Functionality: change colours, drag and drop, search, duplicate;
    • (I)nterface: csv export, credit card, slow connections;
    • Platform: browser;
    • Operations: resize browser, back and forth between pages;
    • Timings: logout timings, real time updates;
    • (User types): user, guest, owner;
  • Cern Kaner website;

I also recorded 3 quotes from Matt:

  • “When sequence makes a difference we have to consider it”
  • “You are guaranteed to miss things that are not in the context.”
  • “In a competitive market place you need software that is beautifully crafted, rather than a set of screws together tap tap tap done! kind of thing”

I thoroughly enjoyed the tutorial day, and hopefully did it justice with this set of notes. But most importantly I got to meet 2 guys (amongst many others during the day) who have become people I admire in the software testing community.