Tag Archives: agile testing

Agile Testing Days 2014

Last November I attended Agile Testing Days 2014. If you read my previous posts you would recognise this was the second time I attended the conference.

It was great being back in Potsdam, Germany, and see many familiar faces, including some from across the pond whilst also meeting new ones from all over the place.

This time I had decided I would pace myself in terms of the number of talks I attended, the tweets I posted and the notes I took, just so I could enjoy more of the conversations and attempt to take less information in but at the same time hoping I would be able to retain it for longer and use it more consistently afterwards.

Below are the notes I took during the 4 days, which started off with a tutorial day on a Monday.

“Technical testing in an agile environment” by Alan “The evil tester” Richardson

  • Technical testing:
    • reminder to keep going deeper
    • tool augmentation
    • technical details will:
      • inspire more testing
      • identify more risks
      • identify different problems
    • not limit our testing to acceptance criteria
  • MORIM:
    • Model: understand different viewpoints
    • Observer: corroborate, invalidate model
    • Reflect: find gaps, lack of depth, derive intent
    • Interrogate: focussed, deep dive observation with intent
    • Manipulate: hypothesis exploration and “how we do stuff”
  • tool augmentation:
    • is not automation, it uses automation
    • passively observe, maintain history of observations
    • alert specific conditions
    • observe the unobserved, and interrogate the inaccessible
    • help model, reflect and manipulate
    • never tools to control, tools to augment
  • go beyond the surface structure
    • transformational grammar
    • surface and deep structure
    • chomsky
    • multiple surface structures
    • single deep structure
      • filtered, biased, distorted -> surface structure
    • questions operate as tools to investigate surface to deep mapping people
  • how to do technical testing:
    • identify tools
    • questioning systems at different surface levels
    • learning system structure technology
    • model system surface structures
    • observe system surface structures
  • automation? sure if you have time;
  • “EditThisCookie” plugin;
  • fiddler and use of its breakpoints feature;
  • burpsuite;
  • WebHackers handbook;
  • “The tangled web: a guide to securing modern web applications”;
  • retrospectives:
    • don’t just pat on the back
    • raise process issues that impact
    • agree what to do about them
    • treat broken windows
    • you might have to be mr. nag and mrs. nasty
  • standups:
    • pay attention to changes
    • describe in value terms
    • see help, pairing
  • acceptance tests:
    • abstraction layers
    • re-use abstraction layers for adhoc automation as well as acceptance tests
    • seek to understand technicalities as well as domain
    • pair

I really enjoyed this tutorial day – it was probably my first tutorial where the majority of our time we spent doing what we do on a day to day basis which is testing software. Yes we got stopped and had some time limits, but those breaks were used very effectively not just by Alan but also by everyone in the room where we shared our experiences and different tools we use in our technical testing role. The only improvement I would suggest to this tutorial would be more pairing and different pairs for the testing exercises as it could spur up more conversations and it would set the tone up for more people to carry on doing it during the break and also during the rest of the conference. But I must say a big thank you to Alan for this tutorial as it was certainly the most useful tutorial I’ve been to so far in my career 🙂

Day 1

“Welcome to the future! Preparing for our agile testing journeys” by Lisa Crispin and Janet Gregory

  • Preparing for the future, or the agile future…;
  • what are the challenges, what can testers do? How can we change conversations?;
  • ability to broaden t-shape skills (breadth and depth);
  • ability to learn, become a t-shaped tester, cognitive learning skills, take charge of your career;
  • customers assume that we know what they want and they want us to deliver exactly what they are thinking;
  • sometimes it’s better to train people than automate certain processes, ask good questions, walk people through;
  • borrow from other disciplines, like business analysis (the 7 product dimensions);
  • change it:
    • from counting bugs
    • from counting metrics that do not matter
  • we have to start thinking about all kinds of risks – call for more risk assessment?;
  • models can help us chose how to attack a problem;
  • serious play is a great way to learn about new things (play, observe, innovate);
  • inspect and adapt: try again, scrap it and try again – what’s next for you?

“A poet’s guide to acceptance testing” by George Dinwiddie

  • We want tests to last over time;
  • not just automated tests, the way you express yourself counts;
  • you need to start thinking about the theme, then start analysis each part;
  • without context words can be confusing;
  • “Given Mary had a little lamb, When Mary went to school, Then the lamb went to school”;
  • purpose is to be picky so words can stand the test of time;
  • some words like everywhere can be quite hard to test against – use examples that cover a good range of options;
  • when a scenario has too much setup that’s a clue – you may not need all that;
  • why do we write tests that ask question of what we should do? We should ask what the system should do;
  • purpose of this talks is that words matter – pay attention to them.

“From good to great: Combining session and thread-based test management into xBTM” by Michael Albrecht

  • SBTM/TBTM/xBTM: models, mindmaps and automatic reporting;
  • xBTM in a nutshell is both session and thread based test management;
  • James Bach’s exploratory testing spectrum;
  • session: test charter – what and what problems;
    • produce a session report (time, bugs, setup, defects, issues and notes);
  • PROOF – past, results, obstacles, outlook and feelings;
    • past and results are very similar in the report
  • ratio for session based test management is 15 minutes test design, 90 minutes test execution or 10/60 minute ratio;
    • fixed length
    • planned (charters, key areas)
    • reviewable (reports)
  • TBTM – threads are a test activity or test idea;
  • use mindmaps to generate threads;
  • to do list, no timeboxing, working in parallel, mind maps;
  • xBTM – models for test design (you show a map to point a country for example);
  • “yed” tool;
  • using a mindmap, navigate through the flow of the system to guide your session/test design.

“Strategy testing // building a product users want” by Roman Pichler

  • What a boss wants vs. what programmers want – programmers just want to give it a go, build a prototype and see if it files;
  • vision – your goal;
  • strategy – path to the goal;
  • details – steps;
  • who are the users and who are the customers?
  • “find an itch that’s worth scratching”;
  • test your strategy – areas of uncertainty, risk assessment;
  • go and talk to your users about the experiences and how they use (your) product today;
  • create a failure tolerant environment.

“The antimatter principle” by Bob Marshall

  • Golden rule – treat as you want to be treated;
  • platinum rule – treat people as they want to be treated;
  • the antimatter principle – antimatter is the most expensive/rare substance known to man;
  • attend to people’s needs;
  • why do we do software development? Why do we do testing?
    • to attend to people’s needs
  • autonomy, mastery and purpose – Dan Pink;
  • Non-violent Communication by Mark Rosenberg;
  • our default mode when we are “not thinking about anything” is to think about ourselves in relation to others – neuroscience research;
  • theory x or theory y organisation.

“Testing the untestable” by Peter Thomas

  • You don’t know what you don’t know;
  • Dan North’s “deliberate discovery;
  • blue green deployments (continuous delivery);
  • the third way;
  • sometimes “The best testers… are your users”;
  • monitoring over testing…;
  • look for abnormal patterns in your data, usage, etc.

“The pillars of testing” by David Evans

  • Model: a (sub) set of things to create or improve within a development or testing process;
  • “all models are wrong, some of them are useful”;
  • Confidence, above safety and courage;
  • stronger evidence, better test design, greater coverage, faster feedback;
  • collaboration and communication, business engagement, team structures, test techniques, skills and experience, effective automation, application testability, configuration management;
  • there’s also the bottom layer which represents the strong base;
  • then there’s the foundation layer;
  • this model can be used to discover or apply root cause analysis;
  • it can also be used to assess, survey teams and organisations, rate the perceived success and importance of each element and look for hotspots and for variances;
  • confidence is the balance of safety and courage.

“Don’t put me in a box” by Anthony Marcano

  • Most people tell you what they are instead of what they do;
  • categorisation still defines what we do for example mum and dad duties;
  • quality comes from people and not from processes;
  • sharing the responsibility within the team doesn’t stop people having expertise or being the expert/reference point, but we may not need him/her all the time, we can all do it.

“Pull requests and testing can be friends” by Alan Parkinson

  • Use files changed in pull requests to guide your charters and your exploratory testing;
  • ask questions in the pull requests comments feature to learn about risks;
  • learn from history;
  • see who contributed what.

“Lateral and critical thinking for testers” by Dan Ashby

  • Left and right side of the brain;
  • left – critical thinking;
  • when we got the information upfront it’s a lot easier to ask questions
    • it’s hard to use critical thinking alone!
  • lateral thinking is when thinking leads the information;
  • “Lateral thinking” by Ed de Bono;
  • Lateral = side thinking.

“Communication: What are you thinking about?” by Shachar Shiff

  • Bad communication causes failure;
  • “Visual aids improve communication”;

“Helping testers add value to Agile projects” by Alan “The evil tester” Richardson

  • Testers adapt different filters to different systems;
  • waterfall projects:
    • removed waste
    • responding to need, not want
    • exploring more
    • taking responsibility for my testing rather than conforming to the process
  • view it as systems rather than agile;
  • testing needs to remain viable and needs to add value;
  • steal from other disciplines (systems thinking, cybernetics, etc.);
  • ownership for testing;
  • exploration beyond acceptance criteria;
  • we need strategies that cope with things coming fast and things getting delayed.

In summary, Agile Testing Days was yet again a great conference and a great week spending some quality learning time with people that I previously admired and new people I met and learn to admire. Tuesday night I saw Matt Heusser getting his Most Influential Agile Testing Person award for 2014, which I was happy for as I voted for him, and also the Brazilian National Team was awarded 1st place in the Software Testing World Cup. Most of the other nights were spent in the bar chatting with people from all over the place, whilst also helping out at the Games Night on Wednesday where I was (as you would expect) one of the people co-ordinating the SET game table. Also had the chance to act as the coach in the infamous Pen game to two of my Redgate colleagues. What can also seem as an off topic to some people, and hopefully on topic for some others, we also went to one of the “Lock rooms” in Berlin where you have to solve a variety of puzzles in order to get out of a locked room – that also opened my eyes to the variety of skills testers have and it was great to further meet some people and chat some more in a slightly more relaxed environment (except for the fact we were locked in a room!).

Hopefully see you at Agile Testing Days 2015!

CAST 2014

Back in August 2014 I was fortunate enough to attend CAST 2014.

CAST, which is organised by AST (Association of Software Testers), is an annual software testing conference bringing attendants and speakers from all over the world. This year’s edition was held in New York, at one of New York University’s buildings overlook 5th Avenue and Washington Square Park. This location was as convenient as it would get for me, considering I would be flying to another continent – it only takes 6 hours (give or take) to fly across the pond and land in New York.

IMG_2032

I arrived in New York on the Sunday, a day before the conference started, and arrived just a little bit too late in the afternoon to take part in the “Scavenger’s Hunt” that took place around the area.

Monday was a tutorial day and I took part in “The art and science of test heuristics” led by Fiona Charles, which I will be covering the main points below.

Monday evening I went down to the bar where a few attendants and speakers were gathering and played a few test games, the one I particularly took a liking for was SET. SET is a pattern recognition game – it’s brilliant to get your brain working again and one of my colleagues has even suggested playing it before the start of every meeting to make sure everyone was in gear and ready to work.

Tuesday and Wednesday were to be your standard conference days, with keynotes and talks given by speakers, and I surely made the most of it. Apart from the usual conversations I was also lucky enough to take part in a testing challenge on Tuesday evening where we were asked to test a Test Execution tool developed by one the challenge sponsors. I was “recruited” via twitter where several people were forming teams and ended up in a 8 people team. It was great to be able to pair up with someone that I never met before and I certainly got a few tips from him (Michael Corum @TNRidgeback). The results were announced the following morning and our test report was considered to be the best one alongside another team’s, but the others got the prize in the end, considering they had less members and produced a slightly better formatted document. Wednesday evening a few of us went out for a meal after the conference had finished, which was really good to meet even more people from the community and share experiences – it’s great to find out that other people have the same professional struggles as you even though they are separated by thousands of miles. It’s also good to know that you’re at least on the right path, and in my case actually quite fortunate to work somewhere as good as Red Gate. The evening was rounded off with a few drinks and yet more testing games, in particular the Pen game which myself and Michael Corum paired on, to try and solve the puzzle. We eventually got it, after quite a few attempts and some excellent coaching from Stephen Blower.

Overall this was an amazing experience, and the people I met were once again one of the highlights. But, of course listening to the likes of Michael Bolton, James Bach, Fiona Charles speak was really enlightening, and the amount of learning that took place over the 3 days was great. It was also great to see familiar faces like Pete Walen, Matt Heusser, Jean-Paul Varwijk and others I met at Agile Testing Days.

So below you can find my notes about the keynotes, talks and tutorial I attended. Hopefully, they are of some use to you.

“The art and science of test heuristics” Tutorial by Fiona Charles

  • Started off with a knot problem where each group (of 8 people approximately) would form a knot with their hands
    • it led groups to understand the problem, and in turn question the requirements
    • refactor our initial model for solving the problem
    • control the scope – we had too many people in the group for example
  • sometimes you have to abandon certain models;
  • children games are great to understand heuristics and models;
  • prepare to re-model
    • re-model -> embrace chaos
    • challenge the assumptions regularly
    • focus and de-focus
  • how would you go about this?
    • google it!
    • watch your initial assumptions
    • over think it!
    • previous experience
    • instructions can be useful
    • look at patterns
    • trial and error
    • multiple oracles
  • reverse strategy – start with easy first and vice-versa;
  • throw away work and re-do;
  • limit on permutations – find a way to cut through;
  • step away from problems and take a break;
  • switch technique;
  • diversity on a team can be good but also cause a problem;
  • being lucky – surely you must find your own luck;
  • problem could be bigger than you thought;
  • rule: use machines for what they are good at;
  • beer fridge problem (https://www.youtube.com/watch?v=t41wNkGvJ9k)
    • previous experience
    • tech limitations
    • personas (diversity/multiple people)
    • (not) follow instructions
    • act on feedback
    • varying input
    • curiosity
    • consider audience
    • visually appealing (objects)
    • continuous learning
    • context
  • you don’t stick quality in;
  • if you don’t pass one test criteria, don’t go into the next one;
  • tactful communication
  • follow the money
  • danger in following heuristics – change them?
  • art – creativity coming up with heuristics;
  • science – analysing/measuring those heuristics.

I really enjoyed the tutorial day. It was a simple start and it was great to hear what everyone thought heuristics meant in their own context. To me, heuristics are rules of thumb. Something that will help me and steer me in a certain direction to try and achieve a certain goal or prove something. In a way, this gives me the correct behaviour expected, but also allows to think what may or may not happen instead and lets me follow a variety of paths. The variety of games and exercises done throughout the day were great and Fiona had great input with all her knowledge on certain things that we may have been missing.

“Testing is not test cases (towards a performance culture)” by James Bach (keynote)

You can watch the keynote here and the slides are here. Below are some of the notes I scribbled down during the talk.

  • Analytical lag time – time between experiments and the results come up/are shown;
  • “Testing is not test cases”
  • use charters for testing – they will help you;
  • there’s no test for “it works”;
  • Pro Clip – James Bach tool to generate test data;
  • analytical testing is different to analysis of output;
  • every act of testing involves many layers (slide #3);
  • fresh-eyes-find-failure heuristic (slide #5);
  • importance of precise words as testers communicate;
  • highlight text if you’re not sure it’s a bug and you need clarification with developer or someone else;
  • “I don’t yet see a problem here” rather than say “It passes”;
  • last thing a tester should do is just follow instructions, it’s not intellectual enough;
  • a test case is a set of ideas, instructions, or data for testing some part of a product in some way;
  • test automation saying is toxic – testing can’t be automated, just like test cases don’t contain testing;
  • you don’t ask developers if they are a manual or an automated programmer, so why would you say it with testers;
  • managing tacit knowledge – real testing ability cannot be spoken, it’s tacit knowledge;
  • testing cannot be encoded (slide #12);
  • struggle through practice and examples, not certification;
  • checking is part of testing, just like tyres are part of a car – just not everything – you may not need much testing on your project; test exploration or design work can lead to valuable work – you may not need deep testing to happen again;
  • testing as a performance – like an actor in a play;

“My boss would never go for that” by Alessandra Moreira (talk)

  • Testing and the art of persuasion;
  • ideal world != real world: managers not always know about testing or even the skills a tester has;
  • persuasion: process by which a person’s attitudes or behaviour are influenced by communications from other people;
  • persuasion != coercion; manipulation; one way street;
  • pesuasion = influence; guidance; negotiation;
  • if you present your evidence in a certain style they prefer to hear it – visual;
  • “grow your credibility as a craftsman” James Bach;
  • know your craft:
    • build credibility
    • part of being a good tester
    • how do you know your way is better
    • keep learning
  • build a solid case:
    • arguing skills can be learned
    • consider your boss’ point of view and business priorities
    • gather supporting evidence
    • other possible questions?
  • communicate clearly:
    • use your manager’s communication style
    • present compelling evidence
    • link your priority to your boss’
  • compromise:
    • listen and include different perspectives
    • prepare small compromise in advance
    • start small
    • be prepared to incorporate new ideas
  • things to remember:
    • resisting change is natural
    • most managers want you to succeed
    • be patient
    • find a support system
  • common mistakes:
    • upfront hard-sell
    • too many buzzwords
    • assuming it is a one shot
  • so:
    • figure out why, build a sound case, know your craft, prepare, compromise and be patient.

“Scaling up with embedded testing” by Trish Khoo (keynote)

  • The more we could do at the start, the more challenging testing gets because not many obvious bugs are propping up;
  • developer find and fix, tester verifies expectations;
  • [at Red Gate, at least in my current team, I consider myself quite lucky regarding this as developers are often asking “How are we going to test this?”];
  • take tester our of the feedback loop between dev/tester/product owner and the feedback loop becomes a lot quicker;
    • pivotal
      • “we don’t actually have a QA department” (Elizabeth Hendrickson)
      • developers doing exploratory testing
      • eliminating the time spent explaining what has been done
    • microsoft
      • “it should be hard to find bugs” (Alan Page)
      • ultimate goal in a software engineering team is to engineer software
    • google
      • “quality is team owned not test or QA owned” (Michael Bachman)
  • raising the bar:
    • how can we as testers improve, and start using our, now, free time to provide value?
    • you can’t expect developers to know about testing practices from one moment to the other;
  • learn about human computer interaction, statistics and modelling people.

“Psychology and Engineering of Testing” by Ilari Aegerter and Jan Eumann (slides) (talk)

  • Testers and programmers mutually respect each other – both parties bring a good variety of skills;
  • PTE Agile Testing Manifesto
  • pair on tasks – pair on exploratory testing with developers;
  • educate the team about testing – workshops, dojos, katas, etc;
  • skills needed to make this happen:
    • technical awareness (coding, reading code, database expertise, test environments, service configuration, etc.)
    • domain knowledge

“Paint like an engineer – skills in testing” by Alexandra Casapu (talk)

  • Best way to work on skills is to do something, get feedback, improve it and keep doing this;
  • discuss the method, and apply engineering methods;
  • taxonomy of skills:
    • heuristic does not guarantee a solution
    • it may contradict other heuristics
    • it reduces the search time for a solving problem
    • its acceptance depends on the immediate context instead of an absolute standard
  • do a “my personal testing skills”
    • categories: human-human interaction, attitude determining, rule of thumb, information visualisation, risk-controlling
  • use quiet times (Christmas, other people’s vacations, etc.) to learn or do new stuff;
  • skills you’re not using you will lose them;
  • skills are a procrustean bed (procrustean myth);
  • not just learning a skill but interaction between these skills;
  • nurturing skills – choose some areas; learn the queues that lead to mistakes; recognise when need help.

“Patterns of automation” by Jeff Morgan, founder of leandog.com (talk)

  • Specification by example:
    • specification – user story with acceptance criteria
    • implementation – code with unit tests
    • verification – automated tests
    • duplication in the above points
  • PageObject pattern:
    • web service, web apps, mobile application, data warehousing app
    • stops the need to go and fix tests that have broken in different places – makes it easy to fix in just one place
  • Default Data pattern:
    • use ruby gem to complete data we don’t care about
    • just testing for one thing – focus on the things that really matter and remove the noise
  • Test Data management
  • Route Navigation
    • we do this all the time to achieve something
    • the way we navigate may change things
    • pattern helps us navigate through different paths to reach the same point (i.e. page in a website(s))

Day 2

“STEM to STEAM – Advocacy to Curricula” by Carol Strohecker (keynote)

  • Science, technology, engineering, (arts), math;
  • network for sciences, engineering, art and design – spin off international network;
  • “Need to explore the ontology of software testing” James Bach;
  • “BDD is a design methodology, not a testing methodology”;
  • observing – requires additional patience, concentration and curiosity;
  • imaging;
  • abstracting – focusing, simplifying and grasping essence;
  • innovative thinking;
  • pattern recognition – perceiving information;
  • pattern forming – combining elements/operations in a consistent way that produces a (repetitive) form;
  • analogising – recognising functional likeness between two or more otherwise unlike things;
  • body or kinaesthetic thinking – sensation of muscle, sinew and skin; sensations of body movement, body tensions, body balance (proprioception);
  • empathising – putting yourself in another’s place, getting under their skin, standing in their shoes;
  • new tools:
    • modelling: represent something
    • playing: practice
    • transforming: “segwaying” from one of the “tools” to the other
    • synthesising
  • intuition, inter-disciplinary work;
  • “Inventing Kindergarten” by Norman Brosterman and Friedrich Froebel

“Testing in an agile context” by Henrik Andersson (talk)

  • Definition of checking (Michael Bolton);
  • confirming existing beliefs, check that the code hasn’t been broken;
  • testing != checking, explain the differences between them;
  • removing coding from programmers can extend the feedback loop and decrease value;
  • “Testing is focusing on exploration, discovery, investigation and learning”;
  • “See new information, see things differently, driven by questions that haven’t been answered before”;
  • pair on:
    • product owner on design of acceptance test
    • on doing exploratory testing session
    • understanding the customer
    • programmer on checking
    • programmer to understand the program
  • be a coach on testing for the whole team to provide other perspectives;
  • pick up, try and learn new testing stuff otherwise you will run out of wisdom to share;
  • do what you can to be valuable, if you can’t do something else, move on;
  • “I’m here to make you look good” – change this to “make us”;
  • the A-Team works because it has one “crazy” guy – be that guy and test boundaries;
  • session-based test management
    • way to manage exploratory testing from Jon and James Bach
    • charters
  • tester velocity:
    • number of sessions available over the sprint
    • planned out of office
    • planned other things
    • actual available number of sessions
  • make you stand-up contribution useful, use a scrumboard for example;
  • metrics on test time used – bug investigation, set up, learning performance, integration, “testing”, etc.

“There was not a breach, there was a blog” by Ben Simo (keynote)

You should really watch the full keynote here. My notes are very short on this keynote, mainly because Ben managed to get everyone’s attention constantly as a great speaker, and his content shows clearly his passion for the craft of testing that we should all aim for.

  • Security wall can only be there if there is security at every point;
  • system under scrutiny was already exposing itself in usernames – potentially it isn’t a problem but for healthcare.gov it definitely was – same with email addresses;
  • explore in browser developer tools – loads of information could be leaking through web requests;
  • watch out for security reset questions;
  • rest service outputting the GUID (for resetting password via email) on the browser web request;
  • do not include user and GUID reset code in the same email;

“Thinking critically about numbers: defence against the dark arts” by Michael Bolton and Laurent Bossavit (workshop)

A workshop from Michael Bolton doesn’t come around all that often. To no surprise the room reached its capacity fairly quickly so it was just a matter of time to start it all off.

Here are the key points of the workshop:

  • We were asked to investigate a number of different claims;
  • we had to note what it would take to change our mind, whether we thought the claim was true or false;
  • heuristics and tactics used:
    • locate the primary source
    • finding cited work quickly
    • image search for variants of the same chart
    • search by date to find when a claim was first made
    • original data allows you to plot your own chart
    • locate other claims about the same phenomenon
    • search for quotations
  • two tracks about the world
    • claim
      • construct
      • observation
      • metric
      • generalisation
      • representation
      • interface
    • claimant
      • existing bias
      • sources
      • trustworthiness
      • motivation

“Software Testing state of the practice (and art! and science!)” by Matt Heusser (keynote)

  • Have we swung the pendulum too far to programming with agile?;
  • automation and technology is increasing across society;
  • intractable trasience of test;
  • we have a high turnover in test, perhaps that’s ok? perhaps we need to plan accordingly;
  • fragmentation within testing – different keynotes talking about testing as different things;
  • we use words we can’t agree on;
  • result is an ignorance of test;
  • healthcare.gov was probably tested very well but they were “told off”;
  • social structure makes hard to make corrections;
  • in software development, agile (scrum, careful with assumptions) is winning;
  • scrum was created because of constant change of requirements, vision, etc;
  • xprogramming – need for shorter releases;
  • scrum sasy that testing should be done by embedded members of the (development) team;
  • how to win big? save the scrum;
  • system thinking skills applied to getting rid of queues and wait states;
  • honesty (smart empowered testers can do amazing things);
  • experts on context

CAST was an amazing conference, one of the highlights of my year. I hope to attend again at some point. Hopefully you have got something out of my notes. The call for participation is now open and you can check it here.

Agile Testing Days 2013 – Day 2 Talks Notes

The second day of the conference, and when I was already feeling quite exhausted, I attended a couple of talks in the morning and just sat down in the main room during the afternoon to see what the consensus were about. Here are my notes for the day.

“BDD – Live coding sessions” by Carlos Ble

Carlos’ idea for this session was that he would be coding scenarios in real time a web app so that the audience could participate in giving him feedback and come up with requirements. He would then use this to write up his scenarios before translating them to Cucumber. Unfortunately there were too many problems at the start of the session to due with the network and his server so we only had limited time. Carlos still managed to explain what he intended to do and as easy as it would be to say that he could have prepared it better, those things can happen (live demo gods) and it was brave to do something like that in front of a packed room. I caught up with Carlos in the evening as we were both having dinner in the same restaurant and we had a little chat about BDD and his experiences with it, and I got plenty of resources to go and look at, so it was definitely worth attending. Also, my brain got a bit of a break 🙂

“Myths of exploratory testing” by Luis Fraile and Jose Aracil

This was one of the most discussed talks (at least on Twitter) of the conference, partly because of some of the very controversial claims the presenters made during the talk, but also because they seemed to have a slight different idea of what exploratory testing was for them than to most people in the audience. Despite this, here are their key points.

  • when you explore it you want to come back;
  • keys to success: inspects and adapt, be creative! take advantage of your team and skill set, additional to other testing, quickly find defects, add value to your customer and test early/test often;
  • myth 1: same as ad-hoc testing
    • must be planned and documented
    • you must know: what has been tested; when it was tested; what defects were logged
    • some ideas: testing tours by James Whitaker (!); session based from James Bach; your own method
  • myth 2: can’t be measured
    • multiple measurement techniques: SBTM; amount of logged defects; defects vs. user story complexity
    • you must be creative
    • warning.. warning! don’t be fooled by metrics
  • myth 3: endless
    • difficult to focus on long tasks (> 25 mins)
    • endless = useless
    • must focus on specifics; requirements, problems and complex parts
    • stay focused for burst periods (25 mins)
    • pomodoro testing
  • myth 4: can’t reproduce defects
    • how do you reproduce a defect: be an explorer, like David Livingstone
    • toolset: record video/audio; screen capture; analog recording
  • myth 5: only for agile teams
    • inspect and adapt
    • insanity is doing the same thing over and over again and expecting different results
    • look for new ways of testing
    • empower your team by allowing creativity
    • do you trust your team?
  • myth 6: no documented
    • tester “lonely planet”: user manual; online help (F1 tour); help from third parties
    • alternative tester: goes outside the tour (cancelling processes halfway, using undo, doing things twice); use uncommonly used functionality or processes; always with an objective in mind
    • second visit: you need pictures/notes

Consensus Talk

Organization, Roles and Responsibilities of Testers and Test Managers on Agile Projects” by Dr. Jennifer Blechar

  • 2 projects which were virtually the same, yet they evolved differently – here are the learning points:
    • there’s no I in team:
      • testers must be part of the team
      • test managers outside of teams useful to co-ordinate across teams
      • consider dedicated technical testers
      • consider dedicated test automation experts
      • communication essential
    • use the right tool for the job:
      • everyone in the team must use the same tools
      • user stories in the tool – linked to tests created by testers, users, devs, etc.
      • careful evaluation of tools for test automated prior to implementation
    • never underestimated the value of socialising
      • people are much more likely to “go along” with your ideas if they know you
      • make time to get to know key stakeholders in the project – this includes the customer as well as developers
      • create opportunities for socialising
    • get everyone on board – and keep them on board!
      • the test plan is useless if only the testers agree with it – everyone needs to buy-in to the test plan and be commited to their role in the process
      • relationships needs to be constantly maintained
      • don’t be afraid to change course if needed – use lessons learned
    • reporting
      • customers and other key stakeholders need to be aware of the progress at all times – consider “light” but often reporting
      • reporting is also motivational and useful to get and keep everyone on-board
  • five factors identified as influencing the success of agile testing effort
  • additional factors likely, most important factor is to be agile!

Agile Testing & BDD eXchange 2013 – Notes

Last year, in November (2013), I went to the Agile Testing & BDD eXchange that was held in London and hosted by SkillsMatter.

Here are the notes and main takeaways I took/found during the day, organised by each of the 6 talks.

 

“How I learned to stop worrying and love flexible scope” by Gojko Adzic and Christian Hassa

  • Get organisations to benefit more from iterative delivery, allowing flexibility. You can see flexibility is everywhere apart from software, e.g. plane tickets where you can choose to pay more to be allowed to change details of the flight at a later date.
  • “Software people” tend to complain about not having enough time, even though most times it’s perceived that money is the problem where in fact time available is.
  • It’s important to change the mental model around flexibility and starting to engage with the business – this will avoid peaks in the backlog.
  • We need to change the ways to write user stories in order to avoid them becoming issues.
  • Adapt: Why success always starts with failure book recommendation.
  • Palchinsky principles:
    • first thing is to plan for variation: see out new ideas and try new things;
    • survivability: when trying something new, do it on a scale where failure is survivable;
    • selection: seek out feedback and learn from your mistakes as you go along.
  • The role of user stories is survivability, we have to get the most of them and make them survive.

Gojko then talked a bit about road maps and related back to survivability of user stories. He explained that road maps are only what we think road maps are when we talk to “software people”. For those people, roads maps are just roads, not maps. And actually they are not even roads – you go out one way and you come out the other, so they are more tunnels than they are roads. Features are delivered linearly, and for road maps to stay truthful to its meaning we need different routes, where stories are survivable experiments. User stories would then become our turn by turn navigation, instead of becoming just a destination. The sat nav analogy is quite apt here, where the more sources of feedback (GPS signal, traffic information, peer to peer information, etc.) we have the better the navigation is.

User story example:

As a sales manager
In order to monitor inventory
I want […] report

“In order to monitor inventory” is the part we have to change – there is no space for feedback here and there needs to be, otherwise we don’t know if we have made the right “turn”. Once we have a change in our story, we have a success criteria.

Impact mapping:

003 001

 

  • Slice your backlog for impacts
  • How much would you pay for information on whether this thing you are working on is giving you what you want? We can probably change how budgeting works.

 

“Whose truth is it anyway.” by Matt Wynne and Seb Rose

I was really looking forward to this talk, I had met and listened to Seb Rose only a few weeks before at Agile Testing Days 2013 in Potsdam, Germany and I had started a few months before to read Matt’s “The Cucumber book”.

Here are some of the highlights:

  • Differences between technical and business people demonstrate the value of establishing trust between everyone.
  • Iterate on a feature and show the value of it to the business – new features or changes are feedback from the business on their own.
  • Writing BDD scenarios will help you keep in the problem domain.
  • Most people that are excited by BDD and start practising it are coming from the point of view of test “automation”, which makes it a problem.
  • Declarative vs. imperative examples.
  • web_steps are dead.
  • Ubiquitous language is one of the biggest BDD selling points.
  • If we can understand the problem, the solution should take care of itself – apply problem solving skills.
  • “Whose domain is it anyway” Dan North’s blog post.
  • In order to write BDD scenarios you have to find a compromise between everyone in the team.
  • The need for detail might indicate a lack of trust.
  • Level of detail is hard to define – where are you in the pipeline: know all the details in the system or know nothing, it’s important to establish the compromise.
  • Business people do care about the behaviour of the system.
  • Easy is different than simple:
    • Understanding this, and getting there, helps you understand the project and your tests;
    • Time saved at the beginning is the most important of the project.

The testing iceberg:

006

 

“What do testers do?” by Tony Bruce

Tony is a good friend of mine, someone that I also met at my first ever conference, Agile Testing Days 2013. He blogs here, and you can find the “slides” for this presentation here.

Main points:

  • Testers don’t:
    • randomly click things;
    • check off requirements;
    • represent a safety net or a gateway;
    • if automation works “well” people might not believe you need to do anything;
    • policing developers and nagging them with your defects.
  • Testers do:
    • we ask questions – in order to investigate so we can make a decision;
    • we are information hunters;
    • “A tester is somebody who knows that things can be different” – Jerry Weinberg.
    • think in different perspectives:
      • user;
      • troublemaker;
      • stakeholder;
      • developer.
    • apply critical thinking (e.g. 6 thinking hats);
    • sometimes we use silliness;
    • we build relationships;
    • we use experience and irrationality;
    • we communicate, we tell stories and ask questions at the same time;
    • we coach, we mentor, we teach and we learn;
    • we make use of our senses (sad, angry, etc.);
    • we use our technical skills and our humour;
  • Testers are not:
    • a safety net (again!);
    • a gateway (again!);
    • a judge.

One thing to take into consideration is that even though testers are not judges, they could consider themselves as an expert witness at the trial. And we can’t judge quality because quality is relative from person to person.

The final slide from Tony’s presentation illustrates quite well what testers do, I recommend you give it a look!

 

“Two sides of the story” by David Evans and Brindusa Axon

David and Brindusa talked about user stories, the pros and cons and about using them and techniques for improving them.

  • The current path to stories is flawed, as we always end up biting off more than we can chew.
  • Good story characteristics:
    • INVEST
      • independent;
      • negotiable;
      • valuable;
      • “estimatable”;
      • small;
      • testable.
  • Typically however we lack the V for valuable.
  • The template for story stinks:
    • “Who?” there is almost never just one stakeholder;
    • “What?” what is the benefit of forcing an awkward style?;
    • “Why?” why does the last phrase always sound so weak?;
    • Use templates if it helps get the team going, but once you’re going going then you don’t need templates. Leave them behind.
  • Possible alternative for user story writing:
    • <verb> a <noun>
      in order to <improve quality>, we will do <some thing>
      in order to help <someone>
      achieve <some value>
      we will do <some thing>
    • plain language!
  • Liz Keogh’s blog post “They’re not user stories”
  • A story is a promise to hold a conversation – if you story card is too small to write everything, get yourself a small card. You don’t need all the details, have conversations instead.
  • Don’t let agile be an excuse to ignore documentation.
  • Don’t let documentation be an excuse to avoid a conversation.
  • Documentation: “like a lady’s skirt – long enough to cover the subjects, short enough to create interest”
  • Short lived (burn this):
    • story;
    • change;
    • acceptance criteria;
    • cards/tasks.
  • Long lived (keep this):
    • specification;
    • effect;
    • acceptance tests;
    • living documentation.
  • There’s no way you can say “I’ve written the perfect story”.
  • Signs of success with stories:
    • finding elegantly simple, not simplistic solutions;
    • Product Owner satisfied with doing less than intended;
    • transparent decisions on trade-offs;
    • fewer boomerangs and re-works;
    • highly collaborative creation process.
  • Collective product ownership idea in the team.

 

“BDD & The business analysts” with Kent McDonald, Jeffrey Davidson, Olaf Lewitz, Jake Calabrese, Leslie Morse and Christ Matts.

This talk was actually a conversation instead of your standard conference talk. Chris Matt was acting as the “host” and here’s a picture of what it looked like from the seats.

008

Here are some of the key points talked about:

  • Using BDD for scope definition (this was actually my first takeaway from BDD when I first used it, and I will at some point blog about it!).
  • BDD is a conversation and it should be easy – it’s how we interact with people!
  • There are 2 parts: work out what the questions are, and then find the answers.
  • You also need to structure it in a way so that the “conversation” doesn’t lose fidelity.
  • The “given when then” is necessary but not sufficient – conversations need to happen.
    • The necessity comes from offering consistency.
  • Real analysis still has to happen, just capturing stories isn’t enough as users may not be telling you what reality actually is.
  • If you’re only writing things down then you are not a business analyst, you’re a transcriber.
  • Mention of Liz Keogh again, and her blog post on “Estimating complexity”.
  • BDD focuses on behaviour and needs of stakeholders.
  • BDD is only of part of the “Analysis” toolkit, which is not a role.
  • We’re all doing BDD everyday (or should be doing), we just may not be writing it down and instead are hiding it from everyone.

 

“Successful collaboration and MOFF – Applied” by Chris Priest, Jenny Martin and Pete Buckney

They spoke about using the feedback onion (which apparently wants to become the feedback aubergine (less barriers between layers and won’t make you cry so much) and the MOFF collaboration framework.

MOFF – Maximise opportunities for feedback: opportunities, value and quality of interactions and using different techniques to achieve this, such as impact maps.

You can see the models here.