Category Archives: Uncategorized

Blog has moved!

Hi all,

Just a quick announcement that I’ve migrated my blog over to GitHub Pages. If you are reading this it’s likely that you’ve got here from Google rather than through joselima.co.uk 🙂

I’ll be keeping the content here (on wordpress.com) but won’t be updating it!

Thanks

Advertisements

London Tester Gathering Workshops 2016

Last week I attended the London Tester Gathering Workshops.

I was lucky enough to be given a ticket by Jean-Paul Varwijk (@arborosa) who was giving a workshop on “Agile Exploratory Testing”.

I couldn’t attend the first day of the workshops but I went down to London on the Friday to take part in Jean-Paul’s workshop and in the afternoon attend the “Get into Coaching and Mentoring” which would be a nice fit into my new job role at Redgate – Quality Coach.

You can find my notes below, I thought overall it was a great day – I remember attending it a few years ago and was good to see a few old friends. It was also great to meet up with some of the guys from Atlassian, including Andrew who I have been exchanging emails with for a while, trying to understand their QA Model better and how we can incorporate some of its aspects at Redgate. I am really grateful to Andrew for his time chatting and all the information he has provided me over the last few weeks.

Agile Exploratory Testing – Jean-Paul Varwijk

  • He started off asking people to write 10 things they noticed about the room (didn’t matter what they were, just write them down as soon as it comes up in your brain).
  • He proceeded to define what exploration is: the act of searching, for the purpose of discovery of information or resources.
    • Searching – activity
    • Purpose of discovery – goal/direction
    • Information – Something worth knowing
    • Resources – something that helps to advance
  • “In exploration you aim to create a sense-making model instead of a categorisation model”.
    • Resource: Myers Briggs Type Indicator: Categorisation Model, Jon Bach (Agile 2010) “Telling your exploratory story”
  • We then proceeded to explore two different applications:
  • Oracles: Heuristic principle or mechanism by which you recognise a potential problem. They help you evaluate test results.
    • Tacit – Testers: experience, your feelings and mental models
    • Explicit – Testers: inference, observable, consistencies
    • Tacit – Other people: conference, stakeholders, feelings and mental models
    • Explicit – Other people: Reference, shared artefacts (specs, tools, etc.)
  • We then played the ColorGame (http://arborosa.com/colorGame/colorGame.html) :
    • Asked to use oracles
  • We also touched on the various definitions of exploratory testing:
    • The wikipedia definition (which is very similar to James Bach’s)
    • James Bach
    • Michael Bolton
    • Jean-Paul’s own one
  • Heuristics: Oracles are heuristics, but not all heuristics are oracles.
    • SFDIPOT
  • Session-Based Test Management was also covered very quickly as we were running out of time.
  • Planning:
    • Strategy: test plan, coverage risks
    • General: heuristics, test ideas, risk catalog
    • Ongoing planning: weekly planning session, charters
    • Issues, bugs and session sheets
    • Daily Test sessions: low tech dashboard
  • Test story: art of storytelling is pretty important in exploratory testing.
  • Agile testing → Concurrent testing: analysis, design and dev throughout a sprint. Breaking down in small items, improving awareness and responsibility of all team members for the software quality.

QA at Atlassian – Andrew Prentice

  • Does your business really want testers? (or do they want something else and call them testing?)
  • If quality was there, there would be no need for testers
  • Testers can’t be gatekeepers
  • Quality assistance (model) introduction: co-pilot role: goal is to go really fast. Great analogy if you think about the role of a co-pilot
  • Difference between a “10x developer” and a “1x developer” is a quality assistance engineer
  • When developers fail, QA fails too
  • Centralised QA Function:
    • Different team for sharing knowledge, people and resources across products
    • Efficient recruitment, onboarding process, development and management
  • 4 factors:
    • Eradication: cross site request forgery protection across Atlassian
    • Prevention: QA kickoffs, testing notes and demos
    • Detection: automated tests, Developers in Test, blitz testing
    • Mitigation: progressive deployments, feature flags, controlled releases (limited by number of people that get it)
  • 3 different QA Phases:
    • 1: QA only did critical testing (changes), pair with devs for the rest and provide resources
    • 2: devs only doing critical testing, QA focus on kickoffs, root cause and defect analysis
    • 3: no manual testing, QA focus on eradication

Get into Coaching and Mentoring – Tony Bruce and Dan Ashby

I didn’t take any notes during this session as it was very hands on.

  • We started with an ice-breaker (game of “Swoosh”) and then moved to definitions of coaching, mentoring, leadership and teaching.
  • There were a couple of games played:
    • Communications game (that felt mostly about leadership) where everyone in the room got a piece of paper with a picture – the point was that we had to put ourselves (around 22 people) in order from start to finish to tell a story with our pictures: figure out the pattern, figure out the relationship between the pictures and then manage to self-organise so the story made sense.
    • Writing about a shape/picture: there were a lot of circles in my example and I had to describe in 5 minutes how to achieve the same shapes/picture without talking to my colleague (we actually got taken out of the room so my colleague only had access to my poorly handwritten piece of A4 paper)

 

As usual, these are very rough notes, so some information may not be entirely accurate, or representative of what was said/shown during the day. Feel free to tweet me any questions @joseglima

Running your Selenium checks on Microsoft Edge

For the last week or so, after spending some time upgrading some of our existing selenium client machines I thought it would be a good idea to find out more about Selenium checks (tests...) on Microsoft's new browser, Edge.

It's the first time that Microsoft have released a web driver themselves, officially supporting it.

Below are some of the issues I've found whilst using it, and since I had to spend some time digging around GitHub/Google I thought I'd collate them here.

The first problem I found was trying out the example given by Microsoft themselves on their blog (https://blogs.windows.com/msedgedev/2015/07/23/bringing-automated-testing-to-microsoft-edge-through-webdriver/)

The first thing you do is install the web driver (link is in the url above).

Then all of a sudden, without modifying the code posted by Microsoft, the tests would just hang. Upon inspecting the output I could see that:

"Existing Microsoft Edge (pid: xxx) terminated forcibly"

In this case, the problem was with Edge itself. By default, Edge has a startup page rather than a url, and it appears that the web driver struggled to understand that, quitting straight away. Simply go to Edge's options and change this to a url you want (Google or Bing should do just fine).

The next problem I faced when I tried to run those checks was that Edge couldn't be launched if the user was builtin\Administrator. This is a security issue implemented in Windows 10 and to work around it have a look here: https://4sysops.com/archives/why-the-built-in-administrator-account-cant-open-edge-and-a-lesson-in-uac/
(hint: the solution is actually only described half way down the page).

To add Edge as a browser that can be run on a selenium grid, edit the node's configuration file by adding the following (and changing as you wish):


 {
 "browserName": "MicrosoftEdge",
 "maxInstances": 4,
 "platform" : "WINDOWS"
 }
Next issue was that Selenium couldn't detect the Edge web driver. One workaround is to move the place where the installer places the web driver to where you running your Selenium node from (i.e. the same folder). After that, add this to your node startup file (where you also tell it where Selenium lives for example):


-Dwebdriver.edge.driver=.\MicrosoftWebDriver.exe

This should allow you to run your checks using remote capabilities, and here's how I've done it:

var capabilities = new DesiredCapabilities("internet explorer", "11", new Platform(PlatformType.Windows))
var edgeProfile = new EdgeOptions();
edgeProfile.PageLoadStrategy = EdgePageLoadStrategy.Eager;
capabilities.SetCapability("edge_profile", edgeProfile);

The one thing to bear in mind is that Microsoft's Web Driver doesn't yet support all commands that you're potentially used to, like xpath (you probably shouldn't use xpath anyway, but sometimes there is no easy alternative). You can try a newer version ("for windows insiders") of the webdriver that has a preview of the getelement by xpath but it's obviously not in the official release yet, so use it with caution. Check this url to find out what you can and can't do: https://dev.windows.com/en-us/microsoft-edge/platform/status/webdriver/details/

For all the above I have been using Selenium 2.48.2 but I believe bindings were introduced in 2.47.0 to support the new Microsoft Web Driver. I tried to use the latest (2.50.1) but found it too unstable so reverted back to 2.48.2.

Hope you find some of this information helpful.

My first public talk!

I began my professional career as a software tester just over 3 years ago. Back then, if anyone had asked me if in 3 years’ time I would be presenting a topic to a bunch of fellow professionals I would have thought they were crazy.

The first conference I attended was back in 2013, just a year after I started at Redgate. I was encouraged to go by my line manager who was once a tester and also Chris George who headed the testing function around here. Credit where credit is due, without Chris and Ben, I would not be where I am today in my professional career.

During that conference one thing that I briefly thought about was how cool it would be to tell my experiences to others, obviously with the caveat at the time that my experience was very short. But time didn’t stop and as I gathered more experience I decided after attending Agile Testing Days 2014 that I would work on a talk to submit for next year’s event.

When that time came I knew that I wanted to talk about my experience within my team and the design architecture that we have adopted and decided to follow, so I could tell people how my approach to testing had changed. One of the hardest things for me during the whole process was passing the message that I wasn’t there to tell anyone how to test micro services, but to tell them how I’ve done and continue doing, and how my testing was different to what I had done in the past. The reason for this is that I have always worked in the same company, within the same team, and mostly, around the same people.

So my talk got accepted and I was invited to be in the “Young Agile Talents” track which took some pressure off, but at the same time the “talent” word added some on top. It’s great that conferences are giving young (in terms of age or experience) people the chance to give talks and to share their experiences. If you are looking for help, Speak Easy program offers you the chance to be mentored by an experienced speaker who will help you all the way from coming up with an idea for a talk or workshop, to the delivery. They also have agreements with a lot of different conferences where there’s a guaranteed slot for one of their mentees.

After a few months of just thinking about my talk it was sort of time to put ideas into an actual presentation and I used a lot of my peers for this, as well as external people. Within Redgate I must thank Gareth Bragg, Chris Auckland, Andrew Fraser, Robin Hellen, Danielle Ainsworth and Toby Smyth who saw a very rough draft of my talk and literally grilled me afterwards with questions and feedback. Outside, I must thank Emma Armstrong for also looking at a very rough presentation and providing me with encouragement to make it better. And finally thanks to Chris George for giving me an alternative order of my slides which no doubt made a lot more sense.

All in all it was a very fulfilling experience, one that I would most likely try to repeat.

You can find my slides here if you are curious to know what my talk was about: http://www.slideshare.net/joseglima/exploratory-testing-micro-services

I was also asked by InfoQ to do a short Q&A which you can read here: http://www.infoq.com/news/2015/12/developing-testing-microservices

What I learned (for someone that has never presented at a conference before)

  • There are a lot of conferences out there in most fields, pick one that has a good track record at giving people opportunities like lightning or consensus talks;
  • Work in small chunks on your presentation and iterate on them – yes you will most likely still make changes the day before but hopefully that won’t have any impact on the presentation itself;
  • Get feedback as early as possible, even if it’s just on what you are thinking about presenting;
  • Act on that feedback very soon after you get it, otherwise you will lose some of the context;
  • Enjoy the occasion, and try to still enjoy the conference you are at!

Agile Testing Days 2013 – Day 3 Keynotes Notes

The last day of the conference started off with an interesting keynote by David Evans. Here are my notes for his keynote and also the other 2, including the closing keynote from Lisa Crispin and Janet Gregory.

“Visualising Quality” by David Evans

  • The product of testing is confidence, not quality!
  • Decision support – balance risk and reward;
  • the expert witness in a trial;
  • accidents happen – unpredictable, one-in-a-million freak accident:
    • the question was asked (suffer catastrophic failure due to the cold temperate (NASA accident 1987)
      • reasonable doubt may mean no go when it comes to a decision
      • engineers did their best to highlight problems
      • failures in communication that resulted in a decision to launch based on incomplete and sometimes misleading information, a conflict between engineering data and management judgement.
  • the value of the information we provide is equal to the value of the decision it informs;
  • testing only isn’t enough, it’s the communication about the collected data that counts;
  • McGurk effect – beware of conflict between what we show and what we say;
  • hard to process information without context – put data together, or compare different data points;
  • put other contexts too – military budget of the US – put it as a percentage of the US GDP;
  • get feedback quickly – put it somewhere visible;
  • look for opportunities where we taking shortcuts and bending them behind their original purpose;
  • system diagrams: scale it by:
    • size
    • usage
    • value
    • risk
  • smell: arbitrary representations – work with the brain, not against it;
  • smell: the warm glow of the dashboard – visualisation overload;
  • smell: colour bias and averaging;
  • false perspective: choose perspective that relates to the context;
  • represent people on a kanban board;
  • keep it simple, get it green (builds);
  • visualise subjective assessments (James Bach low tech dashboard);
  • show named milestones (Jeff Patton’s story map);

“The next decade of Agile Software Development” by J. B. Ransberger

  • Kent Beck: “Why aren’t we rich yet?”
  • when people say bring me data they really mean shut up and go away;
  • Etudes for Excellence by James Shore
    • agile practices are etudes not rules!
  • we don’t want change because eventually we will do the wrong thing;
  • Tim Lister’s “Adrenaline junkies and Template Zombies”;
  • the opportunity to address risk is what’s missing from daily standups;
  • Tim Lister’s “Waltzing with bears – Managing risks on software projects”;
  • inbox for later process in our whiteboards (if you can’t resolve in 2 minutes);
  • backlog: what are we missing about all those features?
    • involving the customer
    • cartoon for agile project spec – what programmer thought, designer, product owner, etc.
    • talking in examples
      • having conversations, is more important than capturing conversation, is more important than automating conversations – BDD
      • abstraction then talk about details (example driven development)
      • lost luggage example – point to a bag that’s most similar to yours and note how it’s different
  • negotiating the scope: ask how much of each story not which stories;
  • when you spot something wrong in a pairing session wait 15 seconds before you point it out – no one wants to know there’s a semi colon missing;
  • pairing also develops your confidence, and/or humility – helps with trust;
  • “you cook, you clean up”;
  • continuous integration is an attitude not a tool;
  • test retreats

“Build bridges, change viewpoints, delight customers” by Lisa Crispin and Janet Gregory

  • Customer, focus on the customer;
  • it’s all about perspectives:
    • have multiple perspectives
    • be open, listen
    • consider your customer needs
    • are business needs different?
  • changing our culture
    • focus on quality, not speed
    • learning culture
    • short feedback loops
    • respect for all team members
  • ways to collaborate:
    • 3 amigos / power of 3
    • pairing
    • talk about tests
    • continuous feedback
    • impact mapping
    • story mapping
  • collaboration happens easily between children because there is trust;
  • interruption isn’t necessarily rude;
  • test framework:
    • tests/examples passes to test method/fixture calls developer code
  • delight customers: collaboration helps simplify, deliver what customer wants most;
  • when programmers and testers automate acceptance tests without the customer, they risk merely throwing features over the wall;
  • learn non-threatening ways to ask “why do you want that?”;
  • provide an environment that nurtures change;
  • teach people (testers) how to do a good resume, and instead of leaving they won’t want to once they realise how much they are learning;
  • trust, visibility, transparency, team focus on quality and … the 3 Cs (contractual, communication and competence);
  • it’s still more important to have the right people than to have them together;
  • it’s only 5 steps to the top – you can rest, do it in small chunks;

Agile Testing Days 2013 – Day 1 Keynotes Notes

The conference days, apart from the first day which was a tutorial one, consisted of a morning start keynote, followed by 2 talks (in which you could pick 5 different tracks), an after lunch keynote, followed by 2 more talks or workshops and a closing day keynote.

“Agile Testing is nonsense, because Agile is testing” by Andrea Tomasini

It was the first talk/keynote of the conference and the feeling afterwards was that Andrea tried to cover a little bit too much ground specially considering we were just getting started. I guess you could argue that if he tried to deliver the same keynote towards the end of the conference then people would have been exhausted already so it wouldn’t have worked. There was plenty of information and here are some of my notes (also as a side note: just because I write down my conference notes, it doesn’t necessarily mean I agree with them, it means I would be happy to discuss them with you, and that hopefully it has sparked some interest from you too):

  • testing is an attitude – being competitive is an attitude of mind;
  • questions matter a lot more than the mind;
  • every step performed while creating a new product is unique, therefore all testing is unique;
  • clients don’t always have domain knowledge to know what is right or not;
  • inspect the outcome and learn to validate assumptions and hypothesis;
  • if we test because we don’t trust what the team did then we are far away from understanding the true value of testing;
  • testing as an approach: if we write a test that doesn’t bring anything new, then it brings no value;
  • we like to learn using short feedback loops;
  • being successful once doesn’t mean you got it right – you could have got lucky (“works on my machine!);
  • reduce:
    • social risk
    • schedule and cost risks
    • business risks (show it to the customer as often as possible; contracts are not as important as collaborations)
    • technical risk (no one does a project twice – each step is an opportunity to learn)
  • testing as a practice:
    • creating a vision with stakeholders
    • test as a vision – you test hypothesis first, then you go back and test again after thinking about scaling the business
    • the next step is consulting – get consultants to help fill the gaps
    • co-creating – you start creating something that at the beginning no one knew what it was
  • don’t overload people when trying to go lean – it’s not only about reducing waste;
  • Unnecessary variations – keep the flow even;
  • wasteful activities – remove non value adding ones;
  • when retrospective actions keep being the same then no continuous improvement process is in place
    • under pressure your muscle memory wins so you will do the same mistake
  • Testing is
    • an attitude because we embrace the Agile manifesto and its principles, we have to accelerate learning, and requires individual commitment to validate assumptions
    • everybody makes mistakes, every mistake is an opportunity to learn
    • testing is an approach, it requires to systematically initiate everything we do by understanding the constraints, and the expected outcome
    • testing is a practice, because once we develop the attitude and learned the approach, we will be able to emerge practices.

“Agile Testing: Strength through interdependence” by Mary Gorman

  • Types of interdependence:
    • pooled interdependence: marketing, develop, training, operations
    • sequential interdependence: analyse, design, develop, test
    • reciprocal interdependence: analyse to and from design, and so on for develop and test
    • comprehensive interdependence: 4 quadrants (analyse, design, develop, test) interacting with each other
  • strongly interdependent team relies on each other;
  • trust: team interdependence is built on a foundation of mutual trust;
  • capacity for trust: trust ourselves and others
    • 3 C’s of trust: contractual, communication and competency
      • contractual: clear expectation, meet commitments
      • communication: who knows what, when, open to feedback from each other, direct, constructive feedback
      • competence: trust of capability, respect each other, engage others and empower them, help others learn new skills
  • agile activities: discover via structured conversation (explore -> value/evaluate -> confirm (loop))
  • typical testing used to happen when code was happening; we need to bring it to the discovery stage (early testing)
  • we can’t afford the V&V model anymore – we need to collapse the hands and bring it together
  • product requirements interdependencies:
    • external: where to begin, where to end?
    • internal: 7 dimensions (user, interface, action, data, control, environment, quality attribute), techniques, user and action / action and data / data and interface have to be tested together, cross interdependent and internal dependent
    • techniques: scenarios, example, data tables, given when then, planguage (Tom Gilb)
    • shared techniques: yield a strong, higher quality product
    • interdependence across all views
    • interdependent wear, fail, dependent controlling, interdependent strong, flexible.

“The science behind building and sustaining high-performance teams through understanding behavioural science, neuroscience and social psychology” by Peter Saddington

  • True self organisation? Theory or reality?
    • ability to change and influence everything around you in autonomous teams
  • high performance – high productivity, fun, sustainability;
  • people are the problem… and the solution, not heuristics, methods or patterns;
  • emotions aren’t accurate over time;
  • as behaviour patterns are understood, accuracy of predictable engagement pattern increases;
  • as team dynamics are understood productivity of teams increase
    • and management effort decreases, shifting towards inspiring, enabling and fulfilling people – managers become managers of inspiration instead of issues;
  • re-interview process:
    • what do other people say you are?
    • what do you love to do (outside work)?
    • who do you look up to (mentor/role model)? why?
    • what type of problems do you enjoy solving (outside work)? why?
    • how do you know you’ve done a good job at something (outside work)?
    • what is your best way of supporting others (outside work)?
  • the power of play (fun):
    • year after year (for the past 30 years) fun has only decreased
    • happier people are more productive people
    • increase fun – increase innovation
  • how do we get more fun at work?
    • purpose
    • autonomy
  • switching context causes disruption to teams: silly and unproductive context switching;
  • multiple projects cost:
    • money to companies
    • 40% loss in productivity
    • visual input drops 29% and brain activation drops 53%
    • context switching linked to memory loss
    • multitasking linked to madness
  • focus is the important word;
  • effective leaders should see themselves not as managers or even problem solvers but as lovers of people and inspiration starters;
  • understand people, increase fun, focus.