Session Based Exploratory Testing
Eller: Sessionsbaserad utforskande testning
Jag försöker få den manuella testningen på mitt nuvarande uppdrag att bli bättre. Jag samlar inspiration från rapid software testing, exploratory testing och session-based testing. Jag införde för ett halvår sedan mer uppstyrd rapportering och började föra statistik på hur mycket olika funktionella områden och vilka fysiska enheter som testats. Men det börjar bli dags att uppdatera metodiken.
Jag tänker mig att man inte bara samlar in en slags total-tid, utan försöker bryta ner den i tre bitar:
- Förberedelsetid: läsa på, lära sig, hämta utrustning och konfigurera.
- Testtid: den tid man aktivt testar. Idealt vill jag inte ha för stora sessioner - men vi får se hur det går med det. Dessutom skulle jag vilja att testtiden delas in i två bitar:
- Uppstyrd testning: man följer en test eller krav specifikation, eller någon annan form av dokumentation som en bugg-rapport eller ett mail från en kund.
- Undersökande testning: man avviker från den uppstyrda stigen och undersöker. Här passar det nog att ta hjälp av de-focus/focus; "om jag kombinerar X med Y?" eller "om jag upprepar X?"
- Rapportering: rapportera nya buggar, fylla i testrapport eller berätta om akuta problem.
Utöver att man får in tiden bättre skulle jag vilja få en bättre effekt av själva testningen. Det tror jag man får av mer uppstyrd förberedelse. Jag skulle vilja att testarna (och utvecklarna som får ta på sig test-hatt med jämna mellanrum) går igenom tre faser i varje testsession:
- Förberedelse (S: Setup)
- Var finns de största riskerna?
- Var kommer jag hitta problem?
- Hur kan jag få det att gå sönder?
- Vad kan jag kombinera detta med för att få det att gå sönder?
- Vad är viktigast att undersöka?
- Vad riskerar jag att glömma bort?
- Vilka specifikationer kan jag använda som stig? Hur mycket vill jag stanna på stigen?
- Hur vill jag få täckning? Minns SFDIPOT: Structure, Function, Data, Interfaces, Platform, Operations, Time
- Hur täcker jag detta?
- Vilken utrustning använder jag och hur är den konfigurerad?
- Hur lång tid lade jag på förberedelser?
- Genomförande (T: Test)
- Vad täcker jag med min täckning?
- Focus eller De-Focus?
- Vilka nya idéer till testning fick jag?
- Hur lång tid lade jag på genomförande?
- Rapportering. Här tycker bröderna Jonathan och James Bach att PROOF är en bra minnesregel. (B: Bug)
- PROOF:
- Past: Vad hände under sessionen?
- Results: Vad uppnåddes under sessionen? Här tänker jag att täckning är relevant.
- Obstacles: Vad stod i vägen för bra testning?
- Outlook: Vad mer behöver testas?
- Feelings: Hur kändes det?
- Vilka buggar har rapporterats?
- Debriefad till... klockan...
- Hur lång tid lade jag på rapportering?
- PROOF:
Men produkten är ganska komplex och kräver att en bas för testningen finns med på något sätt. Jag tror på att låna idéer från Scenario based Exploratory Testing för att få in den testdokumentation vi redan har, men att uppmuntra testarna att inte bara följa den stig som specifikationen anger.
Jag vill även försöka samla in en känsla för hur det går. Jag hoppas kunna se trender som att det initialt är mer förvirring än senare i releasefasen och att det blir fler glada ansikten över tid. Jag lånade helt enkelt en massa ansikten till rapporten från Unicodes Emoticon block (se [1]) som ska väljas utifrån hur testningen kändes.
Referenser
Rapid Software Testing Class Materials
Here are some more pretty unstructured notes from Rapid Software Testing Class Materials, written by James Bach, Michael Bolton and Paul Holland. Available at satisfice: [2]
They say once that...
- Testing is...
- Acquiring the competence, motication and credibility to...
- Create the conditions necessary to...
- evaluate a product by learning about it through experimentation, which includes to some degree: questioning, study, modeling, observation and interference, including
- operating a product to check specific facts about it...
- evaluate a product by learning about it through experimentation, which includes to some degree: questioning, study, modeling, observation and interference, including
- Create the conditions necessary to...
- ...so that you help your clients to make informed decisions about risk.
- Acquiring the competence, motication and credibility to...
- And perhaps help make the product better, too.
The Rapid Software Testing define coverage as "...how thoroughly you have examined the product with respect to some model...", where you cover the famous "San Francisco Depot, or SFDIPOT": Structure, Function, Data, Interfaces, Platform, Operations and/on Time. It is, in addition, also important to investigate Development aspects of the system: Supportability, Testability, Maintainability, Portability and Localizability.
Good products that are testable have: Controllability, Observability, Availability, Simplicity, Stability and information.
Quality Criterias are diverse (CRUCSS CPID) Capability, Reliability, Usability, Charisma, Security, Scalability, Compatibility, Performance, Installability, Development.
Exploratory Testing is an approach to testing that emphasizes the personal freedom and responsibility of each tester to continually optimize the value of his work by treating learning, test design, test execution as mutually supportive activities that run in parallel throughout the project.
"How do you invent the right tests, at the right time? Evolve them with an exploratory strategy".
The Rapid Software Testing approach seems to prefer testing where a tester follows ideas: "The skilled tester remains in control of the process" whereas test specifications are not well liked "Scripted procedures give the illusion of control over unskilled testers".
Two concepts I like are
- De-Focus (if you are frustrated): look for patterns in your previous testing and seach for ways to violate them. Try to do multiple factors at a time and broading and vary your testing.
- Focus (if you are confused): simplify your test, frequently return to a known state. Try one factor at a time and make precise testing.
If you like rapid software testing and exploratory testing you have to like wierd acronyms. The one for General Test Techniques is FDSFSCURA:
- Function Testing
- Domain Testing
- Stress Testing
- Flow Testing
- Scenario Testing
- Claims Testing
- User Testing
- Risk Testing
- Automatic Checking
"Instead of thinking pass vs fail, consider thinking problem vs no problem"
Testing Patterns that are mentioned as examples of quick or careful tests:
- Happy Path: Use the product in the... way... optimistic programmer might imagine users to behave...
- Documentation Tour: Look in the online help or user manual... Improvise from them... Even if you don’t expose a problem, you’ll still be learning about the product.
- Sample Data Tour: Employ any sample data you can... Use...inappropriate data.
- Variables Tour: Tour a product looking for anything that is variable and vary it... as far as possible...
- Complexity Tour: Tour a product looking for the most complex features and using challenging data sets...
- File Tour: Have a look at the folder where the program's .EXE file is found. Check out the directory structure, including subs. Look for READMEs, help files, log files, installation scripts, .cfg, .ini, .rc files. Look at the names of .DLLs, and extrapolate on the functions that they might contain or the ways in which their absence might undermine the application.
- Menus and Windows Tour: Tour a product looking for all the menus... and other controls.
- Keyboard and Mouse Tour: Tour a product looking for all the things you can do with a keyboard and mouse... Combine each key with Shift, Ctrl, and Alt. Also, click on everything.
- Interruptions: Start activities and stop them in the middle. Stop them at awkward times... arrange for other programs to interrupt (such as screensavers or virus checkers). Also try suspending and activity and returning later.
- Undermining: Start using a function when the system is in an appropriate state, then change the state part way through...
- Adjustments: Set some parameter to a certain value, then, at any later time, reset that value to something else...
- Dog Piling: Get more processes going at once...
- Continuous Use: While testing, do not reset the system... Let disk and memory usage mount...
- Feature Interactions: Discover where individual functions interact or share data... Tour them. Stress them...
- Click for Help: ...bring up the context-sensitive help feature...
- Input Constraint Attack: Discover sources of input and attempt to violate constraints on that input...
- Click Frenzy: Ever notice how a cat or a kid can crash a system with ease?...
- Shoe Test: This is any test consistent with placing a shoe on the keyboard...
- Blink Test: Find some aspect of the product that produces huge amounts of data... Let the data go by too quickly to see in detail, but notice trends...
- Error Message Hangover: Make error messages happen and test hard after they are dismissed. Often developers handle errors poorly.
- Resource Starvation: Progressively lower memory, disk space, display resolution, and other resources until the product collapses, or gracefully (we hope) degrades.
- Multiple Instances: Run a lot of instances of the app at the same time. Open the same files...
- Crazy Configs: Modify the operating system’s configuration in non-standard or non-default ways either before or after installing the product...
- Cheap Tools: Learn how to use InCtrl5, Filemon, Regmon, AppVerifier, Perfmon, and Process Explorer, and Task Manager (all of which are free)... Pay special attention to tools that hackers use...
Slide 108 lists "Common Problems with Test Documentation" - Haleluja moment: "What does Rapid Testing Look Like? Concise Documentation Minmizes Waste"
Consider Automatic Logging: How can you automate logging in your project?
To increase Accountability
- Use a test charter
- Time box sessions
- Review Results
- Use Debriefings
"How do you effectively report your work? Learn to tell a compelling story that provokes the right questions."
Bach'arnas Session-Based Test Management
One of the first references in the wikipedia page on Session-based Testing (see [3]) is a link to Satisfice (see [4]) and Jonathan Bach and James Bach's Session-Based Test Management (see [5]), written by Jonathan Bach.
Bach first write a little about the background for the session based approach:
- Better communication
- More orderly reports
- Organize the work
- Without obstruction of work
- Perfect mariage of Exploratory Testing and Metrics.
Testing in Sessions
- The basic work unit is a session, a 45-120 minute work unit (typical 90 min.).
- there are a lot of other things that testers do
- Opportunity Testing (testing that was not on charter).
- Ideal: 3 test sessions per day
Debriefing and Planning
- debriefing after each session
- goal:
- test manager understand and accept the session report
- coach the tester
- feedback.
- Bach claims that thanks to these debriefings it is possible to estimate the amount of work involved and to perform predictions - even without planning the work in detail.
Test Session Sheet
The test session reports are written with a semi-formal text format that makes me think of pubmed, gene and protein sequence formats. The reason for Bach to do this is not that you have to do it - it just facilitates automatic reporting. With some scripts they can get statistics easier.
The Test Session
One of my favourite quotes in his text is: "From a distance, exloratory testing can look like one big amorphous task...", so we'd like some reports from it, but reporting takes energy away from the actual testing, so it has to be lightweight. They use a break down into three areas. TBS:
- Test design and execution: Prepare for testing, learn and do testing.
- Bug Investigation and Reporting: Investion of something that might be a problem.
- Session Setup: Configuration, getting the right gear, reading manuals, write session report.
Some of the interesting sections are:
- Session Charter with mission statement and areas to be tested.
- Task breakdown:
- Duration
- Percentage spent on the Test design and execution; Bug Investigation and Reporting; and Session Setup (the TBS metrics).
- Percentage spent on charter vs opportunity
- Test Notes
- Issues: Obstacles in testing.
- Bugs: Problems in the product.
They wrote a script to analyse the reports to create something that can be exported into excel with lots of traceability. This would also measure coverage and uncomplete testing.
Metrics
Some of the metrics mentioned are focused around predicting work effort. An example: Assume we have 4 testers that do 3 test sessions per day, 2 more release candidates to test in 3 weeks, and a product with 20 major areas:
- We can make 4 testers * 3 sessions/(day and tester) * 5 days/week * 3 weeks = 180 test sessions
- But we had 2 builds and 20 major areas, giving 4.5 test sessions per area.
- Depending on the risk and size of the product that might be much or little.
BugFinders Training Video: Exploratory Testing Types
BugFinders Training Video: Exploratory Testing Types, Details of various types of Exploratory Testing, See youtube at [6], by Martin Mudge
A nice short video with the objectives to...
- Explain Exploratory Testing (ET)
- Look at different techniques for ET
- Look at an example
They explain that exploratory testing is a skill and mindset of tester and the "Concurrent test execution & design". Exploratory Testing is not: "Following your nose".
Their list of Varieties of Exploratory Testing contain 5 versions:
- Freestyle Exploratory Testing: "ad hoc" exploratory testing. Best for first contact and to learn the application.
- Scenario based Exploratory Testing: They expain this as a version of traditional testing where a test case is expanded or modified. I like this approach very much.
- Feedback based Exploratory Testing: Use experienced from previous sessions to guide the next session.
- Strategy based Exploratory Testing: Use test design techniques like boundary-value analysis and equivalence value partitioning with exploratory testing. Martin Mudge believes this to be the most common way of testing for testers.
- Session based Exploratory Testing: I explain my approach above. Also read more below.
They give good examples of the above in the video, and a nice example where they compare testing to taking a trip to Australia. Should you follow the guide or randomly explore the first beach you see?
BugFinders Training Video: Session Based Exploratory Testing
BugFinders Training Video: Session Based Exploratory Testing, An Overview of Exploratory testing Techniques, See youtube at [7], by Martin Mudge
Another nice short video with the objectives to:
- Explain Session Based Exploratory Testing
- Give an example of a charter
- Show how to mind map
- Tips on effective Session Based Exploratory Testing
They define Session based exploratory testing as "define what to test, give it a time scale and test based on that."
This type of testing requires some session preparations. And testing should ideally follow the plan. (Test Objective, Duration, what to test, type of testing how much of the time should be spend exploring and how much on the charter, evidence recording, expected Risk Areas, and so on).
They have a nice example with that includes testing of a pogo-stick. Context is important. They use mand mapping as a tool for mapping the area to test, and they use the mind map to map the testing performed. Their tips are:
- Do
- Do Record bugs on the map
- Do Use tools to record the test session (screen recorder?)
- Do Create a description of what you did
- Do Be prepared to talk about the tests and bugs
- Don't
- Don't just play with the application (unless the charter says so)
- Don't just record bugs instead of continuing with the charter
- Don't Rush bug reports. Carefully formulate the bugs.
- Don't forget to plan - understand what you need to know/have before testing
Tillhör Kategori Test
Se även Beräkna Testinsats och Riskbaserad Testning
Se också James Bach Open Lecture On Software Testing