Thursday, July 26, 2012

Rapid Testing Intensive 2012: Day 3

9:02 AM - Jon kicks off the Intensive with his project meeting. Talking about the communication between us and his eBay team.

9:07 AM - James talking about the upcoming assignments which will be split between onsite and online. Each table will get a 30 min test session. Later today we will be working / testing with tools since both James and Rob has some experience with this.

9:10 AM - James is talking about sympathetic testing.

9:15 AM - James is answering a question about knowledge transfer for regression tests when someone leaves a company. James uses the analogy of someone driving a car, if someone comes in and wants to drive his car he doesn't write down his driving procedures. He assumes that driver has driving skills. A tester should be good at rapid learning, skilled in testing and since most testers are untrained much of the documentation is of poor quality anyways. In Rapid Testing you create concise documentation, take test notes, you can take video but skilled testers should be able to pick up things fast.

9:22 AM - Pay attention to the test coverage outline - maintain it. Maintain the risk coverage outline.


9:24 AM - Jon is talking about a dice game we played last night and how important reputation is. Everyone here is building a reputation. James says things get easier with a better reputation - less annoying things are asked of you, less questioning of your work.

9:28 AM - Feature capability - is it capable of doing what its supposed to do? Correct output? Feature consistency - if they are similar features do things similarly? Example: if you save a Word document in .rtf and then you save a Wordpad document in .rtf - will they both save in the same?

9:30 AM - Are we going to do any bug reviews here? James wishes he thought of that idea he just has to figure out where we can fit it in.

9:36 AM - Rapid Software Testing and Agile fit in perfectly because RST all happens in your head. Agile has a strong emphasis on automated testing and tester's need to be weary of automated testing because it isn't actually testing - it's fact checking. Automated testing / checks are happy path tests, you aren't finding bugs that could be found through more aggressive testing. Rapid testers love using tools if they make you a better tester.

9:45 AM - James talks about Cucumber being a waste of time. James uses his programming skills to improve his testing skills and build tools but he sees a lot of Agile people are tool happy. Andrew points out those using Cucumber may use it because its not an obsession with tools but a way to leverage or improve the quality of the communication they receive.

9:50 AM - James is sharing his experience on using video to communicate findings instead of documentation.

10:00 AM - James is recapping what we've done the last 2 days. Did some testing, worked on risk lists, did some specification reviews, some people ignored the specifications, demonstration test sessions, going forward we will look at tools, maybe do some bug triage and we've blown eBay away with the amount of bugs we produced.

10:02 AM - Online people will work with Scott Barber. The online people are offline until 1PM. James is describing how much they've learned and are adjusting to this format of online and offline. Today we are doing a local activity for the next 30 minutes we are doing a test session on the add photos part of My Vehicles.

10:40 AM - Test session is over, filling our bugs now.

10:45 AM - Break time.

11:03 AM - Each table is going to prepare a professional test report on what we did during the most recent test session. In written and oral form. Written will consist of 2 flip chart pages. 20 minutes. It must relate to the entire work of the entire table and then we will give the report to the entire room. James in basically interested in what we did, the story, specifically the conditions that we tested.

11:08 AM - Working my with team - TRON on building our professional test report.

11:26 AM - We are still working on reports. James says he wants decent reports and you can't do it without practice in only 20 minutes. He is going to add 10 more minutes so we can get a good report down.

11:36 AM - Time to review our reports so we are putting them on the wall. James is video tapping this so it might be available online at some time. He is introducing the video and we are up first.

11:56 AM - Took 20 minutes to give our report - which was the first stand up test report I've ever given. (Apparently I came across as a little nervous - mostly nervous about the material) Some of the terminology I used was rather vague which caused James to ask numerous questions to help identify what I actually meant. Anything we say that has some inherent meaning is immediately questioned by James because the customer may not understand what that means.

12:00 PM - Another group (S-Table) is up talking. Simon is presenting for S-Table group and he's doing a pretty good job by speaking very clearly. (Everyone who goes after the first team should be better! hahaha) James has got some questions and feedback for Thomas but his board / post it sheets are much clearer than ours; again you'll be able to see and his group but he cuts them off because now it's time for lunch.

1:00 PM - James is giving a presentation on How to Give a Professional Test Report. He's going to talk about tools in a bit. Test reporting is the heart of testing, its helpful for managing yourself. Reports aren't just facts, its a choice of which facts matter and you are shading reality. No one who gives a useful report is revealing facts, they are always pruning picking and choosing. Suppress silly metrics - it's like counting how many unicorns will fit into your cubicle? We are back with the online people.

1:08 PM - In the absence of context test case counts have no meaning - so don't include them. Pass rate is a stupid metric without context. James is showing slides from his How to Give a Professional Test report but I'm not sure where the slides are coming from.

1:17 PM - You can't give a form like test case counts, pass rate fails, etc. to managers because they don't know what any of that stuff means. You need to produce a report that tells them something. You can list your bugs because the titles can give them context and you can talk about it. Pie charts, graphs don't tell anyone anything.

1:20 PM - Talking to people is an alternative to counting test cases! You can summarize certain ideas that people need to know and an example of that is a Low Tech Testing Dashboard.

1:27 PM - A professional test report is one that fits the context for the customer or the person who matters. It could be nice to list the things that are important to do but haven't gotten to yet. James is explaining what his dashboard means. He recommends not using happy faces when writing on a white board.

1:35 PM - Your dashboard should have structured subjectivity, a human judgement. No raw data on the dashboard because management can't process raw data when outside the context of the team. When testers do that we are abdicating our responsibility. Management needs to glance at the test report and make decisions very quickly about what needs to be done.

1:36 PM - Test report should be 3 levels. A story about the status of the Product, a story about how you Tested it, a story about the Value of the testing. Apparently our test reports contained all 3 of these levels. Yay!

1:40 PM - Use safety language - phrasing that qualifies or otherwise draft statements of fact so as to avoid false confidence like: I think, so far, apparently, I assumed, it appears, etc. When you are trying to communicate something dramatically (and not factually) be careful of your use of safety languages. Sometimes when you are pressed for time or need to give your message a rhetorical punch or for ritual practice (like at the altar) you probably want to avoid safety languages.

1:48 PM - James is showing examples of reports. He shows the most expensive report he has ever constructed at roughly $250,000 of labor charges for a patent infringement lawsuit. It shows each of the claims in the patent and the proof as demos and exhibits something violates this patent. This was the lawyers test case outline. The second report is an exploratory report that was filmed and contains details about what was done, what was asked and what was thought of by the people doing the test. The third report was one back from 1994 that was based on an idea James had about "rapid testing".

2:00 PM - Test reporting is fundamental. Practice this even though management is not going to force you to do this. Jon talks about how he does reports at eBay and he does several variations.

2:05 PM - We are going to talk about tools and Robert Sabourin. The typical automation formula is purchase an expensive tool, hand them to a tester (forced on them), testers are forced into test case-ism which changes how the tester thinks or forces the company to hire automation testers. This can work if your product is easy and doesn't change much. Tremendous amounts of automation effort and it keeps braking and then you have to spend an increasingly larger amount of time fixing things.

2:15 PM - Snap out of that routine and use free tools. As James calls it guerrilla test automation - quick and dirty tools that can help. If they don't help then abandoned them. Not all testers should be programmers that's a bad idea. You need one tester who is a good programmer so they can build tools, but you want a wide variety of testers who are interested in a number of areas.

2:30 PM - James is answering questions. Ajay asked one about safety language. We are breaking for root-beer floats. Cool.

2:51 PM - We are back to doing test reports with Thomas and his team "7 Up" I think they are called.

3:03 PM - Susan is up for team Coho talking about their test report.

3:09 PM - James is reviewing, talking about watching our team (TRON). We called out facts, we would get quiet, then we started discussing problems and struggles, and would then focus on specific deep issues. We have to have a foot in both worlds:

  • the mission, risks, or TCO, the overview.
  • focusing on the deep specific issues or the path you are on.  
Testers need to be able to bounce back and forth at certain intervals. These are two types of thinking that can be incompatible at certain points. We are done with test reports.

3:15 PM - Rob is up and he's wondering around tell a story about coming to RTI and talking to James about bring a tool. Rob has a picture with 4 elements (a quick and dirty framework) showing his system under test. Rob is showing and explaining how his ruby tool works. Rob has an example framework of Ruby on his website Amibugshare.com

3:30 PM - James said if enough people are interested in doing combination testing he will do a webinar on it. James and Rob are talking about the results the program returned which measured the time in milliseconds. Now Paul Holland is up helping to figure out what the data means and re-plotting it. One of Rob's undergraduates is a programmer and now his toolsmith who created this application to help us test eBay motors. It performs massive searches with lots of combinations.

3:42 PM - We are going to install the tool and try it. We are using tools to supercharge the human mind, give us more abilities.

3:51 PM - James wants us to brainstorm what we can do with this tool, how we might or how we think it should be used. Then he's going to get another tool up and running.

4:10 PM - We are now reviewing the ideas and of course James comes to our group first. Dwayne and James are discussing the ideas we came up with on how to use Rob's tool. Andrew figured out there were quite a few bugs in the tool and a discussion over it began.

4:35 PM - Other groups are talking about running the tool at different times and on international sites. One of the other participants found a rather interesting bug with a tool designed to check all listings in categories or filters and compared them to non-categorized listings.

4:43 PM - James is showing an example of a blink test of eBay using a tool called IECapt. There are many oracles for this: a juxtaposition blink test oracle, zoom blink oracle, speed blink oracle and a noise zoom oracle.

4:50 PM - James shows a mind map of eBay's hosts which he made through a web crawl. James says its big like a city and Jon says he prefers the term "death star".

4:55 PM - Done. We are going kayaking.


Photos from the event have been posted on Flickr: http://www.flickr.com/photos/83337442@N02/
Check out the other days:

No comments: