Saturday, July 28, 2012

Rapid Testing Intensive 2012: Day 5

The final day of the Rapid Testing Intensive #1:

RTI Attendees
The Group photo - taken on the 4th day (I'm in the 2nd row behind the #1)

9:00 AM - We all picked up our Certificates of Satisficity - basically saying we completed Rapid Testing Intensive #1

9:05 AM - Jon, as PM, is starting us off with the RTI Project Status with a background about getting started at eBay. He was forced to do metrics he didn't like, getting bogged down in a ton of meetings and he got the opportunity to train new hires so Jon created a slide deck which he is showing. Going over highlights of the week with some screen images - the first bug filed was eBay Motors experiencing technical difficulties.

9:12 AM - Mark (part of team TRON) continued to get the experiencing technical difficulties problem up until yesterday - it was tied to his account. Wheel Center had about 39% of the bugs, Light Center had 35% of the bugs and the Tire Center had 43% of the bugs. Jon claims that the MyVehicles section only had 1 issue but that's unlikely and that's a reason why metrics need a context and a story before they make sense.

Friday, July 27, 2012

Rapid Testing Intensive 2012: Day 4

9:02 AM - James starts us off on Day 4. We are going to look at the status of the test project in terms of what we need to accomplish and look for the holes. This is a typical rapid testing management maneuver. James is showing a graph and reiterates he doesn't believe in fake metrics. The pink bands represent off hours and the clear bands represent on hours. At the beginning there is a very big jump in the number of bugs and then it flattens out.

9:08 AM - Turns out Paul Holland is going slow with checking the bugs - he claims its because there are only 3 people checking the bugs while 100 are reporting them. James wants to go through the bugs and check the risk areas to get general impressions so one of the activities together today or tomorrow might be to place risk measures on beach bug. The graph may make it on the front of the bug report. Dwayne says he isn't sure of the value of the graph and James says he also isn't sure but he doesn't need to know the value because he thinks it will provoke interest of the reader - in this case eBay.

9:12 AM - In rapid testing we don't put up graphs that give the impression we want to give, which is why James will filter out all duplicates and clean out the rest of the noise that could mislead readers. The graph could give the general impression of industriousness of the group over the four days we were here. Keep that skepticism in mind before considering showing metrics like this. You should always have someone doing bug triage otherwise you get a lot of noise in your reports and nothing gets corrected - no pressure. If you don't have a big team, if you can't dedicated someone, you can do it one at time at the time of the reporting.

9:22 AM - If you don't do bug triage then you get a lot of complaints from developers and managers even though they've never looked at them. It takes time but its worth it. James says at Borland they could do 20 bugs an hour and they determined out of the 800 bugs about 400 were legit bugs. You've got to maintain the quality of the list. After the first triage you get a much better feedback loop from that information. After the scrubbing we will want to see what eBay's final decisions are about the bugs. How do they rate the bugs we've created, what do they think of the bugs we've reported, how many do they end up fixing? That's the big thing.

Thursday, July 26, 2012

Rapid Testing Intensive 2012: Day 3

9:02 AM - Jon kicks off the Intensive with his project meeting. Talking about the communication between us and his eBay team.

9:07 AM - James talking about the upcoming assignments which will be split between onsite and online. Each table will get a 30 min test session. Later today we will be working / testing with tools since both James and Rob has some experience with this.

9:10 AM - James is talking about sympathetic testing.

9:15 AM - James is answering a question about knowledge transfer for regression tests when someone leaves a company. James uses the analogy of someone driving a car, if someone comes in and wants to drive his car he doesn't write down his driving procedures. He assumes that driver has driving skills. A tester should be good at rapid learning, skilled in testing and since most testers are untrained much of the documentation is of poor quality anyways. In Rapid Testing you create concise documentation, take test notes, you can take video but skilled testers should be able to pick up things fast.

9:22 AM - Pay attention to the test coverage outline - maintain it. Maintain the risk coverage outline.

Wednesday, July 25, 2012

Rapid Testing Intensive 2012: Day 2

9:00 AM - Start of the day. James doing some talking about what we did yesterday, he's built a mind map. James and Jon are going over our schedule - gonna try to stick to it better than we did yesterday.

9:23 AM - Jon is doing a de-brief from yesterday / Project Check in. Talking about how good the bugs are that were filed. Nice job.

9:30 AM - Reviewing the TCOs from yesterday. Don't update them if it's going to cost too much. Being critical of one TCO that according to James could be affected by Visual Bias - only test the things they see. This is why we use heuristics strategies.

9:40 AM - Lighting and Wheel center for eBay Motors have been added to the scope of My Vehicles and Tire Center. Session of survey the functionality and do until 10:45 AM. Modify your TCOs.

10:45 AM - Break time!

Rapid Testing Intensive 2012: Day 1 Recap

This is a previous day recap of the things we did in Rapid Testing Intensive #1, Day 1 on Orcas Island, WA in 2012. I hope I remember everything. Some of the information I took from Karen Johnson's internal micro-blog. We've got eBay's support, developers online, ready to help with any bugs we find.

9:00 AM - Description of what we will be doing from Jon and James. How Jira works for reporting bugs, find project documentation, etc.

9:31 AM - First assignment is a Usability Test of eBay Motor's My Vehicle section. Session testing in pairs. There is a script to follow, find and report bugs as well as fill out the script. Post it to Jira when done.

11:35 AM - Wrap up of the first assignment. Get all bugs in.

11:47 AM - Usability test debrief from Jon and James.

12:00 PM - Lunch time.

Wednesday, July 11, 2012

Enrolled in BBST Foundations

It's official. I'm enrolled in the BBST Foundations course for November through AST.

I joined AST (Association for Software Testing) with the end-goal to enroll in the BBST (Black Box Software Testing) Foundations course. I've read about the classes, seen a number of experts whom I trust recommend them and also heard good things from my post on SQA StackExchange. BBST.info which is home to Cem Kaner and Rebecca Fiedler (the creators of the material) BBST consulting practice say this:
Too many testing courses emphasize a superficial knowledge of basic ideas. This makes things easy for novices and reassures some practitioners that they understand the field. However, it’s not deep enough to help students apply what they learn to their day-to-day work.
The BBST series will attempt to foster a deeper level of learning by giving students more opportunities to practice, discuss, and evaluate what they are learning. The specific learning objectives will vary from course to course (each course will describe its own learning objectives).

Saturday, July 7, 2012

15" MacBook Pro with Retina

I decided against waiting for Christmas and a week ago ordered a new 15" MacBook Pro with Retina display with 16GB of RAM and a 256GB Solid State Drive. What can I say other than I was impressed with the 2012 refresh.



When it does arrive (in about a month or so) I intend to write about my current 13" MacBook Pro's (last years model) performance scores (xbench, SSD benchmarks, etc) and the new 15" MBP with Retina's scores.

In the past I've written a few times about Solid State Drive performance in Apple computers:

I do quite a bit of software testing on a number of different platforms - Mac, Windows - and I think this new computer will do nicely. 

Friday, July 6, 2012

Do Software Testers Need a College Education

I came across an old blog post on the uTest blog Do Software Testers Need a College Education. The author says:
Depending on who you ask this question to, you’re likely to receive various degrees (pardon the pun) of yes and no. Or you may find many others who answer in a noncommittal way: “it depends.”
Having worked closely with thousands of software testers in the uTest community, I can attest to the fact that many testers do in fact have impressive resumes with regard to higher education (master’s degrees, PhD.s, etc.). However, there is also convincing evidence that demonstrates quite the opposite. So if you let the data speak for itself, what is one to believe? 
The article goes on to list a few explanations from the For, Against and It depends camps. I was trying to comment on the article but it wasn't working. If they turned off the comments for this article then why is the comment box still available? I must have found a bug! 


Here is my input:

Wednesday, July 4, 2012

Anyone Can Test, right?

In this video from StarEast Rob Sabourin talks about his experience with just anyone testing.



Anyone can test, right?