Thursday, June 4, 2009
" wanted to update you on our first sprint, which is now over. I certainly learned a lot! We did not complete the user story we selected (simple login and logout of the GUI and XML systems with both English and French) for several reasons, but largely because we underestimated the amount of time the individuals on the team were spending on non-sprint activities. We also did not include time for continued infrastructure additions, but those were hopefully a one-time cost that we won't experience again.
We had spent 2 sprints previously working on Engineering stuff (i.e. infrastructure) but we missed some items that we ran into as soon as we started the product development sprint, such as teaching QA to run a build, figuring out how to help QA change product configuration so that the application under test was running independently from the SDK, and waiting for a third-party to translate English into French for all our error codes and GUI screens. There were also some assumptions made on the design side that resulted in QA having trouble getting the product to start, such as which database we should be connecting to and what protocol we support, so we're trying not to make so many assumptions going forward.
The tasks that did not get completed in the last sprint were moved into the second sprint, which is currently underway. And I think things are already going better this sprint, based on what we agreed to do to improve communication and timeliness. One of the biggest problems was lack of team notification when code was checked in, so we now have an email being sent to all when a commit is made. Unfortunately, we still don't have automated builds, or a central build server, but QA can manage with on-demand builds for now. I
n the current sprint, I was determined to provide the test case purposes to design within 1 day so that they could starting coding immediately. Right after our sprint planning meeting, QA got together to draft test purposes for our user stories and list questions about the RFC specifications. We then met with our product owner to get answers to our questions and ensure we were on the right track for test cases. We recorded the answers and then met with developers to share the test cases and come to a consensus on internal error codes. Now design is starting to code, based on the test cases we drafted, while QA starts to script the automated tests. Tomorrow we'll have a defined list of English errors codes which can then be sent off for translation for the next sprint, which will contain the French portion (we've learned that French has to be one sprint behind because of the translation delay).
One of the other major things that came out of the last retrospective was the lack of GUI usability design being done up front. It resulted in a non-production-worthy login/logout page and lots of time at the end trying to figure out what it should look like. For the current sprint, we've agreed to get all programmers, QA and the product owner together, as well as our usability specialist, to design the main headers, footers, menus, and main content page. We met today for 2 hours, and it went really well - we came out of the meeting with a hand-drawn copy of our GUI consensus.
With respect to QA, the three of us are working as a single unit for now, so that we can all learn the automation tool together, design the data-driven test formatting in Excel, and understand the types of questions to raise (when, and to who). By next sprint, I expect QA can work more independently, but we still plan to have frequent peer reviews to ensure we stay consistent and brainstorm a good set of cases. I've had feedback from design that our methods of communication are helping them bridge the requirements gap between waterfall and agile, so I think QA is starting to find its stride on the team.
We still don't have a tool that generates load, but that's something I'll take on in-house with a designer once we've got the functional automation working smoothly. Already, I think the quality is better than what it would have been without Agile. It's really fascinating to see Agile working in such a green environment where they had never written anything down previously. I know we have a long way to go, and a lot of learning to do, but I think we're heading in the right direction.
.. end quote
Lisa Crispin and I have created a course on Agile Testing with the focus on how testers transition from a traditional testing team to testing on an agile team.
Over three days, we put theory into action through a variety of exercises. This course teaches testers how to fit into agile projects, contribute to the whole team and overcome common cultural and logistical obstacles in transitioning to an agile development process. It explains the values and principles that help testers adopt an agile testing mindset and how to accomplish traditional testing processes, such as defect tracking, metrics, audits, and conforming to quality models. Students will learn how to complete testing activities in short iterations, and how testers contribute on a daily basis during each iteration and release cycle. Through interactive exercises and group discussions, participants will discover good strategies for driving development with both executable and manual tests. The course is filled with real-life examples of the many ways agile testers add value.
I also offer it to specific client as well, so please contact me if you are interested. See my website for more details. www.janetgregory.ca/CoursesOffered.html