How I Write a Conference Report

A while ago I was able to attend the KWSQA Targeting Quality conference in Waterloo Ontario.  After a great time learning, connecting with old friends, and meeting new ones, I eventually had to go back to the office.  When I got there I was expected to produce an experience report to justify the trip, as I am sure many of you have had to do in the past.

For the purposes of this post, I would like to share a couple tricks I employed to produce what I would consider a decent experience report.

Focus on Value

In my case, the company covered the bill for the trip and the conference.  though it wasn’t a huge investment for the trip, I wanted to make sure it was a worthy investment.  I learned lots of things at the conference, and it is important to make sure those tidbits of knowledge are included to show what was learned that could be leveraged for the company.

Focus on Solutions

I have seen quite a few reports from others (I have even been guilty of it in the past) that just rewrite the class descriptions in a report form and call it good (i.e. I learned x in class a and y in class b).  This covers my first point a bit, but just listing random facts and topics that you learned about don’t show the application of that knowledge.  Based on all of the knowledge you gain, seek for ways to apply that knowledge to problems currently facing your company.

Implement Solutions/Value

Once you have this knowledge and some way in which to apply it, the next step I would consider in writing a great experience report is to actually implement the ideas in the report.  If the experience report is just some document that gets filed into the nether regions of the company storage banks, where is the value in that?

Allow the lessons learned to extend out of the conference, and off the page of the experience report and actually work to implement what you learned.  I was able to do so with what I learned at KWSQA and doing so made the experience (and the experience report) much more valuable.

Below is the text of my experience report from KWSQA (sanitized a bit for safety reasons) for an example of these suggestions in practice:

Targeting Quality 2012 Conference Attendance Report

-Wade Wachs-

After spending a couple days at the Targeting Quality 2012 conference sponsored by KWSQA, I came back to the office with a few items that I feel would benefit the culture and outcomes of the development and QA teams in our company.  Those items are listed and explained below.

Reduce/Remove any Us vs. Them culture

This is one of the biggest actionable items I came away with from the conference.  This applies in several dimensions that our company is already taking actions to accomplish.

Dev vs. QA

I think we have managed a pretty decent relationship between the development team and the testers in our company, but we have consistently thought of these as two separate teams.  One of the big things that I heard at the conference was the idea of considering the testers as part of the development team.

Paul Carvalho talked about this in terms of the fact that SCRUM processes only recognize 3 roles of Product Owner, Scrum Master, and Product Developer.  That is not to say that only those who write code count as developers, but that all members of the team that are not managing or defining the requirements should be working to build a quality product.  I had several conversations with Paul and others that suggested a cultural shift to include the testing role in the team of developers could have a significant impact by tightening the feedback loop between code creation and testing.

We have already made significant steps in the last couple weeks to work towards a goal of integrating the code writers and testers better.  Conversations are in the works to continue this integration further.

Office 1 vs. Office 2

Selena Delsie made a comment that I really liked along the idea that having a small team that practices agile in a larger more waterfall organization is typical, but greater benefits can be realized if the whole organization works together in a more agile manner.  This really hit home for me, as I have felt that Office 1 has been going more and more agile while Office 2 is still struggling with understanding how we do things.  I wrote in my notebook in Selena’s session, “The WHOLE company needs to BE agile, not just development DO agile.”

After conversations with an internal employee last week, I think we are taking some good steps in this direction with the inception of monthly blackout dates and taking the time to all meet together as a company and discuss what we are all doing.  I am cautiously optimistic that these meetings could have a significant positive impact on the quality of the software we are producing as we reduce the feedback loops between those of us producing the software and those teaching how to use the software.

The Software Testing Ice Cream Cone

Paul Carvalho in his tutorial about pitfalls in agile organizations talked about the balance of manual testing and automated testing.  Based on some concepts from Brian Marick (one of the Agile Manifesto signers) and a couple others, there needs to be a push to have manual testers doing business facing testing that is critiquing the product, and spend as little time as possible focusing on base functionality and regression checking.  The amount of testing can be drawn in a pyramid with unit tests at the bottom, integration then functional tests on top of that, and manual exploratory testing depicted as a cloud on top of the pyramid that is being supported by the bottom three layers.

However, in many organizations (ours included) the actual testing effort is an inverted pyramid with very little automated unit and integration testing, a little automated functional testing and lots of cloud shaped manual testing, which ends up looking like an ice cream cone.  I have already talked with Steve about turning that ice cream cone around by adding some additional effort in unit testing and better supported automation.  This goal is in the process of being implemented via the talent reviews with QA and developers.

Effective Metrics

There was a great keynote from Paul Holland where he gave a few techniques on how to effectively provide metrics to management while maintaining integrity of the narrative.  The few concepts that I would like to investigate more and implement are:

– Provide metrics along with narrative to provide the full story behind the metrics.  This narrative can contain any of the potential pitfalls or dangerous conclusions from the metrics or other qualitative information not captured in numbers.

– Use a dashboard to provide a better picture of testing activities.

– More effective use of sticky note boards and how to accurately use those for managing testing effort and displaying work that is being done.

I also was party to a couple side discussions along this topic at the conference.  I hope these conversations will be helpful in moving forward in our goal to identify useful performance measures and provide that information up the management chain.

All in all it was a very enjoyable conference.  The intangibles of the conference were many, but include an increased passion in continuing to push forward, a feeling that the company values me as an employee enough to invest the funds to send me to training, and an increased connection to the testing community to further relationships that will be sustaining in the future.  I truly appreciate the investment and would like to attend further conferences in the future as we get a better handle on this current list of improvements.

Comments are closed.