Sunday, November 28, 2010

Social Networking for Test Professionals follow-up

In the last entry I formulated my ideas on what social networking means to us test professionals. Soon after I read a column by Thomas Friedman on the professional use of social networks. Here is the link, and a relevant excerpt:

"’s knowledge industries are all being built on social networks that enable open collaboration, the free sharing of ideas and the formation of productive relationships — both within companies and around the globe. The logic is that all of us are smarter than one of us, and the unique feature of today’s flat world is that you can actually tap the brains and skills of all of us, or at least more people in more places. Companies and countries that enable that will thrive more than those that don’t."

Clearly Tom reads my blog. :)  

Then I stumble on a blog by Rick Nelson in Test & Measurement World magazine (here) quoting that same Thomas Friedman column. Rick also references an interview EDN managing editor for news Suzanne Deffree did with Deirdre Walsh, social-media and community manager at National Instruments.

Walsh listed five reasons why engineers should use social networks (paraphrasing): 
1. technical support from peers
2. staying abreast of the industry for career reasons
3. be heard, by your peers and your vendors
4. professional networking with like-minded peers
5. become "famous" in a free-form venue not restricted by their job description

So look around, change your perspective, find your voice, get involved. Here are a couple of forums I frequent, and a couple where I am a member and occasionally drop by:
groups within LinkedIn:
Software Testing & Quality Assurance
Software Testing and Quality Assurance (yes, they are different)
Bug Free: Discussions in Software Testing
DFT Experts  (Design For Test - a hardware thing)
First Fault Problem Solving

Wednesday, November 17, 2010

Social Networking and the Test Industry

Recently my friend and colleague Paul, who happens to be the VP of Technology at a major aerospace and telecommunications company, sent me the following query.

 "I would be interested in getting your views on the use & impact of social networking on software & hardware testing."

That is an interesting question. Three areas come to mind. 

1. Social networks devoted to the subject of testing.

Test is generally not taught in school, and doesn't receive the same respect in a company as design or development. Participation in a social network focused on the subject gives testers the opportunity to ask questions or discuss ideas, keep up with the state of the industry, or just receive encouragement in the importance of the discipline. It can be a very useful tool for professional development. I have recently started participating in the LinkedIn Group Software Testing & Quality Assurance, and a really excellent independent organization called The Software Testing Club ( 

2. A good bug tracking tool has built in social networking type features (IMHO).

Any serious test effort, in hardware or software, should involve the use of a tool to document and track bugs or defects. This would include the details of the problem and the test that uncovered it. It would track the state of the bug as it progresses through its lifecycle toward fix and verification - or whatever might be its final disposition. The tracking packages can also have some very useful extra features that seem to fall into the domain of a social network, like the ability to host threaded discussions attached to the defect, and notify interested subscribers of updates in the discussion or state, or subscribe to new filings in a particular area. The tool is assumed to be running internal to a company, perhaps on their intranet. 

3. Crowdsourcing.

There is a relatively new type of testing going on today that involves a large group of testers, often geographically dispersed, working independently, but organized and managed over a social network. This is somewhat reminiscent of the way open source software development works. A prime example is a company called UTest ( that is worth checking out. They have >10,000 people signed up, non-employees, with varied levels of education and experience, who have an interest in participating in software testing. A company will contract UTest to have their software exercised, and UTest will assign a group (50? 100? 200?) to the task. Testers are compensated based on unique bugs filed or test plans supplied. This has been called crowdsourcing.

The Software Testing Club (mentioned in point 1.) is building a crowd-sourcing type organization called (strangely enough) The Crowd. Expect that to be a more focused, perhaps closer to a contractor source.

Due to their large and actively interested user bases, Microsoft and Google get to accomplish a fair semblance of crowdsourcing with a Beta test of a new product or version.

It's a little harder to imagine the applicability of crowdsourcing to hardware test. I imagine you could run a test effort against a released product, or send out samples to a large group.

So those are my initial thoughts on the subject. I would love to hear yours. Comments welcome.

Thursday, October 7, 2010

Should you hire from your customers?

In the last two posts I discussed how you should look at your customers to help determine the necessary technical qualifications of your testers, as well as what level of industry experience would be useful. This raised the question of whether you should ever hire from your customers. My answer would be yes - but only with the consent of their management.

Full disclosure: Probably half my group, including myself and my boss, at one time worked for electronics or semiconductor companies, using or supporting internally the kinds of tools that we now develop and test. Certain tools in this industry are very technical, with very specific usage, and industry experience is very useful in developing, testing, and supporting them. For this reason cross migration between vendor and customer is a common occurrence. We have had particular success in recent years hiring field application engineers from our customers ranks.

But be warned! You do not want to get into the situation where your customer thinks that you are recruiting from his fold. You risk losing that customer, potential legal action, or worse - a negative reputation. So if you are considering a candidate who is actively employed by one of your customers, your best approach is to clear it with her management first (with the candidate's consent, of course). Companies can be surprisingly open to this for a couple of reasons:
  • They expect turnover.
  • They may recognize that this employee has career aspirations they cannot satisfy.
  • And they may expect significantly improved support if they have a man on the inside. (and they'd be right!)

So, you have successfully vetted the technical qualifications of your candidate. Great! But don't forget that you are hiring for a QE role. Do they have the right temperament for test? Do they have any background in it? If your candidate shows real potential otherwise than go ahead and hire them, but provide them with formal training in software test once they start.

Monday, September 13, 2010


In the previous post I discussed how you should consider the technical skill level and domain knowledge of your customers when considering prospective testers for your software tools. I would like to add a couple of clarifications.
  1. This doesn't mean that all your testers have to fit the same mold. On the contrary, I am a fan of diversity on multiple levels. Having a team from diverse backgrounds tends to increase the size of your group's technical arsenal
  2. That discussion totally ignored skill and experience in the discipline of software test. An experienced user of specific tools is not necessarily a good tester, and could likely benefit from education in the subject. Similarly, a tester with no industry-specific experience - but who has been working in the field of software test for a decade or two - can bring a lot to the table regardless, and tends to raise the professionalism of the group.
  3. Can someone learn either the domain knowledge or the software test skills? Abso-freakin-lutely. My pet peeve is when a company will only consider someone for a function if they are currently performing that exact function in their current job. I believe one of the most important considerations should be the demonstrated ability for a person to learn and produce.
  4. What if your tool has multiple levels of users? Perhaps you provide enterprise software, where administrators set it up and keep it running, analysts customize the tools to the company's needs, and then the typical user of the front end belongs to an industry specific demographic, whether they be engineers, accountants, or clerical. In a case like that you need to determine which skills should be in every member's toolbox, and which merely need to be represented in the group, or at least accessible to it. For example, you may only need one Oracle DBA, while everyone in the group knows how to drop and add a user in order to reset the test data. Or everyone familiar with the technical lingo and basic use flows, but a couple with enough experience to develop datasets and map out more complex usage scenarios.

The general subject of what makes a good tester has been covered in many places, so I won't get into that here. You can find a useful discourse on the subject - and on most things test related - in
Testing Computer Software, by Kaner, Falk, and Nguyen.

Saturday, August 28, 2010

How do you qualify a tester? Look at your customers.

In an earlier post I discussed a job listing for a QA Manager that was described as "perfect for a new grad." I questioned whether a new grad was qualified to manage anything, particularly QA? Let's step back from the management aspect, and consider what qualifies someone for a QA role. What should you look for in a prospective software test engineer in your company? My advice would be to look at your customers.

Sure, there are certain characteristics you want to look for in a candidate, as an employee of your company, a member of your team, and in particular a software tester. But what I want to focus on is the utility of comparing your prospective hire to your current customers based on two criteria: Technical skill level and domain knowledge.

Technical skill level. You want someone who is capable of not only becoming knowledge and facile with your particular products, but also of using them in all of the myriad ways that your customers will. Does the product run on both Windows and Unix/Linux? A candidate who is "most comfortable with Windows" may not be prepared for the command line, scripts, and system utilities that are common in a Unix house.

Domain knowledge. Face it, if your product under test is an immersive RPG (role playing game), you want a hard core gamer putting it through its paces. Is it a tool for musicians, or an accounting package? It's hard to see how you can test either without specialized knowledge, beyond what might be available in a spec or user documentation. But if we are talking about a lightweight user interface, casual game, or general consumer application, the main concern is the skill of the tester, rather than their background.

In my experience with engineering software it has been important to know the size and content of typical data sets; to understand, for example, that due to Moore's Law the "large" circuit testcase you built six or seven years ago is now a "medium" sized circuit. You want to be familiar with the other tools they will be using in conjunction with yours, and the file formats that implies. Engineers, particularly EEs, are very technical - and notorious hacks. Chances are they are loading the data into the tool via script, kicking off runs with cron jobs, integrating the tool into their environment using API programming, customizing interfaces, and parsing results with their own reporting utilities. If you support it, you need to test it. So you need testers who can think and work like your users.

The point is, you want to find the bugs before your customer does. And you never want to have to say, "Gee, how did they do that?"

Sunday, August 1, 2010

Google Calendar as a Scheduling Tool

In my never-ending quest for a better way to view schedules, I have found that Google Calendar has some really useful features. I was looking for something better than scribbling in a weekly planner, but less cumbersome and more available than MS Project. Outlook is great for daily scheduling but becomes too cluttered from a month overview.

In Google Calendar, tasks can be entered as multi-day events, and show up as a clearly labeled colored bar in day, week, or most useful, month view. (You can enter appointments, but granularity of a day is most useful for scheduling across a release.) These tasks can be imported from a simple .csv file, output from MS Project or Excel.

One of the most powerful features of Google Calendar is that you can have events/tasks organized into multiple, color-coded calendars, which can be viewed individually or layered together. For example, you can create a calendar for each person you supervise or work closely with, for easier coordination. I would also recommend you create a calendar for your schedule tasks separate from your default calendar (where you might have appointments, holidays, etc.). You can create a calendar for the given release timeline. You might even create a personal calendar for items you wish to recall but don’t want showing up in a public calendar. Individual calendars can be modified, shared with other users, and deleted when no longer needed.

These individual calendars can now be viewed in any combination. Below is an example showing my schedule tasks for a given month overloaded with the tasks of two people I was supervising.

Here are the steps to import a calendar from an Excel task list, as you might get from Project. I assume there are columns at least for the task or milestone name, owner, start date and finish date.

1. Sign up for a google/gmail account if you don’t already have one. This will give you access to Google Apps.
2. In Excel, sort the task by owner.
3. Highlight all the rows for a given owner. Copy and paste them into a new worksheet.
4. In this worksheet, eliminate any columns besides task name, start date and finish date.
5. Add a header row with the following for column names – Subject, Start Date, End Date
6. Make sure the dates are in the format  “mm/dd/yyyy” and save the file as a CSV; for convenience, use a name signifying the person and period covered.
7. Go to Google Calendar, and on the left bar, My Calendars, select Create. Use a name indicating the person and period.
8. Below that, Other Calendars, select Add, Import Calendar. Browse for the CSV file created above, and for the Calendar field choose the one you just created.

Note that a quirk (bug) in the system is that the event created from an import is not inclusive of the last day, whereas it is if you create it manually. This creates a one day gap between tasks – no big deal.

Happy Scheduling!

Update: Here is how I output the information from Microsoft Project that I then import into Excel to reshuffle so I can import it into Google Calendar as an Excel task list.

1. Viewing the project in MS Project, select Save As..., choose CSV (Comma delimited)(*.csv)
2. This brings up the Project Export Wizard. Next.
3. Create new or existing map? Select New map, Next.
4. Select the types of data you want to export. Accept defaults Tasks, Export includes headers, Text delimiter ','. Next.
5. Map Tasks Data. First Field, Select Name in the From column, Subject in the To.
Second Field, From field is Start, To field is Start Date.
Third Field, From field is Finish, To field is End Date.
Fourth field, From field is Resource_Names, To Resource_Names (anything).
6. Finish
Now you have a .csv file with the Excel task list to use for the procedure above.

Friday, July 9, 2010

Wanted: QA Manager, No experience necessary ???

Sorry, I am not posting a job opening.

But I did see a posting like this on my High School alumni mailing list last year. "QA Manager; perfect for a new grad." As much as I appreciated the efforts of the fellow alumnus who paraphrased the information from his company's internal listing, I must say my initial reaction was dismay.

First, I may sound like an old fart, but is a new college grad really qualified to manage anything?
Second, you just have to question this company's commitment to quality.

I would like to speak to both of these points.
First, the new college grad as manager. This reminds me of a story my father-in-law told me. He worked for a number of years as a shipping clerk for a large pharmaceutical company. Every few years they would bring in a young engineer, fresh out of school, to manage the department. More often than not this young buck, anxious to prove himself, would reorganize the department and its processes - without consulting the current staff. They would then spend the next few months relearning all the reasons they did things the old way. You can imagine this did not foster an atmosphere of cooperation amongst the staff.

Many in the department were very experienced. My father-in-law has exceptional mechanical and organizational skills. But they did not have the automatic respect conferred with a college degree. You know this young engineer brought a useful set of skills and training to the table. But his inexperience led him to disregard a significant resource - the experience of his staff.

I try to think what I might do if I were that young engineer tapped to manage the department. What if I sat the staff down and explained that, since I was new to all this, I decided to consult the best experts I could find. You, the people who have been doing it for years. Chances are good that approach would have yielded some useful efficiency improvements. Chances are better it would have created a significant morale boost and hence productivity improvement.

The thing is, would I have had the wherewithal to take that approach if I did not have the experience I have now?

Next post - what to look for in a Test Engineer.