Recently my friend and colleague Paul, who happens to be the VP of Technology at a major aerospace and telecommunications company, sent me the following query.
"I would be interested in getting your views on the use & impact of social networking on software & hardware testing."
That is an interesting question. Three areas come to mind.
1. Social networks devoted to the subject of testing.
Test is generally not taught in school, and doesn't receive the same respect in a company as design or development. Participation in a social network focused on the subject gives testers the opportunity to ask questions or discuss ideas, keep up with the state of the industry, or just receive encouragement in the importance of the discipline. It can be a very useful tool for professional development. I have recently started participating in the LinkedIn Group Software Testing & Quality Assurance, and a really excellent independent organization called The Software Testing Club (www.softwaretestingclub.com).
2. A good bug tracking tool has built in social networking type features (IMHO).
Any serious test effort, in hardware or software, should involve the use of a tool to document and track bugs or defects. This would include the details of the problem and the test that uncovered it. It would track the state of the bug as it progresses through its lifecycle toward fix and verification - or whatever might be its final disposition. The tracking packages can also have some very useful extra features that seem to fall into the domain of a social network, like the ability to host threaded discussions attached to the defect, and notify interested subscribers of updates in the discussion or state, or subscribe to new filings in a particular area. The tool is assumed to be running internal to a company, perhaps on their intranet.
There is a relatively new type of testing going on today that involves a large group of testers, often geographically dispersed, working independently, but organized and managed over a social network. This is somewhat reminiscent of the way open source software development works. A prime example is a company called UTest (www.utest.com) that is worth checking out. They have >10,000 people signed up, non-employees, with varied levels of education and experience, who have an interest in participating in software testing. A company will contract UTest to have their software exercised, and UTest will assign a group (50? 100? 200?) to the task. Testers are compensated based on unique bugs filed or test plans supplied. This has been called crowdsourcing.
The Software Testing Club (mentioned in point 1.) is building a crowd-sourcing type organization called (strangely enough) The Crowd. Expect that to be a more focused, perhaps closer to a contractor source.
Due to their large and actively interested user bases, Microsoft and Google get to accomplish a fair semblance of crowdsourcing with a Beta test of a new product or version.
It's a little harder to imagine the applicability of crowdsourcing to hardware test. I imagine you could run a test effort against a released product, or send out samples to a large group.
So those are my initial thoughts on the subject. I would love to hear yours. Comments welcome.