Testing in Australia has a problem. The Australian IT industry still views testing as a job that anyone can perform with minimal technical skills. A quick search on a leading recruitment site has a number of advertisements for Testers with requirements similar to the following:
– At least 3 years experience in a Tester or Test Analyst role with demonstrated experience in planning and executing functional, systems and regression tests.
– Excellent written and verbal communications skills will be a must for this role – the right candidates must be able to develop clear and concise test cases and test scripts
– Some previous experience using Test Director or Quality Center – any exposure to automated test tools (particularly QTP) would be a benefit
– Experience of various software testing approaches and SQL in oder to extract data for tests.
– A methodical nature and high levels of attention to detail.
(Extra points to those of you who noticed the spelling error in the advertisement = “oder” instead of “order”.)
This ad is typical of those here in Australia for testing positions. No programming skills required, no industry specific knowledge, just X number of years in testing, good communication skills and some experience with a particular test tool vendor’s product.
I’m a big fan of analogies so let’s imagine for a second that a car company uses non technical testers to evaluate new cars under development. I can imagine a dialogue between a test driver and an engineer going something like this:
“The new car feels a bit funny at the front”
“What do you mean by funny?”
“Well I turn the round thinngy in the there …”
“oh you mean the steering wheel”
“yeah, the steering feel. I turn the steering feel to this way, but the car doesn’t turn as much as I thought it would.”
“It’s a wheel, not a feel.”
“oh sorry, wheel”
“Anyway, so let me see if I understand you. When you turn the wheel, and the front of the car doesn’t turn as much as you expect it to.”
By comparison, a test driver that understands how a car works can have a much more meaningful conversation with the engineer:
“It has mid-corner understeer, after turn in.”
“Ok we can solve that by increasing camber slightly or adjusting the rear springs.”
“Let’s try the spring adjustment, as changing camber will also effect tire wear.”
The technical tester is much more effective than the non technical equivalent as they have similar, equivalent or superior knowledge to the engineer but simply a different focus and specialist skills. Whilst my example is fictional, the exact same difference in conversation should be expected between a “Tester” and a SDE/T. What is an SDE/T? An SDE/T is one of the common, technical testing roles at Microsoft. Their testing careers page has the following description:
Software Design Engineer in Test
Tests and critiques software components and interfaces in more technical depth, writes test programs to assure quality, and develops test tools to increase effectiveness.
At Devtest, this is what we do as well using, the same tools, knowledge and experience, as all the other “developers” on the project. The key difference, however is we have a different focus, live in different .Net namespaces (lately System.windows.Automation) and have different goals. However we aren’t any less capable at writing code that the “developers” on the project.
What we constantly have to battle is a prejudice that if you have test somewhere in your job title we are somehow lesser beings, and are not valued or required on most projects. We are seen by people starting out in the industry as un-cool and very few people would consider testing as a career, this is just crazy, we use C#, visual studio, TFS and other cool tools, just like everyone else, we just have a different focus. The “tester is dead” long live the SDE/T.
Bj Rollison has blogged about the percieved value of code coverage and the information, percieved or otherwise that it provides. This post is definatley worth a read and has me thinking about an example that I commonly use drawn from Code Complete when takking about the subject, but that is a topic for future blog post.
The project that I am currently working on uses infopath forms. This was causing us a few hassles on the testing front, especially now we are using WatiN extensively. So after a few failed attempts at using the new .Net 3.0 automaton API’s and not really wanting to learn MSAA, I decided to create a specalised port of WatiN to test infopath and ItiN was born.
ItiN uses a hybrid approach allowing manipulation of an InfoPath document directly using XPath and the XML dom, or the view using traditional WatiN automation.
The full set of HTML controls supported by WatiN is not required for ItiN, so only: buttons, check boxes, radio buttons, textboxes and select lists are supported.
The ItiN framework I is open source on Codeplex here http://www.codeplex.com/itin.
A sample script using the framework that works against the 2003 sample IssueTrackerSimple form saved in c:\ is as follows.
public void IssueTrackingSample()
// Open test form
string infopathFileName = @"c:\IssueTrackingSample.xml";
InfopathTester FormTester = new InfopathTester(infopathFileName);
FormTester.SetFormValue(@"//iss:title", "Issue Title");
// Click the send email button which does nothing as I don’t have email configured for infopath
FormTester.Button(ItiN.Find.ByValue("Send as E-mail")).ClickNoWait();
// TODO: Implement code to verify target
Assert.Inconclusive("TODO: Implement code to verify target");