Validation of calibration software ? as required by ISO 17025, for instance ? is a topic that folks don?t prefer to talk about. Almost always there is uncertainty about the following: Which software actually should be validated? If so, who should take care of it? Which requirements must be satisfied by validation? How will you take action efficiently and how is it documented? The following post explains the background and gives a recommendation for implementation in five steps.
In a calibration laboratory, software can be used, among other activities, from supporting the evaluation process, up to fully automated calibration. Regardless of the amount of automation of the software, validation always refers to the complete processes into that your program is integrated. Behind validation, therefore, may be the fundamental question of whether the procedure for calibration fulfills its purpose and whether it achieves all its intended goals, that is to say, does it provide the required functionality with sufficient accuracy?
In order to do validation tests now, you ought to know of two basic principles of software testing:
Full testing is not possible.
Testing is always influenced by the environment.
The former states that the test of all possible inputs and configurations of an application cannot be performed due to large number of possible combinations. With regards to the application, the user should always decide which functionality, which configurations and quality features should be prioritised and that are not relevant for him.
Which decision is manufactured, often depends on the second point ? Sins operating environment of the program. Depending on the application, practically, there are always different requirements and priorities of software use. There are also customer-specific adjustments to the software, such as regarding the contents of the certificate. But additionally the average person conditions in the laboratory environment, with an array of instruments, generate variance. The wide variety of requirement perspectives and the sheer, endless complexity of the software configurations within the customer-specific application areas therefore make it impossible for a manufacturer to test for all your needs of a particular customer.
Correspondingly, considering the above points, the validation falls onto an individual themself. In order to make this process as efficient as you possibly can, a procedure fitting the next five points is preferred:
The data for typical calibration configurations should be thought as ?test sets?.
At regular intervals, typically once a year, but at the very least after any software update, these test sets ought to be entered into the software.
The resulting certificates could be compared with those from the previous version.
In the case of a first validation, a cross-check, e.g. via MS Excel, can take place.
The validation evidence ought to be documented and archived.
WIKA provides a PDF documentation of the calculations carried out in the software.
Note
For further information on our calibration software and calibration laboratories, go to the WIKA website.