Validation of calibration software ? as required by ISO 17025, for example ? is Admit that folks don?t prefer to talk about. Often there is uncertainty about the following: Which software actually must be validated? If that’s the case, who should take care of it? Which requirements should be satisfied by validation? How do you take action efficiently and how is it documented? The following post explains the background and gives a recommendation for implementation in five steps.
In a calibration laboratory, software is used, among other activities, from supporting the evaluation process, around fully automated calibration. Regardless of the degree of automation of the program, validation always refers to the entire processes into that your program is integrated. Behind validation, therefore, is the fundamental question of if the procedure for calibration fulfills its purpose and whether it achieves all its intended goals, that is to say, does it supply the required functionality with sufficient accuracy?
If you need to do validation tests now, you should be aware of two basics of software testing:
Full testing isn’t possible.
Testing is always influenced by the environment.
The former states that the test of most possible inputs and configurations of an application cannot be performed as a result of large numbers of possible combinations. With respect to the application, the user must always decide which functionality, which configurations and quality features should be prioritised and that are not relevant for him.
Which decision is made, often depends on the next point ? the operating environment of the software. Based on the application, practically, you can find always different requirements and priorities of software use. Additionally, there are customer-specific adjustments to the software, such as concerning the contents of the certificate. But additionally the average person conditions in the laboratory environment, with a wide range of instruments, generate variance. The wide variety of requirement perspectives and the sheer, endless complexity of the software configurations within the customer-specific application areas therefore make it impossible for a manufacturer to test for all the needs of a specific customer.
Correspondingly, considering the aforementioned points, the validation falls onto the user themself. In order to make this technique as efficient as you possibly can, a procedure fitting the next five points is preferred:
The info for typical calibration configurations ought to be defined as ?test sets?.
At regular intervals, typically one per year, but at least after any software update, these test sets should be entered into the software.
The resulting certificates could be weighed against those from the prior version.
Regarding an initial validation, a cross-check, e.g. via MS Excel, may take place.
The validation evidence ought to be documented and archived.
WIKA offers a PDF documentation of the calculations completed in the software.
Note
For further information on our calibration software and calibration laboratories, go to the WIKA website.

Leave a Reply