Within statistical programming, the quality control (QC) process is essential to ensure the accuracy of the created datasets, tables, listings, and figures. Often, the approach involves independent programming of outputs by both a production programmer and a QC programmer provided with the same specifications. After independent programming has taken place, we can then compare results to ensure that there are no discrepancies.
One of the tools that forms an integral part of this process is a tracker document to record the activities undertaken by both the production and the QC programmers. Tables, figures, listings and datasets usually go through multiple levels of quality control – and each of these QC stages may involve updates to the production programs. Different organisations take different approaches to storing this information, but often the tracking is detailed within an Excel spreadsheet. The contents of the tracker are then usually reviewed manually to ensure consistency.
In this blog, I will share how my colleagues and I created a simple yet very effective solution that automates the process of checking the QC tracker itself and improves efficiency and accuracy.
A QC tracker typically contains information like QC comment history, run dates and QC pass dates as well as essential information about the dataset or output. However, the challenge we face in programming teams is the potential for human error when entering this tracking information. Further, the tracker may reflect errors and inconsistencies inadvertently introduced within the QC process itself.
Here are some examples of the kind of issues that we can sometimes observe within a QC tracker:
Where names or dates have been spelt incorrectly.
Inaccurate run dates
These kinds of discrepancies can occur if the QC programmer has entered the run date incorrectly or rerun the program without noting the update.
QC process errors
Some common examples of QC process mistakes that can show up within a QC tracker include:
When the production program has updated, but the QC programmer hasn’t subsequently rerun their program.
If an output has been run before the dataset which was used to create it was run, thus making any information in the output out of date.
All of the above issues are tricky to identify, as well as time-consuming when there are a lot of outputs to work through. At Veramed, we are committed to continual improvement of our processes and delivering the highest possible quality of work to our customers in the most efficient manner. We realised that we could tackle this routine, but troublesome problem by devising a simple but effective QC tracker checker tool that would allow us to identify and address the above errors more comprehensively and quickly.
The tool we created is a SAS program wrapped in a BASH shell script. BASH is a Unix shell and command language that typically runs in a text window where we can type commands to trigger a specific action. This simple, intuitive interface makes it easy and fast to use for programmers and removes the need for manually updating code for each QC run. Importantly, it remembers the essential study information from the last run, eliminating the need to duplicate data entry and risk errors. It also incorporates QC log files and production output files to cross-reference the important activity dates and ensure their accuracy.
With the click of a button, the tool then produces a Microsoft Excel spreadsheet, which details the identified errors for the team to follow up and correct. This spreadsheet also contains summary tables, that allow the programming team to track study progress. The latter is an especially helpful feature for studies that require a large number of outputs and datasets.
At Veramed, applying this tool has further enhanced the integrity of our QC process and been used within several trial projects. It is an excellent example of the type of practical innovation that we encourage within the team. As programmers, by creating even incremental improvements in the way that we work, we can help to streamline and accelerate the overall development process. In November 2019, I was fortunate to have the opportunity to present the tool during the PHUSE conference and received positive feedback from delegates who attended the talk.
To download a copy of the detailed slides presented at the PHUSE conference, please enter your details below.