This week we are sharing some of the post-webinar questions sent to us after our Data Integrity webinar. If you missed it, watch the recorded version.Remember: if you have any questions, it's not too late to ask! Simply email Paul Daniel.
Contemporaneous: This is a big and fancy word that means that the data was recorded while the events of the record were actually occurring. The data, records, and events must be recorded at the time of the events themselves.
If a process has multiple steps over time, we should record each step as it happens.
I chose PIC/S because it has a wider international scope and because it addresses computerized systems specifically. The other guidances do not. However, you should give all of them a read, but you will notice that they are each saying the same thing. Maybe you will find one of them written in a way that makes more sense to you than the others.
Part 11 is primarily focused on making sure that a system that uses electronic records does not have more risk than the equivalent system using paper records. Part 11 is not focused specifically on Data Integrity, though Data Integrity is part of it. As a result, you could easily have a system that is 100% compliant to Part 11 requirements but is storing electronic records that have major integrity problems.
Just like you can store a worthless item inside a very nice vault. It would be hard to list everything that should be done in addition to get to comply with Data Integrity.
In fact, I think it is of little use to think of Data integrity as a separate set of regulations for compliance. If you are truly complying with the intent and letter of the predicate rules (Parts 210, 211, and 820 for Pharma and medical Devices) then you will be highly likely to have no Data integrity issues.
I know that statement may be of little use, as a common corporate response to Data Integrity guidances is to institute a program to verify Data Integrity compliance for each computer system in the facility.
I think a better approach would be more holistic, to look at all the ways, the processes and teams (including training) with which we qualify or validate an electronic records system. And this includes design; the system is built in such a way that systems commissioned, implemented, and validated will comply with the spirit and letter of the predicate rules, and by extension, with Part 11 and Data Integrity guidances.
Sorry for the jargon! Some terms and acronyms are so often used in some parts of the world that it can be easy to forget that there might be someone new to the industry who is not familiar with the term.
SOP stands for "Standard Operating Procedure." I think the term has an origin in military usage, but I am not sure. A SOP is simply a document that describes the correct way to perform a procedure. In clinical research, the International Council for Harmonization (ICH) defines SOPs as "detailed, written instructions to achieve uniformity of the performance of a specific function."
Because it is written down, and version controlled, and approved, the assumption is then that every time the procedure is performed it will be performed the same way.
Approval for a SOP usually comes from multiple people, including a subject matter expert (usually the author), a responsible person for the department in question (often supervisor), a high-level manager or executive, and finally by the Quality Assurance (QA) group. The signed and approved would then enter a process called document control to protect the document from changes and to distribute only correct copies for training and execution.
There are, and you can find the technical definition within the guidances referenced in the webinar slides.
I feel these definitions will tend to guide you back to the same answer. You need to understand your system to decide what a complete data record should look like, beyond the other definitions provided in ALCOA+. As an example, let's look at a monitoring system like viewLinc.
We want the timestamp (sample alarm report), and we want the temperature and the identity of the source instrument. However, this particular record won't be complete, or even useful, without knowing the location of the source instrument. Similarly, you will need to decide for your candidate system, what data constitutes a complete record.
The Internet of Things has allowed the development of wireless technology that ensures reliable, long-range signal strength in sensor-equipped data loggers. This webinar will show you how recent advances are changing the way controlled environments are monitored, especially in GxP-regulated applications.
Discover the new VaiNet Wireless Technology
Watch a webinar on the VaiNet Wireless Monitoring System and get the answers to these questions:
What kind of signal range is possible?
What wireless features ensure a secure signal?
What kind of wireless structure is effective?
What is the infrastructure investment?
How many data loggers per single network access point?
Watch the video