Contemporaneous: This is a big and fancy word that means that the data was recorded while the events of the record were actually occurring. The data, records, and events must be recorded at the time of the events themselves.
If a process has multiple steps over time, we should record each step as it happens.
Why PIC/S and not the others e.g. MHRA? FDA? What are the differences between the different types? Is there something/ anything missing in one over the other?
I chose PIC/S because it has a wider international scope and because it addresses computerized systems specifically. The other guidances do not. However, you should give all of them a read, but you will notice that they are each saying the same thing. Maybe you will find one of them written in a way that makes more sense to you than the others.
When the system is validated against 21 CFR Part 11, what else is to be validated/verify to be in compliance with Data Integrity?
Part 11 is primarily focused on making sure that a system that uses electronic records does not have more risk than the equivalent system using paper records. Part 11 is not focused specifically on Data Integrity, though Data Integrity is part of it. As a result, you could easily have a system that is 100% compliant to Part 11 requirements but is storing electronic records that have major integrity problems.
Just like you can store a worthless item inside a very nice vault. It would be hard to list everything that should be done in addition to get to comply with Data Integrity.
In fact, I think it is of little use to think of Data integrity as a separate set of regulations for compliance. If you are truly complying with the intent and letter of the predicate rules (Parts 210, 211, and 820 for Pharma and medical Devices) then you will be highly likely to have no Data integrity issues.
I know that statement may be of little use, as a common corporate response to Data Integrity guidances is to institute a program to verify Data Integrity compliance for each computer system in the facility.
I think a better approach would be more holistic, to look at all the ways, the processes and teams (including training) with which we qualify or validate an electronic records system. And this includes design; the system is built in such a way that systems commissioned, implemented, and validated will comply with the spirit and letter of the predicate rules, and by extension, with Part 11 and Data Integrity guidances.
Sorry for the jargon! Some terms and acronyms are so often used in some parts of the world that it can be easy to forget that there might be someone new to the industry who is not familiar with the term.
SOP stands for "Standard Operating Procedure." I think the term has an origin in military usage, but I am not sure. A SOP is simply a document that describes the correct way to perform a procedure. In clinical research, the International Council for Harmonization (ICH) defines SOPs as "detailed, written instructions to achieve uniformity of the performance of a specific function."
Because it is written down, and version controlled, and approved, the assumption is then that every time the procedure is performed it will be performed the same way.
Approval for a SOP usually comes from multiple people, including a subject matter expert (usually the author), a responsible person for the department in question (often supervisor), a high-level manager or executive, and finally by the Quality Assurance (QA) group. The signed and approved would then enter a process called document control to protect the document from changes and to distribute only correct copies for training and execution.
There are, and you can find the technical definition within the guidances referenced in the webinar slides.
I feel these definitions will tend to guide you back to the same answer. You need to understand your system to decide what a complete data record should look like, beyond the other definitions provided in ALCOA+. As an example, let's look at a monitoring system like viewLinc.
We want the timestamp (sample alarm report), and we want the temperature and the identity of the source instrument. However, this particular record won't be complete, or even useful, without knowing the location of the source instrument. Similarly, you will need to decide for your candidate system, what data constitutes a complete record.
The Internet of Things has allowed the development of wireless technology that ensures reliable, long-range signal strength in sensor-equipped data loggers. This webinar will show you how recent advances are changing the way controlled environments are monitored, especially in GxP-regulated applications.
- A brief history of wireless technology
- Key differences in current technologies and functional comparisons of each (i.e. Wi-Fi, Bluetooth, and ZigBee)
- The fundamentals of wireless monitoring networks, including frequency, modulation, LoRa®, link budgets, and network topology
Join us for a webinar on Wireless Temperature & Humidity Monitoring:
Wednesday, April 19, 2017
11:00 AM EDT / 8:00 AM PDT
Who Should Attend:
- Facilities manager
- Calibration manager
- Validation manager
- Validation specialist
- Validation technician
- Regulatory officer
- Distribution Manager
- Warehouse Manager
- Logistics Manager
- Stability Scientist
Paul Daniel is the Senior Regulatory Compliance Expert at Vaisala. He has worked in the GMP-regulated industries for over 20 years helping manufacturers apply good manufacturing practices in a wide range of qualification projects. His specialties include mapping, monitoring, and computerized systems.
At Vaisala, Paul oversees and guides the validation program for the Vaisala viewLinc environmental monitoring system. He serves as a customer advocate to ensure the viewLinc environmental monitoring system matches the demanding requirements of life science and regulated applications.
Paul also shares his GMP experience through regular blog contributions, webinars, and seminars around the world. Paul’s expertise in the demanding GxP world is applicable to any industry where measurement is critical to product quality. Paul is a graduate of University of California, Berkeley, with a bachelor's degree in biology.
In addition to editing the Vaisala Life Science blog, Janice Bennett-Livingston is the Global Life Science Marketing Manager for Vaisala's Industrial Measurements business area.
Pre-Vaisala writing credits include a monthly column called "Research Watch" for Canada's award-winning magazine alive, as well as articles in Canadian Living and other periodicals. Other past work: copywriting for DDB Canada, technical writing at Business Objects, and communications specialist for the British Columbia Child & Family Research Institute.