In this week's blog, Piritta Maunu answers a question she received after her recent webinar: Stability Studies & Monitoring. But first...
Remember Heta's video on something new from Vaisala in last week’s blog? This week R & D engineer John Bekema further explains how new wireless technology is about to revolutionize the way that controlled environments are monitored. Shipping Spring 2017!
John lists the 3 goals he and his fellow developers set for creating something very new...
My colleagues and I greatly enjoyed your recent talk on stability monitoring. These webinars are a great way to ensure that our current practices are aligned with regulatory expectations and that we are interpreting them correctly. I appreciate your insight and quality expertise and we found your presentation very educational.
I work for a biomedical device testing facility in the United States and we utilize a building automation system to monitor all of our rooms and equipment located in several sites. We have been utilizing this system since 2003 and we began setting up monitoring points to capture data every time the temperature changes by 0.2°C. As you can imagine, this has generated a lot of data and as we have grown, we now find this method is creating a lot of “nuisance” alarms.
For example, if an incubator goes out of tolerance for 3 seconds, we have to write a nonconformance report based on our current definition of continuous monitoring. Obviously for our 72-hour studies, a 3-second OOT does not have an impact. So we are planning to redo our SOPs and redefine “continual monitoring” to mean capturing data once every 10 minutes.
My original webinar question was:
“What is the recommended frequency of sampling with automated monitoring? We are in the biomedical device industry...”
This is my other question: We are also looking to simplify our use of our environmental chambers. We conduct a lot of specialized testing for certain clients and usually need to perform these tests quickly. We have several chambers and there are usually three to four temperature and humidity settings of selected by clients. So, in our current process we perform an “as found”, change the set points, perform another requalification, and then release for use prior to utilizing for testing.
We continuously monitor these environmental chambers, which range from stand-alone units approximately 72 cubic feet to 15ft x 15ft room. We were thinking of increasing the monitoring points from one to several and then not conducting requalifications as we would be continually monitoring in multiple locations. We have discussed possibly placing a probe in the top and bottom locations but we are not sure what others in our industry are routinely doing. Could you share what you have seen and what you would recommend so that we can better streamline our process?
Again, thank you so much for taking the time to share your knowledge and I look forward to your response!
Thank you for your kind words and great questions!
To ensure I’ve addressed your original question: “What is the recommended frequency of sampling with automated monitoring?”
This is mainly defined by your company’s requirements, risk tolerance, risk management, as well as the product characteristics and relevant regulatory requirements (GMP, GDP or GCP).
In my experience, the memory size of the recording hardware is a major factor in the final decision. As you heard during the webinar, I have seen monitoring frequencies from 15 seconds to 15 minutes. The typical frequency of sampling is from 1 to 5 minutes. The obvious problem with intervals < 1 minute is that the data loggers’ memories quickly fill and in case of a power outage, you might lose some data as the logger memory is full. But of course, this all is also dependent on the data logger types and a number of the channels connected per logger.
To ensure I give you a complete answer, I sought a second opinion from our Service Manager of Monitoring Systems. He said:
“Since there is a functional trade-off between sample frequency and internal data storage, the main factors affecting sample rate selection are the customer requirements for data granularity and the application requirements. Data granularity will be based mostly on how susceptible the product or process is to environmental variations. versus the application requirements for data protection (in other words, what is the cost of missing data and what is the worst case scenario for being able to recover from a failure)”.
Here my colleague Paul Daniel answers your second question:
“Your use of the chambers, at least in my experience, is fairly unique in the fact that you change set-points regularly. The typical approach I have seen is that a particular piece of equipment is dedicated to a given setpoint, qualified at that range, and the setpoint not changed unless the equipment was being dedicated to a different purpose, at which time a complete remapping would be performed. The complete remapping, for a 70 cubic foot unit would usually call for 10 sensors (9+1) and the 15x15 room would call for 16 sensors (15+1), per the guidance presented in the AFNOR French standard for climatic chambers (NF-15x140), presented in English in the ISPE Good Practice Guide for Cold Chain Management.
These quoted sensor densities are based on the assumption that we are dealing with a chamber used for the storage of temperature sensitive pharmaceuticals, or for a culturing incubator. It seems that your application is different. From what o can tell, you are doing short-term (72-hour) studies where you are challenging some sort of medical device with a relatively high-temperature exposure. No doubt this process has different requirements, but certain generalities with remain the same and apply regardless of this difference in purpose. So, being unfamiliar with your specific industry (which appears to be short term temperature challenges of medical devices) I can’t exactly give an opinion in terms of what others in your specific industry are routinely doing. However, I do believe we can compare approached between industries.
The logic you presented, with moving from a requalification practice to a continuous monitoring process with a high sensor density is supported by other GMP industries. The question, as you have already presented it, boils down to how many sensors should you use. In this case, based on the parallel to chamber mapping, we can infer an upper limit of 10 sensors (corners, center, and reference probe). It is likely that 10 will sound like too many to you. Conversely, your proposed solution of 2 sensors (top and bottom) sounds like too few to me! But, what we have done here is defined a range as a starting point. From here, you can analyze each unit on a case by case basis to see what makes sense.
In a 72 cubic foot unit, you may find it appropriate to just place a single sensor (or two) on each shelf, then the number of shelves defines your number of sensors. For the larger rooms, you probably want more sensors, and might default to the 10 sensor solution.
In the end, it should boil down to a few factors to consider. These following factors will likely guide your decision:
1) How sensitive is my process? More sensitive process (products) demands a higher density.
2) How consistent is my equipment? Equipment that can provide a more uniform environment will need fewer sensors.
3) How much labor savings will I gain by choosing intense monitoring over repetitive requalification? This liberated time and money becomes available budget to spend on the additional sensors and their ongoing calibration.
In short, your approach is sound, as long as you choose a sufficient sensor density. You will likely still be doing pre- and post- qualification studies – they will just be a LOT easier because the sensors are already in place! The real question then is how many sensors to use. You have a great resource to answer this question with your customers. If you are using too few, they will let you know. However, I suggest you ask them first, rather than wait for them to tell you AFTER you have done a study using your new monitoring/qualification strategy.”
We hope this helps!
Thank you Piritta and Paul! You answer that you have seen intervals of up to 15 minutes gives us the confidence we needed to move forward with this sampling interval. Yes, I believe we are one of those customers where although it seems like a drastic change, it is more appropriate. We have been using the system for 13 years and the industry has grown with this type of monitoring, we feel confident we have enough historical data, and out of tolerance investigations to show that when we have had out of tolerances, a good majority of them were such a minimum time frame (less than 15 minutes) it had no impact on the overall testing when failure investigations were conducted. Rather than continuing to hash out the same investigation over and over again with the same results, we have decided to redefine our definition of continuous.
We are happy to be using some of Vaisala’s products. It’s good to know that in current times where companies are cutting back and limiting their client support, Vaisala is putting their clients first through providing webinars such as these and creating a forum where we are all learning from each other! Keep up the FANTASTIC work and we look forward to future webinars!
Piritta Maunu has 15 years of experience in biotechnology, having worked in several quality management positions. Maunu holds a degree of M.Sc. (Cell Biology) and is certified to teach with a specialty in General Biology, both degrees from the University of Jyväskylä, Finland. In her role at Vaisala, she supports the sales department, assists the quality department with audits, creates educational content for life science customers, and provides application support to R&D teams creating solutions for monitoring critical environments.