Table of Content
If a company stores or transports pharmaceutical products and wants to comply to GMP and GDP guidelines it must produce, handle, store and transport the products in qualified facilities. Calibrated sensors need to be installed in those facilities, which report their temperature values to a compliant monitoring system. But what does GxP compliance in combination with a temperature monitoring solution mean? In the following short summary we list all elements and features of a GxP-compliant temperature monitoring solution.
“Title 21 CFR Part 11” is the part of the Title 21 of the Code of Federal Regulations written by the United States Food and Drug Administration (FDA). Title 21 contains regulations on electronic records and electronic signatures. Part 11 defines the criteria by which electronic records and electronic signatures are considered trustworthy, reliable, and equivalent to paper records to ensure GxP compliance.
A monitoring solution which stores electronic records which are critical to patient safety must be in compliance to Title 21 CFR Part 11. In order to do so, it is important to understand the main risks.
Electronic data could be deleted, accidentally modified or intentionally modified. Title 21 CFR Part 11 defines criteria by which electronic data is trustworthy, reliable and equivalent to paper records and handwritten signatures executed on paper. If you follow those rules, your electronic records will be complete, intact, maintained in the original context and geared towards compliance.
Complete Data – Monitoring temperature with the help of sensors, a communication bridge and the software solution, one of the main challenges is the completeness of data. Mechanisms need to be in place to ensure compliance, so that no data is lost on the way from the wireless sensors through the communication bridge to the monitoring software. Therefore, in case of a disconnection between the sensors and the radio bridge or the cloud storage, data must be buffered in the sensors until the cloud confirms that the connection has been re-established and he data has arrived.
Intact Data – Although the risk for accidental or intentional modification is minimal, the integrity of data in a measurement chain can only be achieved by encrypting the data all the way from the measuring wireless sensor through the communication bridge to the cloud. Once the data has arrived in the software, it is important that no raw data can be deleted or modified. Any user should not be able to change the raw data, however it is possible to add certain types of additional information. For example, in order to add an interpretation of the data, certain comments or acknowledgements about the raw data can be added to the system. Furthermore, in order to create selective views on the raw data, reports can be created and exported.
Maintaining electronic data in its original context – Keeping the data in one single source on a central cloud infrastructure makes sure that the data is kept in its original recorded context and the risk of misinterpretation is therefore eliminated. Warnings, alarms and reports should always refer to the unique sensor name, event number and time stamp.
There are many rules to follow when it comes to compliance in user management. Every user with access to the solution must be identified by a unique username and password and must have a clear role and rights. Additionally, every action taken by the user in the system must be identified and tracked. When conducting critical operations such as the acknowledgement of an alarm, the user even needs to confirm his action by inserting his password a second time. In order to avoid unauthorized access, it is furthermore important to implement a time-out mechanism in case the user is not taking action for a longer time period.
The result of the above mentioned tracking functionalities is a complete audit trail aligned with compliance: Who has done what and why? Technically, the audit trail keeps track of every single automated event the system is generating and every single manual task a user is performing. So regardless from which perspective one takes a look into the system – a full audit trail could be:
A temperature monitoring system typically executes the following different automated mechanisms and workflows:
In addition to automated events, the system must keep track of every single manual task a user performs including the time stamps of each task. The following manual events could be tracked:
GMP and GDP standards define that pharmaceutical products must be stored and transported according to the required temperature conditions mentioned on the drug label to ensure compliance. Every excursion from these temperature conditions must be documented. The monitoring system should support the user in creating automated excursion reports to which the user can still add certain information. The following procedure gives an example on which questions a Quality Manager should ask, once a temperature excursion has occurred.
ALARM GOES OFF
A temperature excursion triggers an alarm. The alarm can be seen on the sensor itself or the dashboard display and can be sent out via email or SMS containing an excursion report with the following information:
Where did the alarm go off? Which facility, container or sensor had an excursion?
When has the temperature excursion occurred and when were the drug label conditions re-established?
How long has the product been exposed to temperatures outside the drug label conditions?
What was the highest/lowest temperature measured?
Risks? Is it likely, that the core temperature of the product has been affected thus damaging the product?
Severity? Is there sufficient stability budget left to justify a release of the product or is a product recall necessary?
Corrective actions needed? What is the cause of the temperature excursion and does it have to be corrected? Do people need to be informed about the findings?
Preventive Actions needed? In case of high-risk and/or repetitive errors: Which preventive actions can be performed in order to avoid a repetition of the event? Are changes implemented?
A dashboard gives a brief overview on the current status of each sensor. The sensors can be grouped in a meaningful way or placed on top of a floor plan to illustrate their physical location. The dashboard should show the currently measured value, show the alarm status and give further meaningful information on the technical status of the sensor. The benefits of a dashboard are:
Besides a clear alarming mechanism, it is vital to have a periodic reporting on all sensors of a system. Every report can have a different purpose and therefore every report will contain different content. If the report serves as an archive of data, the report should be a document with compliance regarding the ISO standards for long term archiving. If the report gets sent to customers, it might be vital to combine various sensors together giving the ideal overview on the customer's project. To sum it up, a few examples of regular reports could be:
Download our checklist which covers everything you need to know about storing temperature data in the Cloud.
Archiving is not clearly defined in GxP regulations and is left open to everybody's own interpretation. Many people have the rather unrealistic idea, that once data is archived, it should be available forever in the same way as it was generated. Data archiving is the process of "moving data that is no longer actively used to a separate storage device for long-term retention. Archive data consists of older data that remains important to the organization or must be retained for future reference or regulatory compliance reasons.” As a result, "archive data" has a different form than "process data".
Process data is “fresh data” which is used to execute business decisions (e.g. data of a product, mean kinetic temperature calculation of a stability study). The service provider must ensure that for two years, process data is available electronically for visualizations (e.g. zoom, overlay), statistics (e.g. calculate MKT), reports (e.g. release decision) and exports of the data (e.g. to a higher batch management system). Furthermore, it must be possible to add comments related to the data in the system.
After the first two years, the data is typically not needed in business processes anymore and its location and form will be changed into archive data. The service provider must ensure that archive data is available for at least 10 years and fulfils the following requirements:
If you work with pharmaceutical products and want to comply with GMP and GDP guidelines, you must know more about Qualification.
Get more insights with our experts about temperature monitoring of pharmaceutical products.
Find out more about the process of a calibration and why it is even necessary to calibrate a sensor.