Five Reasons Why a Standard EDMS is Inadequate for eTMF

January 29, 2013

  • eTMF Best Practices

 

Life Sciences was one of the first industries to embrace Electronic Document Management systems (EDMS), beginning in the 1980s.  The industry grew comfortable with entrusting their regulated content to EDMS, first using custom systems built on standard platforms such as Documentum, and later moving towards life sciences specific products, often for managing regulatory documents or quality system documents such as SOPs.

The industry was slower to move to electronic Trial Master Files (eTMF).  Years ago, it was more difficult to justify a business case for eTMF, but now most people see a variety of efficiency gains in areas such as collecting documents, site start up, maintaining quality, and supporting records management and legal hold,  as confirmed in the 2012 TMF Reference Model Survey.

However, user and system owner satisfaction with eTMF systems is, anecdotally at least, significantly lower than with “first wave” systems such as Regulatory and SOP systems.  Most vendors built their eTMFs by extending their original, successful, systems – so why the shortfall?

The answer lies in some fundamental differences in the nature of the documents and business processes that often can’t be addressed just by adding new features and functions to an existing framework.  So while of course we need core EDMS features such as versioning, security and workflow, we also need other characteristics missing from many legacy systems.  In this posting, we’ll examine some of the most important characteristics.

1. To Know What is Missing, You Must Know What is Expected

Regulators expect sponsors to maintain a complete set of essential documents to enable both the conduct of a clinical trial and the quality of the data produced to be evaluated.  But the list of essential documents is trial-specific, based on a great many factors such as the sites and countries involved, the trial design, the safety issues encountered, the specific labs and IRBs used, etc.  Many of these documents originate outside the sponsor, at trial sites, regulators, labs, and other locations.

With a standard EDMS, either you begin authoring documents from a template or you upload them upon receipt.  It’s rare to have a systematic mechanism to specify exactly what is expected in advance, and as a result you are not able to determine what’s missing without using manual, external mechanisms such as spreadsheets.

eTMF should support the establishment and maintenance of expected documents at the study, country and trial level – and reporting mechanisms that tell TMF managers what is missing.

2. To Know What is Late, You Must Know When Items are Due

In a standard EDMS, documents rarely have due dates.  Tasks often have due dates – it’s common to assign an expected completion date for an approval task, for example – but when you think of the average regulatory or SOP system, it would be uncommon to have the ability to assign due dates to Stability Reports, agency forms or SOPs at the document level.

Of course, sponsors have goals for when these documents should be completed – but it’s not usually a compliance issue if they are late.  That’s not true for eTMF, as the MHRA GCP Guide makes clear. “The TMF must be kept up to date, with documents placed in the TMF in a timely manner, as Regulation 31A (3) of SI 2004/1031 states that ‘The master file shall at all times contain the essential documents relating to that clinical trial’.”

Most TMF documents are associated with study milestones, which in turn have due dates.  eTMF should use this and other planning information to automatically assign due dates to TMF documents, and then track receipt against those due dates and inform responsible users when those dates are in jeopardy.

3. To Understand Quality Failures, You Must Quantify Quality Failures

In standard EDMS, workflows are generally geared towards review and approval of authored content.  For eTMF, much of the content is final before being brought into the system and requires only QC checks.  This is a less knowledge-intensive process than content review, and most sponsors would like to adopt an assembly-line type approach where specific checks are made and documents accepted or rejected as a result.

While this process can be supported in standard EDMS, the result of review is generally acceptance or rejection with free-form comments or annotations.  This information is not useful for trending and continuous process improvement.  For example, if you find that 10% of your QC failures are due to missing wet ink signatures, you can take steps to address the issue.  eTMF should gather the data and provide the tools to support this analysis.

4. Dealing with the Version Dilemma

In standard EDMS, versions of a document are generated in consistent and logical order.  When updating an SOP, you begin by accessing the latest version of the SOP, whatever it may be.  After completing your edits, passing through review, and gaining approval, you have a new major version, incremented by one.

In eTMF, building the version tree can sometimes be problematic.  For documents generated externally, there is no guarantee that different versions of the same document will be received and processed in order.  Using standard functionality may lead to an unacceptable situation where what appears to be Version 2 of a document may actually be an older, superseded version and Version 1 the current version.

eTMF should be able to manage version trees for externally generated documents based off the dates associated with the approval or finalization of the content itself – not the somewhat arbitrary date it was imported or released to a final status in the system.

5. Harnessing the Power of Metrics

Although every TMF is different, eTMF processes should be repeatable, with continuous process improvement as a significant goal.   However, managing TMF performance and designing targeted interventions for problem areas is difficult if not impossible without effective metrics.

In many cases, EDMS metrics are just search results masquerading as reports.  These are not sufficient to provide the insight that is needed in the key TMF areas of completeness, timeliness and quality.  For example, to understand performance levels and trends, you need to see analysis of KPIs month over month or year over year.   The system must be designed to capture key information in order to provide the required insight, rather than just package up and present what is available in a standard database.

This insight will allow sponsors to compare their vendors and CROs against their service level agreements and against each other, and to provide CROs with a way to report back to their clients on their service levels and achievements.