Discussion Club: FDA/PhUSE CSS Working Group 1 Update Budapest 2012

From PHUSE Wiki
Jump to: navigation, search


FDA/PhUSE CSS Working Group 1 Update at the EU Annual PhUSE Conference 2012

Introduction

The FDA/PhUSE CSS Working Group 1 collaborates on the topic "Data Validation and Quality Assessment". This working group will focus on collaborating to develop a robust process to rapidly validate and assess data quality as data moves through the product life cycle across both industry and regulatory review. The group will discuss current pains and potential solutions around topics such as the current data validation rules as well as the CDISC Data Validation Project, development and implementation of tools, terminology, and improving the quality of the data to support analytical needs.

Progress made since the CSS meeting in March in Washington

Develop Guidelines for Validation Rule Developers

  • Draft completed on May 21, 2012
  • Final version is planned for Oct 30th

  Establish Change Control Board (CCB) for Validation Rules

  • Charter finalized on July 10th, 2012
  • Currently in process of selecting CCB members
  • Goal to lunch CCB by end of September
  • Challenges we continue to face is establishing practical processes for operations of Board. Who does what, and when? How to handle disagreements and differing opinions, etc.

  Review top 20 issues from CBER

  • Team has been meeting by biweekly since July 20th
  • Currently reviewed 8 of 20 checks
  • A number of action items have been created for CDISC, FDA, and CCB
  • Goal is to finish by October 31st

  Reconcile CAB and OpenCDISC validation rules

  • Team has been meeting by weekly since June 15th
  • Team identified 45 rules to be reconciled
  • 45 rules have been completed
  • Planning to provide change requests to CCB & CDISC & other groups by October 31st

Discussion topics for EU PhUSE Conference

"Adherance to Submission Standards"

What do you do in your company to validate FDA submissions for adherance to standards

  • Are you developing guidelines
  • If so do you follow any process or guidelines?
  • Opendisc

Minutes of the discussion

We had about 40 interested attendees for the discussion about WG1 split into three groups. These enthusiasts gathered for about 30 minutes to discuss aspects of "Data Validation and Quality Assessment" and feed results directly into the PhUSE Wiki for further content development.

Adherence to Standards

The question was asked the question in the section above. What each company did was added to stickers and bulleted below. This led to further discussion.

First Group

  • Client SOPs, Validation and QC to client Standards, SDTM
  • Use OpenCDISC checker, maintain our own CDISC interpreted standards, Building up a standards governance organisation
  • Run DMCC tool to check SDTM datasets, Expert review of mapping to SDTM, Check consistency across studies programmatically.
  • Customers SDTM and ADaM data model, SDTM guidance, triple check tools (SAS based) which compares aCRF, XPT files and define.xml. OpenCDISC validator
  • Check against common terminology, compare data and metadata, compare against SDTM specs
  • Internal standard processes that HAVE to be followed, OpenCDISC, CDISC latest SDTM guide vs company specific spec has some differences so have to justify changes.
  • Develop SOP and follow them. Not necessarily the latest rules are followed.
  • Check adherence to common terminology
  • Data standards repository (SDTM based), conformance checking done within a tool
  • OpenCDISC validator 1.2, SDD 3.5 CC suite, a member of CDM is heading our governance body looking at all proposed changes and running them by internal SMEs.
  • Rely on OpenCDISC, waiting for the market to provide a solution

Discussion:

Changes require repeated rounds of review and change.

The people running the tools are the people who know the study

Company versions of standards – how to keep them in synch with CDISC releases is an issue.

Company has internal standard against which data is validated. Then converted to submission standards and revalidated – multiple checks at various stages.

Internal standards use extra variables which are then stripped for submission.

Data standards repository for SDTM and ADaM which can contain multiple standards and versions.

Conformance checking with FDA checks; Future may not need to check as the spec will match the conformance checks.

In future IG will contain business rules that will present the true business rules of the standards. Will contain the why and data governance rules.

Governance – committee of heads of programming to maintain updates of standards

CRO – has nice mapping system, client wants their own (and they’re paying).

CRO maintains multiple versions, often per client. Don’t want to have to reinvent things. If no client standards then use CRO tools. Tools use metadata to implement stds.

How often update:

Often until stable; CDM managing the standards, data provided in SDTM.

Adherence – scope around the contents of the trial. Data standards don’t remove the necessity for good use of data or goverence. No getting out of base requirements of structure of CT data processing.

Data quality checks for content vs. standard structure checks.

Second Group

  • Data Quality check- standard derivations, SDTM OpenCDISC, global data standards
  • Data Quality checks – data issues – looked after by data management
  • For SDTM its depended in the past on the client, That’s meant OpenCDISC and/or webSDM. Recently, per process we run OpenCDISC checks per client
  • Define actions depending on purpose of run (regular transfer/submission). Version of standards at the element level had impact on conformance/quality checks
  • Compliance check process independent of developmemtn process. Run OpenCDISC, Web SDTM, custom developed in SAS. Report feeds into data guide. No metrics
  • SDTM creation is meta data driven; validation is manual (SAS), OpenCDISC and WebSDTM. For ADaM creation is metadata drive, validation only with SAS.
  • Until recently only SDTM and define were checked through OpenCDISC version 1.1. More recently moved to OpenCDISC v1.3 so we can do the ADaM checks on datasets and define. As a CRO sometimes there are a variety of checking options which the sponsor can take in-house. Would say that there is also a fair amount of manual ADaM review. In-house SAS created macro for SDTM define.xml.
  • CRO: ADaM – work to Client (or own) standard – check with OpenCDISC
  • Standardise data conversion where possible SDTM > ADaM, implement standard checks (study A vs study B, project vs global standards)
  • Refer to guidelines > form meta data that can be customised < source data > standard data (validation)
  • ADaM – OpenCDISC, in house checks that match CDISC list, run against shells/specs before programming starts, ADaM specs template pre populated as a starting point.

discussion

CRO: monitor each version of elements to update internal standard and rules. Participate within external groups.

Control of standards is the easy bit….what happens after that is the hard bit. Advisory boards assess the impact – how to roll out to groups…much talk not actually do…impact analysis. No big changes yet…bug fixes only. (And 3.1.2 amendment). 3.1.3 Being discussed, current thinking will be to do multiple in batch.

3.1.4 is mainly therapeutic

Resourcing is an issue.

Sponsor has goverence board for SDTM and ADaM for 3.1.2 and internal standards. Have internal doc management system, deviation request process.

Full process for change control. If there is no standard then there can be no deviation! Have therapeutic standards. Working ok.

Impact on tools is the main issue for deviations

There is a designated group for SDTM+ and another group for ADaM standards. Some folks on both. Emails ask for impact analysis

Maintain Meta data in back end. Use this for generating data.

How will tools manage the release of stds and changes? Companies are still working through this and made more difficult with SDTM releases every six month.

Third Group

  • Implement SDTM and ADaM, meta data compliance checks with SAS macros
  • OpenCDISC to check ADaM
  • Define.xml created with a standard SAS macro from the ADS data (not metadata)
  • Run OpenCDISC over the SDTM datasets, validation run in system check required fields are present, populated etc in define.xml
  • OpenCDISC, SAS/CDI, in-house validation checks
  • Based on IG we need additional information to in-house standard, WG1 should maintain change control
  • CDISC working party responsible for updates, OpenCDISC validator, Independent validation, (review/final inspection)
  • Standard checklists, independent validation by an experienced person, SOPs, Trainings, different steps of validation (source code, output)
  • Data standards > analysis metadata > checks within reporting tools. But not necessarily policed.
  • Short wish list! Fix the things you can fix, Explain the things you cant fix.

Discussion

Define created from actual data not from the Meta data having previously checked the against the Meta data spec.

Working group to maintain change control. Additional checks provided in-house tool.

Training!

Step wise checks on each item.

Have internal working party that implement changes and communicate to teams and create and give training. Each team decides which version to use.

Client specifies. Not implement mid project unless requested by client.

Global standards released. Promotion communicated by email to project teams. Timing decides whether to stay or switch. Meta data contains column which details the version

Global standards group; They make the standard which is accepted by FDA. Review once a year apart from bug fixes. Each team uses the latest standards available at the kick of the study. Integrated data should use latest standards. Pushed out to all regions and CROs.

Data stds group who assess change and consider impacts on tools, may not implement straight away and implement a later version. They drive the change and strategy on implementation. Can we change all our systems every six months?

Moving towards a Meta data repository; Concept based. Reconcile versions on integration of data.

We have an In-house tool for checking vs. stds. SAS CDI does the checking and has repository. Meta data checking.

Checking by reporting tools based on meta data. Source vs. metadata std. less checking done due to validated tools.

Checking of meta data vs. std meta data. Across variable dependency use OpenCDISC validator – content checking. Need one tool.

FDA doesn’t have experience of internal validation, they only see after it has been done. “fix the things you can fix, document the things you can’t fix”.

Reviewers don’t understand what validation and what quality checks have been performed. Data guide should detail be predictable and contain things which might be legitimate but real.

More visible documentation of methods used will make the reviewers life easier.

Conclusions

This is the current stage of the discussion. You are welcome to visit the Wiki and add your comments, suggestions and improvements to it to keep the content up to date and the discussion alive. Many thanks


Last revision by Adie efc,10/19/2012