0% found this document useful (0 votes)
44 views

2017-03-13 Cnodes Qa Focus Group Results Draft

The document summarizes the results of a quality assurance focus group conducted by CNODES. It identifies several procedures currently used by sites to ensure quality, such as feasibility checks of study results, collaboration among team members, and double checking work. It also suggests procedures that could be implemented network-wide, like ensuring consistent interpretation of protocols, improving communication between analysts and liaisons, and implementing checks early in the process like reviewing flowcharts and expected values. Suggested barriers included time constraints that make quality assurance challenging.

Uploaded by

api-270829414
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views

2017-03-13 Cnodes Qa Focus Group Results Draft

The document summarizes the results of a quality assurance focus group conducted by CNODES. It identifies several procedures currently used by sites to ensure quality, such as feasibility checks of study results, collaboration among team members, and double checking work. It also suggests procedures that could be implemented network-wide, like ensuring consistent interpretation of protocols, improving communication between analysts and liaisons, and implementing checks early in the process like reviewing flowcharts and expected values. Suggested barriers included time constraints that make quality assurance challenging.

Uploaded by

api-270829414
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

CNODES Quality Assurance Focus Group Results

From March 13, 2017

Overview

The themes and responses listed below were compiled from three separate sources:
a) Pre-Focus Group discussion notes provided by some sites
b) Notes taken by three individuals during the Focus Group discussions
c) Responses written on the recording sheets at each table
Responses were collected into main themes, and are presented below either as
written or as interpreted by the note-takers. Duplicate responses of simple one-word
answers were removed; other more nuanced ideas that touch on the same idea
were either blended or retained separately.

Summary of Key Results

[Intentionally left blank until feedback received]

353762951 Page 1 of 8
Question 1: What quality assurance procedures or elements do you
currently have in place at your site or do you use personally in your work?

Theme 1: Common sense feasibility checks when coding a study


Making sure that cohort numbers and results make sense within your own
data checking ranges and frequency distributions
Comparing the results to results of similar studies done by the centre (not
necessarily CNODES) or similar published studies from comparable
populations
Looking at individual patient data and drawing out timelines
Randomly selecting a small sample of patients (e.g. 10) and following through
to see if the code is doing what is expected
Internal data quality checks on data holdings (e.g. CPRD extensive checks)
Producing descriptive tables and checking feasibility
Checking data for missing or incomplete fields
Data quality checks prior to releasing to analysts

Theme 2: Collaboration and discussion with other team members


Discussions between the analyst and liaison regarding cohort creation, the
analytic protocol, and data; regular and frequent check-ins between analyst
and liaison
Discussions with pharmacists and physicians to confirm drug lists and claims
are appropriate
Flexible roles of data examiner and data analysts
On non-CNODES studies, analysts are co-authors or are consulted during
creation of protocols

Theme 3: Double checking by reproduction or review


During protocol creation, consultation with the internal review committee
(feedback on methods, design & data)
Recoding the same study in a different way at different points in time (by the
same person)
Getting someone else from the same site to re-code and compare results
Cross reference just the definitions: two people define the conditions (ICD
codes, etc) is there agreement? If not, explore what and why
Having two analysts perform the extraction only and compare numbers
Shadowing, oversight, review for less experienced coders
Double check cohort entry/exclusion code by senior analyst
Sample codes [assumed to mean segments of code are used on known
datasets to test the accuracy of results]

353762951 Page 2 of 8
Theme 4: Use of existing tools and resources
Consulting the Analyst Tool Box
Use of standardized codes and macros for common analysis tasks (e.g.
extraction, matching)
Using previously tested SAS code from prior studies and adapting it to the
current study
Automated output (ex PROC report and PROC tabulate)
Copying and pasting reducing human error

Theme 5: Documentation and tracking of work


Documenting steps of the analysis in a separate Word document (coding
decisions, algorithms, exclusions, definitions, patient counts, observations)
Documenting steps of the analysis using the Saskatchewan Excel model
Researcher documentation, including exclusions, rationale, grouping
Analyst documentation, including how each step was done, assumptions, and
decisions
ICES uses a Dataset Creation Analysis Plan, which includes name, #, PI,
definition, inclusion/exclusion criteria
Documenting each section of code within the SAS code itself (using
comments); some sites have documentation standards for code
Researcher documentation (analytical steps)
Same structure of folders, data sets, naming conventions for every study so
other analysts can make sense of a project
Version control on all documents
Registering the protocol

Theme 6: Training
Analysts take pharmacoepi courses
New analysts replicate a previous study until they arrive at the correct results
Learning about the process of how the data are created/provided

353762951 Page 3 of 8
Question 2: What are some quality assurance procedures that could be
implemented across the CNODES network at various stages of project
development?

Theme 1: Ensure clarity and consistency of protocol interpretation across sites


Create a list of standardized definitions (e.g. for diseases) and use them in
protocols wherever possible
Compile list of definitions study by study with codes (site by site)
Have two people (separately, to ensure accuracy) create macros for standard
definitions, and then use the same definitions in each protocol
Create definitions, library, glossary
Include more examples/illustrations in protocol
Include equations in protocols (numerator/denominator)
Standardize variable names, data structure, output
Stratify Table 1 by year/key variables

Theme 2: Improve communication


Need for analysts to talk more amongst themselves and a place for them to
exchange questions and answers without discussing results or sharing code.
Analyst-only teleconferences for projects. Would answer questions from
analysts all at once and would eliminate multiple interpretations of the
protocol
Analyst forum/confidential email process for asking
questions/clarifications
Make the Lead Analyst the point person for other analysts (as opposed
to Liaisons). Track questions and answers by:
Email notification
Intranet or listserv
Lead analyst might Reply All to analysts or call teleconference
ICES currently has internal bulletin board on intranet where analysts
post questions and answers - a similar solution may be possible for
CNODES
Establish scheduled check-in points between analyst and site liaison and
between liaison and site lead multiple and earlier in the process
Where possible, have the lead analyst and project lead at same site and
encourage greater communication between site leads and liaisons. There will
be increased buy-in through these discussions.
Clarify expectations for liaisons
Involve analysts in writing the analytic protocol/statistical analysis plan

Theme 3: Build upon existing resources for analysts, liaisons, and project leads

353762951 Page 4 of 8
Make sure everyone knows about the protocol development guide
Create an analyst training guide and disperse
Online training or training modules live and archived resources
Compile library of Strategic Analytic Plans (SAPs)/Analytic Protocols from
previous projects
Use the simulated dataset to test key methodologies, or sections of code
Test code, test logic. Each site test against simulated data code, logic
similar results? Could be incorporated into phase 1 of every study
Automation to eliminate human error. Standard macros (tested) could be sent
to each site and used to build the base cohort.
Automate cohort creation
Share SAS code for commonly/frequently used methods (check and
recheck?)

Theme 4: Implement checks early in the process


Establish an internal review committee for each protocol with at least three
people
Create line diagram/flowchart to define cohort creation (this was done in a
previous protocol implement for all).
Flow chart is very good to ensure quality. Also some tables histogram for PS
(Standard graphs)
Include data checkpoints in analytic protocols e.g. indicating that follow-up
time range should be 1-730 days. Include more expected values in the
protocol
Have analysts create a flowchart that documents each step of the coding.
Project leads to check and compare flow charts from each site
Create checklist for analysts
Implement any peer review of code earlier in the coding process
Replicate studies within each centre or randomly select one centre per study
to replicate analysis and cohort creation
Simulation data test your code to get common results over all sites
Pilot testing

Other un-categorized suggestions


Provide more time for coding/analysis
Review of strengths and limitations of research in field and state how the
current study will improve upon it
Anticipate analyses that may be requested at the point of publication. Try to
avoid re-doing analysis to publication or going back to add more code
Documentation helps to go back and re/add code for publications
Try data visualization

353762951 Page 5 of 8
Pilot sites?

353762951 Page 6 of 8
Question 3: Barriers/Facilitators

Theme 1: Barriers related to time


This was a heavily emphasized point: analysts feel rushed in their work and
do not have time to do adequate checks of their work
The ratio of time spent building protocol vs time to do analysis is a barrier to
quality assurance. More time is spent on protocol development can some of
that be shifted to analysis time?

Theme 2: Barriers related to resources


High turnover of staff and poor continuity of procedures
Staff turnover and loss of corporate memory/historical knowledge some
organizations are higher risk than others. It is difficult to hire and retain
analysts; training new analysts is time consuming
Lack of incentives to retain CNODES analysts
Many sites have limited staff capacity (e.g. only 1 or 2 analysts per site)
Overload on lead analyst

Theme 3: Barriers related to money/finances


Money
Institutional barriers (e.g. salary bands)

Theme 4: Barriers related to communication


Lack of shared information/updates
Challenge to efficiently, clearly communicate (e.g. email/intranet, etc)
Language barriers
Conflicting viewpoints of liaison and analyst

Theme 5: Barriers related to data or current project development practices


Sample size is a barrier
Challenge to capture detail in protocol
Lack of centralized, easily accessible resources
Lack of knowledge about available resources
Comparison of site results happens late in the process
Institutional policy around suppression of small cell sizes (SK site)
Certain variables unavailable at some sites (requiring an adjustment in the
methods)

353762951 Page 7 of 8
Hard to standardize
Need to share only well-tested code
Theme 6: Facilitators
More money
Training opportunities
Authorship recognition (acknowledgements) for analysts
Jr + Sr analyst pairing on each project (takes time, but provides learning
opportunities)
Valuing a culture of learning, sharing, and team work
Culture of reporting mistakes
Expectation (written) about roles in projects
Reviewing information available on site
Scheduled project-specific analyst sessions
Central tasks (e.g. DINs, CADTH)
Analyst orientation to CNODES projects (e.g. frequently asked questions)
Investigative teams
SAS workshops
Documentation of errors
Connection between methods lead & lead analyst
Duplication of meta-analysis
Adapting QA procedures to specific requirements of each site may help with
implementation

353762951 Page 8 of 8

You might also like