Documentum For Life Sciences 16.4 Administration Guide
Documentum For Life Sciences 16.4 Administration Guide
Life Sciences
Version 16.4
Administration Guide
Legal Notice
Preface ................................................................................................................................ 15
Chapter 1 Life Sciences Solution Fundamentals ......................................................... 17
Overview ......................................................................................................... 17
Document Domains .......................................................................................... 18
Roles ............................................................................................................... 20
Administrators ............................................................................................. 22
Consumer Import ......................................................................................... 23
Controlled Printers and Issued Printers (LSQM) ............................................. 23
Document Approvers ................................................................................... 24
Document Auditors ...................................................................................... 26
Document Authors ....................................................................................... 27
Document Contributors (LSTMF) .................................................................. 29
Document Coordinators ................................................................................ 29
Document Quality Organization Approvers (LSQM) ...................................... 30
Document Readers ....................................................................................... 31
Document Reviewers .................................................................................... 32
Managers (Product, Project, Trial, Study, Regulatory) ...................................... 32
Global External Participants (LSTMF) ............................................................ 34
Inspectors (LSTMF) ...................................................................................... 34
Investigators (LSTMF) .................................................................................. 34
Quality Check (LSTMF) ................................................................................ 34
Security Groups ........................................................................................... 35
Solution-Specific User Roles .......................................................................... 35
User Roles in Documentum for eTMF ........................................................ 35
Cross-functional User Groups ............................................................... 37
Reporting Groups ................................................................................. 37
External Trial Participant Roles .............................................................. 38
External Trial Participant Groups ........................................................... 39
User Roles in Documentum for Quality and Manufacturing ........................ 40
Cross-functional User Groups ............................................................... 43
Reporting Groups ................................................................................. 44
User Roles in Documentum for Research and Development ........................ 44
Cross-functional User Groups ............................................................... 52
Reporting Groups ................................................................................. 52
User Roles in Documentum Submission Store and View ............................. 52
Correspondence User Groups................................................................ 54
Cross-functional User Groups ............................................................... 55
Reporting Groups ................................................................................. 55
Control Categories ............................................................................................ 56
3
Table of Contents
4
Table of Contents
5
Table of Contents
6
Table of Contents
7
Table of Contents
8
Table of Contents
9
Table of Contents
10
Table of Contents
11
Table of Contents
List of Figures
12
Table of Contents
List of Tables
13
Table of Contents
14
Preface
This guide contains information about administering, configuring, and extending the solutions that
are part of OpenText Documentum for Life Sciences. The Life Sciences solution includes:
• OpenText Documentum for eTMF (LSTMF)
• OpenText Documentum for Quality and Manufacturing (LSQM)
• OpenText Documentum for Research and Development (LSRD)
• OpenText Documentum Submission Store and View (LSSSV)
Note: Documentum Content Server is now OpenText Documentum Server. OpenText Documentum
Server will be called Documentum Server throughout this guide.
Intended Audience
This guide is intended for anyone responsible for configuring, extending, or administering any
products in the Documentum for Life Sciences solution. OpenText recommends completion of the
following training prior to using this guide:
• Technical Fundamentals of Documentum
• Composer Fundamentals
• D2 Configuration
• Life Sciences Fundamentals
• Life Sciences Trial Master File (LSTMF)
Information about these training courses is available on the OpenText website.
Revision History
Revision Date Description
April 2019 Updated the content in Configuring the D2
Configuration Migration Tool, page 379.
January 2019 Updated the steps in Connecting the iHub
Analytical Designer with the Life Sciences
Repository, page 235.
15
Preface
16
Chapter 1
Life Sciences Solution Fundamentals
This section provides an overview of the OpenText Documentum for Life Sciences solution.
Overview
The Documentum for Life Sciences solution helps organizations meet compliance requirements,
increase productivity, and securely collaborate across the extended enterprise. The Life Sciences
solution includes:
• Documentum for eTMF (LSTMF)
• Documentum for Quality and Manufacturing (LSQM)
• Documentum for Research and Development (LSRD)
• Documentum Submission Store and View (LSSSV)
The Life Sciences solution is built on OpenText Documentum D2 and the Documentum platform.
Individual solutions rely on reusable components provided by two base layers:
• Unified Solution Layer (also referred to as Controlled Document Foundation)
• Life Sciences Foundation
In this architecture, the lower layers have no dependencies on the higher layers. For example, the
Life Sciences Foundation components do not depend on any types, configurations, Java classes, and
so forth defined in the LSQM, LSTMF, LSRD, or LSSSV solution layers. The Unified Solution Layer
components do not depend on any components in the Life Sciences Foundation or specific solution
layers. Conversely, dependencies on lower layer components are encouraged.
D2 configurations are assigned to one of the following applications corresponding to the layered
architecture:
• Documentum for Quality and Manufacturing solution
• Documentum for eTMF solution
• Documentum for Research and Development solution
• Documentum Submission Store and View solution
17
Life Sciences Solution Fundamentals
Document Domains
The Life Sciences solutions provide extensive inventory to manage documents relevant for each
functional area within Life Sciences organizations. Documents stored and managed by the Life
Sciences solutions are categorized according to the functional area in which they are used. This
top-level categorization is called the document domain. Every document is automatically assigned
a domain when it is created. The domain assignment at document creation or import time is
accomplished using a D2 Default Values template configuration associated with the creation profile
of the document.
The standard domains provided with each solution mirror those defined by the DIA Electronic
Document Model (EDM) reference model. The DIA EDM reference model is patterned after the
Electronic Common Technical Document (eCTD) standard. The DIA website provides more
information about the DIA EDM reference model.
In Documentum for Research and Development, domains and inventory are based on DIA EDM
Reference Model with extensions where appropriate. The DIA EDM reference model defines the
following domains:
• Clinical
• Labeling
• Non-Clinical
• Quality
• Regulatory/Administrative
These same domains are defined in the solution with additional inventory to manage Safety-PVG,
Promotional Materials, and Medical Device documents.
Documentum for Quality and Manufacturing provides an inventory mapped to the DIA EDM QM
Reference Model with extensions where appropriate. In addition to standard documents, the solution
includes domains to manage Design History File and Device Master Record components.
The following document domains are provided in the respective Life Sciences solutions:
18
Life Sciences Solution Fundamentals
19
Life Sciences Solution Fundamentals
Roles
A role is a type of group that contains users or other groups that are assigned a specific role. Roles
provide a means of defining groups that have a particular function within a system. For example,
pharmaceutical companies manage their huge set of documentation by the assignment of roles such
as authors, reviewers, approvers, managers, and so on. Each role can have one or more people
designated to perform the activity.
Predefined roles (user and workspace roles) are installed with the Life Sciences solutions.
Administrators must add users or groups to the applicable predefined roles and ensure each user has
an assigned D2 workspace and appropriate access to documents and functions. For more information
about workspace roles, see Workspace Groups, page 223.
A set of roles are provided for each of the following domains:
• Clinical
• Non-Clinical
• Quality
20
Life Sciences Solution Fundamentals
• Regulatory/Administration
• Safety
• Labeling
• Promotional Materials (Ad Promo)
• Correspondence
• GMP
• Medical Devices
In addition, the Documentum for Quality and Manufacturing (GMP) solution roles are assigned by
Applicable Sites. Applicable Sites are defined in a dictionary and can be determined by physical
manufacturing location and/or business groups. The Configuring Roles (LSQM), page 65 section
provides the steps to configure site-based roles.
All roles are installed for all solutions. However, some roles are typically used only in a particular
solution. The naming convention used for the group name for each role in the Documentum for
Research and Development solution is cd_<domain>_doc_<role>. For example, the group name for
the Author role in the Clinical domain is cd_clinical_doc_author.
The naming convention used for the group name for each role in the Documentum for Quality and
Manufacturing solution is cd_<applicable_site>_<role>. For example, the group name for the Document
Coordinator role for the Boston site is cd_boston_coordinators.
Some roles have default members. For example Document Coordinators are members of the
Document Authors role. A top-level role for each domain is also installed. All domain roles are made
members of the top-level domain role as shown in the following table.
21
Life Sciences Solution Fundamentals
Administrators
Administrators have the ability to modify most D2 dictionaries and taxonomies as well as other
configuration objects, including Auto Inheritance Config and Delete Config objects.
22
Life Sciences Solution Fundamentals
Consumer Import
The Consumer Import group includes users who are typically consumers of documents who have the
ability to import documents into the system.
cd_ad_promo_consumers_imp
cd_clinical_consumers_imp
cd_corres_consumers_imp
cd_general_consumers_imp
cd_gmp_readers_imp
cd_labeling_consumers_imp
cd_md_clinical_consumers_imp
cd_md_consumers_imp
cd_md_non_clinical_consumers_imp
cd_md_regulatory_consumers_imp
cd_non_clinical_consumers_imp
cd_quality_consumers_imp
cd_regulatory_consumers_imp
cd_safety_consumers_imp
Is a member of <none>
23
Life Sciences Solution Fundamentals
cd_controlled_print_printers
cd_controlled_print_reprinters
cd_controlled_print_recall
cd_issued_print_admin
cd_issued_print_approvers
cd_issued_print
cd_issued_print_printers
cd_issued_print_reconcilers
cd_issued_print_reprinters
Contains members cd_admingroup
cd_gmp_coordinators
Is a member of <none>
Document Approvers
Document Approvers are responsible for approving controlled documents. They perform the For
Approval task in a controlled document workflow.
24
Life Sciences Solution Fundamentals
cd_ad_promo_template_approvers
cd_clinical_doc_approvers
cd_clinical_template_approvers
cd_corres_doc_approvers
cd_corres_template_approvers
cd_general_doc_approvers
cd_general_template_approvers
cd_global_approvers
cd_gmp_approvers
cd_gmp_template_approvers
cd_labeling_doc_approvers
cd_labeling_template_approvers
cd_md_clinical_doc_approvers
cd_md_clinical_template_approvers
cd_md_doc_approvers
cd_md_non_clinical_doc_approvers
cd_md_non_clinical_template_approvers
cd_md_regulatory_doc_approvers
cd_md_regulatory_template_approvers
cd_med_device_template_approvers
cd_non_clinical_doc_approvers
cd_non_clinical_template_approvers
cd_quality_doc_approvers
cd_quality_template_approvers
cd_regulatory_doc_approvers
cd_regulatory_template_approvers
cd_safety_doc_approvers
cd_safety_template_approvers
Contains members cd_admingroup
25
Life Sciences Solution Fundamentals
cd_gmp_all_users (LSQM)
Document Auditors
Document Auditors inspect the state of documents in their respective domain for audit-readiness.
Document Auditors have read-only access to Effective/Final/Released, Superseded, and Expired
documents and no access to documents in other states.
cd_clinical_doc_auditors
cd_corres_doc_auditors
cd_general_doc_auditors
cd_global_auditors
cd_gmp_auditors
cd_labeling_doc_auditors
cd_md_clinical_doc_auditors
cd_md_doc_auditors
cd_md_non_clinical_doc_auditors
cd_md_regulatory_doc_auditors
cd_non_clinical_doc_auditors
cd_quality_doc_auditors
cd_regulatory_doc_auditors
cd_safety_doc_auditors
Contains members cd_admingroup
Is a member of cd_<domain> (LSRD, LSTMF, LSSSV)
cd_gmp_all_users (LSQM)
26
Life Sciences Solution Fundamentals
Document Authors
Document Authors create documents and submit them for collaborative editing, review, and
approval. They can self-approve documents that do not require formal review and approval.
27
Life Sciences Solution Fundamentals
cd_ad_promo_template_authors
cd_clinical_doc_authors
cd_clinical_doc_authors_tmf
cd_clinical_template_authors
cd_corres_doc_authors
cd_corres_template_authors
cd_general_doc_authors
cd_general_template_authors
cd_global_authors
cd_gmp_authors
cd_gmp_template_authors
cd_labeling_doc_authors
cd_labeling_template_authors
cd_md_clinical_doc_authors
cd_md_clinical_template_authors
cd_md_doc_authors
cd_md_non_clinical_doc_authors
cd_md_non_clinical_template_authors
cd_md_regulatory_doc_authors
cd_md_regulatory_template_authors
cd_med_device_template_authors
cd_non_clinical_doc_authors
cd_non_clinical_template_authors
cd_quality_doc_authors
cd_quality_template_authors
cd_regulatory_doc_authors
cd_regulatory_template_authors
cd_safety_doc_authors
cd_safety_template_authors
28
Contains members cd_<domain>_doc_coordinators
cd_admingroup
Life Sciences Solution Fundamentals
Note: The cd_general_doc_authors role is not a member of any Document Coordinators role because
there is no corresponding Document Coordinators role for the General domain.
All domain Authors roles except the cd_general_doc_authors role, are members of the
cd_general_doc_authors role. This enables authors in all roles the ability to create General documents.
Documentum for Quality and Manufacturing does not use the cd_general_doc_authors role.
tmf_contributors
tmf_external_contributors
Contains members tmf_external_contributors
tmf_external_reviewers
tmf_investigators
tmf_inspectors
Is a member of cd_clinical
Document Coordinators
Document Coordinators manage the release of controlled documents. They can also create documents
and submit them for collaborative editing, review, and approval. Document Coordinators monitor
the progress of document workflow tasks. They can change workflow task performers.
29
Life Sciences Solution Fundamentals
cd_clinical_doc_coordinators
cd_corres_doc_coordinators
cd_general_doc_coordinators
cd_global_coordinators
cd_gmp_coordinators
cd_labeling_doc_coordinators
cd_md_clinical_doc_coordinators
cd_md_doc_coordinators
cd_md_non_clinical_doc_coordinators
cd_md_regulatory_doc_coordinators
cd_non_clinical_doc_coordinators
cd_quality_doc_coordinators
cd_regulatory_doc_coordinators
cd_safety_doc_coordinators
Contains members cd_admingroup
cd_non_clinical_managers
cd_product_managers
cd_regulatory_managers
Is a member of cd_<domain> (LSRD, LSTMF, LSSSV)
cd_gmp_all_users (LSQM)
30
Life Sciences Solution Fundamentals
cd_gmp_qo_approvers
Contains members cd_admingroup
Is a member of cd_gmp_all_users
Document Readers
Document Readers have read-only access to Effective versions of documents. They browse for,
search and read documents.
cd_clinical_doc_readers
cd_corres_doc_readers
cd_general_doc_readers
cd_global_readers
cd_gmp_readers
cd_labeling_doc_readers
cd_md_clinical_doc_readers
cd_md_doc_readers
cd_md_non_clinical_doc_readers
cd_md_regulatory_doc_readers
cd_non_clinical_doc_readers
cd_quality_doc_readers
cd_regulatory_doc_readers
cd_safety_doc_readers
Contains members cd_admingroup
cd_regulatory_managers
cd_regulatory_publisher
Is a member of cd_<domain> (LSRD, LSTMF, LSSSV)
cd_gmp_all_users (LSQM)
31
Life Sciences Solution Fundamentals
Document Reviewers
Document Reviewers review documents and edit documents using annotations. They are responsible
for technical reviews during the authoring and review cycle. Reviewers complete workflow tasks and
can browse and search for documents.
cd_clinical_doc_reviewers
cd_corres_doc_reviewers
cd_format_reviewers
cd_general_doc_reviewers
cd_global_reviewers
cd_gmp_reviewers
cd_labeling_doc_reviewers
cd_md_clinical_doc_reviewers
cd_md_doc_reviewers
cd_md_non_clinical_doc_reviewers
cd_md_regulatory_doc_reviewers
cd_non_clinical_doc_reviewers
cd_quality_doc_reviewers
cd_regulatory_doc_reviewers
cd_safety_doc_reviewers
tmf_external_reviewers
Contains members cd_admingroup
cd_regulatory_managers
Is a member of cd_<domain> (LSRD, LSTMF, LSSSV)
cd_gmp_all_users (LSQM)
32
Life Sciences Solution Fundamentals
progress. Product Managers are responsible for all documents across the regulatory applications,
non-clinical studies, clinical trials, and quality projects associated with particular products. They
create and manage Product Registration Forms.
cd_clinical_managers
cd_clinical_trial_managers_tmf
cd_corres_managers
cd_gmp_item_registration_group
cd_labeling_managers
cd_md_managers
cd_non_clinical_managers
cd_product_managers
cd_quality_managers
cd_<domain>_ref_copy_mgrs
cd_regulatory_activity_managers
cd_regulatory_managers
cd_safety_managers
Contains members cd_admingroup
cd_product_registration_group
cd_project_registration_group
Is a member of cd_<domain>
cd_<domain>_doc_coordinators
Note: Because the cd_product_managers role spans multiple domains, this role is a member of the
following roles:
• cd_clinical_doc_coordinators
• cd_non_clinical_doc_coordinators
• cd_quality_doc_coordinators
• cd_regulatory_doc_coordinators
33
Life Sciences Solution Fundamentals
Inspectors (LSTMF)
Document Inspectors are the users who can inspect the TMF documents uploaded to a placeholder
to which they have access.
Investigators (LSTMF)
TMF Investigators are the users who can import the TMF document for a placeholder of a particular
site, country, or trial but cannot make it Effective/Approved/Final.
cd_clinical_qc_business
Is a member of cd_<domain>
34
Life Sciences Solution Fundamentals
Security Groups
These groups are the document admins who can modify the configured roles on the document and
the model admins who can change the security model on the document.
<domain>_model_admin
Contains members <none>
Is a member of cd_<domain>
Documentum for eTMF provides defined user roles that enable or restrict user access to documents
and information in the system. The following table describes the user roles:
35
Life Sciences Solution Fundamentals
36
Life Sciences Solution Fundamentals
*These roles are external participants. They can receive access to documents associated with a country
or site. The use of the term external does not require the user to be a contractor or otherwise external
to the system. It means that they do not have global access to all documents in the system and only
have access to what managers specifically grant to them. Managers can grant the access for a limited
time. External Trial Participant Roles, page 38 provides more information.
Groups Description
cd_admingroup Administrator:
• Has access to administrative functions
Reporting Groups
Documentum for eTMF provides an optional reporting feature for monitoring active clinical trials
based on AMPLEXOR myInsight for Documentum (myInsight). The reports can be customized
using myInsight.
37
Life Sciences Solution Fundamentals
In Documentum D2, users view dashboards containing report information. The following table
describes the reporting groups used by the Report Generator:
Groups Description
report_user Report Users can generate reports and view
historical data in the form of saved reports.
report_builder Report Builders can manage report definitions
and presentations.
report_administrator Report Administrators can define categories and
scheduling of reports.
Documentum for eTMF enables you to add a named user to participate as a particular role within
the LSTMF system. These users, known as external participants, can receive access to countries or
sites. External participants only have access to the documents associated with the entity granted.
Administrators specify document access levels when configuring the user roles. External trial
participants require a Documentum user account (dm_user) for system access.
Note: The use of the term external does not require the user to be a contractor or otherwise external to
the system. It means that they do not have global access to all documents in the system and only have
access to what managers specifically grant to them.
Managers can register external participants for countries and sites to grant access for the specified
entities. They register the external participants by adding them to the relevant site or country
registration forms. By default, users who have Write access to the registration form can add external
trial participants. The Access Control tab of a registration form defines the users who can update the
form. For example, a clinical trial manager or a local site administrator delegated to act in this role
can add external participants to registration forms. These participants receive the access specified in
the registration form.
The following table describes the default external participant roles:
Role Description
Investigator A clinical investigator responsible for
administering the drug or therapy to subjects
(patients or volunteers) and recording clinical
data on each subject.
Inspector A representative of the health authority or
regulatory agency responsible for ensuring that
good clinical practice is followed during the
conduct of the trial.
38
Life Sciences Solution Fundamentals
Role Description
External Contributor A producer of specific documents or a person
who can typically import LSTMF documents.
For example, a member of a Contract Research
Organization.
External Reviewer A peer reviewer or participant of specific
documents. For example, an expert in the
relevant field of medicine.
The system creates user groups for providing document access to external trial participants. The
following table describes the external trial participant user groups:
Groups Description
tmf_global_external_participants Provides access rights common to all external
participants. This group is a TMF group.
tmf_contributors Provides read-only and browse access to the
top-level Clinical cabinet and product folders.
All clinical trial participants belong to this group
indirectly.
tmf_external_contributors Assigns a workspace to the External
Contributors.
tmf_external_reviewers Assigns a workspace to the External Reviewers.
tmf_inspectors Assigns a workspace to the Inspectors.
tmf_investigators Assigns a workspace to the Investigators.
pg_<product-code> Provides Read access to the product-level
folders.
(Product group)
tg_<trial-ID> Provides Read access to the top-level TMF folder
for the trial.
(Trial group)
cg_<trial-ID>_<country-code> Provides Read access to the top-level country
folder for the trial.
(Country group)
sg_<site-ID> Provides Read access to the top-level site folder
for the trial.
(Site group)
Each external trial participant user group is further divided into subgroups for roles with the
following suffixes:
• Investigator: _inv
• Inspector: _insp
39
Life Sciences Solution Fundamentals
Documentum for Quality and Manufacturing provides defined user roles that enable or restrict user
access to documents and information in the system. The following table describes the user roles:
40
Life Sciences Solution Fundamentals
41
Life Sciences Solution Fundamentals
42
Life Sciences Solution Fundamentals
Groups Description
cd_admingroup Administrator:
• Has access to administrative functions
43
Life Sciences Solution Fundamentals
Groups Description
cd_issued_print_admin The only user that should be part of this
group is the user used in the Controlled Print /
Issued Print web application. That is, the user
configured in the properties file to log into the
repository. This user is granted access to specific
D2 contexts for virtual document publishing.
cd_issued_print_approvers Members of this group are allowed to approve
Issued Print Requests. By default, this group
will contain the cd_issued_print_printers group.
cd_issued_print Members of this group are allowed to see the
Issued Print Reporting option. By default, it
includes cd_issued_print_printers, cd_issued
_print_reprinters, cd_issued_print_approvers
and cd_issued_print_reconcilers.
cd_issued_print_printers Members of this group are allowed to create an
Issued Print.
cd_issued_print_reconcilers Members of this group have access to reconcile
issued prints. By default, this group will contain
the cd_issued_print_printers group.
cd_issued_print_reprinters Members of this group are allowed to reprint
an existing issued print. By default, the
cd_issued_print_reprinters group will contain
the cd_issued_print_printers group.
Note: In addition to the Issued Print groups specified in the cross-functional user groups table,
the system provides site-based approvers (cd_issued_print_<site>_approvers) and reconcilers
(cd_issued_print_<site>_reconcilers) groups. For more information about creating these groups,
see Issued Print Site-based Groups, page 261.
Reporting Groups
Documentum for Quality and Manufacturing provides an optional reporting feature for monitoring
active clinical trials based on myInsight. See Reporting Groups, page 37 for the reporting groups
used by the Report Generator.
Documentum for Research and Development provides defined user roles that enable or restrict user
access to documents and information in the system. The following table describes the user roles:
44
Life Sciences Solution Fundamentals
cd_md_clinical_doc_approvers
cd_md_clinical_template
_approvers
cd_md_doc_approvers
cd_md_non_clinical_doc
_approvers
cd_md_non_clinical_template
_approvers
cd_md_regulatory_doc
_approvers
cd_md_regulatory_template
_approvers
cd_non_clinical_doc_approvers
cd_quality_doc_approvers
cd_regulatory_doc_approvers
cd_safety_doc_approvers
cd_<domain>_template
_approvers
Auditors cd_ad_promo_doc_auditors Have read-only access to audit logs,
Approved, and Superseded documents.
cd_clinical_doc_auditors They can view document content, history,
and properties.
cd_labeling_doc_auditors
They must have a minimum extended
cd_md_clinical_doc_auditors
privilege of View Audit.
cd_md_doc_auditors
cd_md_non_clinical_doc
_auditors
cd_md_regulatory_doc
_auditors
cd_non_clinical_doc_auditors
cd_quality_doc_auditors
cd_regulatory_doc_auditors
45
cd_safety_doc_auditors
Life Sciences Solution Fundamentals
cd_md_non_clinical_doc
_authors
cd_md_non_clinical_template
_authors
cd_md_regulatory_doc_authors
cd_md_regulatory_template
_authors
cd_non_clinical_doc_authors
cd_quality_doc_authors
cd_regulatory_doc_authors
cd_safety_doc_authors
cd_<domain>_template_authors
46
Life Sciences Solution Fundamentals
cd_labeling_consumers_imp
cd_md_clinical_consumers_imp
cd_md_consumers_imp
cd_md_non_clinical_consumers
_imp
cd_md_regulatory_consumers
_imp
cd_non_clinical_consumers
_imp
cd_quality_consumers_imp
cd_regulatory_consumers_imp
cd_safety_consumers_imp
Document cd_ad_promo_doc Manage the release of controlled
Coordinators _coordinators documents.
• Control Category 2: Approvers can act
cd_clinical_doc_coordinator
as Document Coordinators.
cd_labeling_doc_coordinators
• Control Category 3: Authors can act as
cd_md_clinical_doc Document Coordinators.
_coordinators
cd_md_doc_coordinators
cd_md_non_clinical_doc
_coordinators
cd_md_regulatory_doc
_coordinators
cd_non_clinical_doc
_coordinator
cd_quality_doc_coordinator
cd_regulatory_doc_coordinator
cd_safety_doc_coordinator
47
Life Sciences Solution Fundamentals
cd_quality_managers
cd_<domain>_ref_copy_mgrs
cd_regulatory_activity
_managers
cd_regulatory_managers
cd_safety_managers
48
Life Sciences Solution Fundamentals
cd_labeling_doc_readers
cd_md_clinical_doc_readers
cd_md_doc_readers
cd_md_non_clinical_doc
_readers
cd_md_regulatory_doc_readers
cd_non_clinical_doc_readers
cd_quality_doc_readers
cd_regulatory_doc_readers
cd_safety_doc_readers
Reviewers cd_ad_promo_doc_reviewers Review documents using annotations and
edit documents. They are responsible for
cd_clinical_doc_reviewers technical review during the authoring and
review cycle.
cd_format_reviewers
cd_labeling_doc_reviewers
cd_md_clinical_doc_reviewers
cd_md_doc_reviewers
cd_md_non_clinical_doc
_reviewers
cd_md_regulatory_doc
_reviewers
cd_non_clinical_doc_reviewers
cd_quality_doc_reviewers
cd_regulatory_doc_reviewers
cd_safety_doc_reviewers
49
Life Sciences Solution Fundamentals
50
Life Sciences Solution Fundamentals
cd_md_nonclinical_sec_doc
_admin
cd_md_regulatory_sec_doc
_admin
cd_nonclinical_sec_doc_admin
cd_quality_sec_doc_admin
cd_regulatory_sec_doc_admin
cd_safety_sec_doc_admin
cd_ad_promo_sec_model
_admin
cd_clinical_model_admin
cd_labeling_sec_model_admin
cd_md_clinical_sec_model
_admin
cd_md_nonclinical_sec_model
_admin
cd_nonclinical_sec_model
_admin
cd_quality_sec_model_admin
cd_regulatory_sec_model
_admin
cd_safety_sec_model_admin
cd_md_regulatory_sec_model
_admin
Administrators cd_admingroup Access administrative functions but do not
have access to controlled documents.
51
Life Sciences Solution Fundamentals
Groups Description
cd_admingroup Administrator:
• Has access to administrative functions
Reporting Groups
Documentum for Research and Development provides an optional reporting feature for monitoring
active clinical trials based on myInsight. See Reporting Groups, page 37 for the reporting groups
used by the Report Generator.
Documentum Submission Store and View provides defined user roles that enable or restrict user
access to documents and information in the system. The following table describes the user roles:
52
Life Sciences Solution Fundamentals
53
Life Sciences Solution Fundamentals
Use the correspondence user groups for correspondence documents. The following table describes
the correspondence user groups:
Groups Description
cd_corres Users of Correspondence documents. Users in
this role are able to access the Correspondence
domain documents and have at least READ
permission on the documents.
cd_corres_consumers_imp Consumers who can import documents. They
have read-only access to Approved versions.
cd_corres_doc_approvers Correspondence document approvers approve
controlled documents. Some documents require
electronic signatures.
cd_corres_doc_auditors Correspondence document auditors have
read-only access to audit logs, Approved,
and Superseded documents. They can view
document content, history, and properties.
cd_corres_doc_authors Correspondence document authors create and
edit Correspondence documents in the Draft
state. For Control Categories 2 documents,
Authors cannot be an Approver.
cd_corres_doc_coordinators Correspondence document coordinators
manage the release of controlled documents.
For Control Category 2 documents, Approvers
can act as Document Coordinators.
cd_corres_doc_readers Correspondence document readers have
read-only access to Approved versions.
54
Life Sciences Solution Fundamentals
Groups Description
cd_corres_doc_reviewers Correspondence document reviewers review
documents using annotations and edit
cd_format_reviewers documents. They are responsible for technical
review during the authoring and review cycle.
cd_corres_managers Correspondence project managers create
and manage registration forms for the
Correspondence domain.
cd_corres_template_authors Correspondence template document authors
create and manage template Correspondence
documents.
cd_corres_template_approvers Correspondence template document approvers
approve template documents.
Groups Description
cd_admingroup Administrator:
• Has access to administrative functions
Reporting Groups
Documentum Submission Store and View provides an optional reporting feature for monitoring
active clinical trials based on myInsight. See Reporting Groups, page 37 for the reporting groups
used by the Report Generator.
55
Life Sciences Solution Fundamentals
Control Categories
The term control category refers to the level of regulation that should be applied to a document. The
control category for each artifact is defined in the creation profile through the assigned Default
Values template and lifecycle.
The control category for a document is stored in the Category attribute. It is an integer value of either
1, 2, 3, or 4. The value of the Category attribute determines the lifecycles and workflows that are
applied to a document. The Life Sciences solutions provide four control categories.
The common security features for all control categories:
• Allow joint and collaborative authoring in Documentum D2.
• Restrict access to content based on user role and lifecycle state.
• Allow complete withdrawal of the document with optional retention of historic copies.
The following table describes the security of the document control categories:
56
Life Sciences Solution Fundamentals
• Expiration notifications.
Category 2 Controlled documents that require formal
review and approval. They do not require
signoff by the Quality Organization.
58
Chapter 2
Customizing D2-Based Solutions
The Life Sciences solutions are configured in D2 Config. The configurations are based on life science
industry best practices and stored in a set of Documentum D2 applications. Topics in this chapter
describe configurable features and tasks for experienced Documentum and Documentum D2
administrators to perform to meet their business requirements. To generate a document listing the
default configurations, select Specifications > Generate specifications from D2 Config.
Some administrative functions are available in the D2 Client.
The Documentum D2 Administration Guide provides more information. The Appendix D, D2
Configurations section provides the list of D2 configurations that must not be renamed or removed.
59
Customizing D2-Based Solutions
In the example below, the new application is named, KB Pharma. All of the configurations that are
either modified or added by the customer will be assigned this application. Existing OpenText
applications defined by the base solution should be left in the Applications list for base solution
configurations as well.
Assignment of a custom application enables you to easily identify your custom configurations and
package them for deployment and backup.
60
Customizing D2-Based Solutions
61
Customizing D2-Based Solutions
Reconciling the changes between the new version of the base configuration (in this example,
Procedure Properties) and the extension (in this example, KB Procedure Properties) is a manual
process that involves either:
• Using a diff and merge tool on the two versions of the XML file and then importing the updated
XML file, or
• Using D2-Config to extend the upgraded version of the configuration and reapplying your
changes.
62
Customizing D2-Based Solutions
The following image shows the comparison of two versions of a creation profile using a diff and
merge tool:
Notice that the changes are highlighted. In this case, the version of the file on the left allows for
creation of two additional document types. There are also some changes in the properties of the
creation profile highlighted. After you have verified the changes, you can merge the changes using
the tool.
Note: Between D2 4.5 and 4.6 release, the D2 config XML has undergone some changes. For instance,
in D2 4.5, the display config is named as d2_attribute_mappings.xml but in 4.6 it is just .xml.
Note: In some sections, there is a separation between the configuration definition and its settings.
This can be seen as a “contents” folder under the main folder. While there may be no changes in the
definition file, there may be changes in the content file. It is recommended that you check both files.
63
Customizing D2-Based Solutions
Extending Roles
If you require custom roles to be created in the Life Sciences solution, you can do so by extending the
predefined roles that are installed with the application. Extending the roles involves making property
page changes and workflow configuration changes through D2-Config.
Configuring Workflows
When a workflow is started, the system specifies the performers for each state of the workflow. By
default, users who are members of the default roles are assigned as performers to the workflow.
However, if the default roles are extended with custom roles, the query that assigns the performers
must be modified.
1. Log in to D2-Config.
2. Select Go to > Workflow.
64
Customizing D2-Based Solutions
1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Data > Dictionary.
4. Under Dictionaries, select GMP Applicable Sites.
5. On the Languages tab, you can add or remove the required sites.
6. On the Alias tab, under each roles, define the respective user group for the respective site. For
example, for the Boston site, under authors, type cd_boston_authors.
7. Click Save.
All Documentum for Quality and Manufacturing groups are cd_<site>_<role>. Custom roles should
follow the same paradigm as the queries in the system are built to look for that structure. Note that
this also applies for custom LSRD, LSSSV, and LSTMF roles where it is cd_<domain>_doc_<role>. The
queries are built expecting that structure as well, although they typically take the group_name
(for example, cd_clinical, cd_quality, and so on).
65
Customizing D2-Based Solutions
66
Chapter 3
Life Sciences Data Model
This section provides an overview of the object data model used in the Life Sciences solutions.
67
Life Sciences Data Model
Two base types, cd_controlled_document and cd_common_ref_model, define metadata that pertains
to all subtypes. Users do not directly create instances of these two types.
Types with the _info suffix represent registration forms. Registration forms are created and managed
by the Business Administrators in each respective domain, that is, Product Managers, Clinical Trial
Managers, Non-Clinical Study Managers, Project Managers, and Regulatory Managers.
Users in each respective domain in the appropriate roles can create instances of the types listed
in the following table.
68
Life Sciences Data Model
69
Life Sciences Data Model
Caution: Extending the object model with additional levels can affect the performance of the
system.
70
Chapter 4
Reference Models
71
Reference Models
Quality, Non-Clinical, and Clinical. Each tab lists the document artifacts for that domain and the
metadata for the documents. The documents are categorized according to group and subgroup. The
following image is a small snippet of the DIA EDM reference model spreadsheet:
The Artifact Name column lists the individual document artifacts for the given domain. In the
snippet, artifacts for the Quality domain are shown. The Group and Sub-Group columns show how
each artifact should be categorized. The columns to the right of Artifact Name, list the metadata
for documents in the given domain. An “M” in the cell indicates the attribute is mandatory for
the associated artifact.
This model is implemented in the Life Sciences solution using D2 dictionaries and taxonomies. D2
property pages enforce mandatory attributes as defined in the model.
For the purpose of explanation, the Quality Domain dictionaries are used as an example. The
complete set of D2 dictionaries that list the groups, subgroups, and artifacts defined in the DIA
EDM and TMF model are described in Appendix C. The Quality Domain dictionaries used in the
implementation of the DIA EDM reference model are listed in the table below. Notice that these
dictionaries map to the columns of the DIA EDM reference model spreadsheet.
72
Reference Models
73
Reference Models
These dictionaries are used in D2 creation profile configurations to enable users to create specific
document artifacts. In the following creation profile image, the Quality Drug Product Artifact Names
dictionary is used in a creation profile to enable users to create Quality – Drug Product documents:
74
Reference Models
Each value mapped to the Dictionary alias for the associated key in the TMF Models dictionary has a
dictionary. For example, the DIA TMF 2.0 key is associated to the TMF Unique Artifact Names 2.0
dictionary. In the TMF Unique Artifact Names 2.0 dictionary, information about an artifact is stored
as an alias (as columns) as shown in the following image:
75
Reference Models
When you create placeholders and TMF documents, the creation logic uses the respective dictionary,
which is determined by the reference model of the trial, and populates the relevant artifact
information in the property pages. A default value template, TMF Placeholder Defaults, is defined
to apply the default values for all the placeholders and documents that are created. Anything that
changes from artifact to artifact is defined as an alias in the dictionary and a lookup expression is
used in the default values template to determine the values from the dictionary.
76
Reference Models
The following table lists the taxonomy properties for the D2 dictionaries in D2-Config.
Table 6. Taxonomy
77
Reference Models
Correspondence Domain
The Correspondence domain had the following dictionaries and taxonomies in the previous design:
78
Reference Models
types based on its group and artifact mappings, the workflow mappings for each artifact, and also
the property page acronyms for it.
• The preceding creation profiles are now replaced with the following creation profiles (with
the exception of Regulatory Correspondence Template Management Artifacts, Regulatory
Correspondence Attachments):
— Regulatory Correspondence Documents - Agency
— Regulatory Correspondence Documents - General Non Product
— Regulatory Correspondence Documents - Internal
— Regulatory Correspondence Documents - Submission Receipt Confirmation
— Regulatory Correspondence Email - Agency
— Regulatory Correspondence Email - General Non Product
— Regulatory Correspondence Email - Internal
— Regulatory Correspondence Telephone - Agency Contact Record
— Regulatory Correspondence Telephone - General Non Product
— Regulatory Correspondence Template Management Artifacts
— Regulatory Correspondence Attachments
These creation profiles are configured with a single document type, New Document. Users must
select document properties including artifact_name in the Properties page. The data in the
Properties page is sourced from the single dictionary, Regulatory Correspondence Classification
by Groups and Artifacts. All the artifacts within the Correspondence domain are Category 2.
• The dictionary/taxonomy mappings from the property page configuration have been replaced
with DQL queries to populate data from the new dictionary where ever it is required.
• The dictionary contains the property page acronym for the respective property pages to be used
for that artifact. This acronym is used in the DQL query of the related property page to list the
artifacts.
With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Regulatory Correspondence Classification by Groups and
Artifacts dictionary instead of managing multiple configurations.
79
Reference Models
had to be made to more than one configuration resulting in a cumbersome process of making any
changes to the data model.
To centralize the definitions of groups, subgroups, and artifacts within the system and simplify
making changes to the data model, a new implementation of the reference model for each domain is
used with the objective to reduce the number of configuration elements and capture the data in a
single dictionary rather than scattering it into multiple configurations.
Regulatory Domain
The Regulatory domain had the following dictionaries and taxonomies in the previous design:
80
Reference Models
Clinical Domain
The Clinical domain had the following dictionaries and taxonomies in the previous design:
81
Reference Models
Non-clinical Domain
The Non-clinical domain had the following dictionaries and taxonomies in the previous design:
82
Reference Models
83
Reference Models
There is an alias created for each value that varies for each artifact, for example, Group, Subgroup,
Artifact Name, and so on.
• The preceding creation profiles have been modified with a single document type, New Document.
Users must select the document properties including artifact_name in the Properties page when
creating documents. The data in the Properties page is sourced from the single dictionary, Non
Clinical Artifacts. All the artifacts within the Non Clinical Domain are Category 2.
Note: Non-Clinical Literature Reference documents were Category 3 in Life Sciences 4.3 release.
As part of the new implementation, these documents have been converted to Category 2.
However, you can self-approve Non-Clinical Literature Reference documents.
• The dictionary/taxonomy mappings from the property page configuration have been replaced
with DQL queries to populate data from the new dictionary where it is required.
With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Non Clinical Artifacts dictionary instead of managing
multiple configurations.
Quality Domain
The Quality domain had the following dictionaries and taxonomies in the previous design:
84
Reference Models
85
Reference Models
86
Reference Models
With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Promotional Materials Artifact dictionary instead of
managing multiple configurations.
Labeling Domain
The Regulatory Labeling domain had the following dictionaries and taxonomies in the previous
design:
87
Reference Models
Properties page is sourced from the single dictionary, Regulatory-Labeling Artifacts. All the
artifacts within the Labeling domain are Category 2.
Note: Few artifacts such as Artwork, SPL - Non-PLR, SPL - PLR were Category 3 in the Life
Sciences 4.3 release. As part of the new implementation, these documents have been converted to
Category 2. However, you can still self-approve these documents.
• The dictionary/taxonomy mappings from the property page configuration have been replaced
with DQL queries to populate data from the new dictionary where ever it is required.
With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Regulatory-Labeling Artifacts dictionary instead of managing
multiple configurations.
Safety Domain
The Safety domain had the following dictionaries and taxonomies in the previous design:
88
Reference Models
in the Properties page. The data in the Properties page is sourced from the single dictionary,
Safety Artifact. All the artifacts within the Safety domain are Category 2.
• The dictionary/taxonomy mappings from the property page configuration have been replaced
with DQL queries to populate data from the new dictionary where it is required.
With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Safety Artifact dictionary instead of managing multiple
configurations.
Extending Dictionaries
You can extend a dictionary for any of the LSRD domains by adding a new artifact using the
following procedure:
89
Reference Models
Each of these components must be configured with the new artifacts that are to appear in the system.
Extending Dictionaries
In Documentum for eTMF, you must extend the following dictionaries:
• TMF Models if you want to add a new reference model.
• TMF Unique Artifact Names <version> if you want to add a new artifact.
To extend a dictionary by adding a new artifact, follow the steps in Extending Dictionaries, page 90.
90
Reference Models
91
Reference Models
92
Chapter 5
Document Creation Profile
The Life Sciences solutions provide several D2 creation profiles to enable users to create and import
documents and registration forms. There are one or more creation profiles for each LSRD, LSTMF,
and LSSSV domain. There is no manager role in LSQM as registration forms are not currently used
in this solution.
93
Document Creation Profile
Note: For the Regulatory/Admin domain, only one artifact available in the creation profile that is,
New Document. Group, Subgroup, and Artifact information is captured on the Edit properties screen.
94
Document Creation Profile
For Documentum for Quality and Manufacturing, within each creation profile, the individual
document types map to artifacts. In the case of Documentum for Research and Development,
Documentum Submission Store and View, and Documentum for eTMF, the creation profiles are
mapped to a single document type, New Document. Users must select the document properties
including artifact_name in the Properties page when creating documents. The data in the Properties
page is sourced from the single dictionary for that domain.
Each creation profile has an associated D2 dictionary that defines the document types (that is,
artifacts) for the relevant domain and group. For Quality and Manufacturing domains, the individual
document types within each creation profile map to groups rather than artifacts.
The Default Values template and lifecycle must always refer to the same control category number.
In the example, the Directive artifact in the LSQM Governance and Procedures domain will be
created as a Control Category 1 document. The assigned Default Values template ensures that the
category attribute is assigned the value 1. The document then matches contexts with the condition,
category=’1’. This ensures the correct Control Category 1 configurations are applied to the
document at runtime.
95
Document Creation Profile
The Life Sciences solutions depend on correct default values to function correctly. The following table
lists attributes that must have default values when a document is created.
96
Document Creation Profile
97
Document Creation Profile
3. Under Properties, enter the profile details, such as name, description, list of applications, and
dictionary properties.
4. Click Save.
98
Document Creation Profile
6. In the table, create a new row for each new artifact and select the following values in their
respective columns for each new row:
a. First column: Select a dictionary item from those you added in step 3.
b. Type column: Select the object type for the new document that will be created when a user
selects this item in the creation profile. If you created your own custom type, you can
specify the custom type.
c. 02 config column: Leave blank.
d. Property pages column: Select the property page configuration that should be displayed
during creation or import for this item.
e. Version column: Typically, you should select 0.1. This value ensures that the first effective
version of the document is version 1.0, while all draft or pre-effective versions are 0.x.
f. Inheritance column: Select an appropriate D2 inheritance configuration. If you created your
own inheritance definition, select the custom inheritance definition.
g. Default Values Template column: Select a Default Values template that corresponds to the
specific artifact, its domain, and/or its group and the desired control category.
h. Lifecycle column: Select the lifecycle corresponding to the desired control category for the
document. This must correspond to the Default Values template. If you created your own
lifecycle definition, select the custom lifecycle definition.
i. Workflow column: Leave blank.
7. Click Save.
99
Document Creation Profile
Caution: The system generates the document_id from a 9-digit counter, which allows 100
million unique document IDs. In D2-Config, the size of the counter can be increased by editing
the _Document_ID auto naming configuration. However, the document_id attribute in the
cd_controlled_doc object type is a ten character string. When the digits are increased earlier
10, the cd_controlled_doc object type must also be changed.
If the counter value on the _Document_ID auto naming configuration is reset after creating
documents, subsequent document IDs might not be unique. Consequently, you should not reset
this counter value in a production repository.
If two autonaming configurations have the same prefix, the system does not validate the
configurations for existing documents. This can result in duplicate document names in the system.
For example:
• QUA - Quality
• QUA - Qualitative Use Assessment
You can avoid this by defining a different prefix for the document in the respective Prefixes dictionary
in D2-Config.
Document versions are minor (v 0.1) until they are made effective. Documents in an
Effective/Approved/Final state have major version numbers (v 1.0).
100
Document Creation Profile
updating the Product Code that then cascades down to the documents. The Documentum D2
Installation Guide provides information about O2 configurations.
Option Description
Form Details
Default Configuration Select to make this configuration the default
view configuration. If more than one default
configuration is applicable for the end user in
a context within the configuration matrix, D2
uses the top-most configuration in the C2 View
Configuration page.
Compatibility Select the oldest compatible version of the PDF
viewing software.
Protection Details
Encryption Level Determines the type of encryption (40 or 128 bit).
User password The password required for opening the PDF.
Password to modify preference The password required for modifying the PDF.
Permissions Check the checkboxes to allow the user to print,
modify content, copy, or modify annotations.
Initial View Details
Page Layout Defines the layout of the document, for example,
multiple pages, single page, and so on.
Display Determines which panels to display on initial
view, for example, bookmarks or just the PDF
page, and so on.
Displayed page on running Select the page to display on initial view.
The following table lists the C2 View configurations used by the Life Sciences solutions:
101
Document Creation Profile
Note: When exporting Audit Report C2 View and TBR Audit Report C2 View, the Audit table will
be appended in front of the document followed by rest of the document content. There is no way
to show only the Audit Report through C2 View configurations.
102
Document Creation Profile
1. Create a folder (for example, /realtimeclient_config) in the client file system with the
following structure:
a. Copy aek.key from the %CTS%\config folder on the CTS host machine to the folder you
created in Step 1. This is used in the preferences.xml file as part of <AekFilePath>.
b. Copy mspassword.txt from the %CTS%\docbases\<repository>\config\pfile on
the CTS host to the pfile folder. This is used in the preferences.xml file as ServerProperty
passwordFile.
c. Create an empty cache folder. This is used in preferences.xml as ServerProperty Cache.
d. Create the preferences.xml file. You can use the following sample content but replace the
highlighted values with the values corresponding to your environment:
<?xml version="1.0" encoding="UTF-8"?>
<ServiceConfiguration ID="CTS Web Services">
<PropertyList>
<ServerProperty Key="Cache" Description="The Temperory Cache Directory"
Value="C:/realtimeclient_config/cache" />
<ServerProperty Key="AllowDirectTransfer" Description="Allow Direct File
Transfer From CTS Server to Client. Set it to false if there is a firewall restriction"
Value="true" />
<ServerProperty Key="CTSWSPingInterval" Description="Interval (in seconds)
used to specify how frequent the LB should ping its CTS instances for heart rate."
Value="30" />
<ServerProperty Key="FailoverRetries" Description="Allow a number of retries
if a request fails while waiting on the HTTP response from CTS" Value="1" />
<ServerProperty Key="FailoverWait" Description="Wait between failover retries"
Value="1"/>
<ServerProperty Key="InstanceSelector" Description="Specify an implementation
class for instance selection " Value="com.emc.documentum.cts.lb.workers
103
Document Creation Profile
.OccupancyBasedSelector"/>
<ServerProperty Key="CTSOccupancyPollingInterval" Description="Specify occupancy
polling interval in seconds" Value="7"/>
<ServerProperty Key="ConnectionRetries" Description="Specify connection retries
(in case Repositories section is not configured )" Value="10"/>
<ServerProperty Key="AvailabilityRetries" Description="Number of retries when
CTS instances are not available" Value="2"/>
<ServerProperty Key="AvailabilityWait" Description="Number of seconds to wait
for rechecking availability" Value="4"/>
<!-- if local load balancer is used, no need of CTS-WebServices -->
<LoadBalancer type="local" URL="" sendMode=""/>
<!-- Otherwise, a remote CTS-WebServices can be used as Load Balancer, for ex: -->
<!-- <LoadBalancer type="remote" URL="http://10.31.158.35:8080/services/transformation
/LoadBalancer" sendMode="remote"/> -->
<Repositories>
<AekFilePath>C:/realtimeclient_config/aek.key</AekFilePath>
<LoginContext DocbaseName="lrx64">
<ServerProperty Key="domain" Value=""/>
<ServerProperty Key="userName" Value="Administrator"/>
<ServerProperty Key="passwordFile"
Value="C:/realtimeclient_config/pfile/mspassword.txt"/>
<ServerProperty Key="maxConnectionRetries" Value="10"/>
</LoginContext>
</Repositories>
</PropertyList>
</ServiceConfiguration>
Note: If you have multiple repositories configured with CTS, you may need to add multiple
login context nodes in the <Repositories> tag in the preferences.xml file. In addition, you
must create a separate pfile folder with the mspassword.txt file for each repository. For
example, if you have two repositories, repo1 and repo2 configured with CTS, you must create
two folders—repo1 and repo2, each with a pfile folder and mspassword.txt in them.
2. Create an environment variable named CTS_CONFIG_LOC with the value as the path of root
folder that contains the preferences.xml file, for example, C:\realtimeclient_config.
104
Document Creation Profile
10. Under Rendition configurations, select a configuration and under Properties, select the Fast
Web Compatibility checkbox and click Save.
Electronic Signatures
Electronic signatures are captured during the workflow process when approving documents by
forcing the user to enter a username and password and storing that information in the audit trail.
The C2 configuration then reads the audit trail to create the signature page that is dynamically or
statically applied to the PDF rendition.
Signature Page
Signature pages are currently handled through C2 configurations that attach the signature page to
the PDF rendition. This process uses XSL FO to produce one or more PDF pages containing a table
of signature information. C2 does not provide a mechanism for determining the page size of the
generated signature page. To make it easier for clients to configure the page size for the signature
page, parameters for the page height and width can be configured in D2-Config on the C2 Rendition
Configuration Merge section:
It is recommended to set the page size to the largest page size used if multiple page sizes are required
for a client. When choosing to physically print the document, the “fit to page” option should be
selected.
105
Document Creation Profile
Overlays
Overlays are applied through C2 Export, C2 Print, and C2 View configurations through the layer
setting in the Stamping configuration. To ensure that the correct layer is used with regard to page
size, the overlay must contain a page for each page size and orientation. For example, to support both
portrait and landscape orientations in both A4 and US Letter page sizes, four pages are required in
the overlay:
• US Letter Portrait
• US Letter Landscape
• A4 Portrait
• A4 Landscape
Other page sizes can be supported by simply adding another page with the appropriate settings. If
the header or footer is changed in the document, such as property changes, header/footer location
changes, or changes to the size of the header or footer, the C2 overlay may also need to be altered.
Note: This only applies to overlays generated through C2. It does not apply to overlays generated by
the Controlled Print feature.
Note: Only portrait-based templates are included out of the box. If landscape pages are required,
these pages must be added to the authoring template as well as to any existing overlays so that the
properly sized or positioned overlays can be applied.
106
Chapter 6
Registration Forms
107
Registration Forms
• Enable the overall status of a product, trial, or project to be indicated by the appropriate Managers.
• Enable an entire product, trial, or project to be locked down if necessary, in order to prevent
additional documents from being made “Effective” (perhaps temporarily).
Registration forms are set up by the high-level Management team, that is, Product Managers,
Regulatory Managers, Clinical/Non-Clinical Trial Managers, Project Managers in the Documentum
for Research and Development Quality area, and Corporate Administrators. Only these users can
create and modify the forms. In general, these forms provide an overarching governance mechanism
that can be used at the Management level, over and above the role-based access controls applied at the
individual document level. The following table shows the available registration forms, the registration
form object type, and the type of object that inherits properties from this kind of registration form.
Registration Form Name Registration From Type Target Object Type(s) Inherit
from Registration Form
Country Registration Form cd_clinical_country_info cd_clinical_tmf_doc
(LSTMF solution only)
Site Registration Form (LSTMF cd_clinical_site_info cd_clinical_tmf_doc
solution only)
Clinical Trial Registration Form cd_clinical_trial_info cd_clinical
cd_clinical_tmf_doc
Non-Clinical Study cd_non_clinical_study_info cd_non_clinical
Registration Form
Medical Device Clinical Trial cd_clinical_trial_info cd_clinical
Registration Form
Medical Device Registration cd_product_info cd_clinical
Form
cd_non_clinical
cd_reg_admin
Medical Device cd_reg_admin_info cd_reg_admin
Regulatory-Admin
Registration Form
Medical Device Submission cd_reg_submission_info cd_reg_admin
Registration Form
Product Registration Forms cd_product_info cd_clinical_tmf_doc
cd_clinical
cd_non_clinical
cd_quality
cd_reg_admin
Project Registration Forms cd_quality_project_info cd_quality
108
Registration Forms
Registration Form Name Registration From Type Target Object Type(s) Inherit
from Registration Form
Regulatory Application cd_reg_admin_info cd_reg_admin
Registration Forms
Regulatory Activity Package cd_reg_admin_activity_info cd_reg_admin
Registration Form
Submission Registration Form cd_reg_submission_info cd_reg_admin
In most cases, each registration form object type extends the relevant target object type (that is, the
document object type to which the form pertains) for the following reasons:
• Optimizes query performance since there are usually fewer instances of registration forms than
documents, so the queries used to populate the product, trial, or project codes work efficiently.
• Facilitates D2 configuration, enabling D2 contexts to be used to restrict certain configuration
elements to registration forms (for example, attachment of property screens, lifecycles, and
security models).
• Provides the ability to add attributes specific to the registration form but not the target documents.
It is possible to extend the configuration to support other types of registration forms if necessary.
109
Registration Forms
6. Create a Default Values template to be used during creation. Ensure that all of the mandatory
default properties described in the Default Values Template, page 95 are defined.
7. Create a lifecycle configuration for the new registration form type if desired. You can also use
an existing registration form lifecycle if it applies.
8. Create a creation profile that enables creation of the new registration form type. Alternatively,
add the new registration form to an existing creation profile. Customizing Creation Profiles,
page 97 provides more information.
110
Chapter 7
Attribute Inheritance and Cascading
Attribute Rules
This section explains how attribute inheritance and cascading rules work in the Life Science solution.
111
Attribute Inheritance and Cascading Attribute Rules
Table 26. Comparisons between D2 inheritance and Auto Inherited Attribute Rules
112
Attribute Inheritance and Cascading Attribute Rules
Option Description
Enabled Check if this auto-inheritance configuration is enabled. If the
selection is cleared, it is ignored at runtime.
Automatically applies to new Check if this auto-inheritance configuration should be
objects applied automatically whenever objects of the specified types
are created. If the selection is cleared, the configuration is
only applied if it is explicitly called from a lifecycle action.
Order of precedence Specifies the order of precedence for this rule in relation
to other rules that may be applicable. Rules are applied in
increasing order of precedence that is, a precedence value
of 0 denotes the highest precedence, 1 is the next highest;
and so on. When two or more applicable rules have the same
level of precedence, they are applied in alphabetical order of
object_name.
113
Attribute Inheritance and Cascading Attribute Rules
Option Description
Inherit from Specifies the source object(s) that are used to provide the
values to be inherited. The object(s) are in terms of a DQL
qualifier. This can refer to attribute values of the target object
using the $value(...) notation within the qualifier, as in
standard D2 configurations. For example,
cd_product_info where product_code =
‘$value(product_code)’
Option Description
Inherit to Specifies the target object(s) on which the inheritance rules
are to be applied. The default setting is SELECTED. This
enables attribute inheritance to be applied to one or more
objects related to the selected object (the object specified by
the –id argument) as opposed to the selected object itself such
as its parent folders. For example, for Documentum for eTMF,
to propagate attributes from the currently-selected object to
its parent folders along the folder path towards the cabinet
level, set source to SELECTED and target to ALL_FOLDERS
for the tmf_folder object type .
For checked-out objects Specify how currently checked-out items should be handled.
If you use the Bypass lock temporarily option, the existing
check-out locks are reinstated after the update, enabling users
to continue to work on checked-out documents.
115
Attribute Inheritance and Cascading Attribute Rules
Option Description
Attributes Specifies the list of attributes to be inherited from the source
object(s) to the target object. Each entry is of the form:
[<update-mode> ]<targetAttribute> [=
<sourceAttribute> | :<literal-value>]
where <update-mode> is:
116
Attribute Inheritance and Cascading Attribute Rules
Option Description
Update Folders Specifies a set of folders that should be moved or renamed
after updating the attributes of the target object, in the
form <source-folder-path>|<DQL-qualifier> =>
<new-folder-path>|<new-folder-name>. Attribute
expression functions prefixed by the “$” symbol can be used
to refer to attributes of the context object (the first source
object, by default). For example,
/Clinical/$arg(old_value)=$new_value
renames a product-level folder in the Clinical cabinet.
117
Attribute Inheritance and Cascading Attribute Rules
Option Description
Delete Empty Folders Select this option if you want to delete empty folders after
applying D2 auto-linking or deleting objects, that is, where
a document is moved from its current folder to some other
location, and there are no other documents in the original
folder.
Delete Target Objects Where Use this setting to selectively remove redundant
objects identified in terms of a DQL qualifier or
Boolean attribute expression (that is, an attribute
expression returning a ''true'' or ''false'' value). For
example, either the DQL qualifier r_object_type
= 'cd_product_info' or the attribute expression
@eq(@value(r_object_type),cd_product_info) can
be used to delete Product Registration Forms.
Option Description
Reapply D2 Auto-Naming Specifies a sub-set of the modified target objects that should
Where be auto-named after updating them, in terms of either a
DQL Qualifier or a rule expression (i.e. an expression that
evaluates to either true or false, as appropriate). As for
update_folders, “$” or “@” function prefixes can be used
within the rule expression to refer to attributes of the context
object or modified target object, respectively. For example,
@endswith(@value(r_object_type),_info)
causes D2 auto-naming to apply to all modified target objects
with an object type ending in “_info”; that is , all registration
form object types, but not others.
118
Attribute Inheritance and Cascading Attribute Rules
Option Description
Reapply D2 Security Where Specifies a subset of the modified target objects that should
have D2 security configurations applied after updating them.
Apply Custom Processing Specifies a subset of the modified target objects that should
Where have custom processing applied after updating them.
Class Path of Custom Plug-in The post-processing plug-in class to invoke for each
qualifying object, if any. Specify the fully-qualified class
path. For example, com.documentum.cdf.plugins
.MyPluginClass.
Additional plug-in arguments Specifies additional arguments for the custom plug-in.
Option Description
Apply Auditing To Enables auditing to be applied to a subset of the impacted
objects based on a DQL qualifier; for example, specify
r_object_type = ‘target_type’ in the Apply Auditing To field
to audit changes on objects of a specific type. Specify “*” to
audit changes on all impacted objects. Note that this can
incur a lot of extra processing if many objects are affected.
Audited Event Name Specifies the event name to be recorded in the audit trail for
each impacted item. If null or undefined, auditing is disabled.
Audited Event Arguments Up to 5 arguments can be recorded in the audit trail for
each modified object. The argument values can contain
attribute expressions, prefixed either by “$” symbols to refer
to attributes of the source object, or by “@” symbols to refer
to attributes of the target (impacted) object in each case. For
example, “@value(object_name)” records the name of the
source object in the audit trail (that which caused the object
to be updated) in each case.
119
Attribute Inheritance and Cascading Attribute Rules
For example, there may be cases where you want the related documents to take on the new attribute
value from the registration form, such as the addition of a generic name to a product registration.
There may be other cases where you do not want the existing documents to take on the new value of
the attribute, but only new documents, such as the changing of a Principal Investigator at a trial site.
And there may still be other cases where you not only want the attribute to be saved on the existing
documents, but you would like to have business logic invoked to properly represent the change to
the registration form.
See Appendix C, Visual Representation of Attribute Cascading in Life Sciences for a visual
representation of how the attributes are cascaded from the registration forms to the documents.
This section defines how to configure the Cascading Attribute rules. By default, the Life Sciences
solutions come with preconfigured cascading attribute rules. Preconfigured Cascading Attributes
Rules, page 140 provides more information.
To configure the Cascading Attributes rules:
If you modify the cascading attribute rule settings (the cd_auto_inherit_config objects in the
/System/CDF/Auto Attribute Inheritance Config folder), restart the Documentum Java Method
Server service on each Documentum Server node in order to pick up the changes.
Field Description
Automatically applies to new Select whether the system applies the rule automatically
objects when users create the specified object type. Clear this
option if the rule applies explicitly as part of a lifecycle
transition action. For example, in the Change Product
Code function.
Order of precedence Select the order of precedence for this rule when there
are other applicable rules. The system applies rules with
the highest order of precedence first. If two or more
applicable rules have the same order of precedence,
the system applies the rules in alphabetical order by
object_name.
120
Attribute Inheritance and Cascading Attribute Rules
Field Description
Applies to object types Select the object type or types that apply to this rule.
For example, Change Product Code applies to the
cd_product_info object type.
Condition qualifier (optional) Type an optional DQL qualifier to satisfy before applying
the rule to the object type. For example, the condition
product_code is not null ensures that the rule
only applies to objects with defined product-code values.
6. On the Inherit from tab, in the Inherit from field, type the source objects that provide the
inherited values:
• SELECTED: Inherit values from the currently selected object.
• PRIMARY_PARENT_FOLDER: Inherit values from the immediate primary parent folder of
the selected object.
• PARENT_FOLDERS: Inherit values from each immediate parent folder of the selected object,
starting with the primary folder, which is the first folder in the i_folder_id property.
• ALL_FOLDERS [<DQL-qualifier>]: Inherits values from all direct and indirect parent
folders, from the first parent folder to the cabinet level, that match the specified conditions,
if any. For example, ALL_FOLDERS tmf_folder where product_code != ' '
inherits values from all direct and indirect parent folders of type tmf_folder with non-blank
product_code values.
• <DQL-qualifier>: Inherits values from the specified objects in terms of a DQL qualifier. You
can use $-expressions within the DQL qualifier to refer to attribute values of the selected
object, if necessary. For example, the Inherit Clinical Trial Registration Form Info rule
uses the DQL qualifier: cd_clinical_trial_info where clinical_trial_id =
$quotedvalue(clinical_trial_id)
If you leave this field blank, the system uses SELECTED as the default.
7. On the Inherit to tab, configure target object inheritance as described in the following table:
Field Description
Inherit to Type the target objects that inherit the relevant attribute
values.
8. On the Attributes tab, configure attribute inheritance as described in the following table:
121
Attribute Inheritance and Cascading Attribute Rules
Field Description
Attributes Specify the list of attributes to inherit from the source to
the target objects. In each case, specify:
The following example shows the Attributes field for the Inherit Clinical Trial Registration
Form Info automatic inheritance configuration object:
9. On the Folder Updates tab, specify the list of folders to move or rename.
To move a folder and its contents to another location, specify <current-folder-path> =>
<new-folder-path>
To rename a folder, specify <current-folder-path> => <new-folder-name>.
In both cases, you can use $-expressions to refer to attributes of the source object, and you can
use @-expressions to refer to attributes of the target folder. You can also use a DQL qualifier to
specify the source folder if necessary.
The following example shows the Folder updates field for the Change Product Code automatic
inheritance configuration object:
In the example, $arg(old_value) and $arg(new_value) refer to the old and new product
code values, respectively.
10. On the Deletion tab, configure how to delete objects as described in the following table:
Field Description
Delete empty folders Select whether to delete the empty folders that remain
after moving documents and folders to new locations.
Delete target objects where Type a DQL qualifier or a Boolean attribute expression to
remove redundant objects.
123
Attribute Inheritance and Cascading Attribute Rules
11. On the Post-processing tab, you can selectively apply Documentum D2 autonaming, autolinking,
security, and custom post-processing to the modified objects. Type a DQL qualifier, Boolean
attribute expression, or the wildcard symbol * in the applicable fields.
For example, in the Change Product Code rule, the system applies Documentum D2 autonaming
to the relevant Product Registration Form and its associated Clinical Trial Registration
Forms using the following DQL qualifier: r_object_type in ('cd_product_info',
'cd_clinical_trial_info')
You can apply Documentum D2 autonaming, autolinking, and security selectively to the affected
objects using this technique. The system applies the DQL qualifier to identify the target objects
before applying the inherited attributes, but applies the Documentum D2 configurations after the
objects update. (If there are multiple target objects to update, they process in parallel, regardless
of the Update Mode setting.)
It is also possible to apply custom processing to selected target objects using the Apply custom
post-processing where field. For example, to apply custom autonaming to the affected
objects, you can write a Java plug-in and install it in the Java Method Server lib directory. The
Documentum Controlled Document Foundation Developer API provides details.
12. On the Auditing tab, configure the following fields.
Option Description
Apply Auditing To Enables auditing to be applied to a subset of the impacted
objects based on a DQL qualifier; for example, specify
r_object_type = ‘target_type’ in the Apply Auditing To field
to audit changes on objects of a specific type. Specify “*” to
audit changes on all impacted objects. Note that this can
incur a lot of extra processing if many objects are affected.
Audited Event Name Specify the event name to be recorded in the audit trail for
each impacted item. If null or undefined, auditing is disabled.
Audited Event Arguments Up to 5 arguments can be recorded in the audit trail for
each modified object. The argument values can contain
attribute expressions, prefixed either by “$” symbols to refer
to attributes of the source object, or by “@” symbols to refer
to attributes of the target (impacted) object in each case. For
example, “@value(object_name)” records the name of the
source object in the audit trail (that which caused the object
to be updated) in each case.
124
Attribute Inheritance and Cascading Attribute Rules
125
Attribute Inheritance and Cascading Attribute Rules
126
Attribute Inheritance and Cascading Attribute Rules
127
Attribute Inheritance and Cascading Attribute Rules
128
Attribute Inheritance and Cascading Attribute Rules
129
Attribute Inheritance and Cascading Attribute Rules
130
Attribute Inheritance and Cascading Attribute Rules
potentially contain commas. Unrecognized functions are not processed, and are passed into the
resulting string unchanged.
Method arguments containing attribute expressions can be specified in D2 lifecycle configurations.
However, D2 evaluates them where functions are recognized (for example, $value(...) and
$dqlvalue(...) expressions) prior to passing the argument values to the method. To prevent
D2 from doing this, the alternative function prefix symbol “@” is supported – for example,
"@dqlvalue(select ... where ...)”. These expressions are not recognized by D2 and are
passed to the method unresolved, so they can be evaluated by the method itself.
The Documentum Controlled Document Foundation Developer API Javadocs provides more details.
131
Attribute Inheritance and Cascading Attribute Rules
Methods Is Context User Param Applicable? Description Lifecycle Configuration Setup Comments
CDFAppendRepeatingAttributesMethod Y D2 Server method that appends values Does not pass the context user in any Supports the -context_user argument.
CDFApplyAttributeInheritanceAsynchMethod from one set of attributes to another, Lifecycle method call in the default Method is executed with an
ensuring that the data is maintained in a configurations. Administrator session. Context
table format (no missing cells). user is used to create audit entry.
CDFApplyAttributeInheritanceMethod Y D2 server method that copies or Does not pass the context user in any Supports the -context_user argument.
merges attributes from arbitrary source Lifecycle method call in the default Method is executed with an
objects to arbitrary target objects configurations. Administrator session. Context
associated with the currently-selected user is used to create audit entry.
object (or the selected object itself)
according to pre-configured attribute
inheritance rules, defined in the
cd_auto_inherit_config configuration
objects in the Documentum repository.
CDFApplyD2ConfigurationsMethod N NA
CDFApplyDynamicSecurityMethod N D2 server method that applies the NA NA
dynamic security to the documents
based on the security model settings set
on the document.
CDFAuditMethod Y D2 Server method that logs a specified Context user is mandatory. Supports the -context_user argument.
event in the Documentum audit trail. Method is executed with an
Administrator session. Context
user is used to create audit entry.
CDFCloseChangeRequestMethod Y The method is invoked by a user action
or a lifecycle batch to close the CIP
Change Request based on its associated
documents status. Currently, there is no
support of context user available in the
method. Therefore, when the Change
Request is closed by a Coordinator, the
audit shows Administrator instead of
the user name.
CDFCopyRelatedObjectAttrsMethod Y D2 Server method that copies specific Does not pass the context user in any Supports the -context_user argument.
attributes to or from other repository Lifecycle method call in the default Method is executed with an
objects related to the selected document configurations. Administrator session. Context
through dm_relations, for example, to user is used to update last modifier.
synchronize attributes from a document
to a Change Request or from a Change
Request to a document.
132
Attribute Inheritance and Cascading Attribute Rules
Methods Is Context User Param Applicable? Description Lifecycle Configuration Setup Comments
CDFCopyRelationMethod Y D2 Server method that copies Not used in any Lifecycle action in the Supports the -context_user argument.
relationships between the selected default configurations. Method is executed with an
document and a parent document, for Administrator session. Context
example, to repurpose a D2_COPY_OF user is not used in the logic.
relation, which is created by D2 when
a document is created with another
document selected.
CDFCreateObjectAsynchMethod Y D2 Server method that creates a new Not used in any Lifecycle action in the Supports the -context_user argument.
object of a specified type, optionally default configurations. Method is executed with an
CDFCreateObjectMethod inheriting content from a specified Does not pass the context user in any Administrator session. Context
template document, attributes from the Lifecycle method call in the default user is used to update last modifier.
currently-selected object, establishing a configurations.
relation to that object, encapsulating that
object as a child of a new parent virtual
document and adding the new object to
existing virtual documents.
CDFCreateRelationMethod Y D2 Server method that establishes Does not pass the context user in any Supports the -context_user argument.
relationships between the selected object Lifecycle method call in the default Method is executed with an
and a set of target objects identified by configurations. Administrator session. Context
a DQL query, or an individual target user is used to create audit entry.
object identified by object type and object
name.
CDFD2ContextOrderExportMethod N NA
CDFD2ContextReorderMethod N NA
CDFDeleteObjectAsynchMethod Y D2 Server method that deletes a specified Does not pass the context user in any Supports the -context_user argument.
object, or a set of objects associated with Lifecycle method call in the default Method is executed with an
it (for example, related objects). configurations. Administrator session. Context
CDFDeleteObjectMethod Y Passes the context user in all the Lifecycle user is used to create audit entry.
method call in the default configurations.
CDFDeleteRelationMethod Y D2 Server method that removes Does not pass the context user in any Supports the -context_user argument.
relationships for the selected document, Lifecycle method call in the default Method is executed with an
for example, to detach change requests configurations. Administrator session. Context
from an "Effective" version of a user is used to create audit entry.
document.
CDFGroupGeneratorMethod N D2 server method that generates the NA NA
security groups for a selected group
generator object as per the configurations
set on the group generator object.
133
Attribute Inheritance and Cascading Attribute Rules
Methods Is Context User Param Applicable? Description Lifecycle Configuration Setup Comments
CDFGroupMaintenanceMethod N D2 server method that generates the NA NA
Group maintenance spreadsheet for
a selected group generator object
according to the configurations set on
the group generator object.
CDFHardDeleteMethod Y CDF method to perform Hard Delete Not used in any Lifecycle action in the Supports the -context_user argument.
on documents by user based on the default configurations. Method is executed with an
cd_delete_config specified in docbase. Administrator session. Context
Audittrail entries to be captured prior to user is used to create audit entry.
Hard Delete, along with reason code.
CDFInitializeArtifactMethod Y D2 Server method that is used for Passes the context user in all the Supports the -context_user argument.
enabling individual documents or Lifecycle methods calls in the default Method is executed with an
batches of documents to be initialized configurations. Administrator session. Context
correctly in the Life Sciences solutions user is used to create an audit entry.
in an efficient and scalable manner,
through a single-server method call.
CDFLoadAttributesFromContentFileMethod N NA
CDFNotifyUsersAsynchMethod Y Sends notification messages to users in Does not pass the context user in any Supports the -context_user argument.
the appropriate roles on a set of objects Lifecycle method call in the default Method is executed with an
through the D2 InBox, optionally routing configurations. Administrator session. Context user
CDFNotifyUsersMethod Y messages through email. Not used in any Lifecycle action in the used in resolving attribute expression.
default configurations.
CDFReAssignWorkflowPerformers N NA
CDFSaveAttributesInContentFileMethod N NA
CDFSetAttributeMethod Y D2 server method that sets a specified Does not pass context user in any Supports the -context_user argument.
attribute on an object, or set of objects Lifecycle method call in the default Method is executed with an
identified by a query, optionally configurations. Administrator session. Context
using D2 dictionary lookups, attribute user is used to update last modifier.
expressions, and re-application of D2
configurations to the specified objects.
Note that this method is used to update
objects with attribute values that can be
computed up-front, in terms of attribute
expressions. To set attribute values
based on DQL query restuls, you can use
the UpdateAttrsViaQuery method as an
alternative.
CDFSetVersionNumberMethod N NA
134
Attribute Inheritance and Cascading Attribute Rules
Methods Is Context User Param Applicable? Description Lifecycle Configuration Setup Comments
CDFShareRelatedContentMethod Y D2 Server method that is used to share NA Supports context_user. Method is
the content of the documents between executed with an Administrator session.
domains. This is used to copy the
content of the master document from
TMF to the RD (Clinical) child document
in case of the cross-domain functionality
is enabled.
CDFSetWFLists Y The method is responsible for updating
the workflow attributes of the document.
The method is invoked when the
document is created in the (init) state of
its lifecycle. Therefore, it must run on
context user session to capture the audit
correctly.
SSVIndexSubmissionFoldersMethod Y SSV Method to index or reindex a series Lifecycle methods calls in the default Supports the -context_user argument.
of submission folders in the repository, configurations. Method is executed with an
to enable them to be viewed in the Administrator session.
Submission Viewer tool.
SSVRefreshSubmissionViewsMethod Y After a Regulatory Application Lifecycle methods calls in the default Supports the -context_user argument.
Registration Form (RARF) has been configurations. Method is executed with an
updated for Application Description, Administrator session. This method is
the new description is reflected in the also supported with Async version for
Imported Submissions View XML files parallel processing capabilities.
for title of RARF.
SSVUpdateCorrespondenceDocumentsMethod Y SSV Server Method that updates the Lifecycle methods calls in the default Supports the -context_user argument.
Internal Correspondence Documents configurations. Method is executed with an
for post-processing steps with regard Administrator session.
to Regulatory Activity Package,
Submissions, Application, and so on.
CDFUpdateAttrsViaQueryMethod Y D2 Server method that runs a DQL Does not pass context user in few Supports the -context_user argument.
"select" query and uses the results Lifecycle method call in the default Method is executed with an
to populate the attributes of the configurations. Administrator session. Context
currently-selected object, objects user is used to update last modifier.
returned by the query itself, or objects
returned by a separate query. Note
that this method is designed to be used
to update objects with attribute values
computed through DQL queries. If
they can be computed up-front, you
can use the SetAttribute method as an
alternative.
135
Attribute Inheritance and Cascading Attribute Rules
Methods Is Context User Param Applicable? Description Lifecycle Configuration Setup Comments
CDFUpdateChangeRequestAttributesMethod Y The method is responsible for updating
the change request attributes when the
change request is checked in by adding
SOP documents to it. The method is
invoked when the document is checked
in by the user. Therefore, it must run on
context user session to capture the audit
correctly.
CDFUpdateRepeatingAttribute Y D2 Server method that updates a Does not pass context user in any Supports the -context_user argument.
repeating attribute in place, adding, Lifecycle method call in the default Method is executed with an
updating, or deleting a specific value as configurations. Administrator session. Context
indicated. user is used to update last modifier.
CDFVirtualDocumentMethod Y D2 server method that enables virtual Passes the context user in all the Supports the -context_user argument.
document structures to be processed in Lifecycle methods calls in the default Method runs in a user session if context
various ways: configurations. user is passed. Context user is used to
• component documents can be added create audit entry.
or removed,
136
Attribute Inheritance and Cascading Attribute Rules
Methods Is Context User Param Applicable? Description Lifecycle Configuration Setup Comments
TMFBulkUploadMethod Not used anymore
TMFCreateLinks N NA
TMFImportExportPackageAsynchMethod Y Provides methods for initializing, Passes context user in all the Supports the -context_user argument.
TMFImportExportPackageMethod zipping, and unzipping TMF document Lifecycle methods calls in the default Method runs in a user session if the
packages containing TMF document configurations. context user is passed. Context user is
content and metadata, suitable used to create audit entry.
for exporting, offline editing, and
re-importing.
TMFNotifyDocumentsNeedIndexing N NA
TMFReconcileArtifactsAsynchMethod Y D2 Server method that reconciles Does not pass the context user in a few Supports the -context_user argument.
a set of TMF documents against a Lifecycle method calls in the default Method is executed with an
file plan, as defined in a Clinical configurations. Administrator session. Context
TMFReconcileArtifactsMethod Trial Registration Form (CTRF) in Passes the context user in all the user is used to update last modifier.
Documentum. As a result of this process, Lifecycle methods calls in the default
any missing TMF placeholders for configurations.
required, recommended, or optional
artifacts specified in the file plan are
generated as required, and the existing
placeholders with matching documents
are updated or removed, depending on
whether or not they refer to repeatable
or non-repeatable artifacts, respectively.
Redundant placeholders referring to
unplanned artifacts (for example, those
that have been removed from the file
plan) are also deleted. The current
progress statistics for the CTRF are then
updated for each planned stage, plus the
trial overall; the validation status and
current progress of each planned artifact
is also recorded in the file plan, if file
plan validation or progress tracking is
enabled.
TMFUpdateContributorGroupsMethod N NA
TMFUpdateFolderAccessForDocumentMethod Not used in TMF methods.
TMFUpdatePlaceholderForDocumentMethod Y D2 Server method that copies, updates, Passes the context user in a few Supports the -context_user argument.
or deletes the TMF artifact placeholder Lifecycle method calls in the default Method is executed with an
for a document that has been uploaded configurations. Administrator session. Context
by the user, either as a new version of user is used to update last modifier.
the placeholder itself, or as a stand-alone
document.
137
Attribute Inheritance and Cascading Attribute Rules
The system sets a new attribute, attribute_list, that contains a pipe delimited list of attributes that
are appropriate for the selected document. This attribute is also set by the CDFSetAttributeMethod,
using the TMF Unique Artifact Names dictionary Attributes alias to set the attribute_list during
certain lifecycle transition using the artifact_name attribute as a key. During indexing, the system
obtains the attribute_list value in real time from the dictionary, using the artifact_name of the
placeholder selected during indexing.
138
Attribute Inheritance and Cascading Attribute Rules
Each attribute that is part of this attribute_list is placed on the property page(s) with a visibility
condition resulting in the attribute being displayed only if the attribute appears in the attribute
list defined for that artifact.
Note: Attributes should not be editable when the document is in the Effective (Approved or Final)
state as this can cause the object name to change. If there are temporal attributes that are always
editable, care should be taken that these are not part of the naming convention because after a
document is approved, its object name should not change.
In the case of Bulk Import/Export (BIE), the functionality has been updated to incorporate the
RequiredAttributes aliases for an artifact and use them accordingly in the autonaming process. The
BIE spreadsheet has been updated with 41 additional columns and the schema has been updated
accordingly to store the values for required attributes for the artifacts during BIE. When creating the
BIE spreadsheet, the user has to manually add the required attributes to each of the artifacts listed in
the spreadsheet to ensure that the autonaming works as expected after BIE. For the list of required
attributes for a particular artifact, refer to the TMF reference model.
1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Data > Dictionary.
4. Under Dictionaries, select TMF Unique Artifact Names 3.0 and click Export.
5. Select Excel as the format and click OK.
6. Under Dictionaries, select the existing dictionary such as TMF Unique Artifact Names 2.0
and click Export.
7. Select Excel as the format and click OK.
8. Open both dictionaries and copy the Document Naming Convention, Attributes,
RequiredAttributes, and Instructions columns from TMF Unique Artifact Names 3.0 to TMF
Unique Artifact Names 2.0.
9. For each artifact listed in the TMF Unique Artifact Names 2.0 dictionary, update the values in
these columns as required.
10. Under Dictionaries, select TMF Unique Artifact Names 2.0 and click Import.
11. Import the updated TMF Unique Artifact Names 2.0 dictionary Excel sheet and click Save.
139
Attribute Inheritance and Cascading Attribute Rules
$arg(new_value) = New
Application Number
140
Attribute Inheritance and Cascading Attribute Rules
141
Attribute Inheritance and Cascading Attribute Rules
142
Attribute Inheritance and Cascading Attribute Rules
replace product_codes
Inherit Regulatory LSRD Copy product-related attributes replace corresponding
Application from the relevant Regulatory dosage_form,
Registration Form LSSSV Application Registration Forms dosage_strength,
Info To Activity of the activity package to Activity indication from
Package Package. cd_reg_admin_info
replace corresponding
product_generic_name,
product_compound_id,
product_chemical
_names, inn_names
from cd_reg_admin
_info
replace corresponding
product_trade_name,
product_trade_country
from cd_reg_admin
_info
Inherit Regulatory LSSSV Copy Regulatory Application None
Application related attributes from the
Registration relevant Regulatory Application
Form Info To Registration Forms to a new
Correspondence Correspondence document, based
on the application_description
setting.
143
Attribute Inheritance and Cascading Attribute Rules
144
Attribute Inheritance and Cascading Attribute Rules
$arg(old_country_code
) = previous country
code value
$arg(old_area_code)
= previous area code
(U.S. state code, where
applicable)
$arg(new_country
_code) = new country
code value
$arg(new_region) =
new region code value
$arg(new_area_code) =
new area code (U.S.
state code, where
applicable)
145
Attribute Inheritance and Cascading Attribute Rules
$arg(new_site_name) =
new site name / label
Rename Site-level LSTMF Same as Rename Site. (It is part of Same as Rename Site.
TMF Documents the same operation.)
Update Clinical LSTMF Invokes through the Update Trial None
Trial Info Info lifecycle action for Clinical
LSRD Trial Registration Forms. It
reapplies updated inherited clinical
trial information to the relevant
documents.
Update Product LSTMF Invokes through the Update None
Info Product Info lifecycle action
LSRD for Product Registration Forms.
It reapplies updated inherited
LSSSV
product information to the
relevant documents and associated
registration forms.
146
Attribute Inheritance and Cascading Attribute Rules
replace corresponding
product_generic_name,
product_compound_id,
product_chemical
_names, inn_names
from cd_reg_admin
_info
replace corresponding
product_trade_name,
product_trade_country
from cd_reg_admin
_info
147
Attribute Inheritance and Cascading Attribute Rules
replace product_codes
Update LSSSV Copy product-related attributes replace corresponding
Regulatory from the relevant RARFs of the application_description
Activity activity package to Correspondence as trade_name
Product Info To documents. _applications,
Correspondence product_trade_name,
product_trade_country
from cd_reg_admin
_info
replace corresponding
indication, application
_description as
indication_applications
from cd_reg_admin
_info
1. To create a custom attribute inheritance rule, log in to D2 Client as a member of the Controlled
Document Administrators group (cd_admingroup) or the installation owner (for example,
dmadmin).
2. Select New > Content from the menu bar.
3. In the Creation profile field, select System Adminstration.
148
Attribute Inheritance and Cascading Attribute Rules
4. In the Document Type field, select Auto Inherited Attributes Rule and click Next.
5. On the Edit properties page, in the Configuration name field, type a name for the attribute
inheritance rule. For example, Reapply D2 Configurations. Type an optional description to
explain its purpose. The system automatically enables the rule by default.
6. On the Rule applicability tab, configure the rule as described in the following table:
Field Description
Automatically applies to new Do not select this option. You should only enable it for
objects rules that the system automatically applies when users
create documents. For this rule, you invoke it only when
necessary using a DQL query.
Order of precedence Leave this option set to 1- High. You can ignore this
option because it only applies when multiple rules apply
to the same object to ensure that the rules apply in the
appropriate order.
Applies to object types Select the wildcard symbol * (any object type). This field
defines the scope of the rule by identifying the documents
in the repository that apply to the rule.
When you execute the rule, you specify the cabinet and
folder path of the top-level folder containing the relevant
documents explicitly in a –folder_path method
argument. It is not necessary to specify an –id argument
(a context object) in this field.
149
Attribute Inheritance and Cascading Attribute Rules
7. Skip the Inherit from tab. This tab specifies the source objects used for copying inherited
attributes and this rule does not apply inherited attributes.
8. On the Inherit to tab, specify the target objects to process as described in the following table:
Field Description
Inherit to Type a DQL qualifier (of the form <object-type>[(all
)] where <criteria>) that identifies the target
objects to update.
9. Skip the Attributes and Folder updates tabs. They do not apply to this rule.
10. On the Deletion tab, select Delete empty folders. When documents auto-file to new locations,
the system deletes any remaining empty folders automatically, including any empty parent
folders along the chain towards the cabinet level.
11. On the Post-processing tab, type the wildcard symbol * in the following Reapply D2
configurations fields:
• Reapply D2 autonaming where
• Reapply D2 autolinking where
• Reapply D2 security where
Leave the Custom processing fields blank.
12. On the Auditing tab, type or select values in the fields.
13. Click Next. The system creates the rule in the /System/CDF/Auto Attribute Inheritance Config
folder with the other rules.
150
Attribute Inheritance and Cascading Attribute Rules
14. To execute the rule, log in to Documentum Administrator or IDQL as the repository installation
owner (for example, dmadmin) and issue a DQL statement of the following form:
execute do_method with method = 'CDFApplyAttributeInheritanceMethod',
arguments = '-docbase_name <repository-name> -user_name dmadmin
-password "" -auto_inherit_config "Reapply D2 Configurations"
-folder_path "<repository-folder-path>'
You do not need a password if you execute the query from the Documentum Server as the
installation owner.
For example:
execute do_method with method = 'CDFApplyAttributeInheritanceMethod',
arguments = '-docbase_name documentum -user_name dmadmin -password
"" -auto_inherit_config "Reapply D2 Configurations" -folder_path
"/Clinical/Cardiology-Vascular/Vasonin"'
The method should return a success code when it is completed. The Java Method Server
Applications log file provides additional details. For example:
%DOCUMENTUM%\jboss7.1.1\server\DctmServer_MethodServer\logs\ServerApps
.log
You can tune the performance of the method by adjusting the max_threads and content_servers
settings in the Documentum D2 System Parameters dictionary.
151
Attribute Inheritance and Cascading Attribute Rules
152
Attribute Inheritance and Cascading Attribute Rules
The automatic_events setting has been added to the main Rule Applicability tab, after the
Automatically applies to new objects setting. This setting becomes visible only if the Automatically
applies to new objects option is selected. A D2 dictionary, Auto Inheritance Event Codes, has been
defined for this, with the values “Create” and “Update”.
In the CDF ApplyAttributeInheritance server method:
• An optional –event argument enables the running of automatic rules based on specific events. The
default value for this is “Create”. This is to simplify the lifecycle configuration. It enables lifecycle
transitions to invoke the method for either “Create” or “Update” events, as appropriate, without
having to specify the rule names explicitly. If the rules to apply are not specified explicitly in the
–auto_inherit_config argument, it identifies all rules applicable to the selected object type that
are enabled, designated as automatic, and have a matching event code in the automatic_events
list. For backwards-compatibility, automatic rules without automatic_events defined for them
are considered as “Create” events.
• The auditing precondition rules are applied to each modified target object, and a post-processing
task generated for each qualifying object. These can then be submitted for processing in parallel or
distributed manner.
• The existing task processing method for each object has been extended to generate an audit
trail entry for each modified object.
153
Attribute Inheritance and Cascading Attribute Rules
154
Chapter 8
Workflows
Workflows comprise tasks that provide business logic to the lifecycle phases and pass content from
one state to another.
Workflow Roles
The Life Sciences solution provides a number of predefined, generic workflow templates and
associated D2 workflow configurations to process controlled documents. The number of states and
workflow task performers depends on each workflow.
Note: Workflow availability is determined by the type of document.
The typical workflow initiator for a workflow is a Document Author or Coordinator. A workflow
initiator is also the workflow supervisor. When an initiator starts a workflow, a dialog box appears
where the workflow task performers must be specified. It is not possible to start a workflow without
specifying the performers.
The workflow task performers must be of the following roles:
• Document Author
• Reviewer
• Format Reviewer
• Approver
• QO Approver (LSQM only)
• Coordinator
• TBR Reader (LSQM only)
Workflow Diagrams
The Life Sciences solutions provide generic workflows to process Control Category 1-3 documents.
Workflows are composed of tasks that provide business logic to the lifecycle phases and pass content
from one state to another.
155
Workflows
The system automatically performs the tasks indicated by the icon. Users with specific roles
perform the tasks with the icon. Users perform the tasks described in the following table:
156
Workflows
157
Workflows
Reviewers have the option to either accept the task using the Reviewed and
No Revision Required option or reject the task using the Reviewed and
Revision Requested option. If any of the Reviewers reject the document,
the system creates a new minor version of the document and a task for the
Author to acknowledge.
Acknowledge The Author accepts the document rejected by the Reviewers and
acknowledges that revisions need to be made to the document, which is
in the Draft state.
158
Workflows
Recall Document
There are two recall workflows that trigger when a recall operation is initiated through the
Controlled Print widget, based on the whether the recipient of the controlled copy of the selected
document is internal or external. These workflows are configured for Documentum for Quality and
Manufacturing only.
• Internal Recipient Recall Workflow
If the controlled copy being recalled was sent to an internal recipient, then this workflow is
initiated. In this workflow, the internal recipient is required to sign-off on the recall operation
by providing an electronic signature. On completion, the Requestor Acknowledgment task is
sent to all members of the cd_controlled_print group. After any member of cd_controlled_print
acknowledges the recall task, the workflow ends and the controlled copy is considered recalled.
159
Workflows
160
Workflows
161
Workflows
162
Workflows
163
Workflows
164
Workflows
165
Workflows
Configuring Workflows
Each workflow has a workflow template and a corresponding configuration in D2-Config. To
configure a workflow, follow these steps:
1. Log in to D2-Config.
2. Select Go to > Workflow.
3. Select the workflow you want to configure.
4. In the Task configuration section, you can update the following field:
166
Workflows
Field Description
Subject Subject of the task that appears in the inbox of the performer.
Message Detailed message that explains the task.
Manual acquisition Acquire the task with or without a click action from the user.
Electronic signature Include an electronic signature with the message.
Audit Enable or disable auditing.
Task follow up Task duration can be defined on the this tab. Reminder notifications
can be configured by specifying the number of days before which the
reminder has to be sent and to whom.
In the figure, the highlighted header row indicates the various workflow dictionaries for each Control
Category. For an artifact, a dictionary value for the workflow can either be N (No) or Y (Yes).
For example, for the Debarment Certification artifact, the Collaboration workflow for Category 1
documents is set to N, indicating that the workflow is not enabled. If the value is set to Y, when you
right-click the document in the D2 Client, the Collaboration workflow option appears in the menu.
Before assigning workflows, start with the Object Type to Taxonomy mapping dictionary. This
dictionary holds for each object type of the document the taxonomy/dictionary that is used to
configure the workflow mappings. The alias, is dictionary, is used to specify the value mentioned in
value alias is the dictionary name or the taxonomy name. Check the value specified for the object type
and its dictionary alias setting. Then check that dictionary or taxonomy for the workflow mappings.
To assign workflows:
1. Log in to D2 Config as an administrator.
2. On the main toolbar, select All elements in the filter list.
3. Click Data > Taxonomy.
4. Under Taxonomies, select a Classification by Group taxonomy.
5. In the dialog box, select Modify and save a copy of the sheet on your machine.
6. In the taxonomy Excel sheet, enable or disable the workflow dictionaries for the required artifacts
by setting the values as N or Y.
7. Import the updated sheet.
8. Click Save.
167
Workflows
168
Workflows
In addition to task performers stored in the document attributes (Author, Reviewer, and so on),
another set of attributes are provided, which are unique to workflows. These attributes include a
“wf” prefix such as wf_authors, wf_reviewers, wf_approvers, and so on. These attributes
are referred to as the actual workflow performers. The non-workflow performers or attributes do
not include the “wf” prefix.
When a workflow is started, the non-workflow attribute values are copied to the corresponding actual
workflow attributes. Thereafter, these new attributes take part in the workflow. Any change to the
actual workflow attributes does not affect the corresponding non-workflow attributes. The actual
workflow performers have the necessary permissions on the document only when the document is in
the state where the performer can perform the task.
You can use the Update performers option to update the actual workflow attributes. The modified
values are not stored in the non-workflow attributes and cannot be viewed in the document
properties. Instead, they are stored only as part of the workflow attributes for the workflow.
1. Log in to D2-Config.
2. Select Go to > Mailing list.
3. Under Mailing lists, select Workflow Completion.
4. In the Recipients field, you can add or remove the user groups that should receive the notification.
5. In the Email subject en and Email message en field, you can modify the notification message.
6. Click Save.
1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Go to > Workflow.
169
Workflows
1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Go to > Menu D2.
4. Under Menus, select CDF Contributor Menu.
5. Under Contextual menus, click <Right click> and select Self-Approve.
6. Under Menu properties, select Hide this entry.
170
Workflows
7. Click Save.
171
Workflows
172
Chapter 9
Lifecycles
This section describes the lifecycle state of a document in the Life Sciences solution.
Document Lifecycle
A document created within the Life Sciences solution has predefined states in which it can be present
at any given point of time. These states, as a whole define the lifecycle of the document. Lifecycles are
a very important part of this solution, as security and workflows are directly linked to it. Standard D2
lifecycle configurations are provided as part of the CDF layer to support each of the four security
categories (Category 1-4). Additionally, lifecycles are defined for Change Request, Product and
Project registration forms, and so on.
The lifecycle models for Category 1-3 controlled documents are designed to support the authoring,
review, approval, and controlled release of documents, and also post-release management operations
such as suspension, expiry, superseding, and withdrawal of previously-released versions. They are
closely-related to the corresponding review / approval workflows. The following table shows the
document lifecycle states associated with the document control categories:
173
Lifecycles
174
Lifecycles
175
Lifecycles
176
Lifecycles
177
Lifecycles
The following figure illustrates the lifecycle state transitions in the Control Category 0 Change
Request Lifecycle Model:
The following figure illustrates the lifecycle state transitions in the Control Category 1 Documents
Lifecycle Model:
178
Lifecycles
The following figure illustrates the lifecycle state transitions in the Control Category 2 Documents
Lifecycle Model:
179
Lifecycles
The following figure illustrates the lifecycle state transitions in the Control Category 3 Documents
Lifecycle Model:
180
Lifecycles
The following figure illustrates the lifecycle state transitions in the Control Category 2 Documents
Lifecycle Model:
181
Lifecycles
The following figure illustrates the lifecycle state transitions in the Control Category 3 Documents
Lifecycle Model:
182
Lifecycles
The following figure illustrates the lifecycle state transitions in the Control Category 2 Documents
Lifecycle Model:
183
Lifecycles
The following figure illustrates the lifecycle state transitions in the Control Category 3 Documents
Lifecycle Model:
184
Lifecycles
185
Lifecycles
The transition to any state can be restricted with the help of different checks. Uniqueness checks,
which are conditions, can be posed to a document, such as “validate document is not checked out”.
Any number of uniqueness checks can be created and restrictions can be applied based on any of
them.
There are two types of checks available:
• Entry Condition: This is a generic check that can be applied to enter a state. In order to enter
this target state, the checks specified in the ‘Entry conditions” section must be met. Here, the
source state is any other state.
• Transition Condition: This is a specific check that can be applied from a fixed source state to a
fixed target state. This option is present at the bottom of the lifecycle configuration page. First the
target state has to be selected from the “Next State” section, and then the transition condition can
be selected from the drop-down list.
186
Lifecycles
187
Lifecycles
188
Chapter 10
Security
This section describes the security model used in the Life Science solution.
Permissions
The Life Sciences solutions configure the permissions of documents based on the user role and the
document category. Permissions are defined for each item in the repository. Permissions identify the
security level needed for a group or user to access the item and their allowed actions.
189
Security
190
Security
191
Security
The permissions for Control Categories 1-3 documents are listed in the following table for the
recipients, readers, and auditors roles:
192
Security
The permissions for Control Categories 2-3 documents are listed in the following table for the
Readers, Auditors, and Regulatory Affairs roles:
193
Security
The permissions for Control Category 4 documents are listed in the following table:
The permissions for Registration Forms are listed in the following table:
These permissions apply to the access control groups on all registration forms. Form managers define
the access control groups on the Access Control tab of the Registration Form properties.
To manage the registration form, the manager must be listed on the Managers list. For example, in the
Product Registration Form, the Product Managers product_mgr1, product_mgr2, and product_mgr3
have the DELETE permission.
To access the registration form, the user group must be on the Primary User Groups list. The
cd_clinical, cd_non_clinical, and cd_quality groups have the RELATE permission.
Users or groups not listed on the Access Control tab of the Registration Form have the NONE
permission.
194
Security
You can reconfigure the security categories for the various artifacts by changing the Default values
template settings in the creation matrix. However, you must ensure that the corresponding lifecycle
model is assigned in the Lifecycle column in each case.
After the correct category value is assigned, the corresponding lifecycle and security model can be
applied through the configuration matrix as shown in the following figures:
195
Security
Note that there are four separate lifecycle configurations defined for each of the four control
categories, but only two control configurations are defined: one for Category 1-3 controlled
documents and a separate one for Category 4 documents. This simplifies the reconfiguration process
and maintains a consistent security model, so that access to documents in specific lifecycle states is
defined uniformly. For example, Authors always have DELETE permits on “Draft” documents,
irrespective of the security category. However, these documents may have different lifecycles and
workflows associated with them, depending on the security category in each case.
196
Security
without incurring performance penalties. This also makes the system easier to maintain as changes to
role group members immediately affect all documents that use them. If a very large number of role
groups are defined, this can also lead to performance issues, particularly for users belonging to many
groups because of the way in which the Documentum Server filters out objects that the user should
not see while navigating and searching the repository. (As a general rule, if a user belongs to no more
than 20-30 groups this does not cause any significant performance problems, but if they belong to 200
groups or more the performance degradation can be significant.)
197
Security
Folder Security
Cabinets and folders may be created automatically by D2 as a result of auto-filing/auto-linking
rules. Access to these folders is governed by separate security configurations that are referenced in
the auto-filing/auto-linking path configuration at each level. This enables access to be restricted to
users in the appropriate functional area master groups, and for cabinets/folders to be hidden to users
in other groups. For example, the auto-linking path for Clinical documents use the Clinical Folder
Security Model configuration, which grants DELETE access to members of the cd_clinical group and
sets the default (dm_world) permit to NONE for all other users, so that only users in the cd_clinical
group can see these folders (apart from the admingroup that is, who, being privileged users, can
always access all folders in the repository). Note that this does not mean that cd_clinical users can
delete any folder in the “Clinical” cabinet – they can only delete empty folders.
Similarly, the auto-filing path for Change Requests (CRQs) uses the Change Request Folder Security
Model configuration at each level, which grants public DELETE access to all users. This enables any
user to create a change request and access it through the Change Requests cabinet/folder structure. It
does not necessarily mean they can delete any change request folder; they can only delete empty
folders. For example, by deleting CRQs that they themselves have raised but not yet submitted, then
deleting the now-empty folder(s) that were created for them.
198
Security
be supported are centrally-configurable, and the access levels to be granted to each role on specific
artifacts (document types) in the TMF are also centrally-configurable, according to the security
requirements of the business. In this way, the registration of external users is simplified and devolved
to local Administrators, and the predefined security model is enforced automatically by the system.
199
Security
If an external user is registered for access at a higher level than for an individual site, that is at the
country or trial level, the registration is assumed to apply to all of the relevant sites for that country or
trial. In other words, their registration fans down to the site level. Likewise, if a TMF document is
created at the country, trial or product level, then it is assumed to apply to all of the relevant sites. So,
if a user has been granted access to a particular artifact at the site-level, they also have the same level
of access to the same artifact at the country, trial and product level. In other words, their access also
fans up to the relevant artifacts at the peer levels.
Note that for logistical reasons it is not possible to register external users for compound or
product-level access to all sites across all studies or trials associated with a particular product using
this mechanism. Doing so would impact the system extensively whenever a product-level registration
is updated. However, it may be possible to accomplish this through other means, for example, by
managing access to individual documents manually on an adhoc basis.
200
Security
Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Application <current-user> cd_regulatory No Access Control
Managers _managers
(form_managers)
Application Users cd_regulatory cd_ad_promo, No Access Control
(form_users) _users cd_clinical,
cd_corres,
cd_labeling,
cd_non_clinical,
cd_quality,
cd_regulatory,
cd_safety
Submission <current-user> cd_regulatory Yes Access Control
Managers _managers
(submission
_managers)
Submission Users cd_submission cd_ad_promo, Yes Access Control
(submission _users cd_clinical,
_users) cd_corres,
cd_labeling,
cd_non_clinical,
cd_quality,
cd_regulatory,
cd_safety
Document cd_regulatory cd_regulatory No Default users
Coordinators _doc _doc /Groups Control
(doc _coordinators _coordinators
_coordinators)
Reviewers cd_regulatory No Default users
(reviewers) _doc_reviewers /Groups Control
201
Security
Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Approvers cd_regulatory No Default users
(approvers) _doc_approvers, /Groups Control
cd_regulatory
_managers
Readers (readers) cd_ad_promo cd_ad_promo No Default users
_doc_readers, _doc_readers, /Groups Control
cd_clinical cd_clinical
_doc_readers, _doc_readers,
cd_corres cd_corres
_doc_readers, _doc_readers,
cd_labeling cd_labeling
_doc_readers, _doc_readers,
cd_non_clinical cd_non_clinical
_doc_readers, _doc_readers,
cd_quality cd_quality
_doc_readers, _doc_readers,
cd_regulatory cd_regulatory
_doc_readers, _doc_readers,
cd_safety_doc cd_safety_doc
_readers _readers
The security changes in the Regulatory Application Registration Form cascades to new Submission
Registration Forms only.
202
Security
Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Submission <Derived from cd_regulatory Yes Access Control
Managers Regulatory _managers
(form_managers) Application
Registration
Form-(submission
_managers)>
Submission Users <Derived from cd_regulatory, Yes Access Control
(form_users) Regulatory cd_clinical,
Application cd_non_clinical,
Registration cd_quality,
Form-(submission cd_safety,
_users)> cd_labeling,
cd_corres,
cd_ad_promo
The security changes in the Submission Registration Form cascades to the submission folder,
subfolders, and documents.
Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Regulatory <current_user> cd_regulatory Yes Access Control
Activity _managers
Managers
(form_managers)
Regulatory cd_regulatory cd_regulatory Yes Access Control
Activity Users _users _users
(form_users)
Submission <current-user>, cd_regulatory Yes Access Control
Managers cd_regulatory _managers
(submission _managers
_managers)
Submission Users cd_submission cd_submission Yes Access Control
(submission _users _users
_users)
The security changes in the Regulatory Activity Package cascades to the Regulatory Application
Registration Form.
203
Security
Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Folder Managers <Derived from cd_regulatory No Access Control
(folder_managers Submission _managers (non-editable)*
) Registration
Form-(submission
_managers)>
Folder Users <Derived from cd_ad_promo, No Access Control
(folder_readers) Submission cd_clinical, (non-editable)*
Registration cd_corres,
Form-(submission cd_labeling,
_users)> cd_non_clinical,
cd_quality,
cd_regulatory,
cd_safety
* — For submission folders and its children, security can only be adjusted through the Submission
Registration Forms. Therefore, it is non-editable in this Properties page.
Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Folder Managers <Derived from cd_regulatory Yes Access Control
(folder_managers Submission _managers
) Registration
Form-(submission
_managers)>
Folder Users <Derived from cd_ad_promo, Yes Access Control
(folder_readers) Submission cd_clinical,
Registration cd_corres,
Form-(submission cd_labeling,
_users)> cd_non_clinical,
cd_quality,
cd_regulatory,
cd_safety
The security changes on the submission subfolders cascades to subfolders and documents.
204
Security
Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Coordinators <Derived from cd_regulatory Yes Process Info
(doc Submission _managers
_coordinators) Registration
Form-(submission
_managers)>
Readers (readers) <Derived from cd_ad_promo, Yes Process Info
Submission cd_clinical,
Registration cd_corres,
Form-(submission cd_labeling,
_users)> cd_non_clinical,
cd_quality,
cd_regulatory,
cd_safety
Other role-based attributes such as Authors, Reviewers, Approvers, and Auditors are not available in
the Properties page for submission documents.
205
Security
To streamline the security and have it consistent across solutions, the dynamic security framework has
been implemented. This framework allows multiple security models to be in place at the same time. It
also enables users to define additional models with a limited amount of customization and regression.
The framework combines all modifications to the security of a document into a single entry point,
while allowing for document-specific flexibility to be managed by an authorized set of users.
Note: The dynamic security framework is only applicable to LSTMF, LSRD, and the Correspondence
domain in LSSSV. In LSTMF, external roles such as Investigator, Inspector, External Contributor, and
External Reviewer do not use this security framework.
The following diagram illustrates the implementation of dynamic security on documents :
206
Security
Tags
Tags are the lowest-level objects in the framework hierarchy. They define how group names are
constructed and how role attributes are resolved. They can be specific to a set of roles that are used in
the creation of user groups. For example, secTag_roles defines the role values such as doc_author,
doc_reviewer, doc_approver, and so on. The secTag_group_names defines the domain group name
values such as cd_clinical, cd_regulatory, and so on.
Tags can refer to a custom list of values or attributes in the repository or a DQL query that can
generate values. This allows the system to know, not only how to construct the required strings for
group names, but also the appropriate value combinations.
Group Generators
Group generator objects enable the system to define how to create a set of user groups to be utilized
by the security models. Group generators are used to:
• Generate groups
• Set up group parents
• Identify patterns for group names used in security models
By using the generator to generate the groups within the definition of the security models, it ensures
that the groups that are needed to support the model exist.
The group generator is generally called with a set of tag definitions to generate the required groups.
These tag definitions can be refined so as to limit the range of values used, for example, for the initial
creation of the groups in the system.
The default group generators are run by the system during the installation of the solution to generate
the set of user groups. You can also manually run the generator in a more limited scope to generate
additional groups as needed when the range of available values changes (for example, additional
products are created when using a product-based group model). To generate the groups, right-click
the generator and select the Generate Groups menu option.
The product groups are not generated on creation of new product; they are generated when the
product is protected based on the Protect lifecycle action on the product registration form. This helps
avoid lot of groups being created in the system which may not be required by the application. You
207
Security
can also create product groups by selecting the group generator, secGroup_ProductGroups, and
select Generate Groups lifecycle action. This creates the product groups for all the products in the
system at that point of time. However, this is not a recommended approach as it may create lot of
groups in the system that may not be required by the application.
Security Model
The security model specifies an attribute set along with a set of group generators that define how
each attribute is populated with groups based on the values of attributes identified by the tags
used in the group generator. These attributes can exist on a single common object shared by many
documents (security model template) as well as on the document. This is used in cases where the
values need to be shared across all objects that have the same set of parameters. Examples include
groups for user selection, groups for ACL security.
The attributes can also exist on the document. In this case, the security model is used where
values need to be initialized similarly across objects that have the same set of parameters but are
manipulated on a document by document basis, or use cases where offset values (that is, not stored on
the document itself) will not function within the D2 framework. For example, the default user/groups
to be set on participants within a workflow.
Security models are used to:
• Define the rules for setting role attributes on documents and registration forms.
Note: In the current system, the security model for registration forms is not supported.
• Define the model as standard or restriction-based (document specific).
• Define the groups that can change the model of the document.
• Define the groups that can change the role attributes on the document.
• Define which role attributes can be modified.
• Define how to calculate what documents should share the shared attribute object. Shared attribute
objects are contentless objects that contain the group information in their attributes and can be
referred to by many documents. This way the information need only be stored once. The link
between the document and the object is through a foreign key in the document pointing to the
shared object. This connection is calculated through a CDF function. The function is evaluated as
a string. Therefore, if you use repeating attributes and the $list function, the sequence will matter.
The model defines whether the document-specific values are allowed and who can set those
values (which are combined with the values on the shared object). If no group is assigned, then no
one has rights to change it.
• Define role attributes that are used by the model:
— Set on the document or on the shared attribute object
— The group generator is used to resolve the group name(s)
208
Security
209
Security
The folder security model specifies, using a tag value that is set on the folders when they are created,
the groups that are to be set in the Readers attribute of the folder. The folder security information is
stored in the keywords attribute, which in turn drives the security of the folder. The folder security
model is specified in each security model definition that is assigned to a document. Security at the
folder level is applied based on the first document that the user creates.
In D2-Config, the folder security model is defined in a single security template, Generic Folder
Security Model across domains. This eliminates the need to define multiple security templates
for each domain, which can be cumbersome.
To enable documents that are linked to multiple folders to identify which of the repeating values
to use, the definition also includes the attribute that drives the folder name, the level at which the
folder is created, and the transformation used to generate the folder name (if not a simple application
of the attribute value).
Shared Objects
Shared objects are used to optimize the data on the document and make it easy to maintain, when the
same data is shared across documents.
The shared objects automatically are created by the system when the user creates the first document
using the shared security model, that is, the attribute location defined in the model is set to Shared
instead of Document. By default, all document data is stored in shared objects and each domain
will have one shared object. The Administrator can view all the shared objects in the system using
the Shared Objects widget.
210
Security
Field Description
Tag Definition Name The name of the tag. This is used to reference
the tag in the other configuration settings such
as group generators and security models.
6. Click Next. The tag object appears in the Security/Tag Definition folder.
211
Security
You can view the properties of the tag object in the Properties widget. Click Edit if you want to edit
the values in the tag object. To delete a tag object, right-click it and select Delete.
212
Security
Field Description
Name of the Generator Name of the group generator. This is the
name that will be used in security models to
identify which groups should be assigned by
combining the pattern in the group generator
and the values of the document attribute set
in the tag.
213
Security
Field Description
Pattern for group admin group Pattern for the group that can invoke group
administration from this group generator. For
example, [group_name]_groupadmin. This
pattern should be fully resolved if the child
is fully resolved.
Hierarchy of Tag Processing The list of tags used in the generator in the
order they should be processed.
Can be used in Security Models A flag indicating if this generator should
appear in the security model pages. Clear the
selection for grouping groups that are never
used directly in models.
6. Click Next. The group generator object appears in the Security/Group Generator folder.
Field Description
Name Name of the folder security model object.
Folder Security Model Type of security model to be applied to
the folders. By default, it is either Open or
Product.
Default Level The folder-level tag to use when the tag
defined on the folder cannot be found in the
model, or if the tag defined on the folder is
null. If a folder has a subject value (level tag)
that does not match any of the existing tags,
this is the tag it should be evaluated as.
Use Default for Empty Select this option if folders with blank folder
tags should use the default folder level.
Attribute for Roles Attribute to store the resolved groups. By
default, the values are stored in the keywords
attribute.
Folder Level This is the level tag that is used to match on
a folder’s subject attribute to determine what
groups to store in the folder’s role attribute.
214
Security
Field Description
Group Generator This is the group generator to use to figure out
which groups to place in the role attribute. The
document that triggered the creation of the
folder will be used to evaluate all $ functions.
Tag These are the repeating attributes of the
document that are used in the group generator
to calculate the value that caused the creation
of the folder (the value must be used to create
the folder name) and figure out which groups
to assign. The values can be separated by a
pipe delimiter.
Tag Generator If the autolinking configuration used a
function or dictionary to create the folder
name, a corresponding function must be
defined here to make a match. Use ~tag to
refer the value of the attribute identified
in the tags list. For example, if there was
a country name using the TMF Countries
lookup—$lookup(~tag.TMF Countries,en) The
values can be separated by a pipe delimiter.
Tag Level For each tag, specify set of numbers that
indicate the folder level at which the
corresponding attribute is used. The values
can be separated by a pipe delimiter.
6. Click Next. The group generator object appears in the Security/Folder Model folder.
Field Description
Security Model This is the name that is used to refer to the
model.
Type of Model Type of object on which the model is to be
applied, document or registration form.
215
Security
Field Description
Group for Document Admin Pattern to determine group allowed to
change the role attributes on the document.
For the Manual security model, this is
restricted_attribute_admin.
List of Modifiable Document Attributes List of attributes on the documents that can be
edited by the document admin. By default,
only the Authors attribute is editable.
Group for Model Admin Pattern to determine group allowed to change
the model for the document through the
change model menu action. Generally, the
target models would be restricted models.
216
Security
Field Description
Role Attribute Name of the attribute that will store the group
values. Preferably should be a repeating
attribute. If the same attribute appears more
than once, it will be updated with a union of
all the rows.
6. Click Next. The group generator object appears in the Security/Model folder.
217
Security
5. To edit the parent generator pattern for a group generator, right-click the generator and select
Copy Parent Generator. This clears the Use parent Generators for patterns option and copies the
parent generator patterns into the Parent Patterns field so that they can be edited. You can use
this to break the pattern link to the parent, while getting an initial setting to modify.
6. Right-click a generator and select Link to Parent Pattern to link to a parent generator. This
action ensures that the Use parent Generator for patterns option is selected and the values in
the Parent Patterns field are cleared.
218
Chapter 11
Workspaces and Welcome Pages
This section describes the workspaces and welcome pages used in the Life Science solutions.
Workspaces
A workspace is a container of widgets that allows you to personalize functionality for availability and
convenience. D2 includes preconfigured workspace templates. These templates contain the layout
and positioning of widget areas. The templates also come with a predetermined set of widgets.
Administrators can configure workspaces to contain workspace views. Views function the same
way as workspaces but provide a method for organizing widgets without losing widget-to-widget
interaction.
In the Life Sciences solution, the organization of the D2 workspace is based on user roles. This means
that all role-based view contexts, widgets, menus, and query forms are assigned to a single workspace
that includes the embedded views necessary for users in that role to perform their tasks and activities.
219
Workspaces and Welcome Pages
• Manage dictionaries
• Manage taxonomies
• Edit a document
• View a document
• Insert annotations
• Locate a document
• Manage Favorites
• Print a document
• Request a rendition
• Delete a document
• Create a relation
• Task
• Dashboard
Reviewer/Approver • Welcome The tasks that can be performed in this workspace
are similar to the Author tasks, except that
• Browse Reviewers/Approvers cannot create or import
documents.
• Task
Consumer • Welcome The following tasks can be performed in this
(read-only) workspace:
• Browse
• Navigate the cabinets and folders
• View a document
• Import a document
• Locate a document
• Manage Favorites
• Export a document
• Print a document
221
Workspaces and Welcome Pages
• Locate a document
• Manage Favorites
• Task
• Compare
222
Workspaces and Welcome Pages
Workspace Groups
Workspace groups provide users with the flexibility to reassign the same workspace to multiple user
roles and not bound to the primary role groups that exist in the default Life Sciences solutions.
Workspace groups provide a mechanism to separate UI-based groups from security groups so that
workspaces can be assigned to the UI groups and not the security groups. The following table lists
the workspace groups and the user roles that are part of that group:
223
Workspaces and Welcome Pages
cd_ad_promo
_template_authors
cd_clinical_doc
_authors
cd_clinical_template
_authors
cd_labeling_doc
_authors
cd_labeling_template
_authors
cd_md_clinical_doc
_authors
cd_md_doc_authors
cd_md_non_clinical
_doc_authors
cd_md_regulatory
_doc_authors
cd_non_clinical_doc
_authors
cd_non_clinical
_template_authors
cd_quality_doc
_authors
cd_quality_template
_authors
cd_regulatory_doc
_authors
cd_regulatory
_template_authors
cd_safety_doc_authors
cd_safety_template
_authors
cd_ad_promo_doc
Workspaces and Welcome Pages
cd_corres_template
_authors
cd_submission
_archivists
ws_lsssv_consumers LSSSV consumers Users of the following
workspace group groups:
cd_corres_consumers
_imp
cd_corres_doc
_auditors
cd_corres_doc_readers
cd_regulatory_activity
_monitors
cd_regulatory
_consumers_imp
ws_lsssv_coordinators LSSSV coordinators Users of the following
workspace group groups:
cd_corres_doc
_coordinators
ws_lsssv_managers LSSSV managers Users of the following
workspace group groups:
cd_corres_managers
cd_product_managers
cd_regulatory_activity
_managers
cd_regulatory
_managers
ws_lsssv_reviewer LSSSV reviewer and Users of the following
_approver approver workspace groups:
group
cd_corres_doc
_approvers
cd_corres_doc
_reviewers
cd_corres_template
225
_approvers
Workspaces and Welcome Pages
cd_gmp_auditors
cd_gmp_consumers
_imp
cd_gmp_readers
ws_lsqm_coordinators LSQM coordinators Users of the following
workspace group group:
cd_gmp_coordinators
ws_lsqm_reviewer LSQM reviewers and Users of the following
_approver approvers workspace groups:
group
cd_gmp_approvers
cd_gmp_qo_approvers
cd_gmp_reviewers
ws_lsqm_managers LSQM managers Users of the following
workspace group groups:
cd_md_managers
cd_md_regulatory
_managers
cd_md_submission
_managers
226
Workspaces and Welcome Pages
cd_clinical_doc
_authors_tmf
cd_clinical_tem
_authors_tmf
ws_lstmf_consumers LSTMF consumers Users of the following
workspace group groups:
cd_clinical_consumers
_imp
cd_clinical_doc
_auditors
cd_clinical_doc
_readers
ws_lstmf LSTMF coordinators Users of the following
_coordinators workspace group group:
cd_clinical_doc
_coordinators
ws_lstmf_contributors LSTMF contributors Users of the following
workspace group groups:
tmf_contributors
tmf_external
_contributors
ws_lstmf_managers LSTMF managers Users of the following
workspace group group:
cd_clinical_trial
_managers_tmf
cd_<domain>_ref
_copy_mgrs
ws_lstmf_product LSTMF product Users of the following
_managers managers workspace group:
group
cd_product_managers
ws_lstmf_inspectors LSTMF inspectors Users of the following
workspace group group:
tmf_inspectors
ws_lstmf LSTMF investigators Users of the following
_investigators workspace group group:
tmf_investigators
227
ws_lstmf_reviewer LSTMF reviewers and Users of the following
_approver approvers workspace groups:
group
Workspaces and Welcome Pages
228
Workspaces and Welcome Pages
Workspace Name Welcome Screen Browse Task Quality Check My Sites Dashboard Administration Concurrent View eTMF
WS eTMF Author Y Y Y Y N Y N N N
Workspace
WS eTMF Y Y Y N N N N N N
Consumer
Workspace
WS eTMF Y Y Y Y N N N N N
Contributor
Workspace
WS eTMF N Y Y Y N Y Y N N
Coordinator
Workspace
WS eTMF Inspection Y N N N N N N Y Y
Workspace
WS eTMF N Y Y N N Y N N N
Multi-view
(Coordinators)
WS eTMF N Y Y N N Y Y N N
Product Managers
Workspace
WS eTMF Y Y Y N N N N N N
Reviewer-Approver
Workspace
WS Life Sciences Y N Y N Y N N N N
TMF Investigator
Workspace
229
Workspaces and Welcome Pages
230
Workspaces and Welcome Pages
231
Workspaces and Welcome Pages
Workspace Name Welcome Screen Browse Task View Submission Dashboard Compare Administration
WS SSV Multi-view Y Y Y Y Y Y N
(Authors)
WS SSV Multi-view Y Y N Y N Y N
(Consumers)
WS SSV Multi-view N Y Y Y Y Y Y
(Managers)
WS SSV Multi-view Y Y Y Y N Y N
(Reviewer-Approver)
232
Workspaces and Welcome Pages
Display Labels
Labels can be set to show different values for attributes than those that are stored in Documentum
Server. The Life Sciences solution utilizes this functionality to change the way the status label for
Effective documents displays for non-Good Manufacturing Practices (GMP) documents. The display
label is set to show Approved or Final instead of Effective for these documents despite the attribute
being saved in Documentum Server with an a_status of Effective.
Welcome Pages
Welcome pages in the Life Science solution are .jsp files that are called through an external widget
within D2. When the user first logs in to the D2 Client, the Welcome view is displayed with a single
external widget that points to these .jsp files. In addition, there is a utility .jsp file for each Welcome
page that contains the DQL queries and a CSS file that contains the style information. There also
exists an imgs folder that contains the images displayed on the pages.
There are three basic identical Welcome pages for the Documentum for Quality and Manufacturing,
Documentum for Research and Development, and Documentum Submission Store and View for the
following roles:
• Authors
• Reviewers/Approvers
• Readers
Documentum for eTMF includes Welcome pages for the following roles:
• Author
• Contributor
• Inspector
• Investigator
• Reviewers/Approvers
• Readers
Each Welcome page includes a menu bar and a quick action bar. The menu bar changes based
on each workspace. The menu bar provides an actionable button for one view available in the
workspace, which is displayed on the top left of the Welcome page. Clicking this button switches the
user to the specified view.
The menu bar also includes the following buttons with numbers next to them indicating the number
of documents the user needs to address:
• Tasks: Number of workflow tasks assigned to the user. This button appears for all solutions.
• Index: Number of documents the user has uploaded but not yet indexed. This button appears
only in Documentum for eTMF.
• My Sites: Number of key documents uploaded to a site. This button appears only in Documentum
for eTMF.
233
Workspaces and Welcome Pages
These values do not update dynamically. You must refresh the D2 Client page to update the values.
The quick action bar includes the following two buttons:
• Create: Invokes the D2 create_object method and displays the Creation Profile interface
to the user.
• Import: Invokes the D2 import_object method and displays the Import Creation Profile
interface to the user.
234
Chapter 12
Reports
This section provides an overview of reporting implemented in the Life Sciences solutions.
Information Hub
The OpenText Information Hub (iHub), a part of the OpenText Analytics Suite, is a scalable analytics
and data visualization platform that enables IT leaders and their teams to design, deploy, and manage
secure, interactive web applications, reports, and dashboards fed by multiple data sources. iHub
supports high volumes of users and its integration APIs enable embedded analytic content in any
app, displayed on any device.
OpenText Analytics Designer is a tool used to develop and design the reports for the Life Sciences
solutions with help of the Documentum for Life Sciences JDBC connector. Using the Analytics
Designer reporting tool, you can also publish reports directly to iHub. The JDBC connector provides
a connection interface between Documentum platform with the Analytics Designer and iHub. The
JDBC connector is bundled with the Documentum for Life Sciences iHub reports package.
The following optional steps enable the design and deployment of custom iHub reports through
iHub Analytical Designer.
1. One component provided by Documentum for Life Sciences iHub Reports is a Documentum
JDBC connector. This connector enables extraction of repository information through DQL
queries. The Documentum JDBC driver JAR needs dfc.properties to be bundled within the
JAR due to a known limitation. The dfc.properties is bundled inside the Documentum JDBC
235
Reports
236
Reports
• jsr173_api.jar
• krbutil.jar
• log4j.jar
• messageArchive.jar
• messageService.jar
• questFixForJDK7.jar
• subscription.jar
• vsj-license.jar
• vsj-standard-3.3.jar
• workflow.jar
• xtrim-api.jar
• xtrim-server.jar
These JARs must be copied from the Documentum Server. Some of the specific JAR file versions
may vary with the Documentum Server version.
6. In the iHub Analytics Designer, create a new project and Data Object.
7. Navigate to the .datadesign file and in the Data Sources section, right-click and select New
Data Sources.
8. On the New Data Source page, select JDBC Data Source and click Next.
9. On the New JDBC Data Source Profile page, select Manage Drivers.
10. On the Manage JDBC Drivers page, on the JAR Files tab, click Add.
11. Select the lsdmjdbc.jar file provided with this package and click Add.
12. Select all the required JARs listed in Step 1 and add them as part of the JAR Files. Click OK.
13. On the Create a new data source page, in the Driver Class list, select com.documentum.ls.oca
.jdbc.jdbc20.ext.DjdbcDriverExt( v7.2).
14. In the Database URL field, type jdbc:documentum:oca:docbaseext@<REPOSITORY NAME>.
where <REPOSITORY NAME> is the Documentum Server repository name.
15. In the User Name and Password fields, you can either provide the Administrator user name
and password (this causes all the reports to be executed using the Admin Session) or configure
as report parameters:
a. If you need to replace the User Name and Password at runtime, remove the default values
from the report parameters.
b. In the Data Design, create two report parameters, username and password with Data type
as String.
237
Reports
238
Reports
239
Reports
c. On the Edit Data Source page, under Property Binding, in the User Name and Password
fields, map the user name and password as newly created report parameters.
d. In the .datadesign file, to replace the user name and password at runtime, place the following
script in the beforeOpen script:
extensionProperties.odaUser = params[“username”].value
extensionPropeties.odaPassword = params[“password”].value
240
Reports
16. Create new Data Sets that provide the DQL queries required to fetch the report data from the
repository. The following sample shows a dynamic mandatory parameter (with Is Required
check box selected) that needs to be passed when executing the report. The parameter should be
the i_chronicle_id of the document selected in D2.
a. In the Edit Parameters dialog box, create another report parameter named chronid, which
represents the Selected Document Chronicle ID and passes it to the document query
retrieving the required document data for the report.
b. Click the data set to open the Edit Data Set dialog box and then click Query.
241
Reports
c. Under Query Text, add the following query for retrieving the required document data
needed by the report:
SELECT
r.object_name AS document_name,
r.title AS title,
r.r_creation_date AS creation_date,
r.r_modify_date AS last_modified_date,
v.alias_value AS status,
r.r_version_label as version_label,
r.primary_group AS "group",
r.subgroup AS subgroup,
r.artifact_name AS artifact,
r.r_object_id as object_id,
r.i_chronicle_id as chron_id
FROM
cd_common_ref_model(ALL) r, d2_dictionary di, d2_dictionary_value v
WHERE
r.i_chronicle_id = '?' and
di.object_name = v.dictionary_name AND
di.object_name = 'Domain Document Status Display' AND
di.alias_name=r.r_object_type AND
v.object_name = r.a_status AND
v.i_position=di.i_position and
r_version_label != ' '
ORDER BY
r.r_creation_date
ENABLE
(ROW_BASED)
d. On the Outline tab in the iHub Analytics Designer, select the newly created data source.
e. For the selected data source, on the Scripts tab, add the following script:
this.queryText=this.queryText.replace("?",params["chronid"].value);
This script ensures that the dynamic content from the Query Text (in this case ‘?’) is replaced
with the added param required to execute the query.
17. To create new report design, you need some draft data to be available in the report designer
preloaded for designing. In the report parameters, username, password, and chronid, provide
default values to create the data object and then save the report design.
Note: These should be reverted after the .data object has been created, otherwise the reports when
packaged will bundle these values as default values. The Default value field must be empty.
18. On the Project Navigator tab, navigate to the newly created .datadesign object, right-click it and
select Generate Data Objects to create the new .data object used in the report design.
1. On the Project tab, right-click the new project and select New > Report. To create a new report
design, see the OpenText Information Hub Designer Guide. For the above example query, the sample
report layout is shown in the following figure:
242
Reports
In the report design, the data object points to the created data object. This needs to be changed to
the Data Design object after the designing is done.
2. After completing the report design, point the data source for the Report Design to the .datadesign
object so that the values are replaced based on the document selected at run time.
3. The Data Object parameters, both new and existing, must be defined as shown in the following
figure:
243
Reports
4. In the report design object, in the beforeOpen field for the .datadesign object, replace the existing
script with the following:
extensionProperties.odaUser = params["username"].value;
extensionProperties.odaPassword = params["password"].value;
myInsight
AMPLEXOR myInsight for Documentum (myInsight) is a third-party reporting tool developed by
AMPLEXOR and is integrated with the Life Sciences solutions to provide reporting functionality.
myInsight can be integrated into the Documentum Administrator, Webtop, or D2 user interfaces,
enabling reports to be run directly from those clients. For the D2 integration, myInsight includes
a web application that can be deployed along with the D2-based Life Sciences solution to provide
reporting widgets.
The myInsight web application dynamically generates reports based on Report Definitions. A Report
Definition is a Documentum object that defines one or more variables and DQL queries that will
generate data for the report. The look and feel and the format of a report are defined through Report
Presentation .xsl files, which are referenced in the Report Definition. Report Presentation .xsl files
can be stored as Documentum objects or can be present on the file system where the JMS (Java
Method Server) is running.
244
Reports
Report Widgets
The Life Sciences solution includes reporting widgets to display reports in the D2 user interface.
When a user selects a document that matches the type of document required for a reporting variable
(such as a Clinical Trial, Site, or Product Registration Form), any reporting widgets that are displayed
in the user’s workspace, updates to display the report defined by the widget.
The following figure shows the configuration of a reporting widget.
The Widget type is External Widget. The Widget url references the myInsight web application and
the URL parameters include login information, the object ID of the select objects, and the object ID
245
Reports
246
Reports
Each resulting r_object_id represents a specific report. The object id for a Report Definition must
be associated with the External Widgets corresponding to that report. For consistency, object
name and location of the report is used for specific reports.
2. Specify the object_id in the External Widget URL. You can also specify the report location instead
of r_object_id.
This configuration is only required if the report is needed in a standalone widget. Other options
include:
• Run the report from the myInsight Reporting widget.
• Run the report from the "Reports by Object Type" widget.
247
Reports
248
Chapter 13
Change Request
The following sections provides steps to disable the Change Request functionality in Documentum
for Quality and Manufacturing, steps to configure Release Pending as the final state for Category 1
documents and steps to prevent a Change Request document from being sent to a workflow when a
non-current version of a document is attached to it.
249
Change Request
250
Change Request
12. Under Transition parameters, remove the values in the Menu label en field and click Save.
13. In the Lifecycle state table, select the Effective row.
14. In the Entry condition table, delete the Condition checked by Method entry condition row that
has the parameter CDFValidateAffectedDocumentHasApprovedCR and click Save.
15. In the Action Type table, delete the Make version action type row and click Save.
16. Select Creation > Default values template.
17. Under Values templates, select GMP Change Request Default Values.
18. In the Properties table, for the cr_document_final_state property, in the Default values field,
append |Release Pending at the beginning of the existing default value. The end value should be
|Release Pending|Effective|Withdrawn|.
19. Click Save.
251
Change Request
Note:
— Next Major Version: If an X.Y version document is bound to a Change Request, the final
status is Effective or Suspended with the next major version, that is, the document version
will be X.0 + 1.0.
— Bound Major Version: If an X.Y version is bound to Change Request, final status is Withdrawn
with the major version, that is, the document version will be X.0.
A CIP Change Request can be closed when the following conditions are met:
• The user requesting the state change must be the Document Coordinator of the Change Request.
• The associated documents must not be involved in any active workflows.
• All of the associated documents must reach its final status. For the change type New or Revise,
it is Next Major Version. For Reinstate and Withdraw, it is Bound Major Version with status
Effective for the change type Reinstate and Withdrawn for the Withdraw change type.
Note:
— Next Major Version: If an X.Y version document is bound to a Change Request, the final
status is Effective or Suspended with the next major version, that is, the document version
will be X.0 + 1.0.
— Bound Major Version: If an X.Y version is bound to Change Request, final status is Withdrawn
with the major version, that is, the document version will be X.0.
Documents can be sent to the Review/Approval, Approval, or Withdrawal workflows only when
there is a valid Change Request created and approved (indicated by the CIP status) for the document.
For example, one CIP Change Request can be used to make the SOP document final only once. Once
the document reaches its final status, a new CR can be created and approved to make any revisions to
the document or to withdraw the document irrespective of the old CR status.
Exceptions: An X.0 version of a document attached to a CIP Change Request is sent to the Withdrawal
workflow. If the Approvers reject the workflow, the version of the document remains in the X.0
Effective state. The Change Request cannot be closed until the document is Withdrawn.
252
Change Request
6. Click Save.
253
Change Request
254
Chapter 14
Controlled Print and Issued Print
The Controlled Print and Issued Print overlay and cover page templates can be created using PDF
acro-forms. PDF acro-forms can be created by creating an Open Office document and then exporting
it to PDF or by creating the PDF using Adobe Acrobat. The following sections describe the permitted
form fields in the templates.
Note:
• By default, Controlled Print and Issued Print are configured for use with Documentum for
Quality and Manufacturing only.
• The Controlled Print functionality is printer driver-agnostic. If you experience any issues during
printing, make sure first that the printer driver is the latest version. If problems still occur,
contact OpenText Global Technical Services and provide the PDF that is failing with the exact
Printer Model and driver information.
255
Controlled Print and Issued Print
To configure the date/time format for #datetime or any date/time data type property, the
D2ControlledPrint.properties file provides a DATE_FORMAT value that can be
used to specify a Java SimpleDateFormat date/time format. It is also possible to use the
$datevalue(<attribute>,"<Java Date Format>") syntax (for example, $datevalue(r_modify_date,
"dd-MMM-yyyy hh:mm:ss a z")) rather than just the name of the attribute in order to specify different
date formats for various properties. However, this only works for properties on the document object
itself and does not support the #datetime computed value.
Overlay Configuration
Overlay configuration is a two-step process:
1. Configure the profile attributes—Attribute is a descriptive label. The profile attributes section in
the Print widget displays the labels as specified in the attribute. Value represents the look-up
parameter. For example, $object_name retrieves the document name. These parameters are used
to retrieve the values from the repository and present them to the user on the profile attributes
section in the Print widget. These values will be printed on the overlay template.
In the current configuration, this process is no longer required to be performed by the system with
the exception of #user_input, which represents the user-input field and needs to be preconfigured
for the overlay template in the profile. Instead, the system can retrieve any property directly from
the document and make it available in the overlay.
Therefore, you can put the document properties directly in the PDF template and avoid
configuring them in the Print Profile. See Creating the Overlay PDF Template, page 259 for the
steps to create the PDF. This follows the same principles of C2 overlays, which is to put the
Documentum property name as the field name, <property_name>, for example, object_name.
For backward compatibility, it also supports property names from the profile. In addition, it
supports $<property_name>, for example, $object_name, and all of the # coverpage variables as
well. In addition, CDF functions that display document status can be added as values in the PDF
template field. For more information about the CDF functions, see the AttributeExpression
256
Controlled Print and Issued Print
JavaDocs. For an example of how to add CDF function in the PDF field, see Creating the Overlay
PDF Template, page 259.
If you want to specify a value on the Print widget at run-time, you need to configure that value
in the Print Profile.
2. Configure the PDF template—The overlay template acro-fields need to have the same strings
as the attribute labels. In the rendered PDF, the attribute specified in the acro-field is replaced
with either the value retrieved from the repository or with the value specified by the user. This
process is described in the following diagram:
Note that the Profile Attributes are now optional but supported for backwards compatibility in the
overlay as mentioned in Step 1. For example, instead of the form having Modify Date as the field
name, you can use r_modify_date as the field name and not have the Modify date in the profile.
257
Controlled Print and Issued Print
Note: The "controlled_print_stamp" System Parameter must be manually added to force the system
to apply overlays as an overlay and not as an underlay.
258
Controlled Print and Issued Print
CDF functions can also be added as a value in the PDF Field Name box. If you want to display
the document status by retrieving the information from the Dictionary alias, you can use the
following syntax:
259
Controlled Print and Issued Print
1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Controlled Printing > Profiles.
4. In the Profiles page, select a profile.
5. In the Overlay File field, click the ellipsis button and browse to the PDF template. Select the
file and click OK.
6. Click Save.
260
Controlled Print and Issued Print
Print Reasons
Refer to the following table and update appropriate dictionaries to add or remove the reasons for
Print, Reprint, and Recall.
Action Dictionary
Print reasons for internal users Controlled Print Reason Internal
Print reasons for external users Controlled Print Reason External
Reprint reasons Controlled Print Reprint Reason
Recall reasons Controlled Print Recall Reason
261
Controlled Print and Issued Print
spreadsheet, and then selecting the Process Changes menu option. This is the generic Security
Group Generator processing.
Configuring Auto-Recall
Printed documents can be recalled automatically based on the lifecycle state. The following steps
outline the configuration needed for auto-recall:
1. Log in to D2-Config.
2. Select Go to > Lifecycle.
3. Under Lifecycles, select the lifecycle that must be configured.
4. Under Lifecycle state, select the lifecycle state on which the auto-recall needs to be configured.
5. Under Action Type, select Apply method.
6. In the Method field, select ControlledPrintRecallLCMethod.
7. In the Extra arguments field, add the following code:
-objectId "$value(r_object_id)" -service_url "ControlledPrint
/rest/controlledprint/doprint?_username=<<iouser>>&_docbase
=<<docbase>>&objId=<<objid>>" -dql "SELECT document_id as
objectId, requestor, recipient, controlled_copy_num as ccNum FROM
dm_dbo.controlled_print WHERE document_id = '<<objid>>' AND event_name
IN ('print') AND print_status = 'Printed'" -reason "Recall reason"
Note: "Recall reason" can be substituted with any valid reason.
8. Click Save.
262
Controlled Print and Issued Print
263
Controlled Print and Issued Print
264
Chapter 15
Virtual Document Templates
A virtual document (VDoc) is a document composed of other documents. VDocs are used to organize
component documents that are located in various folders across functional areas, such as Clinical
Study Report components, CTD Module 2 or 3 components, and so on. Documentum for Research
and Development provides virtual document (VDoc) support where users can create documents,
convert them to a virtual document, and add child documents to it to make a virtual document
structure.
In the out-of-the-box solution, a Clinical Study Report (CSR) Assembly virtual document is shipped
with a predefined structure to accommodate the most common use-case in the industry. The section
provides instructions on how to create custom VDoc templates.
Currently, the Life Sciences solution only supports creating virtual documents in the Clinical
template management. However, template authors can create a template by dragging and dropping
documents into the Virtual Document widget and approve the template in other domains. However,
this template cannot be used when creating documents of that artifact. This is because currently VDoc
template configurations are available for only the Clinical domain.
265
Virtual Document Templates
Each folder represents a level in the template VDoc structure and has a corresponding no-content
document with the same name alongside it. The order in which the subordinate documents are
inserted at each level is governed by the outline heading numbers stored in the a_special_app
attribute (not necessarily by object_name). The resulting template VDoc structure can be previewed
in D2 through the Virtual Doc widget.
Note: For performance reasons, the D2 Virtual Doc widget does not refresh automatically whenever
you select a VDoc in the browser. Right-click the top-level VDoc and select Display Virtual
Document to preview it
Field Description
Asynchronous This option will build the vDoc creation asynchronously and display the
build message to the user to wait until process is complete.
Copy/Link This behavior is governed by a D2 VDoc Template configuration. This
defines whether to copy the child documents from template or link. In
case of documents created from predefined template it must be copied.
New object type The VDoc components are created as objects of type cd_clinical.
Inherit content The template structure can include placeholder documents providing
initial content for each node.
266
Virtual Document Templates
Field Description
Default values The default values is derived from the category attribute on the content
template. If it is set to 2, it uses Cat 2. The same rule is applicable for
lifecycle as well. If the category is set to 2 on template component, then
Cat .2 lifecycle is used on document. The logic for this is defined using
Qualification DQL.
Version label The initial version is 0.1.
If the components of the VDoc template are a combination of Category 2 and 3 documents,
two separate building rules must be defined for them. This causes D2 to apply the appropriate
default values template, inherited attribute rules, and lifecycle to each component. D2 also
applies the initial lifecycle actions to each component, so that they are initialized correctly. Since
the creation of the VDoc is configured to be asynchronous, users can perform other actions till
the VDoc is created.
6. Select Go to > Inheritance.
7. Under Inheritances, select Clinical Virtual Document Template Component Inheritance.
The Clinical Virtual Document Root Inheritance configuration is also applied to each node in
the VDoc copy, which specifies the attributes to be inherited from the root VDoc node to the
subordinate nodes in the copy. This enables the VDoc child documents to inherit the same
properties and default roles as the root VDoc when the VDoc structure is created.
8. In the Selection field on the right pane, add the following values from the Source pane so that the
VDoc inherits these attributes from the VDoc template node:
• artifact_name
• category
• subject ( = sub-folder path)
• title
Note: Do not include the template-inherited attributes in this list.
9. Click Matrix.
10. Under Contexts, expand Content Templates.
11. Under Configuration elements, expand VD Template.
12. Ensure that the Clinical Virtual Document Template configuration element is mapped to the
Clinical Virtual Document Templates context.
The D2 context for this is defined as:
• Object type: cd_content_template
• Qualifier: domain = ‘Clinical’ and r_is_virtual_doc = 1
267
Virtual Document Templates
workflow lifecycle transition tasks, additional D2 actions are used to transition the VDoc child
documents as well as the root VDoc. To configure the template approval in D2-Config:
9. Click Save.
On template approval (promotion to “Effective” state), the child VDoc components are also promoted
to “Effective” in a recursive fashion. For large or complex VDoc templates, it may be more efficient to
use the CDF Virtual Document Method (server method) to accomplish this. This method is more
flexible and configurable, for example, you can transition child documents conditionally. See Review
and Approval of VDocs, page 272 for more information.
Field Description
Name Provide a name for the template
Artifact Name Select the relevant artifact for the template. Although this is not a
mandatory field, it is important to specify an artifact for a virtual
document template so that all the child documents in the VDoc
structure inherit this artifact name from the template.
268
Virtual Document Templates
Field Description
Classification Applicable Artifacts: Select the artifact for which the template
is applicable. For artifact-specific templates, ensure that the
relevant Artifact Name is selected in the Applicable Artifacts list.
The top-level VDoc node is defined as a cd_content_template
object associated with the relevant artifact. In the case of the CSR
VDoc, it is Study Report Assembly.
269
Virtual Document Templates
Field Description
Process Info Specify the default users in the Authors and Approvers fields
for this template.
Template-Inherited Category: Select the control category that will be inherited to
Properties the documents to use the correct configurations like default
values, lifecycles, and so on , as configured in the VDoc Template
configuration. Therefore, it is very important to select the
category correctly.
template:
270
Virtual Document Templates
The value in the Title field is inherited for the component document when the VDoc is created.
4. Click Next.
5. On the Choose template page, select either the Word document template to create a document
or no content to create a no-content placeholder. Click Next.
6. Repeat steps 2 through 5 to create all the required artifacts for the VDoc template components.
7. After creating all the artifacts, navigate to the Templates/D2 folder and in the Doc List,
right-click the parent virtual document, and click Display Virtual Document. The template
appears in the Virtual Doc widget.
8. Drag and drop the artifacts from the Doc List to the Virtual Doc widget as child documents.
9. In the Virtual Doc widget, right-click the parent document and click Checkin.
271
Virtual Document Templates
The usual D2 auto-naming rules apply to the component docs (Clinical Document Auto Naming in
this case). Each component document gets its own unique document number (just as for other
Clinical documents).
D2 auto-linking also applies to the component document – Clinical Study Documents (study-Specific)
in this case as shown in the following image:
272
Virtual Document Templates
To accomplish this in a configurable way, the CDF Virtual Document server method is invoked in the
RD-SSV Cat 2 lifecycle (which applies to the root VDoc). For example, in the “For Review” lifecycle
transition, argument for the CDFVirtualDocumentMethod field is set to:
-id <root-Vdoc-object-id> -child_objects ALL -if_child
"a_status not in ('Effective', 'Withdrawn')" -copy_attrs
"a_status,authors,reviewers,approvers,readers,auditors,wf_authors,
wf_format_reviewers,wf_reviewers,wf_approvers,wf_doc_coordinators"
-apply_d2_security true -context_user "$USER"
Based on the argument, all child documents except those with the status of Effective or Withdrawn
will have the listed attributes and D2 security applied based on the context of the user. Similarly
for VDoc templates, the CDF Virtual Document server method is invoked in the Content Template
Lifecycle Model. For example, in the “For Approval” lifecycle transition, argument for the
CDFVirtualDocumentMethod field is set to:
-child_objects ALL -if_child "a_status not in ('Effective', 'Withdrawn',
'Suspended', 'Superseded', '$value(a_status)') and r_lock_owner = ' '"
-apply_d2_security true -transition_child "For Approval" -context_user
"$USER" -run_as_server true -bind_children true
The CDFVirtualDocumentMethod has a number of significant enhancements over the standard D2
“Action on VD children” feature that can potentially be useful in your application. See the CDF
JavaDocs for details.
273
Virtual Document Templates
Currently, the VDoc binding rules are configured in the Documentum for Research and Development
solution. However, the CDF VirtualDocumentMethod is available in the CDF layer and is exposed for
all the solutions. The CDF VirtualDocumentMethod allows binding to be set when adding children to
the virtual document. To handle setting the binding outside of adding children, the method has been
updated to support a new command line parameter indicating that the binding of the children should
be set (-bind_children). When -bind_children is set to true, it checks the -binding_type passed to
the method. If the -binding_type is early, the system checks the -version_label. The following
table provides a description of these parameters:
Parameter Description
-binding_type Either early or late to denote the VDoc binding
mode to use for new child documents (not
case-sensitive). Early binding means that the
VDoc structure refers to predefined versions
of each version with a specific version label at
any given time. Late binding means that the
VDoc is not bound to a defined version of each
component and instead the relevant version
is selected as and when the VDoc structure is
traversed, as stipulated in the “with ...” clause
of the relevant DQL query. OPTIONAL - early
binding is assumed by default.
-version_label Specifies the child component version
label to use for early binding. For static
bindings, specify an explicit version number
such as 1.0 or use an attribute expression
such as @value(r_version_label[0]) to
fix the virtual document to the current
version of the child document in each case.
$value(r_version_label[0]) refers to the version
number of the parent virtual document node,
which may not be appropriate. For dynamic
bindings, specify a symbolic version label such
as "CURRENT". OPTIONAL - if undefined or
blank, the "CURRENT" version label is assumed
by default.
Based on the values passed for -binding_type and -version_label, the binding rules are enforced on
the VDoc.
274
Chapter 16
Configuration Tasks
This section explains how to perform certain common configuration tasks for the Life Science solution.
275
Configuration Tasks
The following changes have been made in the Life Sciences solutions to address preceding process:
• The default value for consecutive failed login attempts is set to 3. If the user failed to provide
correct password for 3 times consecutively, the system makes the user inactive automatically. This
is applicable while providing the credentials for electronic signatures. The default value can be
changed by modifying the DM_DOCBASE_CONFIG object:
update dm_docbase_config objects set max_auth_attempt = <NEW_VALUE>
where object_name = <DOCBASE_NAME>
where <NEW_VALUE> is the value for the failed login attempts and <DOCBASE_NAME> is
the name of repository used
Note: Setting the value to 0 disables the locking out of the accounts.
Note: If required, you can configure the other options listed in the preceding table.
• A LSReportDeactivatedUser job is created to send a report of the deactivated users to the users in
role cd_report_locked_users, which is disabled by default. This job runs on 24-hours cycle and
sends the list of deactivated users in HTML format. To enable this job:
1. Log in to Documentum Administrator.
2. Navigate to Job Management > Jobs.
3. Right-click the LSReportDeactivatedUser job and select Info.
4. Change the stage of the job from Inactive to Active and then click OK.
The frequency of the job can be changed in the Schedule tab according to your requirements.
The LSReportDeactivatedUser job calls CDFCreateObjectMethod, which generate the report of
deactivated users and save it as object in repository. The following parameters are used:
276
Configuration Tasks
• You can use the Find all deactivated users search query to search the list of inactive users. This is
enabled for users in the Controlled Document Admins Role, that is, cd_admingroup.
• Inline passwords are not used and authentication is typically handled by the platform (NT/LDAP).
User authentication and lockout can be enforced at the platform level. The Life Sciences solution
relies on the underlying operating system to enforce lockout and notification. Refer to the
corresponding documentation for more information.
277
Configuration Tasks
D2 Mailing Configurations
Email configuration in D2 can be used for the following purposes:
• Create mailing lists that end users can use to send preconfigured batch email messages.
• Send email messages directly from the D2 Client.
• Create email distributions based on events.
• Create subscriptions-based events such as workflows, lifecycle transitions, and so on.
d2_mail_config You can configure D2-Config > Tools > End users receive
the mailing server, Email notifications when a
event-based mailing, workflow is initiated
and subject and Ensure that the email for a document. You
message for triggering server is configured can enable hyperlinks
event mails. and the relevant email for enabling the user
events are configured to directly navigate to
based on the business the task and approve
requirement. or reject it based on the
business use case.
d2_sendmail_config You can configure the D2-Config > Go to > End users want
email subject, message, Send Email the ability to send
and attachment messages to both
behavior to enable end Email templates can internal and external
users to send email be configured and users of the D2 system
messages concerning a mapped with different for a selected object.
selected object. role-based contexts Using the Send Mail
enabling those users configuration, you can
to use the template predefine the message
when they use the and subject of the email
Send Email option. to streamline this task.
278
Configuration Tasks
d2_mail_attachment You can configure D2-Config > Creation End users need to
_config content import > Mail Attachments import email messages
such that when an with attachments
end user imports This is a global setting and retain the same
an email message that allows users to format of the email
with an attachment, import attachments in and create renditions
the attachment is email messages and of the email messages
processed and saved allow the system to and its attachments.
in the repository as create a rendition of
a rendition. The the imported email
supported content message and its
types are .eml or attachments.
Outlook formats.
d2_mailing_config You can configure D2-Config > Go to > End users need to
the recipients, Mailing list send the messages
subject, message, to users notifying
and attachments that Ensure mailing lists or reminding them
should be included in are configured for the of a document that
an email message. specific set of users needs to be reviewed.
in the system and The end user selects
can be used when the document and
configuring lifecycle chooses the mailing
transitions, context list that has been
mapping based on configured, which is
conditions and enable then automatically
email trigger to the sent to the predefined
users based on the recipients with
conditions. applicable message
and attachment.
279
Configuration Tasks
d2_subscription You can define specific D2-Config > Go to > End users want to be
_config events and associated Subscription notified any time
email messages that someone checks
end users can subscribe Subscription templates in a new version
to, so that they can be can be created for of a particular
alerted when the event events and mapped document. By creating
occurs. with different a subscription, they
role-based contexts. can subscribe to this
This enables users to event for the document
subscribe events to and be notified by
get email notifications email when a new
using the Subscribe version has been
option and select checked in.
templates based on the
business requirement.
Note: You cannot
preset subscriptions
for users. It is based on
user preference.
d2_distribution_config You can configure a D2-Config > Go to > End users want a
simple workflow-like Distribution quick way of sending
process where you can a document for review
define attachments, Enable users to send to reviewers but want
property pages, bulk email. The to make sure that they
recipients, email distribution templates accept or reject the
messages, and any or user email messages new changes in the
electronic signature can be preconfigured document. Further,
requirements when and mapped to context any approval must be
accepting or rejecting based on which users electronically signed
the requested can use the same in and the approver must
distribution actions. the system through the indicate the reason for
Distribution option. approval.
280
Configuration Tasks
Variable Syntax
docbase_name: $value(docbase_name)
due_date: $value(due_date)
event_name: $value(event_name)
message_text: $value(message_text)
object_name: $value(object_name)
package_id: $value(package_id)
planned_start_date: $value(planned_start_date)
task_priority: $value(task_priority)
router_id: $value(router_id)
router_name: $value(router_name)
sender_name: $value(sender_name)
supervisor_name: $value(supervisor_name)
task_name: $value(task_name)
task_number: $value(task_number)
recipient_login_name: $value(recipient_login_name)
recipient_os_name: $value(recipient_os_name)
recipient_name: $value(recipient_name)
platform: $value(platform)
mail_user_name: $value(mail_user_name)
stamp: $value(stamp)
date_sent: $value(date_sent)
link_cnt: $value(link_cnt)
package_type: $value(package_type)
content_type: $value(content_type)
content_size: $value(content_size)
dos_extension: $value(dos_extension)
web_server: $value(web_server)
web_server_type: $value(web_server_type)
temp_file_name: $value(temp_file_name)
mailScript: $value(mailScript)
bulk_mail_file: $value(bulk_mail_file)
smtp_server: $value(smtp_server)
Subject $value(Subject)
Message $value(Message)
281
Configuration Tasks
Variable Syntax
task.subject $value(task.subject)
task.message $value(task.message)
user_name $value(user_name)
password $value(password)
domain_name $value(domain_name)
launch_async $value(launch_async)
document.object_name $value(document.object_name)
Message Configuration
When mapping the appropriate events in the Email Configuration, you can specify the subject and
message for events. You can embed HTML messages with hyperlinks and images in the email
messages. The following sample code shows HTML messages with hyperlinks and images in the
email messages:
<html>
<head>
<meta http-equiv=Content-Type content="text/html; charset=windows-1252">
<meta name=Generator content="Microsoft Word 14 (filtered)">
<style></style>
</head>
<body lang=EN-US link=blue vlink=purple>
<div class=WordSection1>
<p class=MsoNormal>
<span style='font-family: "Arial", "sans-serif"'>The following workflow task has been assigned
to you ($value(recipient_os_name))by $value(sender_name):</span></p>
<p class=MsoNormal>
<span style='position: absolute; z-index: 251659264; margin-left: -7px; margin-top: 18px; width:
778px; height: 3px'><img width=778 height=3 src="notification%20message%204_files/image001.png">
</span>
<span style='font-family: "Arial", "sans-serif"'><br> <br></span></p>
<table class=MsoTableGrid border=0 cellspacing=0 cellpadding=0 style='border-collapse: collapse;
border: none'>
<tr style='height: 31.9pt'>
<td width=101 valign=top style='width: 76.1pt; padding: .05in 5.75pt .05in 5.75pt; height:
31.9pt'>
<p class=MsoNormal align=right style='text-align: right'>
<b><span style='font-size: 10.0pt; font-family: "Arial", "sans-serif"'>Task:</span></b></p></td>
<td width=537 valign=top style='width: 402.7pt; padding: .05in 5.75pt .05in 5.75pt; height:
31.9pt'>
<p class=MsoNormal>
<span style='font-size: 10.0pt; font-family: "Arial", "sans-serif"'>$value(task_name) for
value(document.object_name)</span></p></td></tr>
<tr style='height: 30.1pt'>
<td width=101 valign=top style='width: 76.1pt; padding: .05in 5.75pt .05in 5.75pt;
height: 30.1pt'>
<p class=MsoNormal align=right style='text-align: right'><b>
<span style='font-size: 10.0pt; font-family: "Arial", "sans-serif"'>Subject:</span></b></p></td>
<td width=537 valign=top style='width: 402.7pt; padding: .05in 5.75pt .05in 5.75pt; height:
30.1pt'>
<p class=MsoNormal>
<span style='font-size: 10.0pt; font-family: "Arial", "sans-serif"'>$value(task.subject)
</span></p></td></tr>
282
Configuration Tasks
283
Configuration Tasks
All aliases are resolved when using through actual D2 environment. Images can be included in the
HTML. This can be done by deploying the D2 War file or any custom War files and then using an
Alias-set to point to the correct URL and file in each environment.
Auditing Events
D2 auditing is used to capture and record key events as documents progress through their lifecycle.
It is possible to reconfigure the audited events for each category of documents independently, if
necessary. In the default configuration, the following events are audited.
For Cat 1-3 controlled documents and Change Requests, the following events are audited:
• Document creation
• Document deletion (including D2 recycle bin delete/restore events)
• Document versioning (check-in events)
• Document property updates, including changes to role members and/or Effective, Review, and
Expiry dates, as and where applicable (audited explicitly)
• Document lifecycle state changes
• Creation of relations
• Removal of relations
• Workflow initiation
• Workflow termination (abort events)
284
Configuration Tasks
285
Configuration Tasks
8. To view the audit events in D2 Client, administrators grant users the View Audit extended
privilege in Documentum Administrator.
Date-Time UI Implementation
The date-time UI implementation in the Life Sciences solutions is based on the D2 time zone
awareness feature. The timestamp displayed in the solutions is based on the following time zone set
on the client machine.
By default, the timestamp displayed in the widgets such as Doc List, Versions, Renditions and so on,
have their values converted to the Client machine's time zone settings. However, when the timestamp
values are stored in the database, they are stored in the UTC format.
When applying dates to PDF documents during C2 Rendition processing, that is, fusing a PDF page
to a PDF rendition, the time zone of the server where the C2 processing occurs is used. For example,
if the PDF rendition processing is running on the Web Server which is in the Pacific time zone, the
date will be converted to the Pacific time zone when reading from the repository and applying
to the PDF. This also applies to C2 View, Export, and Print processing. Depending on where the
processing takes place, that server’s time zone is used. For example, for C2 View, Print, and Export
from the D2 Client, the web server’s time zone is used. For C2 Rendition processing which could
take place on the Documentum Server, the Documentum Server’s time zone is used. For the date
used in the autolinking configuration, the Documentum Server’s time zone is considered when
processing autolinking.
286
Configuration Tasks
It is recommended that all servers be set to the same time zone and if possible set to UTC. This
removes any confusion in that all dates within PDF renditions are UTC and any dates on the UI are in
the client’s local time zone.
Note: In the D2 Client, if you select Today in the Calendar for the date input field, the system takes
the same date at 12:00 AM irrespective of the time you performed the operation on that day. For
example, if you selected Today in the Calendar at 2:00 PM IST on 24-MAY-2018, the system displays
the timestamp as 24-MAY-2018 12:00 AM.
For more information about the time zone feature, see the following whitepapers:
• The Timezone Feature in Documentum Server
• Content Transfer Timeout Configuration for WDK-based Applications
Media Files
Media files, such as video, can be previewed in Documentum through the Life Sciences Document
Preview widget. Any media file format supported by Windows Media Player can be previewed: the
formats that are supported can be specified in the optional "mediaFormats" URL parameter to the
Life Sciences Document Preview widget as a comma-delimited list of dm_format names. Where
unspecified, the default formats are: wmv, avi, mpg/mpeg/mpg-4v, quicktime, and wav files.
287
Configuration Tasks
Note: For this release, only cross-domain sharing of documents from TMF to LSRD is supported.
Ensure that the browser is Java or CTF-enabled so that the Reference Copy does not appear
contentless.
Specific individual document artifacts in any domain in any Life Sciences solution can be shared
automatically with another domain when they are created, according to predefined rules, without
any user intervention. The same document can potentially be shared across multiple domains. The
shared document in each domain may have its own domain-specific object type, properties, lifecycle
model, role-based security, and workflows. To share an artifact across a domain, it is required the
Life Sciences solution share the domain and artifact within that domain. For example, if you want
to share a Clinical document artifact in the LSTMF Clinical domain with LSRD, the same domain,
that is Clinical, and artifact must exist in LSRD.
Limitation:
During the creation of the reference document, if the document is not created because of invalid
inputs, then the parent document is retained without a reference document. Post creation, it is not
possible to generate a reference document for that parent document.
288
Configuration Tasks
where,
• relationtype specifies the Documentum relation type to be established, which in this case
is Reference Copy.
• objecttype is the Documentum object type of the artifact in the target domain.
• artifactname specifies the related artifact name, which must be a valid entry (dictionary key
value) in the relevant artifact dictionary (reference model). If omitted, it is assumed to be the
same artifact name as that of the master document.
• setattributeslist is an optional series of items of the form <attribute-name>|<attribute-value>
pairs separated by the colon operator and key values are separated by the pipe operator that
enables specific attributes of the related artifact to be set explicitly or indirectly through the
D2 default values template configurations. These can use $-attribute expressions referring to
attributes of the original document.
• copyattrs specifies the attributes of the master document that need to be copied to the
reference document. If the attributes are not present in reference document then user has
to manually select.
• defaultvalues specifies the default value template to be set on the reference document.
For example, for the cd_clinical domain, the query will be:
relationtype=Reference Copy;objecttype=cd_clinical;artifactname
=Informed Consent Form;setattributeslist=[doc_coordinators|cd
_clinical_doc_coordinators:notification_list|cd
_clinical_doc_coordinators:authors|cd_clinical_ref_copy
_mgrs:domain|Clinical];copyattrs=product_code,primary
_group,subgroup,indication;defaultvalues=Clinical Cat 2 Default Values
9. Click Save.
289
Configuration Tasks
290
Configuration Tasks
Caution: Only one instance of this object can exist at any point in time. If you create
multiple instances of this object, Hard Delete may not work correctly.
4. Configure the TMF Doc Delete Config properties as defined in the following table:
Field Description
Name Shows TMF Doc Delete Config.
Title Shows TMF Doc Delete Config.
Document Type Select the document type that selected
user roles can hard delete. For example,
cd_clinical_tmf_doc.
291
Configuration Tasks
Field Description
Document Statuses Select the statuses that the document must be
in before users can hard delete it.
User Roles Select the user roles that can perform the hard
delete.
Attributes to Audit Select the document attributes to capture in
the audit trail for the cdf_hard_delete
event. The system audits a maximum of six
attributes.
5. Click OK.
You can view hard deleted documents in D2 Client using the Find Delete Audit Events search query
form.
Note: In the repository, you can delete related documents by setting the integrity_kind attribute
in the dm_relation_type table for that relationship type to 2. The system deletes both the original
document (referred to by parent_id in the dm_relation table) and the related document (child).
However, deleting the child does not delete the parent in the repository. The system records only the
dm_destroy event (not the cdf_hard_delete event) for the related document. If the integrity
attribute is set to 0 or 1, the hard delete fails with an error message.
The standard Documentum functionality is that after an object is deleted, the content file relating
to it is not deleted until the dm_DMClean job is run. Therefore, prior to execution of dm_DMClean,
it is possible to recover the content. Configuring the dm_DMClean job as part of Hard Delete is
not supported.
292
Configuration Tasks
The document package is the parent object in a Package Item relation, and the various items are child
objects. The child objects can be individual documents, folders, or virtual document nodes. In the
case of folders and virtual documents, the subordinate objects are included automatically in the
document pack, in recursive fashion.
The document pack has a main content file that is either a Microsoft Excel spreadsheet (when new
or unpacked) or a ZIP file (when packed). The Excel spreadsheet is derived from a template and is
used to record the Documentum metadata for each item when the document pack is packed. The
Excel spreadsheet template is part of the default installation and is defined as a cd_content_template
object in the repository, with the domain set to Document Pack.
State Description
New Newly-created document pack.
Refreshing A transient state representing a document pack that is being refreshed (that
is, the spreadsheet is being regenerated with the appropriate metadata).
When the refresh process is complete, it reverts to New.
Packing A transient state representing a document pack that is in the process of being
packed (through an asynchronous server method).
293
Configuration Tasks
State Description
Packed A document pack that has been packaged up into a ZIP file and is ready for
export or unpacking, as appropriate.
Unpacking A transient state representing a previously packed document pack (or one
that has been imported as a ZIP file) that is in the process of being unpacked
(through an asynchronous server method).
Unpacked A document pack that has been successfully unpacked into the repository,
that is, any changes such as new or modified documents have been applied.
Invalid A document pack that could not be processed for some reason; for example,
it refers to document files that do not exist.
The “Importing and Exporting Multiple Documents” section in the Documentum for eTMF User Guide
provides more information about the bulk import-export process.
Note: In the Bulk Import-Export Excel spreadsheet, the LIFECYCLE_STATE is set to Read-only
(‘Y’) by default. This setting does not allow lifecycle transition to be considered when importing
documents. All documents will be imported in the “Draft” state even if the user specifies a different
state in the Status column, such as Final, To be Reviewed, and so on. To import documents to the
repository in the specified document status and perform lifecycle transitions, you must set the
Read-only column to ‘N’.
294
Configuration Tasks
5. On the Schema worksheet, add a row for the data field definition as described in the following
table:
Column Description
Worksheet Type File List to indicate that the data
resides in the File List worksheet.
Column Heading Type the column name for your attribute. For
example, Subject.
Data Field Type the Documentum attribute name or your
custom attribute name. For example, subject.
Default Value (Optional) Type a default value for blank cells.
Read-only Type Y if it is a read-only attribute. During
an export, the system reads the attributes
from the repository and writes them to the
spreadsheet. The system ignores this column
on import. Consider highlighting the column
in light blue so that the end users know that it
is a read-only column.
The following example shows information added to the Schema worksheet for the Documentum
attribute column:
The following example shows the Documentum attribute column highlighted in light blue to
indicate that it is a read-only column:
295
Configuration Tasks
6. Save the Microsoft Excel Spreadsheet in Microsoft Excel 97-2003 format (.xls format) or Excel
2007-2010 (.xlsx format).
An Alias column has been added to the Schema worksheet to enable dictionary aliases to be specified.
For example, you can add a "Synonym" alias to the "TMF Unique Artifact Names" dictionary and
override the default artifact display names in this column. If a dictionary alias is not specified or
blank, the locale setting is used for dictionary lookups. The default locale is "en" (English) although
this can also be specified in the schema settings.
296
Configuration Tasks
Column Description
(Checkbox) Select the checkbox for each role to include as
an external participant.
297
Configuration Tasks
Column Description
Group Name Suffix Type a unique suffix for your participant roles.
For example, you can type _contract for a
new document Contractor role.
298
Configuration Tasks
Column Description
File Plan Column Alias Type a column alias in upper case letters with
no spaces. This alias is for system use. For
example, you can type CONTRACTOR for a new
role.
Context Group
Note: This setting is not currently used. It is
reserved for future use.
299
Configuration Tasks
c. Type a name for this copy of the dictionary. This name should be the same as the name
defined for the participant role in the Taxonomy Dictionary Level column of the TMF
External Contributor Roles dictionary. For example, TMF Contractor Access.
d. Do not make any other changes and save the configuration.
9. Update the TMF Classification by Artifact taxonomy with your participant roles:
a. Select Data > Taxonomy from the menu bar.
b. In the Taxonomies list, select TMF Classification by Artifact.
In the dialog box, select Excel as the file format and click Modify. Save the file to your local
machine.
c. Adjust the default TMF External Contributors, TMF External Reviewers, TMF Inspector
Access, and TMF Investigator Access columns of the spreadsheet based on your role
definitions. These columns must match the name of the dictionaries created for each role. For
example, TMF Contractor Access. Add a column for each additional role. If you remove a
role, remove the column for that role.
d. Save the file, import it, and save the configuration.
Related topic:
• Adding an External Participant Role Example, page 301
300
Configuration Tasks
In practice, these role setting are most likely to be Author to provide read and write access and
either Reader or Auditor for read-only access. The available roles are defined in the following
table:
Role Description
Document Coordinator Has full access and can reassign
document-level roles and define the
effective period of a Category 1 document
Author Has read and write access to work-in-progress
versions and can self-approve Category 3
documents
Reviewer Can participate in a review and approval
workflow
Approver Electronically-signs a Category 1-2 document
through a review and approval workflow
Reader Has read-only access to Effective versions
Auditor Has read-only access to release-ready versions,
including historic release-ready versions.
For example, Release Pending, Effective,
Superseded, but not work-in-progress and
Withdrawn versions.
None Has no access to this artifact (the artifact is
hidden)
301
Configuration Tasks
302
Configuration Tasks
303
Configuration Tasks
304
Configuration Tasks
d. In the cells of the TMF Contractors column, type the document-level roles for each artifact
for your defined roles. For this example, type Reviewer in the cells to provide read-only
access for each artifact to the role.
If you do not want a document to go through the QC process or change the QC type, perform the
following steps:
1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
305
Configuration Tasks
306
Configuration Tasks
1. Verify that multiple Documentum Servers are available. The repository should have more than
one dm_server_config object registered.
2. Log in to D2-Config as an administrator.
3. Select All elements from the configuration filter.
4. Select Data > Dictionary from the menu bar.
5. In the Dictionaries list, select System Parameters.
6. On the Alias tab, configure the system parameters as defined in the following table:
Parameter Description
content_servers Type the Documentum Servers available
for distributed processing. Use a
comma-separated list of dm_server_config
names, or * to use all available servers.
307
Configuration Tasks
Parameter Description
max_threads Type the maximum number of server method
threads per Documentum Server for local
processing.
308
Configuration Tasks
This creates the required dm_method object with timeout set to 7200s (2 hours) by default (adjust
the timeout parameters if necessary).
Due to security enhancements in Documentum Server 7.1 and 7.2, the bindfile API no longer works.
Although this does not affect the Life Sciences system, it may have an impact if the user changes the
run_as_server property to false on the following dm_method objects:
• CDFCreateObjectAsyncMethod
• CDFCreateObjectMethod
• TMFCreateLinks
Ensure that the run_as_server attribute is always set to True (1) for these dm_method objects.
309
Configuration Tasks
Note: This D2 core method tracing is only for core method calls invoked through Life Sciences server
methods. For setting the DEBUG level for the default D2 Core Method calls, see the Documentum D2
Administration Guide.
1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Go to > Menu D2.
4. Under Menus, select CDF Contributor Menu.
5. Under Contextual menus, click <Right click> and click Convert to virtual document.
6. Under Conditions, ensure that Selection is not object type "....." is selected.
7. Under Condition parameters, for the Type field, click the ellipsis (...) button.
8. Under Selection, remove the following values:
• cd_quality_gmp_approved
• cd_quality_gmp_effective
9. Click OK and then click Save.
310
Configuration Tasks
published submission documents when they are imported into the repository through the Import
Submission function.
Follow these steps to update the PDF document information in the System Parameter D2 dictionary.
1. Log in to D2-Config.
2. Select Data > Dictionary.
3. On the Dictionary page, select All elements in the filter on the toolbar.
4. In the Dictionaries list, select System Parameters.
5. On the Alias tab, add a standard or custom DocInfo field, such as SourceObjectId or Subject, as
the value of source_object_pdf_docinfo_field.
6. In the source_object_pdf_multipleIDs_delimiter field, add the delimiter value that is used to
separate the list of source document r_object_ids in source_object_pdf_docinfo_field such as
comma, semicolon, backslash, colon, and so on.
7. Click Save.
On the Relations tab in the D2 Client, the user can click an imported submission document and
view the original source document that was used to generate it. The published version usually
contains additional markups such as headers, footers, watermarks, bookmarks, and hyperlinks
generated by the publishing tool, but it is sometimes useful to refer back to the original document
for publishing fidelity checking or to enable the original document to be identified and repurposed
for other applications.
If the publishing tool does not support embedding of Documentum attributes into PDF DocInfo
fields, or where the original documents are stored in a different repository, leave this setting blank;
source document relations will not be created at submission import time in that case.
311
Configuration Tasks
312
Chapter 17
Regulatory Submissions
This section provides an overview of regulatory submissions and how it is configured in the Life
Sciences solution.
Submission Overview
A regulatory submission is a collection of documents (submission elements) sent to a regulatory
Authority with respect to an application. For example, an Investigational New Drug (IND)
application to the US Food and Drugs Administration (FDA) for approval to commence clinical trials
in humans or a New Drug Application (NDA) to the FDA for approval to market a drug in the US. The
application type denotes the purpose of the application (IND or NDA, in the preceding examples).
An application may require several submissions to be made before it is approved. Various
amendments, queries, and requests for supplementary information may be requested by the
Authority and post-approval, additional submissions may be necessary from time to time, such
as Periodic Safety Update Reports (PSURs). The submission type indicates the purpose of each
submission, for example, an Initial Filing, Amendment, or PSUR. Both application types and
submission types are regional – different application and submission types are used in different
geographic regions. For example, the IND and NDA application types pertain to the US, whereas
the European equivalents are CTA (Clinical Trials Approval) and MAA (Marketing Authorization
Application), respectively.
The following figure illustrates the relationship between the various submission-related objects.
313
Regulatory Submissions
An application is typically made to one health authority in one country, in accordance with a
National Procedure (NP). The exception is for applications to European member states, which can
follow either a National Procedure for specific member states (a separate application being made
to each member state in that case), or a Centralized Procedure (CP), in which case the application
is made directly to the European Medicines Agency (EMA), the central European regulatory
authority. If the application is approved by the EMA, it is approved for use across all EU member
states. For certain types of applications, such as Biological License Applications (BLAs), approval
by the EMA through the CP is mandatory; for others, it is discretional. There is also an option to
use the Mutual Recognition Procedure (MRP) in the EU, which enables the same application to be
made to two or more member states simultaneously. Once one member state decides to evaluate
the product, it becomes the Reference Member State. The others become Concerned Member States,
acting in a reviewing or monitoring capacity. In this way, the MRP is designed to share the workload
in evaluating medicinal products across national regulatory authorities within the EU, without
compromising safety or regulatory scrutiny. To support MRP applications, it must be possible for an
application to be associated with multiple health authorities within the EU region.
314
Regulatory Submissions
315
Regulatory Submissions
XSL stylesheet files are not required for previewing. They may be useful as examples if additional
stylesheets need to be provided to support new regions or new DTD formats for existing regions.
316
Regulatory Submissions
hyperlinks. The submission log file is attached to the Regulatory Application Registration Form as a
text or crtext rendition that can be opened in Notepad from the Renditions tab in D2.
The selected submissions or eCTD sequences for the application are downloaded, imported, and
committed in the repository one by one, so that the submission view XML files (including the eCTD
cumulative view) are kept up-to-date and are valid for the last successfully-imported submission
or eCTD sequence. In the event of an irrecoverable error, such as the submission files could not be
downloaded and imported due to a network outage, the changes for the current submission are not
applied to the submission view files. In some cases, the SRF and submission files for the failed
submission or sequence may exist in the repository, even though that submission or sequence is not
visible in the Submission Viewer. However, the SRF itself is not marked as imported until the import
process is complete. This enables the failed submissions or sequences to be re-imported after the
error has been rectified and the system restored. On re-importing the remaining submissions or
sequences, any files that currently exist in the repository are automatically deleted prior to importing
the new files.
To enable previously-imported submissions or sequences to be re-imported, it is necessary to update
the relevant SRFs to mark them as not imported (or the SRFs can be deleted completely). For security
reasons, this can only be accomplished by a Documentum Superuser account, for example in
Documentum Administrator, through the following DQL statement:
update cd_reg_submission_info object
set is_imported = false
where application_description = '<application-description>'
[and submission_number = '<submission-number>']
To enable all imported submissions or sequences for the application to be re-imported, omit the last
and submission_number ... clause, so that all SRFs for the application are updated. It is not
necessary to delete the existing submission folders, subfolders, and documents from the repository,
or to reset the XML view files. This is done automatically when the submissions or sequences are
re-imported.
317
Regulatory Submissions
These files are useful for diagnostic purposes and can be used to validate the outcome of a submission
import operation. The dump files are preserved across import operations, and either replaced or
augmented to in subsequent imports. However, if the initial eCTD sequence 0000 is re-imported, the
submission views are rebuilt and any existing dump files are discarded and regenerated in that case.
Note: It is possible to disable this feature by specifying the argument -attach_dump_files as false
in the "(Import Submissions)" transition of the RARF lifecycle configuration. This option is enabled
by default.
318
Regulatory Submissions
320
Regulatory Submissions
Regional XML Files, page 335). Link this script to the XML schema configuration object by
adjusting the xml_envelope_stylesheet property accordingly.
5. If necessary, add entries to the D2 dictionary, Submission Regional Metadata Conversions, to
convert regional XML-encoded values into suitable values for storage in Documentum. See
Mapping XML Values to Documentum Attributes, page 347 for more information.
6. Import a sample submission to test the new configuration and ensure that it can be navigated
and previewed correctly.
When working with different locales, you must adjust the operating system and the database with
appropriate locale settings as suggested in the Documentum Server Installation Guide and make the
necessary changes in the Documentum Server. In addition, you must also ensure that JMS is adjusted
accordingly to the appropriate locale settings to be in sync with the Documentum Server and the
database. For example, in the JMS startup script, add the following line in JAVA_OPTS and restart
the service:
'-Dfile.encoding=UTF-8'
321
Regulatory Submissions
• eCTD Regional
XML File
322
Regulatory Submissions
323
Regulatory Submissions
To support server
-based prerendering
of XML to HTML, the
specified stylesheet
must be installed in
the /System/SSV
/Stylesheets
folder as a standard
dm_document object
(with format xsl).
To support
client-based rendering
of XML to HTML,
the stylesheet must
also be installed on
the application server
in the %WEB-APPS%
/XMLViewer/style
folder.
preview_widget_id Char(32) Specifies the widget ID SSVLeafDocument
of the target widget Viewer
that should be used
for previewing,
for example,
SSVStudyTaggingFileViewer
. Optional – if
undefined or
blank, the default
preview widget,
SSVLeafDocumentViewer,
is used.
contains_leaf_docs Boolean Indicates whether T
or not this XML file
contains references
to “leaf elements,”
that is, documents to
be imported into the
repository.
324
Regulatory Submissions
325
Regulatory Submissions
326
Regulatory Submissions
327
Regulatory Submissions
The system can support multiple versions of the same XML format. A separate XML schema
configuration object must be defined for each format or version. The xpath_qualifier setting is
used to identify and select the appropriate XML schema configuration object for each XML file
encountered during the eCTD import process, depending on whether the specified XPath expression
matches an element in that XML file. The xpath_qualifier should include a qualifier in each case
so that it only matches XML files of the appropriate format and version, for example, the XPath
expression /eu-backbone[@dtd-version='1.2.1'] only matches XML files containing a root element
named “eu-backbone” (ignoring the ”eu:” XML namespace prefix) where the “dtd-version” attribute
value of the root element is set to “1.2.1”. In other words, this schema only applies to XML files that
use version 1.2.1 of the EU regional M1 XML format, such as the following:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE eu:eu-backbone SYSTEM "../../util/dtd/eu-regional.dtd">
<?xml-stylesheet type="text/xsl" href="../../util/style/eu-regional.xsl"?>
<eu:eu-backbone xmlns:eu="http://europa.eu.int" xmlns:xlink="http://www.w3c.org/1999/xlink"
dtd-version="1.2.1">
<eu-envelope>
<envelope country="uk">
<application>
<number>N123456</number>
</application>
<applicant>Acme Pharma Inc.</applicant>
<agency-name>MHRA</agency-name>
<atc>C10AB05</atc>
<submission type="initial-maa" />
<procedure type="national" />
<invented-name>My Wonder Drug</invented-name>
<inn>wonderdrug</inn>
328
Regulatory Submissions
<sequence>0000</sequence>
<submission-description>Submission of registration dossier</submission-description>
</envelope>
</eu-envelope>
<m1-eu>
<m1-0-cover>
<specific country="uk">
<leaf operation="new" xlink:href="10-cover/uk/en-cover-wonder-drug-50mg.pdf"
xlink:type="simple" checksum-type="md5" application-version="PDF 1.4"
checksum="b132fc1e9e0c5c9f5401a4288f20f60f">
<title>Cover Page (English)</title>
</leaf>
</specific>
</m1-0-cover>
…etc.
</m1-eu>
<eu:eu-backbone>
In this way, different XML formats or versions can be matched to different XML schema configuration
objects with different settings. The file system filename of the XML file itself is not significant (an
EU M1 regional XML file does not need to be named eu-regonal.xml, for instance); neither is the
location of the XML file within the folder structure – the XML file recognition depends only on the
contents of the XML file. If none of the XML schema configuration objects defined in the repository
return a match for a particular XML file, the system logs a warning indicating that the XML file is
not recognized, and treats it as a standard leaf document. It is not possible to extract metadata from
unrecognized XML files. It is also not possible to preview them in the Documentum Submission
Store and View Document Viewer.
329
Regulatory Submissions
regarded as withdrawn. In the initial sequence 0000, only “new” documents are permitted.
(Required)
• xlink:href: Specifies the document content file in terms of a relative file system path from the
sequence-level folder. (Required except where operation = “delete”)
• xlink:type: Specifies the type of the xlink:href value, which is expected to be “simple”, where
defined. (Optional)
• checksum: Specifies a checksum for the content file, which can be used to verify that the
content file has not been modified since the XML file was generated. (Optional)
• checksum-type: Specifies the algorithm used to generate the checksum; usually “md5”
(where defined). This is not currently used by LSSSV. (Optional)
• application-version: Denotes the application name and version number associated with the
content file, that is, the content file format. For example, “PDF-1.4”. (Optional)
• modified-file: Denotes the previously-submitted file affected by the operation (if any) as a
relative file path from the top-level folder containing the sequence folders; the first path
element being the previous sequence number. (Required where operation code is “replace”,
“append” or “delete”; not applicable where it is “new”)
Each <leaf> element should also have a <title> sub-element, in which the document title is
specified. In practice, it is possible to use a different XML element name for leaf documents,
provided the element has attributes and a <title> sub-element defined as above – the actual leaf
document element name is configured in the XML schema configuration object.
3. Non-leaf nodes in the document section are used to represent standard eCTD modules, sections,
and subsections, nested to an arbitrary depth. These usually represent the file system folder
structure of the eCTD sequence. Non-leaf nodes can have the following additional attribute
values defined for them, indicating sections pertaining to specific contexts:
• substance—Name of drug substance
• product-name—Name of drug product
• manufacturer—Manufacturer of drug substance or product
• indication—Specific therapeutic indication
• dosageform—Specific dosage form
• excipient—Name of excipient (inactive ingredient)
These attributes are defined in the relevant ICH DTD. In practice, Documentum Submission Store
and View enables arbitrary metadata to be specified for non-leaf nodes.
4. <node-extension> elements may also be used to represent custom extensions to the standard
eCTD folder structure. Each <node-extension> element has a <title> sub-element defining the
section title (display label), and one or more <leaf> document elements or <node-extension>
sub-elements recursively.
5. In M1 regional modules, <specific> elements can be used to denote country-specific sections,
where the country attribute value denotes the ISO country code in each case. For example,
<specific country="de"> represents a section that is applicable to Germany only. In practice, these
elements only appear in EU regional M1 XML files. The country code value eu or the aliases ema,
330
Regulatory Submissions
emea, or common can be used to denote elements pertaining to the central European Medical
Authority (EMA) or to the EU region in general.
6. In M1 regional modules, <pi-doc> elements can be used to denote package information
documents, such as labeling documents. This may be country-specific (with a country code
defined earlier). These elements may also specify a type attribute value indicating the type of
labeling document in each case, although Documentum Submission Store and View does not use
this. These elements only appear in EU regional M1 XML files.
All the XML files supported in the out-of-the-box configuration conform to this standard format
except for the Japanese regional M1 XML files.
Note: Whenever a large amount of processing is involved during the import of an eCTD submission
that has multiple submission folders, it is better to allocate more resources to the Java Method Server.
For example, memory should be preferably 1024 MB or greater.
331
Regulatory Submissions
To convert Japanese regional M1 XML files into the standard format, the following XSL transformation
script is used:
<?xml version=”1.0" encoding="UTF-8" standalone="no"?>
<xsl:transform version="1.0" xmlns:xsl = "http://www.w3.org/1999/XSL/Transform"xmlns:ectd
= "http://www.ich.org/ectd" xmlns:xlink = "http://www.w3.org/1999/xlink"xmlns:gen = "universal">
<xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/>
<xsl:template match="/">
<xsl:element name="jp-backbone">
<xsl:attribute name="dtd-version"><xsl:value-of select="/gen:universal/@schema-version"/>
</xsl:attribute>
<xsl:apply-templates select="*"/>
</xsl:element>
</xsl:template>
<!--Japanese M1 root XML element-->
<xsl:template match="gen:universal">
<!--Generate "admin" section containing document title, ID and Japanese MHLW envelope
information-->
<xsl:element name="admin">
<xsl:element name="title">
<xsl:value-of select="gen:document-identifier/gen:title"/>
</xsl:element>
<xsl:element name="document-id">
<xsl:value-of select="gen:document-identifier/gen:doc-id"/>
332
Regulatory Submissions
</xsl:element>
<xsl:element name="properties">
<xsl:attribute name="label"><xsl:value-of select="/gen:universal/gen:document/gen:
content-block[@param = 'admin']/gen:block-title"/></xsl:attribute>
<xsl:for-each select="/gen:universal/gen:document/gen:content-block[@param = 'admin']/*">
<xsl:choose>
<xsl:when test="name() = 'doc-content'">
<xsl:element name="property">
<xsl:attribute name="item-number"><xsl:value-of select="@param"/></xsl:attribute>
<xsl:attribute name="name"><xsl:value-of select="./gen:property/@name"/></xsl:attribute>
<xsl:attribute name="label"><xsl:value-of select="./gen:title"/></xsl:attribute>
<xsl:attribute name="value"><xsl:value-of select="./gen:property"/></xsl:attribute>
</xsl:element>
</xsl:when>
<xsl:when test="name() = 'content-block'">
<xsl:element name="property">
<xsl:attribute name="item-number"><xsl:value-of select="@param"/></xsl:attribute>
<xsl:attribute name="name"><xsl:value-of select="./gen:doc-content/gen:property/@name"/>
</xsl:attribute>
<xsl:attribute name="label"><xsl:value-of select="./gen:block-title"/></xsl:attribute>
<xsl:attribute name="value"><xsl:value-of select="./gen:doc-content/gen:property"/>
</xsl:attribute>
</xsl:element>
</xsl:when>
</xsl:choose>
</xsl:for-each>
</xsl:element>
</xsl:element>
333
Regulatory Submissions
</xsl:element>
</xsl:template>
<xsl:template match="*"/>
</xsl:transform>
This script is included in the standard installation, and is stored in the repository as the main content
file of the XML eCTD schema configuration object for Japanese regional M1 files. The XSL file is then
downloaded and applied automatically to Japanese M1 XML files at import time as they are loaded,
prior to processing them. A copy of this XSL transformation script can also be found on the Application
Server in the file %WEBAPPS%/XMLViewer/style/jp-regional-1-0-normalize.xslt for
reference, although it is not used by the Viewer.
The output from the XSL transformation script is another XML file, of the following form:
<jp-backbone dtd-version=”Japanese schema version”>
<admin>
<title>Japanese title</title>
<document-id>document identifier</document-id>
<properties label="Japanese section label">
<property item-number="nn" name="property-code" label="Japanese property label"
value="Property value"/>
...
</properties>
</admin>
<m1-jp label="Japanese section label ">
<m1-01 label="Japanese sub- section label ">
<m1-01-03>
<leaf operation="new " checksum-type="md5" checksum="content file checksum"
xlink:href="relative content file path"
<title>Japanese document title</title>
</leaf>
...
</m1-01-01>
</m1-01>
...
</m1-jp>
</jp-backbone>
This now conforms to the standard XML format required by Documentum Submission Store and
View. The XML schema configuration settings are configured accordingly for this XML file to
enable it to be processed as normal. For example, the xml_envelope_element XPath setting is set
to /jp-backbone/admin in the eCTD schema configuration object for Japanese M1 files; similarly,
xml_extract_leaf_docs_from is set to /jp-backbone/m1-jp, and so on.
It is possible to use a similar technique to support new eCTD formats that do not conform to the
standard XML format, as follows:
1. Create an XSL transformation script to convert the eCTD format into a standard format, just as
in the preceding example.
2. Test the XSL transformation script by linking a sample eCTD XML file to it, for example, add an
XML preprocessing instruction of the form <?xml-stylesheet type="text/xsl" href="filename.xsl"?>
to the XML header in the sample file, and opening it in the browser.
3. After verifying that the XSL transformation script is generating XML output appropriately in
the standard format, create a new XML configuration object in the repository representing
the new eCTD format, and configure the settings as described previously. Ensure that the
enable_xslt_processing option is enabled and import the XSL transformation script as the
334
Regulatory Submissions
main content file of the XML configuration object in the repository, using the Documentum
format code xsl.
4. If the transformed XML file is not proper (that is, missing ‘leaf’ nodes or XML document is empty),
then use alternative transformation by using the xsl_ns rendition format of the eCTD schema
object that specifies a named space. Then, verify whether the XML file obtained is proper or not.
5. Test the installation by importing a sample eCTD that contains the XML files that use the new
format.
You can also add XSL scripts to the existing XML schema configuration objects to transform the
metadata in XML files, for example, to convert values to lower-case, translate country names into
corresponding country code values, truncate values that are too long, and so on. However, in the
default installation, XSL transformation is only used for converting Japanese regional M1 XML
files and study tagging files. XSL scripts can also be used on the application server to generate
XML preview renditions.
335
Regulatory Submissions
For example, the following settings are preconfigured in the schema configuration object named
“us-regional_2-01”, which can be found in the /SSV/XML Schemas folder, and is used to generate
previews of “us-regional.xml” files:
This tells the system to extract the XML metadata from the admin element of us-regional.xml
files below the root element, fda-regional, to generate the xml_preview rendition, and extract
the leaf document elements from the m1-regional element below the root element into the main
submission view. If the user clicks the us-regional.xml element in module 1 in the Submission
Viewer, the xml_preview rendition is rendered into HTML by the relevant stylesheet (in this case, the
“us-regional.xsl” stylesheet is used) and displayed in the document preview panel.
Note: The xml_preview rendition is generated and stored in the repository automatically at import
time. It is attached to the regional XML file as a rendition. This is used only for previewing the XML
metadata in Documentum Submission Store and View. The primary content of the regional XML file
is not affected, and preserves a record of the XML file that was included in the submission. If the
imported submission folder is exported back out to the local filesystem, only the original primary
336
Regulatory Submissions
content files are exported, and not the Documentum Submission Store and View renditions, so that
an exact copy of the submitted files is exported.
If a new region or eCTD version for an existing region is to be supported, a new XSL stylesheet for it
must be provided to enable the XML preview to be generated. These stylesheets must be installed
in the %WEBAPPS%/XMLViewer/style folder on the application server, with the predefined
stylesheets. In some cases, the XSL stylesheets provided by the relevant Authority itself can be used
directly, or adapted as necessary. For example, standard eu-regional.xsl stylesheet provided with
Documentum Submission Store and View is based on the stylesheet provided by the EMA. However,
in other cases, a custom stylesheet will need to be developed, where the Agency does not provide one
(such as Canada), or where it is unsuitable for direct use, for example, it contains JavaScript or XSL
functions that refer to M1 leaf elements, which are not included in the XML preview (such as the US
FDA stylesheet). In these cases, you can use one of the standard stylesheets that are pre-installed with
Documentum Submission Store and View as a guideline for developing new stylesheets.
The following is a transcript of the us-regional.xsl script that is provided for this purpose:
?xml version="1.0" encoding="UTF-8"?>
<!--U.S. Regional Stylesheet (us-regional.xsl) for previewing XML metadata in SSV-->
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
xmlns:fda-regional="http://www.ich.org/fda">
<xsl:template match="/">
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=UTF-8"/>
</head>
<body>
<h3>
<img src="icons/countryflags/us.gif" style="padding-right: 10px;"/>
US Food and Drugs Administration - Regulatory Information
</h3>
<table border="0" cellspacing="20px">
<tr><td>Applicant:</td><td><xsl:value-of select="/fda-regional:fda-regional/admin/
applicant-info/company-name"/></td></tr>
<tr><td>Product Name:</td><td><xsl:value-of select="/fda-regional:fda-regional/admin/
product-description/prod-name"/></td></tr>
<tr><td>Application Number:</td><td><xsl:value-of select="/fda-regional:fda-regional/admin/
product-description/application-number"/></td></tr>
<tr><td>Application Type:</td><td><xsl:value-of select="/fda-regional:fda-regional/admin/
application-information/@application-type"/></td></tr>
<tr><td>Submission Type:</td><td><xsl:value-of select="/fda-regional:fda-regional/admin/
application-information/submission/@submission-type"/></td></tr>
<tr><td>Date of Submission (<xsl:value-of select="/fda-regional:fda-regional/admin/
applicant-info/date-of-submission/date/@format"/>):</td><td>
<xsl:value-of select="/fda-regional:fda-regional/admin/
applicant-info/date-of-submission/date"/></td></tr>
<tr><td>Sequence Number:</td><td><xsl:value-of select="/fda-regional:fda-regional/admin/
application-information/submission/sequence-number"/></td></tr>
</table>
</body>
</html>
</xsl:template>
</xsl:stylesheet>
Note that this stylesheet is only used for previewing the XML metadata in Documentum Submission
Store and View: the standard agency-supplied regional stylesheets can still be included in the
eCTD submission itself (in the utils folder) and used for navigating the submission when it has
been exported to the local file system.
337
Regulatory Submissions
338
Regulatory Submissions
339
Regulatory Submissions
To extract this value into Documentum, the xml_envelope_attrs repeating attribute settings
for the EU regioinal schemas contain the following entry:
submission_type=submission/@type
In this case, the extracted value is initial-maa. It is possible to convert the extracted values
using attribute expressions or through an XSLT transformation script. See Mapping XML
Values to Documentum Attributes, page 347 for details.
d. For each of the above, set the corresponding flags in the is_required_envelope_attr and
override_env_attrs_on_rarf repeating attributes to T (true) or F (false), accordingly.
If an attribute is marked as “required” in the XML envelope but is missing from the XML file or has a
blank value specified for it in the XML file, this is logged as a warning. If the “override on RARF”
340
Regulatory Submissions
setting is enabled, the extracted attribute is copied to the relevant Regulatory Application Registration
Form as well as the submission folder, even if it is already defined at that level. If this flag is disabled,
the extracted attribute is only copied to the Regulatory Application Registration Form if the form
itself does not already have a defined value, that is, it is used as a default value for the Regulatory
Application Registration Form in that case.
341
Regulatory Submissions
The standard XML schema configuration object for the above is called “us-regional_2-01”, and it
has the following predefined settings:
342
Regulatory Submissions
In the preceding table, XPath expressions are used to extract the relevant metadata from the
admin node of the XML file. For more information on using XPath expressions, refer to:
http://www.w3schools.com/XPath/.
343
Regulatory Submissions
<number>pending</number>
</application>
<applicant>ACME Pharmaceuticals Inc.</applicant>
<agency-name>SV-MPA</agency-name>
<atc>C10AB05</atc>
<submission type="initial-maa" />
<procedure type="mrp" />
<invented-name>Wonder Drug</invented-name>
<inn>cureimol monosulphate</inn>
<sequence>0000</sequence>
<submission-description>Submission of registration dossier</submission-description>
</envelope>
<envelope country="de">
<application>
<number>pending</number>
</application>
<applicant>ACME Pharmaceuticals Inc.</applicant>
<agency-name>DE-BfArM</agency-name>
<atc>C10AB05</atc>
<submission type="initial-maa" />
<procedure type="mrp" />
<invented-name>Wunderdroge</invented-name>
<inn>cureimol monosulphate</inn>
<sequence>0000</sequence>
<submission-description>Einreichung des Registrierungsdossiers </submission-description>
</envelope>
</eu-envelope>
<envelope country="fr">
<application>
<number>pending</number>
</application>
<applicant>ACME Pharmaceuticals Inc.</applicant>
<agency-name>FR-AFSSAPS</agency-name>
<atc>C10AB05</atc>
<submission type="initial-maa" />
<procedure type="mrp" />
<invented-name>Médicament miracle</invented-name>
<inn>cureimol monosulphate</inn>
<sequence>0000</sequence>
<submission-description>Présentation d'un dossier d'inscription</submission-description>
</envelope>
</eu-envelope>
<m1-eu>
…etc.
</m1-eu>
</eu:eu-backbone>
In this case, since the XML file contains multiple envelope elements, multiple values can be
extracted for repeating attributes, such as concerned_member_states, product_trade_name, and
product_generic_ name. Duplicate values are ignored. The first envelope element pertains to the
Reference Member State, and the others pertain to Concerned Member States (this only applies to EU
submissions using the MRP; otherwise there would be only one envelope element). The standard
XML schema configuration object for the above is “eu-regional_1-2-1.xml” in this case, and has the
following predefined settings:
344
Regulatory Submissions
Note the use of the [position()>1] condition in the XPath expression for concerned_member_states,
which means that all country codes except that of the first envelope element are extracted
and combined together into the concerned_member_states repeating attribute. In the case of
single-valued attributes such as country_code, application_number, and title, where there are
multiple envelope elements, the first element that has a defined, non-null value takes precedence;
usually those that are specified in the first envelope element. Therefore, in this example, country_code
is set to sv (the code for Sweden, as specified in the country attribute of the first envelope element)
and title is set to “Submission of registration dossier” (the first title value; the German and French
title values are ignored in this case, because title is a single-valued attribute in Documentum).
Submission-level metadata is also recorded in the Submission History XML file for each submission,
enabling this metadata to be displayed by the Documentum Submission Store and View Viewer. By
default, the following attributes are recorded for each submission or eCTD sequence:
• submission_number—Recorded in the id XML attribute.
• submission_date—Recorded in separate day, month, and year XML attributes, in order to
facilitate localization.
• submission_type–Recorded in the type XML attribute.
• submission_procedure_type—Recorded in the procedure-type XML attribute (applies to EU
submissions only).
345
Regulatory Submissions
346
Regulatory Submissions
Note that some of these attributes, such as dosage_form, drug_product_manufacturer, and so on, are
defined in the parent XML elements containing the <leaf> element, which is why the ancestor XPath
expression function (or axis) is used for those items.
For example,
health_authority=submission/@agency-code=>$upper($arg(xmlvalue))
In the example, the expression converts the agency code attribute in the regional XML file into
uppercase. The $arg(xmlvalue) function refers to the raw extracted XML value.
347
Regulatory Submissions
348
Regulatory Submissions
version=
"PDF 1.4" checksum="a28a808f9b7cb01dee5e11779a70c51c">
<title>Proposed Television Advertisement</title>
</leaf>
</m1-15-2-1-1-clean-version>
<m1-15-2-1-2-annotated-version>
<leaf operation="new" xlink:href="115-promo-material/1152-materials/a/
annotated-tvc-storyboard-0814.pdf" xlink:type="simple" checksum-type="md5" ID="ddjwv125"
application-version="PDF 1.4" checksum="8241a7c5bb98dee28b1385dc261345b2">
<title>Annotated TV Storyboard</title>
</leaf>
</m1-15-2-1-2-annotated-version>
</m1-15-2-1-material>
</m1-15-2-materials>
</m1-15-promotional-material>
</m1-regional>
</fda-regional:fda-regional>
2. Formulate an XPath expression that can be used to identify the new XML files and distinguish
them from other XML files, including other versions of the regional XML file for the same region.
In this example, a suitable XPath expression is:
/fda_regional[@dtd-version='3.3']
This XPath expression only matches an element in the XML files with a root element named
fda_regional (ignoring the fda_regional: namespace prefix) and a dtd-version attribute in that
root element with the value 3.3. This is used to set the xpath_qualifier setting in the new eCTD
XML schema configuration object, so that it only applies to US Regional v 2.3 XML files.
Note: The DTD version number is specified as 3.3 within the XML file, and not 2.3. In addition,
the XML namespace prefix, fda_regional:, should not be included in the XPath expression.
Otherwise, it will not be evaluated correctly.
3. Determine whether or not an XSL preprocessing transformation script needs to be applied to the
XML file to convert it into a form that can be processed by Documentum Submission Store and
View. Develop one if necessary. See Processing Standard XML Files, page 329 and Transforming
Non-Standard XML Files, page 332 for more information.
In this case, the XML conforms to the standard eCTD format (apart from the fact that it contains
FDA-specific metadata in the <admin> section), so it would appear that an XSL transformation
script is not required and Documentum Submission Store and View can process these XML files
directly. However, special encodings are used to denote the metadata in the XML file, instead of
literal values: for example,
• applicant-contact-type="fdaact4"
• telephone-number-type="fdatnt1"
• application-type="fdaat1"
• submission-type="fdast8"
• submission-sub-type="fdasst1"
It is necessary to convert these encodings into meaningful values in the XML preview for display
purposes. For example, according to the US FDA guidance documents, the applicant contact
type code “fdaact4” should be displayed as “Promotional Labeling and Advertising Regulatory
Contact”. One way to convert XML values is to add entries to the D2 value mapping dictionary,
Submission Regional Metadata Conversions. However, this only applies to XML metadata
extracted from the XML file into Documentum. Moreover, some of the XML metadata, such as
349
Regulatory Submissions
applicant contact type codes, are not extracted into Documentum attributes at all, as they are not
part of the standard object model.
An XSL transformation script can be used to convert the encoded values into literal values in the
XML preview used in the Submission Viewer. A partial transcript of this script is shown below:
<?xml version="1.0" encoding="UTF-8"?>
<xsl:transform version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:ectd=
"http://www.ich.org/ectd"xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xlink-ectd=
"http://www.w3c.org/1999/xlink">
<xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/>
…
<xsl:variable name="APPLICATIONCONTACTTYPES" select="concat(
'fdaact1=Regulatory', ';',
'fdaact2=Technical', ';',
'fdaact3=US Agent', ';',
'fdaact4=Promotional / Advertising', ';'
)"/>
…
<!-- Default rule to copy all XML elements unchanged -->
<xsl:template match="@* | node()">
<xsl:copy>
<xsl:apply-templates select="@* | node()"/>
</xsl:copy>
<-- Translate contact type codes -->
<xsl:template match="//@applicant-contact-type">
<xsl:attribute name="applicant-contact-type">
<xsl:value-of select="substring-before(substring-after($APPLICATIONCONTACTTYPES,
concat(., '=')), ';')"/>
</xsl:attribute>
</xsl:template>
…
</xsl:transform>
A copy of the complete XSL transformation script can be found on the application server, in the
%WEBAPPS%/style/us-regional_2-3-normalize.xslt file. This file is provided for
reference purposes only and is not used by Documentum Submission Store and View itself.
In the preceding sample script, the application contact type encoding mappings to corresponding
display labels are defined in the XSL variable “APPLICATIONCONTACTTYPES” as a series of
name=value pairs delimited by semicolon characters. A default XSL processing rule is then
defined, which applies to all XML nodes except those that match other, more specific rules. This
copies the XML elements, attributes, and textual data values to the output XML file unchanged.
Then, a rule is defined for converting the “applicant-contact-type” attribute values (in any
XML element) from the encoded forms into the corresponding display values. For example,
the element:
<applicant-contact-name applicant-contact-type="fdaact4">Regulatory
Manager X</applicant-contact-name>
in the input XML file is converted to:
<applicant-contact-name applicant-contact-type=" Promotional /
Advertising'">Regulatory Manager X</applicant-contact-name>
in the output XML file. This enables the Submission Viewer to display applicant contact type
codes as meaningful values in the XML preview. Similar variables and XSL processing rules for
the other FDA value encodings can be defined in the same way, for converting application type
codes, submission type codes, submission sub-type codes, and so on.
350
Regulatory Submissions
Another issue with the US regional XML files is that, according to the FDA regional M1 eCTD
specification, certain documents such as administrative forms can appear in the <admin> section,
as well as in the <m1-regional> section. However, Documentum Submission Store and View
requires all of the leaf documents to be encapsulated within a single XML element, separate to the
XML metadata section (the <admin> element, in this case) so that they can be extracted into the
consolidated module m1-m5 submission tree view. To address this, an additional rule can be
added to the XSL transformation script to copy the <form> elements to a dummy section, named
<m1-1-admin-forms>, below the <m1-regional> node in the output XML file, as follows:
<xsl:template match="//m1-regional">
<xsl:element name="m1-regional">
<xsl:element name="m1-1-admin-forms">
<xsl:for-each select="//form">
<xsl:element name="form">
<xsl:attribute name="form-type">
<xsl:value-of select="substring-before(substring-after($FORMTYPES, concat(@form-type,
'=')), ';')"/>
</xsl:attribute>
<xsl:copy-of select="node()"/>
</xsl:element>
</xsl:for-each>
</xsl:element>
<xsl:apply-templates/>
</xsl:element>
</xsl:template>
These forms is then extracted along with the other <m1-regional> leaf documents, appearing in the
“m1-1-admin-forms” section in the submission tree view. (The form type codes are also converted
into meaningful display values as part of this process, using the technique described previously.)
It is possible to test a new XSL transformation script before installing it by linking a sample XML
file to it. To do this, copy the sample XML file and XSL script to a temporary folder in the local
file system, and open the sample XML file in Notepad or any other XML Editor. Insert an XML
processing instruction into the XML sample file, immediately after the XML header, to link it
to the XSL stylesheet script file, as follows:
<?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet type="text/xsl" href="us-regional-2-3-normalize.xslt"?>
<fda-regional:fda-regional xmlns:fda-regional="http://www.ich.org/fda" xmlns:xlink=
"http://www.w3c.org/1999/xlink" dtd-version="2.01" xml:lang="en">
…
</fda-regional:fda-regional>
351
Regulatory Submissions
In the Developer Tools HTML window, it is possible to drill down into the transformed XML
output (which is itself an XML document, and not an HTML document in this case) to verify that
the XML structure has been passed through correctly and the encoded attribute values have been
translated into the corresponding display values. If the XML output appears not to have been
transformed, check the Console tab for errors and make sure the stylesheet script file name in
the sample XML file is specified correctly in the XML header.
When the XSL stylesheet is working correctly, it can be installed as the primary content file
of the new schema configuration object and the enable_xslt_preprocessing option turned on
for that schema. Documentum Submission Store and View then downloads and applies the
transformation script automatically to the relevant XML files at import time, prior to any further
processing, such as XML metadata extraction or extraction of module 1 leaf documents into
the submission tree view. Note that this process does not alter the original regional XML file
included in the m1 section of the eCTD submission: instead, the original XML file (the input
file) is saved as the primary content of the regional XML document in the repository, and the
transformed XML output file is saved as a separate xml_preview rendition of this document. This
rendition is only used internally by Documentum Submission Store and View to generate the
XML preview when the regional XML file is selected in the Submission Viewer web application.
If the imported submission files are subsequently exported from Documentum back out to the
local file system, only the original primary content files are exported. Therefore, in this case, the
original regional XML file that was submitted to the FDA is exported, containing the original
encoded values, and not the transformed version.
4. Identify the top-level XML elements containing:
a. the regional XML metadata, and
b. the leaf document references.
In this case, the regional XML metadata is contained in the <admin> element and the leaf
document references reside in the <m1-regional> element (including the admin forms after
352
Regulatory Submissions
353
Regulatory Submissions
into the main submission tree view to combine them with the main eCTD backbone file
references for modules 2-5.
b. The HTML page should include a top-level <div class="envelope"> element in the HTML
body, in order to provide scroll bars in the Submission Viewer document preview panel.
See the following example.
c. Predefined icons for various country flags are provided as part of the Documentum
Submission Store and View XMLViewer web application and can be added to the title bar if
required. See the following example.
<?xml version="1.0" encoding="UTF-8"?>
<!-- U.S. Regional Stylesheet (us-regional.xsl) for previewing XML metadata in SSV -->
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:
fda-regional="http://www.ich.org/fda">
<xsl:template match="/">
<html> <head> <meta http-equiv="content-type" content="text/html; charset=UTF-8"/> </head>
<body>
<div class="envelope">
<h3> <img src="icons/countryflags/us.gif" style="padding-left: 10px; padding-right:
10px;"/>US Food and Drugs Administration - Regulatory Information</h3>
<!-- Application summary info table -->
<table border="0" cellspacing="20px">
<tr><th align="left">Applicant ID:</th><td colspan="2"><xsl:value-of select=
"/fda-regional:fda-regional/admin/applicant-info/id"/></td></tr>
<tr><th align="left">Company Name:</th><td colspan="2"><xsl:value-of select=
"/fda-regional:fda-regional/admin/applicant-info/company-name"/></td></tr>
<tr><th align="left">Submission Description:</th><td colspan="2"><xsl:value-
of select="/fda-regional:fda-regional/admin/applicant-info/submission-description"/></td>
</tr>
</table>
<p/>
<!-- Contact info table -->
<table width="100%" border="1" bordercolor="#564742">
<th width="150px" bgcolor="#FEF4F0" style="color=#333333">Contact Information
</th>
<td>
<xsl:for-each select="/fda-regional:fda-regional/admin/applicant-info/
applicant-contacts/applicant-contact">
<table width="100%" border="0" cellpadding="2">
<tr> <td style="border:0;"><strong><xsl:value-of select="applicant
-contact-name"/>
</strong> - <xsl:value-of select="applicant-contact-name/@applicant-contact-type"/>
</td> </tr>
<xsl:for-each select="telephones">
<tr> <td style="border:0;">Tel.: <font color="red"><xsl:value
-of select="telephone"/>
</font> - <xsl:value-of select="telephone/@telephone-number-type"/></td> </tr>
</xsl:for-each>
<tr> <td style="border:0;">E-mail: <a href="mailto:{emails/email
[1]}?subject={concat(/fda-regional:fda-regional/admin/application-set/application[1]
/application-information/application-number/@application-type, ' application ',
/fda-regional:fda-regional/admin/application-set/application[1]/application-
information/application-number,' - ', /fda-regional:fda-regional/admin/applicant-info/
submission-description)}"><xsl:value-of select="emails/email[1]"/></a></td> </tr>
<xsl:if test="position()!=last()"> <tr> <td style="border:0;">
<hr/> </td>
</tr> </xsl:if>
<xsl:if test="position()=last()"> <tr> <td style="border:0;">
<p/> </td>
</tr> </xsl:if>
</table>
</xsl:for-each>
</td>
354
Regulatory Submissions
</table>
</p>
<!-- Application info table -->
<table width="100%" border="1" bordercolor="#564742">
<th width="150px" valign="middle" bgcolor="#FEF4F0" style="color=#333333">
Application Information</th>
<td>
<table width="100%" border="0">
<xsl:for-each select="/fda-regional:fda-regional/admin/application-set/application">
<tr><td><table width="100%" border="0" bordercolor="#564742" cellpadding="2">
<xsl:choose>
<xsl:when test="@application-containing-files='true' or @application-containing
-files='false'">
<tr> <td style="border:0;">Application Containing Files: <xsl:value-of select=
"@application-containing-files"/></td> </tr>
</xsl:when>
<xsl:otherwise>
<tr> <td style="border:0;">Application Containing Files: This
value must be set to either true or false.</td> </tr> </xsl:otherwise>
</xsl:choose>
<tr> <td style="border:0;">Application Type: <xsl:value-of select="descendant::
application-number/@application-type"/></td></tr>
<tr><td style="border:0;">Application Number: <xsl:value-of select="descendant::
application-number"/></td></tr>
<tr><td style="border:0;">Submission Type: <xsl:value-of select="descendant::
submission-id/@submission-type"/></td></tr>
<xsl:if test="(string-length(/.//submission-id/@supplement-effective-date-type)
> 0) and (.//submission-id/@supplement-effective-date-type != ' ')">
<tr><td style="border:0;">Supplement Effective Date: <xsl:value-of select=".//
submission-id/@supplement-effective-date-type"/></td></tr>
</xsl:if>
<tr><td style="border:0;">Submission Id: <xsl:value-of select="descendant::
submission-id"/></td> </tr>
<tr><td style="border:0;">Submission Sub-Type: <xsl:value-of select=".//
sequence-number/@submission-sub-type"/></td></tr>
<tr><td style="border:0;">Sequence #: <xsl:value-of select='substring
(format-number(descendant::sequence-number, "0000"),1, 4)'/></td></tr>
<xsl:if test="(string-length(/descendant::cross-reference-application
-number) > 0) and (descendant::cross-reference-application-number != ' ')">
<xsl:for-each select=".//cross-reference-application-number">
<tr><td style="border:0;">Cross Reference Number:
<xsl:value-of select="."/></td></tr>
<tr><td style="border:0;"><xsl:value-of select="@application-type"/>
</td></tr>
</xsl:for-each>
</xsl:if>
<xsl:if test="position()!=last()"> <tr><td style=
"border:0;"> <hr/></td></tr> </xsl:if>
<xsl:if test="position()=last()"> <tr> <td style="border:0;"> <p/>
</td> </tr> </xsl:if>
</table>
</td>
</tr>
</xsl:for-each>
</table>
</td>
</table>
</div>
</body>
</html>
</xsl:template>
</xsl:stylesheet>
355
Regulatory Submissions
To test the preview stylesheet prior to installing it, link a sample XML file to it using the technique
described previously for preprocessing transformation stylesheets. For example:
<?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet type="text/xsl" href="us-regional-2-3.xsl"?>
<fda-regional:fda-regional xmlns:fda-regional="http://www.ich.org/fda" xmlns:xlink=
"http://www.w3c.org/1999/xlink" dtd-version="2.01" xml:lang="en">…
</fda-regional:fda-regional>
In this case, the preview stylesheet is named us-regional-2-3.xsl and is expected to reside in the
same folder as the XML document itself. When the XML sample file is opened in the browser,
the transformed HTML output should be displayed correctly:
Note that in this case, the encoded XML metadata values are displayed because the HTML output
is generated directly from the original XML file and not the pre-transformed XML file. But at least
it enables the preview format to be verified. When the preview stylesheet is used in Documentum
Submission Store and View, it uses the pretransformed XML file and shows the decoded values.
7. Install the XSL preview script in the %WEBAPPS%/XMLViewer/style folder on the D2 web
application server. If the stylesheet refers to external files such as images or auxiliary scripts,
356
Regulatory Submissions
make sure these are also installed in the appropriate locations. There is no need to restart the
application server.
Note: It is not necessary to install the new DTD file or stylesheet file provided by the Agency.
These files are included automatically by the eCTD publishing tool as part of the submission,
in the util/dtd and util/style folders below the eCTD sequence folder. Documentum
Submission Store and View imports these into the repository alongside the other submission
files, so that the entire sequence folder can be exported back out to the local file system from
Documentum if necessary. The DTD and stylesheets can then be used to validate and preview
the eCTD XML files in the local file system, outside of Documentum. However, they are not
processed or required by Documentum Submission Store and View itself.
8. To represent the new regional XML format, create a new XML schema configuration object of
type cd_ectd_xml_config in the /System/SSV/XMLSchemas folder in the repository. This is
not currently possible in D2 itself because a D2 creation profile is not defined for these objects.
However, a new schema configuration can be created in Documentum Administrator as follows:
a. Log into Documentum Administrator as the Documentum installation owner, for example,
dmadmin, with Superuser privileges in Documentum.
b. Navigate to the /System/SSV/XMLSchemas folder in the repository.
c. Select an existing XML schema configuration object that is most similar to the new schema,
for example, the us_regional-2-01 schema, or select an arbitrary schema.
d. Create a copy of this schema.
e. Modify the new schema properties as described in the following table:
357
Regulatory Submissions
358
Regulatory Submissions
application_num=application
-set/application/application
-information/application-number
application_type=application
-set/application/application
-information/application-number/
@application-type
submission_number=application
-set/application/submission
-information/sequence-number
submission_type=application
-set/application/submission
-information/submission-id/
@submission-type
Required For each preceding XML envelope T
attribute, specify T (enabled) if
the attribute is mandatory, or F
(disabled) if it is optional.
If an attribute is marked as
mandatory and there is no defined
value in the regional XML file for it,
or the specified value is blank, and
a value is not specified explicitly
in the SRF (or inherited from the
RARF), a warning message will
be logged in the import log file at
import time.
After the XML schema configuration properties have been set up correctly, click OK to save
the changes in Documentum.
f. If an XSL preprocessing script is to be applied, check out the new XML schema configuration
object, and use the check-in from file option to upload the preprocessing script to it from
359
Regulatory Submissions
the local file system as the primary content file. Select the XSL Stylesheet format for this file
(Documentum format code xsl) and use the Check-in as same version option.
Note: It is possible to check-in the changes as a new version of the schema configuration
object. Documentum Submission Store and View always uses the latest (“CURRENT”)
version in each case.
This completes the Documentum Submission Store and View configuration process and it is
now possible to import eCTD sequences using the new format. It is not necessary to restart the
Documentum Server, Java Method Server, or Application Server. Documentum Submission Store and
View should pick up the new XML schema configuration automatically on the next import cycle.
Submission Filestore
Submissions are generated by some kind of publishing tool and stored in a network filestore that
is accessible to both the users and the Documentum Server. This enables submission folders to be
selected in the D2 browser, and the contents of those folders to be retrieved asynchronously through
the submission import process, which runs as a back-end server method on the Documentum Server
(Java Method Server).
360
Regulatory Submissions
The folder path from the Documentum Server to the submission filestore can be different to the folder
path from the desktop of the user to the same submission filestore. For example, if the Documentum
Server is running on Unix, the submission filestore path could be /mnt/ext/eSubmissions,
depending on where it is mounted, whereas on the desktop of the user, it could be a UNC path to
the relevant server, for example, \\FSVRM1\eSubmissions. Windows-mapped drives can also
be used for the client path, but the same drive letter must be mapped on each user desktop for the
path to be valid in all cases, for example, S:/eSubmissions. However, Windows-mapped drives
cannot be used for the Documentum Server path, because Windows does not allow them to be used
by processes running as a service, such as Documentum. A UNC path from the server is required.
If the Documentum Server is behind a firewall, it may be necessary to replicate the filestore at the
storage level, or through a sync-and-share utility, such as Syncplicity, so that the same content is
accessible from both sides. WRITE access is required to select the submission filestore because of a D2
limitation. Documentum Submission Store and View does not write to the submission filestore (write
access is only required for submission publishing). If necessary, enable sharing on the submission
filestore itself, so that the appropriate users (including the Documentum installation owner account
used by the Documentum Server, such as dmadmin) have at least Read access to the appropriate
folder. You may need to set up a local user account for dmadmin on the machine hosting the filestore,
so that the Documentum Server can connect to it as that user.
The Documentum for Life Sciences Solution Installation Guide provides the steps for registering
submission filestores.
For eCTDs, the user must select either an individual sequence folder, or a parent folder containing a
series of sequence folders (that is, an application-level folder), in which case those sequences that have
not already been imported are imported into the repository in increments. For NeeS, the selected
folder must be a submission-level folder to be imported in its entirety. Existing submission folders in
the repository are not updated. To replace existing submission folders in the repository, you must
delete the existing submission folder from the repository and reimport it from the external filestore.
1. Log in to D2-Config.
2. Select Data > Dictionary.
3. On the Dictionary page, select All elements in the filter on the toolbar.
4. In the Dictionaries list, select System Parameters.
5. On the Alias tab, replace the value of the d2_url key from localhost to the correct URL address
of D2 on the application server.
6. On the Alias tab, replace the value of the xmlviewer_url key with the XML Viewer URL on the
application server (for example, http://<app server host>:<default port>/XMLViewer).
Note: You need to set the XML Viewer URL in case it has not been set during the installation.
7. Click Save.
361
Regulatory Submissions
362
Regulatory Submissions
Submission Navigator Widget URL Parameters (“WG EXT Submission History View”);
Default URL Path: /XMLViewer/XMLViewer.html
URL Parameter Description Default Configuration
Setting
type List of object type(s) to be cd_reg_admin_info,
rendered by this widget.
cd_reg_submission_info
cd_submission_element
style XSL stylesheet to use to render submission-navigator.xsl
the XML file for the relevant
application into HTML in
the widget panel. (The
specified stylesheet must be
installed in the %WEB-APPS%
/XMLViewer/style folder
on the Application Server.)
docViewerWidget Widget ID of the Submission SSVLeafDocViewer
Document Preview widget.
docViewerEvent D2 event code to send to D2_CUSTOM_EVENT
the Submission Document
Preview widget when a
document is selected in the
navigation panel.
363
Regulatory Submissions
Submission Navigator Widget URL Parameters (“WG EXT Submission History View”);
Default URL Path: /XMLViewer/XMLViewer.html
URL Parameter Description Default Configuration
Setting
initView Initial display mode to use toc
for application-level views, as
follows:
• current–eCTD “current”
view or first NeeS
submission (as
appropriate)
• cumulative–eCTD
“cumulative” view or
first NeeS submission (as
appropriate)
364
Regulatory Submissions
Submission Navigator Widget URL Parameters (“WG EXT Submission History View”);
Default URL Path: /XMLViewer/XMLViewer.html
URL Parameter Description Default Configuration
Setting
refreshLoginTicketInterval Periodic interval, in seconds, 240
to refresh the D2 login
ticket, to prevent tickets
from expiring. The specified
value should be less than the
Documentum login ticket
timeout value, as configured
in the dm_server_config object
for the repository (default =
5 minutes). It should allow a
margin of at least 10 seconds
for D2 to respond to new
ticket requests.
workspaceView D2 view labels in the Submission%20View
workspace in which this
widget is installed. A
comma-delimited list of
view labels can be specified
if necessary; use “%20” to
denote a space character
in the URL. To avoid
unnecessary processing, the
currently-selected Regulatory
Application Registration
Form or Submission
Registration Form XML
file is not downloaded and
rendered until the relevant
view is selected.
active Indicates whether or not false
the widget is active in the
default workspace view. In
single-view workspaces, the
value should be set to true;
for multi-view workspaces,
specify active=true if the
widget is in the initial (default)
view, and active=false
otherwise.
docbase Repository name. The D2 $DOCBASE
token $DOCBASE can be used
here.
365
Regulatory Submissions
Submission Navigator Widget URL Parameters (“WG EXT Submission History View”);
Default URL Path: /XMLViewer/XMLViewer.html
URL Parameter Description Default Configuration
Setting
username Username to use for the $LOGIN
current session. The D2 token
$LOGIN can be used here.
password Password to use for the $TICKET
current session. The D2 token
$TICKET can be used here.
A following URL is an example widget URL for the Submission Navigator widget:
/XMLViewer/XMLViewer.html?workspaceView=View%20Submission&active
=false&format=xml&type=cd_reg_admin_info,cd_reg_submission_info,cd
_submission_element&style=submission-navigator&docViewerWidget
=SSVLeafDocViewer&docViewerEvent=D2_EVENT_CUSTOM&initMessage=Select_a
_Regulatory_Application_Registration_Form,_Submission_Registration_Form
_or_archived_submission_document&docbase=$DOCBASE&locale=en&username
=$LOGIN&password=$TICKET
6. Click Save.
7. Under Widgets, select WG EXT SSV Leaf Element View and configure the widget URL using
the following URL parameters:
Submission Document Preview Widget URL Parameters (“WG EXT SSV Leaf Element
View”); Default URL Path: /XMLViewer/DocViewer.html
URL Parameter Description Default Setting
widget_id The ID of this widget, SSVLeafDocViewer
as specified in the
docViewerWidget widget
in the Submission Navigator
URL.
workspaceView D2 view labels in the Submission%20View
workspace in which this
widget is installed. A
comma-delimited list of
view labels can be specified
if necessary; use “%20” to
denote a space character
in the URL. To avoid
unnecessary processing, the
currently-selected Regulatory
Application Registration
Form or Submission
Registration Form XML
file is not downloaded and
rendered until the relevant
view is selected.
366
Regulatory Submissions
Submission Document Preview Widget URL Parameters (“WG EXT SSV Leaf Element
View”); Default URL Path: /XMLViewer/DocViewer.html
URL Parameter Description Default Setting
active Indicates whether or not false
the widget is active in the
default workspace view. In
single-view workspaces, the
value should be set to true;
for multi-view workspaces,
specify active=true if the
widget is in the initial (default)
view, and active=false
otherwise.
autoSelect Indicates whether or not false
the widget should track and
preview the currently-selected
document in the D2 Doc List
widget. If false, the widget
responds to custom menu
events or events fired by some
other widget (for example, the
Submission Viewer widget)
targeted at this widget ID.
type Comma-delimited list of null
expected object types. Where
specified, only selected
documents of the specified
type are rendered. Optional;
if unspecified, all object types
are rendered, if they have
suitable content.
useC2Overlays Set as true to display PDF false
renditions with watermarks
or signature pages, according
to the C2 view configuration;
false to display PDFs without
C2 watermarks or signature
pages.
imageFormats Comma-delimited list jpeg,gif,png
of in-line image formats
supported by the browser.
docCompareLeftWidgetId Widget ID of the left lsDocViewer1
document preview panel
used in the “Compare” view.
Optional. If undefined or left
blank, the comparison view is
disabled.
367
Regulatory Submissions
Submission Document Preview Widget URL Parameters (“WG EXT SSV Leaf Element
View”); Default URL Path: /XMLViewer/DocViewer.html
URL Parameter Description Default Setting
docCompareRightWidgetId Widget ID of the right lsDocViewer2
document preview panel
used in the “Compare” view.
Optional. If undefined or left
blank, the comparison view is
disabled.
docCompareEvent D2 event code to send to the D2_EVENT_CUSTOM
comparison widgets to initiate
a side-by-side comparison
operation.
initMessage Message to display when Select_a_document_in_the
the widget is initially loaded _submission_viewer.
and nothing is selected for
previewing. Underscore
characters can be used to
represent spaces here, to
make it easier to specify the
message in the URL.
docbase Repository name. The D2 $DOCBASE
token $DOCBASE can be used
here.
username User name to use for the $LOGIN
current session. The D2 token
$LOGIN can be used here.
password Password to use for the $TICKET
current session. The D2 token
$TICKET can be used here.
8. Click Save.
The use of the standard D2 PDF Preview widget is not recommended in multi-view workspaces
because it responds to all document selection events in the Doc List widget irrespective of whether
or not the widget is visible. This can lead to performance issues when navigating the repository.
Instead, it is recommended that the custom Life Sciences document preview widget is used for
previewing. In the standard workspaces, the Life Sciences document preview widget is used by
default in several places, with different URL parameters passed to it in each case, so that the widget
only downloads and renders content when it is in the currently-active view.
The following table summarizes the default widget configurations used for document previews:
368
Regulatory Submissions
369
Regulatory Submissions
370
Regulatory Submissions
371
Regulatory Submissions
=lsSSVLeafDocViewer&initMessage=Select_a_study_tagging_file
_in_the_Submission_Viewer&docbase=$DOCBASE&locale=en&username
=$LOGIN&password=$TICKET&refreshTicketInterval=240&d2WebContext=D2
8. Save the configuration.
Note: If you use Documentum D2 in HTTPS mode, then you must also specify the Documentum
Submission Store and View URLs in HTTPS mode. This is also applicable in cases where the HTTP
login gets automatically redirected to the HTTPS mode for D2.
372
Regulatory Submissions
When a submission is imported, the inter-document PDF hyperlinks and bookmarks are resolved
during import based on the d2_url and xmlviewer_url parameter values specified in the System
Parameters dictionary. However, if you change the Application Server URL after importing the
submission, it impacts the PDF hyperlink navigation on the imported submission documents because
they still reflect the old Application Server URLs. In such a scenario, the user can run the Index
Submission Folders Migration utility. For more information, see Using the Index Submission Folders
Migration Utility, page 386.
Another option is to use an LBR (load balancing router) or proxy server as it enables the target
app server hosts to be reconfigured easily. You then configure the base URL for D2 to point to the
LBR or proxy server, and it forwards the HTTP requests to the relevant app server. It should be
configured to use “sticky sessions” so that the same app server is used for a particular client session,
rather than spreading it across multiple servers. For example, you can set up an Apache Tomcat
proxy server. For setting up the proxy server, refer to the “Proxy Support HOW-TO” article on
the Apache Tomcat website.
373
Regulatory Submissions
This makes it easy for users to switch back and forth between the source and target documents.
However, the JavaScript in embedded PDF documents is only supported on Internet Explorer and
FireFox; it does not work in Chrome. In the latest version of Chrome (62.0.3202.6), it is no longer
possible to replace the built-in PDF viewer with the Acrobat plugin. Therefore, for compatibility
with the Chrome browser, this option is disabled by default. If Chrome does not need to be
supported, enabling this option is recommended for improved usability. However, existing imported
submissions may need to be reindexed or reimported in order to take advantage of this functionality.
Note: These changes only apply to the PDF preview renditions used internally by LSSSV for
inter-document hyperlink navigation within D2. They do not affect the original submitted PDF
content files. If the submission folder is exported, the original submitted PDF files (only) are included
in the export.
374
Regulatory Submissions
</study-identifier>
<study-document>
<doc-content xlink:href="../../../../index.xml#id106245">
<title>SN-3.001.01 – Overview of study objectives</title>
<file-tag name="synopsis" info-type="ich"/>
</doc-content>
<doc-content xlink:href="../../../../index.xml#id106246">
<title>SN-3.001.02 – Long-term dose response results in human trials for Wonderdrug 50mg
capsules</title>
<file-tag name="study-report-body" info-type="ich"/>
</doc-content>
<doc-content xlink:href="../../../../index.xml#id106247">
<title>SN-3.001.03 – Amendments to Clinical Study Protocol</title>
<file-tag name="protocol-or-amendment" info-type="ich"/>
</doc-content>
</study-document>
</ectd:study>
The STF XML file resides in the same folder as the study report documents; in this case, the
5312-compar-ba-be-stud-rep folder (used for comparative bioavailabilty/bioequivalence studies).
Documentum Submission Store and View uses a built-in XSL script, stf-2-2-normalize.xslt, to convert
the STF XML files into a standard format that is easy to process (just as for other non-standard XML
files, such as Japanese regional M1 XML files). This script is installed as the primary content file of the
XML schema configuration object named stf_2-2, which applies to v 2.2 Study Tagging Files; XSL
transformation is enabled for this schema, so that all STF XML files are transformed automatically by
this script.
This information can then be manifested in the main Submission Viewer panel, with links to the three
study reports arranged underneath the study tagging file node itself. If the version 2.2 study tagging
file format is revised in the future, a new XSL transformation script may need to be developed to
transform it into the same format as above, so that it can be processed by LSSSV. To accomplish this:
1. Copy the /XMLViewer/style/stf-2-2-nornalize.xslt script provided, rename it accordingly, and
edit it to convert the new format into the same “normalized” XML format described above.
2. In Documentum Administrator, copy the stf_2-2 XML schema configuration object and rename it
accordingly. Adjust the xpath_qualifier setting to refer to the new version of the STF.
3. Use the check-out/check-in from file functionality in Documentum Administrator to replace the
main content file of the new XML schema configuration object with the new XSL transformation
script developed in step 1.
4. Import a sample eCTD submission using the new STF format and verify that it can be navigated
correctly.
375
Regulatory Submissions
376
Chapter 18
Migration and Integrity Check Utilities
This section describes how to install and use migration and integrity check tools included in the
Life Science solutions.
377
Migration and Integrity Check Utilities
The root cause is because of the way in which the underlying D2 core method is implemented. This
is the server method (part of the D2 core product) that is responsible for applying the relevant D2
configurations to an individual repository object. It processes objects one by one in serial fashion,
and in each case it evaluates the D2 context rules, as defined in the D2 configuration matrix for the
application (which are essentially DQL qualifiers), to determine which auto-naming, auto-linking,
and security configurations apply to the object. This does not scale up well when applied to a large
set of repository objects. It does not support multi-threaded processing, and causes redundant
processing for objects with similar properties. The processing rate when using the default D2 core
method in this way is very poor – typically around 3000 objects per hour.
The ApplyD2Configurations utility provides a wrapper around the D2 core method that supports
multi-threaded processing, which can boost the processing rate by a factor of 10 or more. With a
multi-node Documentum Server architecture (comprising two or more active Documentum Servers),
it is possible to scale the process out horizontally, using the distributed processing options. However,
to match the processing rate of EMA, some 10 Documentum Servers or more may be required.
Another limitation of the ApplyD2Configurations utility is in a Documentum Server failover setup
where the method does not have the capability to check if the Documentum Server instance is
available to process the batch. It assumes all the Documentum Server instances are up and running
and available to process the batch. Therefore, you must enable the distributed parallel processing
only when all the Documentum Server instances are available but not in the failover setup.
The Documentum Controlled Document Foundation Developer API Javadocs provides more information
about the ApplyD2Configurations tool.
378
Migration and Integrity Check Utilities
In this example, C:\Users\dmadmin> is the Windows command prompt, indicating the folder where
the scripts are installed. The parameters are the same for UNIX:
$ sh ApplyD2Configurations -docbase_name documentum -user_name dmadmin -password ""
-qualifier "cd_controlled_doc where folder('/Clinical/Cardiology-Vascular/Vasonin', descend)"
-verbose true
D2 auto-naming, auto-linking and security are applied by default. To apply them selectively, specify
the appropriate flags on the command line explicitly; for example,
C:\Users\dmadmin> ApplyD2Configurations -docbase_name documentum -user_name dmadmin -password ""
-qualifier "cd_controlled_doc where folder('/Clinical/Cardiology-Vascular/Vasonin', descend)"
-verbose true –auto_naming false –auto_linking false –security true
To process new documents, include the –create true argument, so that D2 applies auto-naming
and auto-numbering correctly. Otherwise, it retains the existing auto-numbering values. You can also
specify a D2 lifecycle state transition to apply to each document, through the –state_transition
argument. For example, –state_transition "(init)" applies the initial lifecycle state actions
to each document, as specified in the D2 lifecycle configuration for the “(init)” state.
The –delete_empty_folders option is also useful for cleaning up any empty folders below the
specified top-level folder. For example,
C:\Users\dmadmin> ApplyD2Configurations -docbase_name documentum -user_name dmadmin -password ""
-qualifier "cd_controlled_doc where folder('/Clinical/Cardiology-Vascular/Vasonin', descend)"
-verbose true –auto_naming true –auto_linking true –security true –delete_empty_folders
"/Clinical/Cardiology-Vascular/Vasonin"
When using the –auto_linking option, empty folders are sometimes generated as a result of
conflicts between parallel processing threads. These are prefixed by dm_fix_me labels in the folder
names. Such folders can be cleaned out by enabling the –delete_empty_folders option.
Note that the use of the –auto_linking and –state_transition options, in particular, can
increase the processing time required by the tool substantially. If the documents can be migrated
into the relevant folders directly and initialized with the appropriate metadata in advance, it is
better to do so, in order to avoid lengthy processing. If this cannot be avoided, and there are many
documents to be processed, consider increasing the number of processing threads available through
the –max_threads parameter. If multiple Documentum Servers are available, you can enable
distributed processing as follows:
379
Migration and Integrity Check Utilities
In this example, assuming there are more than 1000 documents to be processed (the -distributed_
processing_threshold value), the tool splits the processing tasks across all available servers.
Note the use of the caret symbol before the “*” symbol, which prevents the Windows command-line
shell from expanding this symbol into a list of files in the current directory. On UNIX, use
–content_servers “\*” to achieve the same results.
To improve the processing rate, specify the -use_private_sessions parameter in the D2 System
Parameters dictionary. If the value of this parameter is null or undefined, 'false' is taken as the
default value.
380
Migration and Integrity Check Utilities
In the preceding example, C:\Users\dmadmin> is the Windows command prompt, indicating the
folder where the scripts are installed. The parameters are the same for UNIX:
$ sh TMFAdmin.sh -docbase_name documentum -user_name dmadmin -password "" -show tmf_contributors
| more
In both cases, the output is piped to the more command to enable it to be scrolled through.
The tool can generate a lot of output if you dump the entire group structure in this way. It is more
useful to focus on a particular subgroup. For example, the groups for clinical trial AMX001 only
can be obtained using the command:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–show tg_amx001 | more
381
Migration and Integrity Check Utilities
The output shows that the access control groups on all of the TMF documents and folders for the
product named “Vasculin” are configured correctly. However, the groups themselves may still
need to be refreshed.
In the next example, another product named “AIR” is selected, and this time, the tool reports a
number of issues. To focus on the errors, the –verbose option is turned off:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–show "/Clinical/Cardiology-Vascular/AIR" –verbose false
382
Migration and Integrity Check Utilities
The output shows that there are 95 documents and one folder that must be repaired, and one
redundant group “tg_A001” that should be deleted. These issues were brought about by bulk DQL
updates to the repository, which caused the trial “A001” to be deleted, and changes to the TMF
configuration in D2 affecting the way the external roles and folder levels are interpreted. To fix this,
run the tool again in repair mode, as follows:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–repair "/Clinical/Cardiology-Vascular/AIR" –verbose false
The parameters are the same except that –show has been changed to –repair. The tool then
fixes all documents and folders that need repairing, reapplying D2 security in each case to ensure
the access controls are corrected. You can combine both the –show and –repair arguments in
one operation. For example,
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–repair "/Clinical/Cardiology-Vascular/AIR" –show "/Clinical/Cardiology-Vascular/AIR"
–verbose false
In this case, the repair phase is always carried out first followed by the revalidation
phase. The repair phase uses the same distributed/parallel processing mechanism as the
D2-Configuration Migration tool and supports the same -max_threads, -content_servers,
and –distributed_processing_threshold parameters, enabling potentially very large-scale
repairs to be carried out in extreme scenarios.
383
Migration and Integrity Check Utilities
To refresh the external user groups for all sites in a specific country, it is only necessary to identify
the relevant Country Registration Form. For example, the following refreshes the groups for trial
“AMX001” and country code “CH” (Switzerland), both at the country level and at the site level:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
-refresh "cd_clinical_country_info where clinical_trial_id = 'AMX001' and country_code = 'CH'"
Likewise, to refresh all site/country groups across all countries for a particular trial, it is only
necessary to identify the relevant Clinical Trial Registration Form. For example,
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
-refresh "cd_clinical_trial_info where clinical_trial_id = 'AMX001'"
To refresh all groups for all currently-active trials for a particular product, for example, “Vasonin”,
use:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
-refresh "cd_clinical_trial_info where product_code = 'Vasonin' and a_status = 'Active'
and use_dynamic_access_ctrls = true"
Note: This can potentially involve a lot of processing if there are many active trials in the system,
and it should be used with caution. The mechanism utilizes the distributed/parallel processing
mechanism to process the groups when a large number of groups must be updated, and you can
384
Migration and Integrity Check Utilities
The same effect can be brought about by refreshing the relevant Clinical Trial Registration Form with
the –reset option enabled as described previously, which is the preferred technique, as this ensures
the correct groups are purged. However, it may occasionally be necessary to revoke external user
access to all trials for a particular product. The most efficient way to do this is to purge the relevant
product group. For example, for the product “Vasonin”:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–purge pg_vasonin"
The relevant trial, country, and site registration forms may also need to be deactivated to prevent
them from being refreshed and reinstating external user access. The following DQL query can be
used to achieve this:
IDQL> update cd_common_ref_model objects set a_status = 'Inactive'
where r_object_type in ('cd_clinical_trial_info', 'cd_clinical_country_info',
'cd_clinical_site_info') and product_code = 'Vasonin'
The –delete option can be used where the group hierarchy needs to be rebuilt, for example, if
the group structure has been reorganized. This is usually only necessary for migration/upgrade
purposes. It is possible to rebuild the entire TMF group hierarchy using the following technique:
• Delete the top-level tmf_contributors group.
• Run the tool in repair mode against the entire /Clinical cabinet. This rebuilds all of the required
groups, although they will not be populated at this stage.
• Refresh all active trials to repopulate the groups.
• Verify that the groups have been created successfully.
385
Migration and Integrity Check Utilities
This command should only be used as a last resort to rebuild the access control groups, as it is likely
to involve very substantial processing, especially in mature repositories.
386
Migration and Integrity Check Utilities
Application Registration Forms (RARFs) is not supported. These must be created as part of the
migration process prior to indexing with the relevant metadata, as required.
• Automatic metadata inheritance from submission folders to the associated subfolders and
documents and to the SRF itself, including metadata extracted from eCTD XML backbone or
regional XML files, for example, application numbers, submission dates, and so on. This includes
an option to merge the extracted metadata back onto the RARF, for example, for new dosage
forms, dosage strengths, and so on. Alternatively, the submission-level metadata can just be
compared against the RARF and any discrepancies logged as warnings.
• Report generation, including a summary of the indexing process for each application (similar to
the log file generated by the existing submission import method), and optionally, a zip file that
contains further details which may be useful for validation purposes.
The parameters supported by the tool are described subsequently in this document, and are also
documented in the Developer JavaDocs for the SSV-Methods.jar package, which is the definitive
reference guide.
387
Migration and Integrity Check Utilities
This creates the required dm_method object with timeout set to 86400 seconds (24 hours) by default.
Adjust the timeout parameters if necessary.
Note: Distributed processing is only supported on LSSSV version 16.4 onwards. This enables a
master process to be invoked on one server, which will automatically distribute processing tasks to
other available servers. Distributed processing takes place at the regulatory application level, so it
is only useful when indexing submission folders for multiple applications, and when more than
one Documentum Server is available.
To emulate distributed processing on previous versions of LSSSV, the tool can be installed and
scripted to run concurrently on multiple Documentum Servers, provided that each instance indexes
submissions for a distinct set of regulatory applications, so that they do not overlap and conflict
with one-another. However, a master script that can launch remote server batch processes is not
provided by default.
Note: The -apply_d2_configs option is not supported on previous versions of LSSSV (versions 4.1,
4.2, or 4.3) and must be explicitly-disabled on those versions (with -apply_d2_configs false). This
means that if the -create_missing_srfs option is enabled, any new SRFs generated by the indexing tool
will be created in the specified -temp_repository_folder location (or the “/Temp” cabinet by default)
388
Migration and Integrity Check Utilities
and will not have D2 autonaming, autolinking, and security applied to them. Likewise, any existing
SRFs that are updated by the indexing tool (for example, to synchronize the properties with regional
XML metadata) will not have D2 auto-naming reapplied. However, the ApplyD2Configurations
command-line utility can be used to post-process these SRFs.
389
Migration and Integrity Check Utilities
On UNIX, running the IndexSubmissionFolders.sh script without any parameters, or with the
help parameter, has a similar effect:
$ sh IndexSubmissionFolders.sh help
The following table lists the parameters that can be passed when running the IndexSubmissionFolders
script.
Parameter Description
–docbase_name Specifies the name of the repository to connect
to. Required.
–user_name The Documentum user name to use to connect
to the repository. The user account must
have Documentum superuser privileges. It
is recommended that the installation owner
account be used, since a password does not
need to be specified in that case (trusted
authentication is used if the script is running
on the Documentum Server as the installation
owner). Required.
–password Optional password to use to login to
Documentum, if the installation owner account
is not used. The password is not masked.
–id Object ID of a context object, that is, an SRF,
RARF, or PRF associated with the submission or
eCTD sequence folders to be indexed.
390
Migration and Integrity Check Utilities
Parameter Description
–submission_folders DQL "select" statement identifying the top-level
submission folders in the repository containing
the documents to be indexed. Where specified,
the query must return the r_object_id values of
the top-level submission folders to be indexed
only (that is, the eCTD sequence folders or
top-level non-eCTD submission folders) in the
appropriate submission number or sequence
number order.
391
Migration and Integrity Check Utilities
Parameter Description
–required_attrs Comma-delimited list of mandatory attributes
that must be defined as non-null or non-blank
values on each submission folder, in addition to
the standard required attributes, which are:
• product_code
• application_description
• application_num
• application_type
• region
• country_code
• health_authority
• is_ectd_submission
• submission_number
• submission_type
392
Migration and Integrity Check Utilities
Parameter Description
–apply_d2_configs If true, D2 autonaming, autolinking, and security
are applied to newly-created SRFs and re-applied
to modified SRFs. Otherwise, any new SRFs are
created in the temporary repository folder, with
the default object name set to <application-no>
- <submission-no> - <submission-type> or
<application-no> - Current - Cumulative (the
"ApplyD2Configurations" utility can be used to
process these subsequently).
393
Migration and Integrity Check Utilities
Parameter Description
–custom_rarf_inherited_attributes Comma-delimited list of custom attributes to
be inherited from the RARF to the imported
submission folders, subfolders, and documents
as default values, in addition to the standard
RARF-inherited attributes, as and where
applicable. Optional.
• application_status
• submission_procedure_type
• concerned_member_states
• submission_date
• submission_status
• submission_pub_path
• product_chemical_names
• drug_substance_name
• drug_substance_manufacturer
• drug_product_manufacturer
• dosage_form
• dosage_strength
• indication
• product_generic_name
• product_trade_name
• product_trade_country
• inn_names
• product_compound_id
394
Migration and Integrity Check Utilities
Parameter Description
–override_rarf_attributes Whether or not the attributes are to be updated
automatically on the RARF if they differ from
the submission-specific values extracted from
the eCTD backbone or regional XML files.
In either case, the discrepancies between the
currently-defined RARF attributes and those of
the imported submission are logged.
395
Migration and Integrity Check Utilities
Parameter Description
–out_of_sequence If true, the eCTD sequence folders are not
necessarily specified in sequence number order,
and may have other following sequences that
have already been indexed which are not
included in the sequences to be indexed in this
batch. In that case, the submission views for the
following sequences may need to be updated, if
they contain “modified-file” references to files
in the sequences to be re-indexed. Note that
this option only applies to eCTD sequences that
have already been at least partially indexed. If
all sequences are included in the appropriate
order, and none have yet been indexed (or the
-reindex option is enabled), this option should
be disabled, to eliminate redundant processing.
396
Migration and Integrity Check Utilities
Parameter Description
–process_pdf_links Determines how submission documents with
PDF files as their primary content format should
be processed, as follows:
397
Migration and Integrity Check Utilities
Parameter Description
–index_documents_and_folders Whether or not the individual submission
documents and folders are also to be indexed.
If true, metadata inherited from the RARF or
SRF, or that which is extracted from eCTD XML
backbone or regional M1 XML files, is applied
to the submission documents and folders in
the repository. Not only does this facilitate
searching, but also it enables eCTD lifecycle
operations, such as modified-file references,
to be resolved correctly. XML cross-reference
renditions are also attached to each document
to enable selected documents to be opened
in LSSSV in the relevant submission view.
If the -reindex option is also enabled, the
existing Xref renditions (if any) are replaced.
Otherwise, they are added only where they
are currently missing. Indexing of submission
documents and folders may be required when
migrating legacy submissions to ensure that the
document or folder properties in the repository
are consistent with the top-level submission
folders or SRFs, and to enable eCTD lifecycle
operations to be displayed and navigated
correctly in the Submission Viewer. This is
usually a one-off requirement at migration time.
After the submission documents and folders
have been indexed, this metadata is generally
static, and subsequent re-indexing is not usually
required unless the submission itself has been
re-imported with revised metadata.
398
Migration and Integrity Check Utilities
Parameter Description
–delete_empty_folders Whether or not empty subfolders should be
deleted from the repository prior to indexing, so
they do not appear in the Submission Viewer.
399
Migration and Integrity Check Utilities
Parameter Description
–rerender_views_only If true, the existing Submission Viewer XML
models for previously imported or indexed
submissions are assumed to be up-to-date
and are not regenerated. However, the
precached HTML files that provide the views
are regenerated using the latest XSL stylesheets.
Use this option to update the submission views
when new versions of the LSSSV XSL stylesheets
are installed but no other changes are required.
Note: This option overrides all other settings
and cannot be used in conjunction with
the –create_missing_srfs, –override_rarf
_attributes, –out_of_sequence, –process
_pdf_links, -index_documents_and_folders,
–delete_empty_folders, or –attach_dump_files
options. The –reindex option is implied when
this setting is used. However, submission
folders that have not yet been indexed are not
processed.
400
Migration and Integrity Check Utilities
Parameter Description
–attach_dump_files If true, additional diagnostic files are generated,
zipped, and attached to the RARF for each
application as additional renditions. In each
case, the zip file contains the following dump
files, as and where applicable:
401
Migration and Integrity Check Utilities
Parameter Description
–max_recorded_errors_per_file Limits the number of errors recorded in the
repository for each document to the specified
number, to prevent swamping where files have
many errors (for example, numerous broken
hyperlinks). If the limit is exceeded, a summary
message is recorded, indicating the number
of additional errors omitted; however, all of
the errors are recorded in the import log file
attached to the RARF.
The following additional parameters can be used to control distributed or parallel processing:
402
Migration and Integrity Check Utilities
Parameter Description
–max_threads The number of threads to use for local parallel
processing of submission folders for distinct
regulatory applications on each server. The
default is 5.
Note: The submission folders for each
application are processed serially by one master
thread, in the appropriate order. As each
submission or eCTD sequence is indexed, the
changes are committed to the repository within a
database transaction. This protects the integrity
of the submission view models and enables the
process to be restarted in an incremental fashion
if necessary.
–max_file_processing_threads Specifies the number of parallel processing
sub-threads to use for local file processing of
each submission folder including XML file
processing and PDF hyperlink processing,
where enabled.
403
Migration and Integrity Check Utilities
Parameter Description
–distributed_processing_threshold Denotes the minimum batch size for distributed
processing in multi-node environments. Batches
that are smaller than the specified size are always
processed locally, avoiding the overheads in
invoking distributed processes when there are
relatively few repository objects to be processed.
If set to 0, distributed processing is disabled (the
default setting).
–shared_folder Specifies an optional network folder path or
dm_location object name of a shared folder to
use for transmitting data efficiently between
remote processes when distributed processing
is used. The shared folder must be accessible
from each Documentum Server in read/write
mode. If undefined, process data is transmitted
via binary objects stored temporarily in the
Documentum repository. This is not as efficient
but obviates the requirement for a shared folder
to be established in the Documentum server
environment.
It is possible to configure the default settings for these parameters System Parameters dictionary
in D2-Config to suit the environment, rather than specifying them explicitly on the command line.
The exception is –max_file_processing_threads, which is not currently supported by the System
Parameters dictionary as it is specific to this method.
The following is an example of a Windows command that indexes all submissions below the
/Submissions/Legacy folder:
C:\Users\dmadmin> IndexSubmissionFolders.bat -docbase_name documentum
-user_name dmadmin –password "" -submission_folders "select r_object_id
from cd_submission_folder where folder('/Submissions/Legacy',
descend) order by application_description, submission_number"
-reindex true -create_missing_srfs true -process_pdf_links all
–max_file_processing_threads 3 -index_documents_and_folders true
-override_rarf_attributes true -attach_dump_files true
In the above, C:\Users\dmadmin> is the Windows command prompt, indicating the folder where
the scripts are installed. The parameters are the same for UNIX/Linux:
$ sh IndexSubmissionFolders.sh -docbase_name documentum -user_name
dmadmin –password "" -submission_folders "select r_object_id
from cd_submission_folder where folder('/Submissions/Legacy',
descend) order by application_description, submission_number"
-reindex true -create_missing_srfs true -process_pdf_links all
–max_file_processing_threads 3 -index_documents_and_folders true
-override_rarf_attributes true -attach_dump_files true
To process submissions incrementally (skipping those that have already been indexed), disable or
remove the -reindex argument.
404
Migration and Integrity Check Utilities
405
Migration and Integrity Check Utilities
1. Legacy submissions can reside anywhere in the repository. However, it is useful to organize
them into a specific cabinet or folder so that they can be identified easily; for example,
/Submissions/Legacy. They can co-exist with submissions that have been imported by
LSSSV, which in the standard installation are auto-filed as follows:
a. European submissions using the National submission procedure:
/Submissions/<product_code>/<region>/<country>/<application
_description>
b. European submissions using the Mutual Recognition Procedure (MRP):
/Submissions/<product_code>/<region>/Reference Member State
/<country>*/<application_description>; /Submissions/<product
_code>/<region>/Concerned Member State/<country>*/<application
_description>
*Cross-linked to each applicable country folder.
c. All other submissions:
/Submissions/<product_code>/<region>/<application_description>
2. Each submission or eCTD sequence should reside in a self-contained folder, directly below
the regulatory application level folder. It is recommended that the submission number or
eCTD sequence number is used for the top-level folder name for consistency with LSSSV. The
subfolders and documents within each submission or sequence folder should reflect the content
of the original submission. However, it is not necessary to include the Windows file extensions in
document object names (such as “.pdf” suffix) as this is implied by the content format code in
Documentum, which should be set appropriately in each case, for example, “pdf” for Acrobat
PDF files; “xml” for XML files, and so on. The original submitted files must be provided as the
primary content file of each document in Documentum. Additional preview renditions may be
generated at indexing time for internal use by LSSSV, but these are not considered to be part of
the original submission. They are not exported to the user’s desktop when the D2 “export folder”
function is used (only the primary content files are exported in that case).
3. The object types used for legacy submission folders and documents should be converted to the
appropriate LSSSV object types if necessary, as follows:
a. The top-level submission folder for an eCTD sequence should be converted to
cd_ectd_submission_folder, and the top-level submission folder for a non-eCTD electronic
submission (NeeS) should be converted to cd_nees_submission_folder.
b. In both cases, all subfolders below the top-level submission or sequence folder should be
converted to cd_submission_subfolder.
c. All submission documents should be converted to cd_submission_element.
This can be accomplished through a series of “change object” DQL statements.
Application-level folders (and above) can be standard dm_folder object types, as they do not
require any specific metadata in LSSSV. It is recommended that the application_description
value for the corresponding RARF be used for application-level folder object names for
406
Migration and Integrity Check Utilities
consistency with LSSSV. It is also recommended that the submission number or eCTD sequence
number is used for top-level submission or sequence folder object names.
4. The minimum required metadata should be populated for each top-level submission or sequence
folder, as follows:
• product_code
• application_description
• application_num
• application_type
• region
• country_code
• health_authority
• is_ectd_submission
• submission_number
• submission_type
When indexing eCTD sequences, some of the required metadata may be specified in regional
XML files, such as application_number, application_type, submission_type and submission_date,
which may be difficult to extract at migration time as it depends on the XML DTD format used in
each case. However, the indexing tool supports automatic extraction of metadata from eCTD
XML files, just as for submission import operations in LSSSV. If the XML-extracted metadata or
sequence folder metadata differs from the SRF, the discrepancies are logged as warnings, and
the SRF is updated accordingly. It is also possible to apply the changes to the RARF as well,
through the –override_rarf_attributes setting. Thus, at migration time, the relevant fields can
be populated on the RARFs and top-level sequence folders with dummy values initially, for
example, application_num = “TBD”, to be replaced at indexing time with the appropriate values
from the regional XML files. However, in the case of non-eCTD electronic submissions (NeeS), all
of the required metadata must be provided on the RARF and top-level submission folders, since
there are no standardized XML files for these.
It is not strictly necessary to populate this metadata on the submission subfolders and
individual submission documents, as this can be done at indexing time through the
–index_documents_and_folders option.
5. The relevant PRFs and RARFs must be created in the repository for each legacy submission
or eCTD sequence, with the appropriate metadata, prior to indexing. However, it is
not necessary to create SRFs as these can be generated automatically at indexing time
by enabling the –create_missing_srfs option. The SRF metadata is populated from the
submission folder metadata in that case. It is also not necessary to link the RARF to the
corresponding application-level folder containing the legacy submission / sequence folders,
as this is done automatically at indexing time if necessary. Submission folders with invalid
application_description values cannot be indexed. These are logged as errors and skipped at
indexing time. However, it is acceptable for submission or sequence folders to refer to RARFs that
are in the “Inactive” state. In that case, LSSSV does not allow additional submissions / sequences
to be imported until the Regulatory Managers re-activate the application by reverting the RARF
to the “Active” state. This enables legacy submissions to be locked down as archived copies.
407
Migration and Integrity Check Utilities
408
Migration and Integrity Check Utilities
regional XML files into the appropriate values expected by LSSSV. For more information on these
procedures, see the Documentum for Life Sciences Administration Guide.
Example
The following example demonstrates how submissions can be imported from the local file system
into Documentum as standard dm_document and dm_folder object types initially, which are then
converted and prepared for indexing. In practice, these steps would normally be carried out as part
of the submission migration process.
In this example, the DQL Tester tool is used to import the sample submission folders, although
migration tools such as the Enterprise Migration Appliance (EMA) may be more suitable in practice.
Note: It is recommended that the repository is set up to use parent folder ACLs for new repository
object by default, instead of the user’s default ACL. Ensure the top-level cabinets have the correct
ACLs assigned to them, so that any new sub-folders created during the migration process inherit
the same ACLs by default. For more information, see the Documentum Server Administration and
Configuration Guide.
1. Create the necessary PRFs, QPRFs, and RARFs for each distinct application for the submissions
to be processed. One way to accomplish this is to use DQL statements to create the registration
form objects in the “Temp” cabinet initially, and then use the “Apply D2 Configurations” utility to
apply D2 auto-linking and security to them, in order to build out the folder structure and move
the registration forms into the relevant folders in accordance with the D2 configurations.
For example, to create a new RARF, you could use a DQL statement such as:
create cd_reg_admin_info object
set object_name = 'ExternalRARF01',
set product_code = 'WonderDrug',
set application_description = 'ExternalRARF01',
set is_ectd_submission = true,
set application_num = 'TBD',
set application_type = 'CTA',
set region = 'EU',
set country_code = 'FR',
set health_authority = 'AFSSPS',
set a_status = 'Active',
append form_managers = 'cd_admingroup',
append form_users = 'docu',
link '/Temp'
Then, run the Apply D2 Configurations utility with the appropriate arguments:
C:\Users\dmadmin> ApplyD2Configurations -docbase_name documentum
-user_name dmadmin -password "" -qualifier "cd_reg_admin_info where
folder('/Temp')" –auto_naming false –auto_linking true –security true
It is not necessary to apply D2 auto-naming here because the RARF object name is set explicitly
to the application_description value, following the standard D2 naming conventions. When
applying D2 configurations to documents, it is recommended to apply D2 auto-naming with the
–new flag enabled, even if the document object names are set up correctly, in order to initialize
any sequence counters used in the D2 auto-naming rules such as document_id values. However,
sequence counters are not used in the standard configurations for registration forms.
409
Migration and Integrity Check Utilities
It is not strictly necessary to create SRFs for each submission or eCTD sequence folder, as
these can be created automatically at indexing time through the –create_missing_srfs option.
However, doing so can facilitate application of the correct metadata and role-based security to
the submission folders and documents, as this can be inherited from the SRFs in each case,
as and where appropriate.
If the relevant target folders can be identified and created in advance as part of the migration
process, the new registration forms can be linked directly to the relevant folders at creation
time (for example, /Regulatory Application Library/WonderDrug/EU in this case),
obviating the need to apply D2 auto-linking in the above. This is recommended for large-scale
migration exercises where it is feasible to do so, as it is much more efficient than applying D2
auto-linking en masse.
2. Create the top-level folders and sub-folders in the repository into which the sample
submissions will be stored, including folders for each distinct application, for example,
/Submissions/Legacy/WonderDrug. These can be created as standard dm_folder object
types; use the application_description value as the application folder name, following
the standard LSSSV folder naming conventions. Note that if a regulatory application has
both NeeS and eCTD sequences associated with it, it is necessary to create two RARFs –
one for the NeeS folders, and a separate one for the eCTD sequence folders, with distinct
application_description key values, although the RARFs can refer to the same application
number – and to create two corresponding application-level folders, accordingly; for example,
/Submissions/Legacy/WonderDrug (eCTD) and /Submissions/Legacy/WonderDrug
(NeeS).
3. Import the sample submission folders into the appropriate application-level folders as necessary,
using the standard Documentum dm_folder and dm_document object types initially, to simulate
migration from a legacy repository that does not use the LSSSV object types.
Using the DQL Tester utility, this can be accomplished as follows:
a. Launch DQL Tester and connect to the repository as the Documentum installation owner
(for example, dmadmin).
b. Open the Docbase Browser tool and navigate to the top-level folder into which the
submissions are to be imported.
c. Right-click on the top-level folder and select Import File.
d. In the File Importer dialog box, click Folder, and select the relevant file system folder
containing the submission folders or eCTD sequence folders to be imported. Verify that the
list of files to be imported is correct.
e. In the Object Name list, select Use File Name.
f. Ensure that the Create Folder Structure option is selected.
410
Migration and Integrity Check Utilities
411
Migration and Integrity Check Utilities
4. Verify that the primary content format (a_content_type) settings for each document are set to
the appropriate Documentum content format names in each case. For example, DQL Tester
assumes that the most recently-created dm_format object with the appropriate DOS extension
is the one that applies to each file, which can cause incorrect content type assignments. For
example, for XML documents, a_content_type may be set to “d2_bin_dump” instead of “xml”,
and for Acrobat PDF documents, it may be set to “pdflink” instead of “pdf” (this can be verified
by clicking on imported files of various types in the Docbase Browser and checking the Content
Type settings). To fix these, use bulk DQL update statement such as:
update dm_document objects
set a_content_type = 'xml'
where folder('/Submissions/Legacy/WonderDrug', descend)
and a_content_type = 'd2_bin_dump'
go
5. Convert the object types to the types expected by LSSSV using a series of DQL statements, as
follows:
412
Migration and Integrity Check Utilities
a. First, convert all folders and subfolders below the application-level folder from dm_folder to
cd_controlled_folder object types (the common parent folder type used for all submission
folder types in LSSSV). For example:
change dm_folder object to cd_controlled_folder where folder('
/Submissions/Legacy/WonderDrug', descend)
b. Next, change the top-level submission folders immediately below the application-level folder
(only) from cd_controlled_folder to cd_submission_folder:
change cd_controlled_folder object to cd_submission_folder where
folder('/Submissions/Legacy/WonderDrug')
c. Next, the same top-level folders to either cd_ectd_submission_folder or
cd_nees_submission_folder. In this case, the RARF is for an eCTD, so we use:
change cd_submission_folder object to cd_ectd_submission_folder
where folder('/Submissions/Legacy/WonderDrug')
d. Next, change the submission folders below the submission or sequence-level folders from
cd_controlled_folder to cd_submission_subfolder:
change cd_controlled_folder object to cd_submission_subfolder where
folder('/Submissions/Legacy/WonderDrug', descend)
Note: This does not include the top-level submission or sequence folders, which are no
longer cd_controlled_folder object types, but sub-types of that type.
e. Finally, change the submission documents from dm_document to cd_submission_element:
change dm_document objects to cd_submission_element where
folder('/Submissions/Legacy/WonderDrug', descend)
6. Set up the minimum required metadata at the top-level submission or sequence folder level:
update cd_ectd_submission_folder objects
set product_code = 'WonderDrug',
set application_description = 'A1234567',
set submission_number = object_name,
set submission_type = 'Initial MAA',
set is_ectd_submission = true,
set application_num = 'TBD',
set application_type = 'CTA',
set region = 'EU',
set country_code = 'FR',
set health_authority = 'AFSSPS'
where folder('/Submissions/Legacy/WonderDrug')
Note:
• This assumes that the top-level submission folder names are the same as the corresponding
submission or sequence numbers, following the LSSSV folder naming conventions.
• It is only strictly necessary to apply this metadata to the top-level submission or sequence
folders, as the indexing tool supports an –index_documents_and_folders option, which can
be used to synch-up the metadata on the relevant submission folders and documents at
indexing time.
• For eCTDs, some of this metadata can be extracted from regional XML files at indexing time
(for example, application_num, submission_type, and so on) in which case the relevant
values can be set to “TBD” or “Pending” initially; they are overridden when the submissions
413
Migration and Integrity Check Utilities
are indexed. For NeeS, or where the metadata is not defined in the eCTD regional XML files,
the values must be specified explicitly in advance, or copied from the relevant RARF.
7. To set up role-based security on the submission folders, subfolders, and documents:
a. For the submission folders and subfolders, set the folder_managers and folder_users roles to
the appropriate users and groups:
update cd_controlled_folder objects
truncate folder_managers,
append folder_managers = 'cd_admingroup',
truncate folder_readers,
append folder_readers = 'docu'
where folder('/Submissions/Legacy/WonderDrug', descend)
This assumes that the legacy submissions are available to all users across all domains as
read-only artifacts, and that the Controlled Document Administrators (cd_admingroup) are
responsible for managing access to them. Adjust these groups as necessary. It also assumes
that the same role-based security applies to all folders and documents in the submission. If
necessary, it can be refined for specific folders within a submission in top-down fashion.
b. For the submission documents, set the doc_coordinators and readers roles to the appropriate
users and groups, and also set the document control category 3 and lifecycle state (a_status
value) to Effective:
update cd_submission_element objects
truncate doc_coordinators,
append doc_coordinators = 'cd_admingroup',
truncate readers,
append readers = 'docu',
set category = 3,
set a_status = 'Effective'
where folder('/Submissions/Legacy/WonderDrug', descend)
c. Run the Apply D2 Configurations utility with the relevant parameters to set up D2 role-based
security on these submission folders, subfolders, and documents:
C:\Users\dmadmin> ApplyD2Configurations -docbase_name documentum
-user_name dmadmin -password "" -qualifier "dm_sysobject where
folder('/Submissions/Legacy/WonderDrug', descend)" –auto_naming
false –auto_linking false –security true
Note: An alternative approach is to configure the default roles on the RARF in advance
and then invoke the Update Security lifecycle state transition on the RARF. This can be
accomplished by running the Apply D2 Configurations utility:
C:\Users\dmadmin> ApplyD2Configurations -docbase_name documentum
-user_name dmadmin -password "" -qualifier "cd_reg_admin_info
where application_description = 'A1234567'" –auto_naming false
–auto_linking false –security false –state_transition "Update
Security"
This mimics the effect of an Application Manager changing the default submission security
at the application level, causing the changes to be cascaded down to the related SRFs,
submission folders, and documents. However, this technique only works if SRFs already exist
for the relevant submission or eCTD sequence folders, so it can only be used after indexing,
or where the SRFs are created explicitly as part of the migration process prior to indexing.
414
Migration and Integrity Check Utilities
8. The submission folders and documents should now be ready for indexing. Run the indexing tool
to process them with the appropriate parameters:
C:\Users\dmadmin> IndexSubmissionFolders -docbase_name documentum
-user_name dmadmin -password "" -submission_folders "select r_object_id
from cd_submission_folder where folder('/Submissions/Legacy', descend)
order by application_description, submission_number" -create_missing
_srfs true -index_documents_and_folders true -override_rarf_attributes
true -process_pdf_links all -max_file_processing_threads 1
-attach_dump_files true
This indexes all submissions or sequences for all applications in the /Submissions/Legacy
folder that have not already been indexed (if necessary, add –reindex true to the command line
to force them to be reprocessed). You can increase the number of file processing threads to
improve the processing throughput, and if multiple applications are involved, the standard
CDF distributed or parallel processing options can also be used to reduce the overall indexing
time, as described in the preceding section.
415
Migration and Integrity Check Utilities
The primary content format (rendition number 0) would be xml if the RARF has been indexed (this
provides the internal table of contents view model, with links to each SRF), and there is an html
rendition for this, comprising the pre-rendered HTML table of contents view displayed by the
Submission Viewer (obviating the requirement for the browser itself to render the XML into HTML
on demand, which has been found to be unacceptably slow on some browsers, especially when
viewing large submissions).
The crtext or text rendition (depending on whether the Documentum Server is running on
Windows or UNIX) contains a summary of the most recent indexing operation for each application.
Double-click it to view it in Notepad.
Any errors or warnings logged in this file should be investigated, in case they are significant. In
most cases, warnings relate to incorrect or missing metadata or PDF cross-reference links that could
not be resolved.
If the –attach_dump_files option is enabled at indexing time, an additional zip file rendition is
attached to the RARF, containing a set of dump files in Comma-Separated Values (CSV) format:
416
Migration and Integrity Check Utilities
These CSV files can be opened in Microsoft Excel, providing further details about the indexing of
specific submissions or eCTD sequences. The Submission Metadata Info files, for instance, tabulate
the Documentum metadata associated with each folder and document for a particular submission or
eCTD sequence folder.
This information can be useful for verifying that the correct metadata has been identified in each case,
including metadata extracted from eCTD XML files or inherited from the RARF or SRF, as well as
metadata defined implicitly by the system (for example, eCTD element names). The first data row, for
the relative file path “.”, refers to the top-level submission or eCTD sequence folder itself, and the last
column lists out any indexing errors or warnings for each file, such as broken hyperlinks.
Note: The metadata is not actually applied to the corresponding submission folders and documents in
the Documentum repository unless the –index_documents_and_folders option is enabled. Usually,
this only has to be done once as the metadata is generally static. However, the SRF metadata is always
updated as and where necessary, and likewise the RARF metadata, if the –override_rarf_attributes
option is enabled. Any discrepancies between the current versus expected SRF or RARF metadata
values are logged as warnings.
The PDF Xref Info files contain details about any cross-reference hyperlinks discovered in each
submission or sequence, and whether or not they were successfully resolved:
417
Migration and Integrity Check Utilities
In the above example, it can be seen that the drug-product.pdf file in the m2/23-qos (Quality
Overall Summary) folder has many cross-reference hyperlinks, including two unresolved links.
On closer inspection of the Target Object Path column, it can be seen that for the two unresolved
links, the target file is specified incorrectly as an absolute path in the PDF file, and not as a relative
path from the folder containing the source PDF file. This is indicative of a publishing error of some
kind. Such links are not generally navigable, even if the files are exported from Documentum to the
user’s desktop.
Note: PDF Xref Info files are only generated for submissions or sequences containing at least one
recognized inter-document hyperlink. Where they are missing, it can be assumed that no hyperlinks
were found, or PDF hyperlink processing was not enabled at indexing time. Also, since Documentum
for Life Sciences 16.4*, inter-document hyperlinks in the PDF bookmark tree (that is, the PDF
“outlines”) are now also processed, just as for hyperlinks represented as page annotations in the PDF
content. In the Xref Info file, bookmark hyperlinks are indicated by a source page number of 0,
whereas page annotation hyperlinks have source page numbers starting from 1.
*Hotfixes or patches may also be available to retro-fit this functionality on previous Life Sciences 4.x
releases. See OpenText My Support for more information.
In the case of eCTDs, for sequence numbers other than 0000 (the initial sequence), an eCTD Modified
File Info file may also be generated, if the sequence contains leaf elements with eCTD lifecycle
operations other than new (for example, replace, append, or delete operations).
418
Migration and Integrity Check Utilities
In this example, it can be seen that sequence 0002 modifies 6 files submitted in sequence 0000 — 3 files
have new appendices or supplements, and 3 have replacement versions. The last column, Resolved,
indicates whether or not the modified file reference is valid, that is, whether or not the modified file in
the preceding sequence exists in the repository. For resolved references, a dm_relation is created
automatically in the repository between the modified file in the preceding sequence and the new
file in the following sequence, reflecting the eCTD lifecycle operation (just as in the native LSSSV
submission import process), and the links can be navigated in the Submission Viewer.
Note: In the case of eCTD delete operations, a file is not actually provided in the current sequence,
so a dummy placeholder (content-less document) is created in the current sequence folder in the
repository to act as an anchor-point for the dm_relation, and to enable Documentum metadata to be
associated with it. However, these dummy placeholders are only used internally by LSSSV, and are
not included in submission export operations.
If eCTD sequence folders are not indexed in the appropriate sequence number order, it can result
in unresolved modified file references. The links for these will not be navigable in the Submission
Viewer, and the dm_relation objects will be missing. To fix this, the sequences containing the
modified file references need to be re-indexed once the referenced sequences have themselves
been indexed. Enabling the –out_of_sequence option at indexing time forces the indexing tool to
automatically re-index following sequences that have already been indexed. However, this requires
additional processing, so it should only be used when needed (for example, when re-indexing
individual sequence folders). Otherwise, it is best avoided by including a suitable “order by” clause
in the –submission_folders DQL query argument.
If intermediate sequence folders are missing entirely from the repository, it can also result in
unresolved modified file references. In this case, the missing sequences must be imported through
the native LSSSV Submission Import function post-indexing, as and when they become available.
This takes out-of-sequence imports into account and automatically fixes unresolved modified file
references in previously-imported following sequences.
Finally, if an eCTD sequence contains Study Tagging Files (STFs), an eCTD Study Tagging File Info
CSV file is also included in the ZIP, tabulating the metadata extracted from each file:
419
Migration and Integrity Check Utilities
STFs provide additional metadata relating to non-clinical and clinical study reports. In LSSSV, this
information is used to reorganize the study documents in the relevant study folders to make them
easier to navigate (they are grouped by study, document type and site ID, where specified, instead of
presenting them as a flat list of files) and to manifest the additional study-related metadata in the
Submission Viewer. The CSV file records all of the the extracted metadata, and the last column,
Resolved, indicates whether or not the links in the STF (the leaf document references) are valid.
Note: An STF can refer to multiple study documents, in that case, the common study-level metadata
is recorded only for the first document (with leaf number 0), in order to avoid duplication, and to
make it easier to interpret the spreadsheet. Subsequent rows are then written for each additional
study document showing the document-level metadata only in the right-hand columns.
420
Migration and Integrity Check Utilities
hyperlinks). However, all the errors are recorded in the indexing or import log file attached to the
RARF as a text rendition, for reference.
421
Migration and Integrity Check Utilities
422
Chapter 19
Life Sciences Software Development
Kit
This section provides an overview of the Life Sciences solution Software Development Kit (SDK).
Overview
The Life Sciences solution Software Development Kit (SDK) enables System Integrators and
Developers to integrate an individual solution such as Documentum for eTMF, Documentum for
Research and Development, or Documentum Submission Store and View with an existing external
application or management system. Developers can develop plugins for the management system in
Java and use the SDK to synchronize the Documentum repository in response to the management
system events.
The SDK exposes basic functions of the solution, such as creating registration forms, through a Java
API and uses standard Documentum Foundation Services (DFS) Web Service calls for integration
purposes. Detailed JavaDocs on the SDK are provided in the javadocs folder of the SDK package.
423
Life Sciences Software Development Kit
The JCL productivity layer is packaged as a JAR file that can be included in Java applications.
Although it is possible for client applications to directly call the DFS Web Services through the
SDK, it is recommended that the operation be carried through the JCL to safeguard the application
against future changes to the Life Sciences solution.
424
Life Sciences Software Development Kit
• Activation and deactivation of file plans to create or remove placeholders for the expected
documents. File plans can be planned in stages if preferred, and activated in incremental fashion;
an initial trial setup stage followed by an active trial stage, then a final close-out stage. A rollover
function is provided to enable the next stage to be activated (this can also be configured to occur
automatically if necessary). Alternatively, all defined stages can be activated at once if preferred,
so that stages can be used to break up the overall file plan into manageable units.
• Update and retrieval of progress statistics indicating the current level of completion of a trial (up
to and including the currently-active stage).
• D2 configuration lookup functions for dictionary entries, default value sets, creation matrices,
and taxonomies, to enable country codes to be translated into locale-specific names, default
values to be applied to new repository objects, and so on. Currently, this is restricted to read-only
operations. D2 configuration changes are not supported through the JCL; this must be done
through D2-Config.
• TMF placeholder retrieval/update functions, enabling placeholders in the repository to be
identified and replaced with document content programmatically for example, to transfer a Site
Management Report (SMR) document from a Clinical Trial Management System (CTMS) into
Documentum for eTMF in response to a particular CTMS event such as SMR review/approval.
• Ability to invoke D2 lifecycle transition actions on TMF documents and placeholders, enabling
the relevant actions to be configured through D2-Config and invoked through the SDK just as
in the D2 Client.
• Registration of external users for access to the TMF documents at the trial, country and site
levels in the appropriate roles, with automatic refresh of the corresponding role groups when
the forms are saved.
• Creation, retrieval, modification, and deletion of Non-Clinical Study Registration Forms
including updating the Group, Subgroup, and other information specific to the registration form.
• Creation, retrieval, modification, and deletion of Non-clinical documents including updating
the metadata of the Non-clinical documents during creation or modification.
• Creation, retrieval, modification, and deletion of Regulatory Application Registration Forms
including updating the Group, Subgroup, and other information specific to the registration form.
• Creation, retrieval, modification, and deletion of Regulatory documents including updating the
metadata of the Regulatory documents during creation or modification.
• Creation, retrieval, modification, and deletion of Quality Project Registration Forms including
updating the metadata during the creation or modification of the registration form.
• Creation, retrieval, modification, and deletion of Regulatory Application Labeling documents
including updating the metadata during the creation or modification of the document.
• Creation, retrieval, modification, and deletion of Regulatory Core Labeling documents
including updating the metadata during the creation or modification of the document.
• Creation, retrieval, modification, and deletion of Regulatory Correspondence documents
including updating the metadata during the creation or modification of the document. This also
includes importing Correspondence email containing attachments.
• Creation, retrieval, modification and deletion of Regulatory Clinical documents including
updating the metadata during the creation or modification of the document.
425
Life Sciences Software Development Kit
426
Chapter 20
Configuration Settings to Improve
Performance
This section provides various configuration settings that you can make to improve the performance of
the Life Sciences solutions.
427
Configuration Settings to Improve Performance
4. In the Job Properties dialog box, select Inactive and click OK.
Deactivating the myInsight Agent job does not affect report generation.
428
Configuration Settings to Improve Performance
threads at the same time. Enabling this option can improve the performance of process-intensive
operations, such as TMF trial activation, submission import, and so on, at the expense of additional
Documentum Server or JMS resources. A default value for the -use_private_sessions argument can
be configured in the System Parameters dictionary, or it can be specified explicitly in the method
arguments.
If allowed, remove RSA providers from the JRE security file as it can maximize the benefit of parallel
handling mechanism in the Life Sciences solutions.
This argument applies to the following server methods:
• ApplyD2Configurations (all solutions)
• ApplyAttributeInheritance (all solutions)
• ImportSubmissionMethod (LSSSV only)
• ReconcileArtifacts (LSTMF only)
• TMFAdmin (LSTMF only)
• UpdateContributorGroups (LSTMF only)
Note: Avoid using max_file_processing_threads and max_import_processing_threads
in the Import Submissions Lifecycle parameters when use_private_sessions is set to true as it
can exhaust the sessions at a faster rate than they are released.
Database Settings
This section provides the different configuration settings that you can make to improve performance
for the following databases that you use in your environment.
SQL Server
As a general rule, the performance on SQL Server can degrade if statistics and fragmentation are not
kept in check. There are a number of tables that are rapidly modified, which make the fragmentation
and index maintenance a key issue. Client Database Administrators (DBAs) must identify their
most volatile tables in the day-to-day operation to determine the best schedule for daily, weekly, or
monthly maintenance.
Recommended settings:
• Increase the memory allocated to SQL Server to 20 GB for small to mid-sized clients
• Run the DB Statistics job as outlined in the Documentum Server Administration and Configuration
Guide.
The following table lists the parameters that you can modify for each of the components:
429
Configuration Settings to Improve Performance
Component Parameters
SQL Server • Max Degree of Parallelism = 0
• Parameterization = Forced
Documentum Server Server.ini
• return_top_results_row_based = F
• concurrent_sessions = 1000
Oracle
Changes to the default SPFIle properties:
• processes 1000 (2X concurrent sessions)
• sessions 500
• open_cursors 2000
• cursor_sharing = Forced
• optimizer_mode = ALL_ROWS
Scheduled Jobs
The dm_LogPurge job cleans up the myInsight TEMP logs. This has to be scheduled to run in order
to clean the JOB logs from the repository. When the reports are generated, it creates a repository copy
430
Configuration Settings to Improve Performance
of the HTML file, which needs to be purged. This can be done by writing a custom job to clean the
generated HTML file while running myInsight reports.
You can also disable unnecessary jobs by running the following commands:
update dm_job object set is_inactive=True where object_name='D2JobWFFollowUpTaskNotif'
update dm_job object set is_inactive=True where object_name='D2JobImportMassCreate'
update dm_job object set is_inactive=True where object_name='D2JobWFReceiveTaskMail'
update dm_job object set is_inactive=True where object_name='D2JobWFSendTaskMail'
update dm_job object set is_inactive=True where object_name='dm_Initialize_WQ'
update dm_job object set is_inactive=True where object_name='dm_QmThresholdNotification'
update dm_job object set is_inactive=True where object_name='dm_QmPriorityNotification'
update dm_job object set is_inactive=True where object_name='dm_QmPriorityAging'
update dm_job object set is_inactive=True where object_name='dm_bpm_XCPAutoTaskMgmt'
update dm_job object set is_inactive=True where object_name='myInsight Agent'
update dm_job object set run_mode=1,set run_interval=60 where object_name='D2JobSubscription'
431
Configuration Settings to Improve Performance
JMS Configuration
For a mid-size client (1000 to 3000 seat count), run the following command to configure the
Application Server JVM settings:
set USER_MEM_ARGS=-Xms4096m -Xmx4096m -XX:PermSize=64m -XX:MaxPermSize=1024m -Xss256k
-XX:+DisableExplicitGC -Xrs -Xloggc:gc.log -verbose:gc -XX:+UseParNewGC
For a small-size client (100 to 300 seat count), configure the following Application Server JVM settings:
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Apache Software Foundation\Procrun 2.0\Tomcat8080\
Parameters\Java
JvmMs = 2048 (Mb)
JvmMx = 2048 (Mb)
JvmSs = 256 (k)
Options =
-XX:PermSize=256m
-XX:MaxPermSize=256m
-XX:+UseParNewGC
-XX:NewRatio=4
-XX:NewSize=256m
-XX:-UseAdaptiveSizePolicy
-XX:SurvivorRatio=8
-XX:MaxTenuringThreshold=0
-XX:+UseConcMarkSweepGC
-XX:+CMSClassUnloadingEnabled
-XX:+CMSPermGenSweepingEnabled
-verbose:gc
-Xloggc:D:/shared/Tomcat8080/tomcat_gc.log
D2 Caching
Open the <App_Server>/webapps/D2/WEB-INF/classes/d2-cache.xml file and add the
following lines before the <cache> tag named xml-cache:
<cache name="object_ids-cache"
maxElementsInMemory="10000"
eternal="false"
overflowToDisk="false"
memoryStoreEvictionPolicy="LFU">
<cacheExtensionFactory class="com.emc.common.java.cache.PropertiesExtensionFactory"
properties="type=d2_documentset, d2_documentset_switch, d2_filter_config,
x3_preferences, x3_image, d2_query_config, d2c_preferences" propertySeparator=";"/>
</cache>
432
Configuration Settings to Improve Performance
D2 Client Configurations
This section provides the Life Sciences-specific UI configurations and recommendations to improve
performance.
Argument Value
widget_id Unique ID for this widget instance. Used for event filtering
workspaceView View (label) in which this widget instance is used such as Browse.
Used to prevent unnecessary downloads
active Whether or not the widget is in the initial view that is, is active
by default
433
Configuration Settings to Improve Performance
Argument Value
autoSelect • true — the widget should respond to selection events (for
example, standard doc preview)
434
Appendix A
Configuration Planning Checklist
435
Configuration Planning Checklist
• Creation Profiles
— Registration forms
• Default values
• Lifecycles
• Workflows
• Rendition requests
436
Configuration Planning Checklist
437
Configuration Planning Checklist
438
Appendix B
Troubleshooting
This appendix provide information on the log files that you can refer to in case you want to
troubleshoot any issue in the Life Sciences solution.
Log Files
The logs are the combination of D2-specific logs, Life Sciences-specific logs, and the supporting
component logs.
D2 Log Files
D2 logs can be accessed during runtime when using the application. The files can also be accessed
during design time when making configuration changes. The list of D2 logs in the Application
Server machine are listed as follows:
• C:\logs\D2-config.log
• C:\logs\D2.log
The list of D2 logs in Documentum Server are listed as follows:
• C:\logs\D2-CS.log
• C:\logs\D2-JMS.log
You can use both D2 Documentum Server and Application Server logs to troubleshoot many issues,
including D2 Configuration-related issues and workflow and lifecycle-related problems.
439
Troubleshooting
• iHub_LSSuite_UrlUpdate.log
• LSSuite_applyD2Configurations.log
• LSSuite_configImport.log
• LSSuite_Config_backup_export.log
• LSSuite_copy_bpmlibs.log
• LSSuite_copy_serverlibs.log
• LSSuite_dars.log
• LSSuite_eCTDSchema.log
• LSSuite_eSubmission.log
• LSSuite_index.log
• LSSuite_populateroles.log
• LSSuite_populateroles_myInsight.log
• LSSuite_postInstall.log
• LSSuite_preInstall.log
• LSSuite_updateversion.log
• LSSuite_UrlUpdate.log
• LSSuite_virtualDocTemplate.log
Documentum for • iHub_LSTMF_config.log
eTMF
• iHub_LSTMF_UrlUpdate.log
• LSTMF_applyD2Configurations.log
• LSTMF_Config_backup_export.log
• LSTMF_ConfigImport.log
• LSTMF_copy_serverlibs.log
• LSTMF_createregtables.log
• LSTMF_dars.log
• LSTMF_generategroups.log
• LSTMF_index.log
• LSTMF_populateroles.log
440
• LSTMF_populateroles_myInsight.log
• LSTMF_postInstall.log
Troubleshooting
• LSQM_applyD2Configurations.log
• LSQM_configImport.log
• LSQM_Config_backup_export.log
• LSQM_copy_bpmlibs.log
• LSQM_copy_serverlibs.log
• LSQM_dars.log
• LSQM_index.log
• LSQM_populateMDroles.log
• LSQM_populateroles.log
• LSQM_populateroles_myInsight.log
• LSQM_postInstall.log
• LSQM_preInstall.log
• LSQM_updateversion.log
• LSQM_UrlUpdate.log
441
Troubleshooting
• LSOnlyRD_postInstall.log
• LSRD_applyD2Configurations.log
• LSRD_configImport.log
• LSRD_Config_backup_export.log
• LSRD_copy_serverlibs.log
• LSRD_createregtables.log
• LSRD_dars.log
• LSRD_generategroups.log
• LSRD_index.log
• LSRD_preInstall.log
• LSRD_populateroles.log
• LSRD_populateroles_myInsight.log
• LSRD_postInstall.log
• LSRD_updateversion.log
• LSRD_UrlUpdate.log
• LSRD_virtualDocTemplate.log
Documentum • iHub_LSSSV_config.log
Submission Store
and View • iHub_LSSSV_UrlUpdate.log
• LSSSV_applyD2Configurations.log
• LSSSV_configImport.log
• LSSSV_Config_backup_export.log
• LSSSV_copy_bpmlibs.log
• LSSSV_copy_serverlibs.log
• LSSSV_dars.log
• LSSSV_eCTDSchema.log
• LSSSV_eSubmission.log
• LSSSV_index.log
442
• LSSSV_populateroles.log
• LSSSV_populateroles_myInsight.log
Troubleshooting
The default location of the log file on the Content Transformation Services host is the
%CTS_HOME%\logs folder. The Content Transformation Services log files include:
• CTS_Log.txt: Contains the errors and exceptions that are specific to the server.
• <Plug-In>_Log.txt: Contains individual plug-in-related errors and exceptions.
• IMAGE3_log.txt: Contains the errors that are specific to the Image3 plug-in. The Image3
plug-in logs errors when generating storyboards because the PDFStoryBoard plug-in cannot
produce images at a resolution higher than 96 dpi.
Note: If separate logging is enabled, log files can be found in the %CTS_HOME%\docbases
\<docbasename>\config\logs folder.
The Documentum Content Transformation Services Administration Guide provides more information
about the log files.
xPlore
You can view indexing, search, content processing service (CPS), and xDB logs in xPlore
Administrator. To view a log file:
1. In xPlore Administrator, select an instance and click Logging.
2. Click the tabs for dsearch, cps, cps_daemon, or xdb to view the last part of the log. Indexing
and search messages are logged to dsearch.
3. Click Download All Log Files to obtain download links for each log file.
The Documentum xPlore Administration and Development Guide provides more information about
the log files.
Documentum Server
Documentum Server logging and tracing provides information about Documentum Server
operations. This logging information and tracing information is recorded in the following files:
• Repository log file: Contains information about root server activities. This file is also sometimes
referred as the Documentum Server log file.
• Session log files: Contains all information, warning, error, and fatal error messages and, by
default, all SQL commands generated from DQL commands.
443
Troubleshooting
myInsight
When an myInsight report is generated manually or as a scheduled task, a log file is created and
placed in the Temp cabinet in the Life Sciences solution. If the myIsight Agent job is run, the log file is
created in the /Temp/jobs/myInsight Agent folder.
The AMPLEXOR myInsight User Guide provides more information about the messages that appears in
the log file.
444
Troubleshooting
The generated prints are saved in the path, <<root_directory>>/tmp/<<object_id of the document>>, if
the log level is set to DEBUG. These files will be cleaned up by the application when the log level
is set to INFO.
445
Troubleshooting
You can set the debug parameter for the following external widgets:
• WG EXT SSV Leaf Element Viewer
• WG EXT PDF Viewer 1
• WG EXT PDF Viewer 2
• WG EXT Submission History View
• WG EXT LSCI LSS Doc Viewer (Browse)
• WG EXT LSCI LSS Doc Viewer (Initial Browse)
• WG EXT LSCI LSS Doc Viewer (Tasks)
• WG EXT LSCI SSV Doc Compare Viewer 1
• WG EXT LSCI SSV Doc Compare Viewer 2
• WG EXT LSCI Study Tagging File Navigator
• WG EXT LSCI LSS Doc Viewer (QC Index)
Documentum Server
For Documentum Server 7.1 and later, because of JBOSS changes, certain configuration changes must
be made to get proper D2 Java Method Server log information:
1. Delete the jboss-log4j.xml file from the <dctm_home>\jboss7.1.1\server
\DctmServer_MethodServer\deployments\ServerApps.ear\APP-INF\classes
folder.
2. Create a logback.xml file in the <dctm_home>\jboss7.1.1\server\DctmServer
_MethodServer\deployments\ServerApps.ear folder. You can use the following sample
log file to create your log file.
446
Troubleshooting
Separate logging appenders are added to the log4j.properties file for logging the polling and
capability caching information. Refer to the following entries in the log4j.properties file:
• log4j.category.POLLINGAppender=INFO, POLLINGAppender
• log4j.appender.POLLINGAppender=org.apache.log4j.DailyRollingFileAppender
• log4j.appender.POLLINGAppender.File=R $C(CTS, PARENT_DIR)\\logs\\Polling_log.txt
• log4j.appender.POLLINGAppender.Append=true
• log4j.appender.POLLINGAppender.lay out=org.apache.log4j.PatternLayout
• log4j.appender.POLLINGAppender.lay out.ConversionPattern=%d{HH\:mm\:ss,SSS} %10r %5p
[%10t] %-20c - %5x %m%n
• log4j.appender.POLLINGAppender.DatePattern=’.’yyyy-ww-dd
• log4j.category.CAPABILITYAppender=INFO, CAPABILITYAppender
447
Troubleshooting
• log4j.appender.CAPABILITYAppender=org.apache.log4j.DailyRollingFileAppender
• log4j.appender.CAPABILITYAppender.File= $C(CTS, PARENT_DIR)\\logs\\Capability_log.txt
• log4j.appender.CAPABILITYAppender.Append=true
• log4j.appender.CAPABILITYAppender.layout=org.apache.log4j.PatternLayout
• log4j.appender.CAPABILITYAppender.layout.ConversionPattern=%d{HH\:mm\:ss,SSS} %10r
%5p [%10t] %-20c - %5x %m%n
• log4j.appender.CAPABILITYAppender.DatePattern=’.’yyyy-ww-dd
The log files, Polling_log.txt and Capability_log.txt, corresponding to these appenders
contain the logs related to polling and capability caching information respectively. This information
is not logged to the main CTS_log.txt file. The log level can be set to DEBUG for more information
to be captured in the log file.
xPlore
Basic logging can be configured for each service in xPlore administrator. Log levels can be set
for indexing, search, CPS, xDB, and xPlore administrator. You can log individual packages
within these services, for example, the merging activity of xDB. Log levels are saved to
indexserverconfig.xml and are applied to all xPlore instances. xPlore uses slf4j (Simple Logging
Façade for Java) to perform logging.
To set logging for a service:
1. In xPlore Administrator, click System Overview.
2. Click Global Configuration.
3. On the Logging Configuration tab, configure logging for all instances. Open one of the services
such as xDB and set levels on individual packages.
To customize the instance-level log setting, edit the logback.xml file in each xPlore instance. The
logback.xml file is located in the WEB-INF/classes folder for each deployed instance war file.
Levels set in logback.xml take precedence over log levels in xPlore Administrator.
Note: Logging can slow the system and consume disk space. In a production environment, run the
system with minimal logging.
Each logger logs a package in xPlore or in your custom code. The logger has an appender that
specifies the log file name and location. DSEARCH is the default appender. Other defined appenders
in the primary instance logback configuration are XDB, CPS_DAEMON, and CPS. You can add a
logger and appender for a specific package in xPlore or your custom code. The following example
adds a logger and appender for the package com.mycompany.customindexing:
<logger name="com.mycompany.customindexing" additivity="false" level="INFO">
<appender name="CUSTOM" class=" ch.qos.logback.core.rolling.RollingFileAppender">
<file>C:/xPlore/jboss7.1.1/server/DctmServer_PrimaryDsearch/ logs/custom.log </file>
<encoder> <pattern>%date %-5level %logger{20} [%thread] %msg%n</pattern>
<charset>UTF-8</charset>
</encoder>
<rollingPolicy class="ch.qos.logback.core.rolling. FixedWindowRollingPolicy">
<maxIndex>100</maxIndex>
<fileNamePattern>C:/xPlore/jboss7.1.1/server/DctmServer_ PrimaryDsearch
/logs/custom.log.%i</fileNamePattern>
448
Troubleshooting
</rollingPolicy>
<triggeringPolicy class="ch.qos.logback.core.rolling. SizeBasedTriggeringPolicy">
<maxFileSize>10MB</maxFileSize>
</triggeringPolicy>
</appender>
</logger>
You can add custom logger and appender to logback.xml. To capture log entries in the
logs in xPlore Administrator, add the custom logger to a logger family, which are defined in
indexserverconfig.xml. This is an optional step – if you do not add your custom logger to a
logger family, it still logs to the file that you specify in your appender. Logger families are used to
group logs in xPlore Administrator. You can set the log level for the family, or expand the family
to set levels on individual loggers.
The log levels include TRACE, DEBUG, INFO, WARN, and ERROR. The levels are in increasing
severity and decreasing amounts of information. Therefore, TRACE displays more than DEBUG,
which displays more than INFO.
Thumbnail Server
To enable logging for CDF, LSTMF, and LSSSV server methods, set the logging level to DEBUG on
the Java Method Server by adding the following settings to the log4j.properties file, located
in \Documentum\jboss[version]\server\DctmServer_MethodServer\deployments
\ServerApps.ear\APP-INF\classes:
• log4j.logger.com.documentum.d2=INFO
• log4j.logger.com.documentum.cdf=DEBUG
• log4j.logger.com.documentum.tmf=DEBUG
• log4j.logger.com.documentum.ssv=DEBUG
• log4j.logger.com.documentum.utils=DEBUG
• log4j.logger.com.emc.documentum.ls.utils=DEBUG**
** Only for LSTMF and Life Sciences solution setups where lock mechanism is used while creating
dynamic groups during trial activation and reactivation.
449
Troubleshooting
The log4j logging framework is used in the Life Sciences SDK. The log4j properties can be set for the
logs specific to the SDK by setting the log levels. For example:
log4j.logger.com.documentum.ws=DEBUG.
myInsight
To troubleshoot issues related to myInsight reports, myInsight logs can be collected by placing
the logging.properties file the WEB-INF/classes folder in the myInsight web application
and restarting the Application Server. The log levels can be FINE and FINEST. The following code
can be added to the log file:
handlers = org.apache.juli.FileHandler
############################################################
# Handler specific properties.
# Describes specific configuration info for Handlers.
############################################################
org.apache.juli.FileHandler.level = FINE
org.apache.juli.FileHandler.directory = ${catalina.base}/logs
org.apache.juli.FileHandler.prefix = myInsight.
java.util.logging.ConsoleHandler.level = FINE
java.util.logging.ConsoleHandler.formatter = java.util.logging.SimpleFormatter
Connection Issues
The Life Sciences solution is a web-based application. If the repository does not appear or other
connection issues occur, verify that the required services are running.
450
Troubleshooting
D2 Performance Issues
D2 is taking time to load the workspace, widgets, creation profiles, and other configurations.
Resolution
Follow these steps to improve the performance of the application with respect to the workspace,
widgets, creation profiles, and other configurations.
1. Stop the application server.
2. Update the D2FS.properties file located in the /WEB-INF/classes folder in D2.war by
turning on maxResultSetSize by uncommenting the line (remove the #). For example:
maxResultSetSize=1000
451
Troubleshooting
As these values are interconnected, it is advisable not to provide large variations over the
recommended settings.
452
Appendix C
Visual Representation of Attribute
Cascading in Life Sciences
During document creation or import, a document inherits a set of attributes from a corresponding
registration form. For example, if you create a non-clinical document, the system requires you to
associate the document to a Non-clinical Study Registration Form. The individual document inherits
a set of product and study information from the registration form.
The following diagrams represent how information is cascaded from the registration forms to the
individual documents in each of the solutions.
Legend:
• TXT — Free Form Text Entry
• DQL — DQL-based set used for the drop-down list
• DQL-prod — DQL-based set tempered by selected product to be used for the drop-down list
• DICT — Dictionary-based set used for the drop-down list
• DICT+FREE — Dictionary-based set used for the drop-down list; unlisted values accepted
• DICT-DQL-prod — Dictionary-based value set through DQL tempered by the selected product
• RO — Read only value; set earlier
• DQL-RO — Read-only value set through a DQL query on another field
• DQL-Dep — DQL-based set tempered by the first value in the set
• DQL-Dep-RO — Read-only value set through DQL tempered by the first value in the set
• DQL-Dep-Free — DQL-based set tempered by the first value in the set; unlisted values accepted
453
Visual Representation of Attribute Cascading in Life Sciences
454
Visual Representation of Attribute Cascading in Life Sciences
455
Visual Representation of Attribute Cascading in Life Sciences
456
Visual Representation of Attribute Cascading in Life Sciences
457
Visual Representation of Attribute Cascading in Life Sciences
458
Appendix D
D2 Configurations
The following table lists the D2 configuration elements that are used by the Life Sciences solutions.
When customizing the D2 configurations to address customer-specific business requirements,
these D2 configurations must not be renamed or removed from the Life Sciences system. These
configurations are used by the Life Sciences Server Methods internally to address specific and
respective business operations.
459
D2 Configurations
Geographic Regions
GMP Artifacts
Regulatory-Admin Artifact
States of America
Submission Filestores
System Parameters
TMF Countries
TMF Models
wf_rule_attributes
460
D2 Configurations
tmf_progress_history
D2 Taxonomy GMP Artifacts
461