0% found this document useful (0 votes)
110 views461 pages

Documentum For Life Sciences 16.4 Administration Guide

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
110 views461 pages

Documentum For Life Sciences 16.4 Administration Guide

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 461

OpenText ™ Documentum ™ for

Life Sciences
Version 16.4

Administration Guide
Legal Notice

This documentation has been created for software version 16.4.


It is also valid for subsequent software versions as long as no new document version is shipped with
the product or is published at https://knowledge.opentext.com.
Open Text Corporation
275 Frank Tompa Drive, Waterloo, Ontario, Canada, N2L 0A1
Tel: +1-519-888-7111
Toll Free Canada/USA: 1-800-499-6544 International: +800-4996-5440
Fax: +1-519-888-0677
Support: https://support.opentext.com
For more information, visit https://www.opentext.com
Copyright © 2019 Open Text. All Rights Reserved.
Trademarks owned by Open Text.
Adobe and Adobe PDF Library are trademarks or registered trademarks of Adobe Systems Inc. in
the U.S. and other countries.
One or more patents may cover this product. For more information, please visit,
https://www.opentext.com/patents
Disclaimer
No Warranties and Limitation of Liability
Every effort has been made to ensure the accuracy of the features and techniques presented in this
publication. However, Open Text Corporation and its affiliates accept no responsibility and offer no
warranty whether expressed or implied, for the accuracy of this publication.
Table of Contents

Preface ................................................................................................................................ 15
Chapter 1 Life Sciences Solution Fundamentals ......................................................... 17
Overview ......................................................................................................... 17
Document Domains .......................................................................................... 18
Roles ............................................................................................................... 20
Administrators ............................................................................................. 22
Consumer Import ......................................................................................... 23
Controlled Printers and Issued Printers (LSQM) ............................................. 23
Document Approvers ................................................................................... 24
Document Auditors ...................................................................................... 26
Document Authors ....................................................................................... 27
Document Contributors (LSTMF) .................................................................. 29
Document Coordinators ................................................................................ 29
Document Quality Organization Approvers (LSQM) ...................................... 30
Document Readers ....................................................................................... 31
Document Reviewers .................................................................................... 32
Managers (Product, Project, Trial, Study, Regulatory) ...................................... 32
Global External Participants (LSTMF) ............................................................ 34
Inspectors (LSTMF) ...................................................................................... 34
Investigators (LSTMF) .................................................................................. 34
Quality Check (LSTMF) ................................................................................ 34
Security Groups ........................................................................................... 35
Solution-Specific User Roles .......................................................................... 35
User Roles in Documentum for eTMF ........................................................ 35
Cross-functional User Groups ............................................................... 37
Reporting Groups ................................................................................. 37
External Trial Participant Roles .............................................................. 38
External Trial Participant Groups ........................................................... 39
User Roles in Documentum for Quality and Manufacturing ........................ 40
Cross-functional User Groups ............................................................... 43
Reporting Groups ................................................................................. 44
User Roles in Documentum for Research and Development ........................ 44
Cross-functional User Groups ............................................................... 52
Reporting Groups ................................................................................. 52
User Roles in Documentum Submission Store and View ............................. 52
Correspondence User Groups................................................................ 54
Cross-functional User Groups ............................................................... 55
Reporting Groups ................................................................................. 55
Control Categories ............................................................................................ 56

Chapter 2 Customizing D2-Based Solutions ................................................................ 59


Extending D2-Based Life Sciences Solutions ....................................................... 59
Creating a Custom Application ..................................................................... 59
Modifying or Extending a Base Configuration ................................................ 60
Extending the Base Solution Contexts ............................................................ 60

3
Table of Contents

Upgrading the Customized Solution .................................................................. 61


Reconciling Extended Base Configurations with Solution Upgrades ................. 62
Merging D2 Configuration XML Files ............................................................ 62
Best Practices for Team Development with D2 .................................................... 63
Extending Roles ............................................................................................... 64
Creating Custom Roles ................................................................................. 64
Configuring the Property Pages ..................................................................... 64
Configuring Workflows ................................................................................ 64
Configuring Roles (LSQM) ............................................................................ 65

Chapter 3 Life Sciences Data Model ............................................................................ 67


Data Model Overview ....................................................................................... 67
Add or Modify Attributes in Existing Solution Types .......................................... 69
Create New Types that Extend from Existing Solution Types ............................... 70

Chapter 4 Reference Models ........................................................................................ 71


Standard Solution Implementation of the DIA EDM Reference Model .................. 71
Standard Solution Implementation of the DIA TMF Reference Model
(LSTMF) .......................................................................................................... 74
Standard Solution Implementation of Quality and Manufacturing
Document Models (LSQM) ................................................................................ 76
Standard Solution Implementation of Submission Store and View
Document Models (LSSSV) ............................................................................... 78
Correspondence Domain............................................................................... 78
Implementation of the DIA EDM Reference Model for Research and
Development (LSRD) ........................................................................................ 79
Regulatory Domain ...................................................................................... 80
Clinical Domain ........................................................................................... 81
Non-clinical Domain ..................................................................................... 82
Quality Domain............................................................................................ 84
Promotional Materials Domain ...................................................................... 85
Labeling Domain .......................................................................................... 87
Safety Domain .............................................................................................. 88
Extending Dictionaries .................................................................................. 89
Extending the Standard Reference Model Implementation (LSTMF) .................... 89
Extending Dictionaries .................................................................................. 90
File Plans and File Plan Templates ................................................................. 90
Implementing a Custom Reference Model (LSTMF) ............................................ 90

Chapter 5 Document Creation Profile .......................................................................... 93


Management Creation Profiles .......................................................................... 93
Document Creation Profiles .............................................................................. 94
Control Category Definition .............................................................................. 95
Default Values Template ................................................................................... 95
Customizing Creation Profiles ........................................................................... 97
Creating a New Creation Profile .................................................................... 97
Changing Creation Profile Properties ............................................................. 98
Changing Creation Properties for an Artifact within an Existing
Creation Profile ............................................................................................ 98
Adding Artifacts to an Existing Creation Profile ............................................. 98
Removing Artifacts from an Existing Creation Profile ..................................... 99
Removing or Disabling an Existing Creation Profile ........................................ 99

4
Table of Contents

File Naming and Versioning ............................................................................ 100


O2 Configuration for PDF and Native Annotations ........................................... 100
C2 View Configurations for PDF...................................................................... 101
PDF Rendition Overview ................................................................................ 102
C2 Configurations with Fast Web View Option ................................................. 103
Configuring CTS for C2 Fast Web View ........................................................ 103
Enabling Fast Web View in C2 Configurations .............................................. 104
Electronic Signatures ...................................................................................... 105
Page Sizes and Overlays .................................................................................. 105
Signature Page ........................................................................................... 105
Overlays .................................................................................................... 106

Chapter 6 Registration Forms .................................................................................... 107


Overview and Form Types .............................................................................. 107
Creating Custom Registration Form Types ....................................................... 109

Chapter 7 Attribute Inheritance and Cascading Attribute Rules ................................ 111


Auto-Inherited Attribute Rules ........................................................................ 111
Configuring Auto-Inherited Attribute Rules ..................................................... 113
Configuring Cascading Attribute Rules ............................................................ 119
Extended Attribute Expressions....................................................................... 125
Context User Support in CDF Methods ............................................................ 131
Special Naming Conventions........................................................................... 138
Artifact-based Autonaming and Attribute Lists (LSTMF) .................................. 138
Configuring the Existing Dictionary for Artifact-based Autonaming .............. 139
Preconfigured Cascading Attributes Rules ....................................................... 140
Using a Custom Attribute Inheritance Rule to Reapply D2
Configurations to Selected Objects ................................................................... 148
Extensions to Cascading Attributes and Auto-Inheritance Rules to
Support Auditing ........................................................................................... 151
Extensions to the CDF ApplyInheritedAttributes Method ............................. 152

Chapter 8 Workflows ................................................................................................. 155


Workflow Roles .............................................................................................. 155
Workflow Diagrams........................................................................................ 155
For Collaborative Editing (Categories 1-3) .................................................... 156
Submit for Review and Approval (Category 1) ............................................. 156
Submit for Approval (Category 1) ................................................................ 157
Periodic Review (Category 1) ...................................................................... 158
Withdraw Document (Category 1) ............................................................... 159
Recall Document ........................................................................................ 159
Submit for Review and Approval (Change Request) ..................................... 160
Submit for Approval (Change Request) ........................................................ 161
Submit for Review and Approval (Category 2) ............................................. 161
Submit for Review-Format Approval (Category 2) ........................................ 162
Submit for Approval (Category 2) ................................................................ 163
Expiry Review (Category 2) ......................................................................... 164
Submit for Review (Category 3) ................................................................... 164
Submit for Delegated Approval (Category 3) ................................................ 165
Content Template Approval ........................................................................ 165
Review Ingested Document ......................................................................... 166

5
Table of Contents

Configuring Workflows .................................................................................. 166


Assigning Workflows to Artifacts .................................................................... 167
Modifying the Task Outcome Labels ................................................................ 168
Configuring Workflow Messages ..................................................................... 168
Workflow and Non-Workflow Attributes ......................................................... 168
Configuring Workflow Notification ................................................................. 169
Configuring Workflow Task Follow-up Notifications .................................... 169
Enabling the Submit for Self Approval Workflow Menu Option ......................... 170
Disabling the Self-Approve Lifecycle Menu Option........................................... 170

Chapter 9 Lifecycles .................................................................................................. 173


Document Lifecycle ........................................................................................ 173
Document Lifecycle Models ............................................................................ 177
LSQM Document Lifecycle Models .............................................................. 177
Control Category 0 Documents Lifecycle .................................................. 178
Control Category 1 Documents Lifecycle .................................................. 178
Control Category 2 Documents Lifecycle .................................................. 179
Control Category 3 Documents Lifecycle .................................................. 180
LSTMF Document Lifecycle Models ............................................................. 181
Control Category 2 Documents Lifecycle .................................................. 181
Control Category 3 Documents Lifecycle .................................................. 182
LSRD/LSSSV Document Lifecycle Models .................................................... 183
Control Category 2 Documents Lifecycle .................................................. 183
Control Category 3 Documents Lifecycle .................................................. 184
Using Uniqueness Checks to Validate Transition ............................................... 185
Normal States and Pseudo States ..................................................................... 186
Creating or Modifying a New Lifecycle Configuration ...................................... 186
Custom Business Logic Using Lifecycle Actions................................................ 187

Chapter 10 Security ..................................................................................................... 189


Controlled Document Foundation Security Model ............................................ 189
Permissions .................................................................................................... 189
Permissions in Documentum for eTMF ........................................................ 190
Permissions in Documentum for Quality and Manufacturing ........................ 191
Permissions in Documentum for Research and Development and
Documentum Submission Store and View .................................................... 193
Assignment of Control Categories ................................................................... 195
Role-Based Access Control .............................................................................. 196
Ownership of Category 1-3 Documents ............................................................ 197
Folder Security ............................................................................................... 198
TMF Dynamic Role-Based Access Control (LSTMF) .......................................... 198
External User Registration ........................................................................... 199
Granular Security for Submissions (LSSSV) ...................................................... 200
Security Settings in the Regulatory Application Registration Form................. 201
Security Settings in the Submission Registration Form .................................. 202
Security Settings in the Regulatory Activity Package ..................................... 203
Security Settings on Submission Folders....................................................... 204
Security Settings on Submission Subfolders .................................................. 204
Security Settings on Submission Documents ................................................. 205
Dynamic Security Framework ......................................................................... 205
Layers of the Security Framework ................................................................ 207

6
Table of Contents

Tags ....................................................................................................... 207


Group Generators ................................................................................... 207
Security Model ....................................................................................... 208
Folder Security Model............................................................................. 210
Shared Objects........................................................................................ 210
Creating a Tag Object .................................................................................. 210
Creating a Group Generator ........................................................................ 212
Creating a Folder Security Model ................................................................ 214
Creating a Security Model ........................................................................... 215
Generating the User Groups Manually ......................................................... 217
User Group Maintenance ................................................................................ 218

Chapter 11 Workspaces and Welcome Pages ............................................................. 219


Workspaces .................................................................................................... 219
Workspace Views and Tasks ........................................................................ 219
Workspace Groups ..................................................................................... 223
Workspace Views for Workspace Roles ........................................................ 228
Display Labels ................................................................................................ 233
Welcome Pages ............................................................................................... 233

Chapter 12 Reports ...................................................................................................... 235


Information Hub ............................................................................................ 235
Using the iHub Analytical Designer ............................................................. 235
Connecting the iHub Analytical Designer with the Life Sciences
Repository ............................................................................................. 235
Creating an iHub Report ............................................................................. 242
myInsight ...................................................................................................... 244
Report Widgets .......................................................................................... 245
Report Generation Process .......................................................................... 246
Configuring External Widgets for myInsight Reports .................................... 246
Accessing the Reports ................................................................................. 247

Chapter 13 Change Request ........................................................................................ 249


Disabling Change Request for Category 1 Documents ....................................... 249
Configuring the Change Request Properties ..................................................... 250
Configuring Release Pending as the Final State for Category 1
Documents ..................................................................................................... 250
Binding Rules for Change Requests ................................................................. 251
Configuring the CURRENT Binding Rule for a Change Request ........................ 252

Chapter 14 Controlled Print and Issued Print .............................................................. 255


Cover Page Configuration ............................................................................... 255
Overlay Configuration .................................................................................... 256
Creating the Print Profile................................................................................. 258
Importing Print Profiles .................................................................................. 259
Creating the Overlay PDF Template ................................................................. 259
Mapping the Contexts to the Controlled Print Profiles....................................... 260
Setting Up the Printers .................................................................................... 261
Print Reasons ................................................................................................. 261
Issued Print Site-based Groups ........................................................................ 261
Configuring Auto-Recall ................................................................................. 262

7
Table of Contents

Updating the Print Variables in System Parameters ........................................... 262


Print Number Prefix Configuration .................................................................. 263
Issued Print Virtual Document Publishing Security ........................................... 263

Chapter 15 Virtual Document Templates ..................................................................... 265


Installing the Clinical Study Report VDoc Template .......................................... 265
D2 Configuration for the VDoc Template ......................................................... 266
VDoc Template Approval ................................................................................ 267
Creating a Custom VDoc Template .................................................................. 268
Creating an Instance of a Clinical Study Report Assembly ................................. 271
Review and Approval of VDocs ....................................................................... 272
Versioning of Virtual Documents ..................................................................... 273

Chapter 16 Configuration Tasks .................................................................................. 275


Configuring Credential Enforcement ............................................................... 275
Electronic Signature Signoff ........................................................................ 277
D2 Mailing Configurations .............................................................................. 278
Types of Mailing Configurations .................................................................. 278
Configuring Mailing Configurations ............................................................ 280
List of Task Variables .................................................................................. 280
Message Configuration ............................................................................... 282
Disabling Email Notifications ...................................................................... 284
Auditing Events ............................................................................................. 284
Configuring Audit Events ........................................................................... 285
Configuring Automated Delegation ................................................................. 286
Date-Time UI Implementation ......................................................................... 286
Media Files .................................................................................................... 287
Search Configuration Changes......................................................................... 287
Configuring Cross-Domain Document Sharing ................................................. 287
Configuring Cross-Domain Document Sharing in D2-Config ......................... 288
Configuring Search Criteria ............................................................................. 289
Configuring XML DocViewer to Display PDFs in Excel Format ......................... 290
Hard Delete (LSTMF)...................................................................................... 290
Bulk Import-Export (LSTMF)........................................................................... 292
Lifecycle Model for Document Packages ...................................................... 293
Configuring the Bulk Import-Export Spreadsheet ......................................... 294
Configuring Roles and Permissions for External Participants (LSTMF) ............... 296
Defining External Trial Participant Roles ...................................................... 297
Defining Permissions for External Participant Roles ...................................... 300
Adding an External Participant Role Example .............................................. 301
Configuring Quality Check (LSTMF) ............................................................... 305
Distributed Server Method Processing ............................................................. 306
Enabling Distributed and Multi-threaded Processing .................................... 307
Disabling Parallel Processing for CFD Methods ............................................ 308
Creating a dm_method Object ..................................................................... 309
Enabling the Trace Level of the D2 Core Method........................................... 309
Enabling the “Convert to virtual document” Menu Option ................................ 310
Updating the PDF DocInfo Parameter in D2 Dictionary ..................................... 310

8
Table of Contents

Chapter 17 Regulatory Submissions ........................................................................... 313


Submission Overview ..................................................................................... 313
Electronic Common Technical Document Submission ....................................... 315
Regional XML Files for Other Agencies ........................................................ 315
Additional XSL Style Sheets ........................................................................ 315
Submission History XML Files ........................................................................ 316
Submission Import Progress Monitoring and Error Handling ............................ 316
Diagnostic Files for Submission Import ............................................................ 317
Supporting New eCTD XML Formats .............................................................. 318
XML Schema Configuration Object Settings ................................................. 321
Processing Standard XML Files ....................................................................... 329
Transforming Non-Standard XML Files............................................................ 332
Previewing and Processing eCTD Module 1 Regional XML Files ....................... 335
XML Metadata Extraction ............................................................................... 338
Example 1: US Submission to the FDA ......................................................... 342
Example 2: EU Submission to Multiple EU Countries ................................... 343
Mapping XML Values to Documentum Attributes ........................................ 347
Worked Example: Extending LSSSV to Support the US Regional
2.3 eCTD Format (DTD Version 3.3) ............................................................. 347
Non-eCTD Electronic Submission .................................................................... 360
Submission Filestore ....................................................................................... 360
Updating the D2 and XMLViewer URLs in D2 Dictionary ................................. 361
Configuring the LSSSV Viewer Widget URLs ................................................... 362
Updating the XML Viewer D2 External Widgets ............................................... 371
Processing of PDFs and Inter-Document Hyperlinks ......................................... 372
Resolving Broken Hyperlinks .......................................................................... 373
Study Tagging Files ........................................................................................ 374
Previewing of Media Files ............................................................................... 375
Adding Custom Format Icons ..................................................................... 376

Chapter 18 Migration and Integrity Check Utilities ..................................................... 377


D2 Configuration Migration Tool ..................................................................... 377
Installing the D2 Configuration Migration Tool............................................. 378
Configuring the D2 Configuration Migration Tool ........................................ 379
TMF Admin Integrity Checking and Repair Tool .............................................. 380
Installing the TMF Admin Tool .................................................................... 381
Examining Access Control Groups............................................................... 381
Checking and Repairing the TMF Security Settings for External Users ............ 381
Refreshing TMF Access Control Groups for Registered External
Users ......................................................................................................... 384
Purging and Deleting Specific Groups.......................................................... 385
Using the Index Submission Folders Migration Utility ...................................... 386
Installing and Configuring the Tool ............................................................. 387
Server Method Arguments .......................................................................... 389
Preparing Legacy Submissions for Indexing ................................................. 406
Example..................................................................................................... 409
Validating the Indexing Process ................................................................... 415
Identifying Applications, Submissions, and Submission Documents
with Errors ................................................................................................. 420

Chapter 19 Life Sciences Software Development Kit ................................................... 423

9
Table of Contents

Overview ....................................................................................................... 423


Web Services Integration ................................................................................. 423
Java Client Library ...................................................................................... 424
Supported Functionality in the SDK ................................................................. 424
Supporting External Registration Forms ........................................................... 426

Chapter 20 Configuration Settings to Improve Performance ....................................... 427


D2 4.7 Performance Best Practices .................................................................... 427
Overriding the Cache Time-out ....................................................................... 427
Deactivating the myInsight Agent Job .............................................................. 427
Disabling RSA Security Providers for myInsight Reports .................................. 428
Internet Explorer Browser Settings (LSSSV) ...................................................... 428
Improving Performance of Server Methods ...................................................... 428
Database Settings ........................................................................................... 429
SQL Server ................................................................................................. 429
Oracle ........................................................................................................ 430
Documentum Server Settings .......................................................................... 430
Scheduled Jobs ........................................................................................... 430
Server.ini File Settings ................................................................................. 431
JBOSS Access Log ....................................................................................... 431
Java Server Method Settings ........................................................................ 431
D2 Web Server Settings ................................................................................... 432
JMS Configuration ...................................................................................... 432
D2 Caching ................................................................................................ 432
D2 Client Configurations ................................................................................ 433
Optimizing the D2 GUI Performance ........................................................... 433
Life Sciences Document Preview Widget Configuration ................................ 433

Appendix A Configuration Planning Checklist .............................................................. 435


Appendix B Troubleshooting ........................................................................................ 439
Log Files ........................................................................................................ 439
D2 Log Files ............................................................................................... 439
Life Sciences Log Files................................................................................. 439
Underlying Products Log Files .................................................................... 443
Content Transformation Services ............................................................. 443
xPlore ................................................................................................... 443
Documentum Server .............................................................................. 443
Java Method Server................................................................................. 444
Third-Party Log Files .................................................................................. 444
myInsight .............................................................................................. 444
Enabling Logging, Debugging, and Tracing ...................................................... 444
Configuring Logging for D2 ........................................................................ 445
Configuring Logging for Controlled Print .................................................... 445
Configuring Debugging for Custom External Widgets .................................. 446
Configuring Logging for Underlying Products ............................................. 446
Documentum Server ............................................................................... 446
Content Transformation Services ............................................................. 447
xPlore .................................................................................................... 448
Thumbnail Server ................................................................................... 449
Java Method Server................................................................................. 449
Life Sciences SDK ................................................................................... 450
Configuring Logging for Third-Party Products ............................................. 450

10
Table of Contents

myInsight .............................................................................................. 450


Connection Issues ........................................................................................... 450
D2 Performance Issues .................................................................................... 451
Exporting Large Folders Issue ......................................................................... 451
Static Objects in XMLViewer not Cached .......................................................... 452

Appendix C Visual Representation of Attribute Cascading in Life Sciences ................ 453


Appendix D D2 Configurations ..................................................................................... 459

11
Table of Contents

List of Figures

Figure 1. Product and Project Registration Form Attributes ................................................. 454


Figure 2. Cascading of Attributes in the Clinical Domain .................................................... 454
Figure 3. Cascading of Attributes in the Non-clinical Domain .............................................. 455
Figure 4. Cascading of Attributes in the Regulatory Domain ............................................... 456
Figure 5. Cascading of Attributes in the Safety and Quality Domain .................................... 457

12
Table of Contents

List of Tables

Table 1. Document domains .............................................................................................. 19


Table 2. Top-level domain roles .......................................................................................... 21
Table 3. Objects Types for which Users can Create Documents ............................................. 68
Table 4. EDM reference model Quality domain dictionaries ................................................. 73
Table 5. TMF dictionaries .................................................................................................. 75
Table 6. Taxonomy ............................................................................................................ 77
Table 7. Dictionaries in the Correspondence Domain ........................................................... 78
Table 8. Taxonomies in the Correspondence Domain ........................................................... 78
Table 9. Dictionaries in the Regulatory Domain................................................................... 80
Table 10. Taxonomies in the Regulatory Domain ................................................................... 80
Table 11. Dictionaries in the Clinical Domain........................................................................ 81
Table 12. Taxonomies in the Clinical Domain ........................................................................ 81
Table 13. Dictionaries in the Non-clinical Domain ................................................................. 83
Table 14. Taxonomies in the Non-clinical Domain ................................................................. 83
Table 15. Dictionaries in the Quality Domain ........................................................................ 84
Table 16. Taxonomies in the Quality Domain ........................................................................ 84
Table 17. Dictionaries in the Ad-Promo Domain ................................................................... 86
Table 18. Taxonomies in the Ad-Promo Domain ................................................................... 86
Table 19. Dictionaries in the Labeling Domain ...................................................................... 87
Table 20. Taxonomies in the Labeling Domain ...................................................................... 87
Table 21. Dictionaries in the Safety Domain .......................................................................... 88
Table 22. Taxonomies in the Safety Domain .......................................................................... 88
Table 23. Management Creation Profiles .............................................................................. 93
Table 24. Required default values ........................................................................................ 96
Table 25. Registration form types ....................................................................................... 108
Table 26. Comparisons between D2 inheritance and Auto Inherited Attribute Rules ............. 112
Table 27. Extended attribute expressions ............................................................................ 125
Table 28. Context User Support in CDF Methods ................................................................ 132
Table 29. Workspace views and tasks ................................................................................. 219
Table 30. Workspace Views for LSTMF Workspace Roles..................................................... 229
Table 31. Workspace View for LSQM Workspace Roles ....................................................... 230
Table 32. Workspace View for LSRD Workspace Roles ........................................................ 231
Table 33. Workspace View for LSSSV Workspace Roles ....................................................... 232
Table 34. Object Types for Configuring D2 Mailing ............................................................. 278
Table 35. List of Task Variables .......................................................................................... 281
Table 36. Configuring the Hard Delete feature .................................................................... 291
Table 37. Lifecycle model for document packages ............................................................... 293

13
Table of Contents

Table 38. Configuration planning checklist ......................................................................... 435

14
Preface

This guide contains information about administering, configuring, and extending the solutions that
are part of OpenText Documentum for Life Sciences. The Life Sciences solution includes:
• OpenText Documentum for eTMF (LSTMF)
• OpenText Documentum for Quality and Manufacturing (LSQM)
• OpenText Documentum for Research and Development (LSRD)
• OpenText Documentum Submission Store and View (LSSSV)
Note: Documentum Content Server is now OpenText Documentum Server. OpenText Documentum
Server will be called Documentum Server throughout this guide.

Intended Audience
This guide is intended for anyone responsible for configuring, extending, or administering any
products in the Documentum for Life Sciences solution. OpenText recommends completion of the
following training prior to using this guide:
• Technical Fundamentals of Documentum
• Composer Fundamentals
• D2 Configuration
• Life Sciences Fundamentals
• Life Sciences Trial Master File (LSTMF)
Information about these training courses is available on the OpenText website.

Revision History
Revision Date Description
April 2019 Updated the content in Configuring the D2
Configuration Migration Tool, page 379.
January 2019 Updated the steps in Connecting the iHub
Analytical Designer with the Life Sciences
Repository, page 235.

15
Preface

Revision Date Description


November 2018 Removed the DIA EDM version number in
the section Data Model Overview, page 67 as
DIA EDM reference is not the same for all the
domains.

Updated the Group Generators, page 207


section.

Added the section Static Objects in XMLViewer


not Cached, page 452.
September 2018 Removed the section “Submission Error
Reporting” as it is covered in the Documentum
Submission Store and View 16.4 User Guide.
August 2018 Added a step in Configuring Cross-Domain
Document Sharing in D2-Config, page 288.
June 2018 Corrected some of the LSRD and LSSSV user
group names in Roles, page 20 and Workspace
Groups, page 223.
June 2018 Initial publication.

16
Chapter 1
Life Sciences Solution Fundamentals

This section provides an overview of the OpenText Documentum for Life Sciences solution.

Overview
The Documentum for Life Sciences solution helps organizations meet compliance requirements,
increase productivity, and securely collaborate across the extended enterprise. The Life Sciences
solution includes:
• Documentum for eTMF (LSTMF)
• Documentum for Quality and Manufacturing (LSQM)
• Documentum for Research and Development (LSRD)
• Documentum Submission Store and View (LSSSV)
The Life Sciences solution is built on OpenText Documentum D2 and the Documentum platform.
Individual solutions rely on reusable components provided by two base layers:
• Unified Solution Layer (also referred to as Controlled Document Foundation)
• Life Sciences Foundation
In this architecture, the lower layers have no dependencies on the higher layers. For example, the
Life Sciences Foundation components do not depend on any types, configurations, Java classes, and
so forth defined in the LSQM, LSTMF, LSRD, or LSSSV solution layers. The Unified Solution Layer
components do not depend on any components in the Life Sciences Foundation or specific solution
layers. Conversely, dependencies on lower layer components are encouraged.
D2 configurations are assigned to one of the following applications corresponding to the layered
architecture:
• Documentum for Quality and Manufacturing solution
• Documentum for eTMF solution
• Documentum for Research and Development solution
• Documentum Submission Store and View solution

17
Life Sciences Solution Fundamentals

Document Domains
The Life Sciences solutions provide extensive inventory to manage documents relevant for each
functional area within Life Sciences organizations. Documents stored and managed by the Life
Sciences solutions are categorized according to the functional area in which they are used. This
top-level categorization is called the document domain. Every document is automatically assigned
a domain when it is created. The domain assignment at document creation or import time is
accomplished using a D2 Default Values template configuration associated with the creation profile
of the document.
The standard domains provided with each solution mirror those defined by the DIA Electronic
Document Model (EDM) reference model. The DIA EDM reference model is patterned after the
Electronic Common Technical Document (eCTD) standard. The DIA website provides more
information about the DIA EDM reference model.
In Documentum for Research and Development, domains and inventory are based on DIA EDM
Reference Model with extensions where appropriate. The DIA EDM reference model defines the
following domains:
• Clinical
• Labeling
• Non-Clinical
• Quality
• Regulatory/Administrative
These same domains are defined in the solution with additional inventory to manage Safety-PVG,
Promotional Materials, and Medical Device documents.
Documentum for Quality and Manufacturing provides an inventory mapped to the DIA EDM QM
Reference Model with extensions where appropriate. In addition to standard documents, the solution
includes domains to manage Design History File and Device Master Record components.
The following document domains are provided in the respective Life Sciences solutions:

18
Life Sciences Solution Fundamentals

Table 1. Document domains

Solution Domain Name Description


LSTMF Clinical TMF Includes documents that are included within a
Trial Master File and corresponds to the DIA TMF
reference model.
LSRD Clinical Corresponds to the DIA EDM reference model
Clinical domain.
Non-clinical Corresponds to the DIA EDM reference model
Non-Clinical domain.
Quality Corresponds to the DIA EDM reference model
Quality domain.
Safety Include documents that provide safety information
for a drug.
Labeling Include documents that provide labeling information
for a drug.
Promotional Include documents that provide marketing and sales
Materials (Ad information for a drug.
Promo)
Medical Devices Include documents for a medical device within the
Clinical, Non-clinical, and Regulatory domains.
Regulatory Corresponds to the DIA EDM reference model
/Administrative Regulatory/Administrative domain.
General Includes uncontrolled documents such as letters,
memos, and notes.
LSSSV Regulatory Includes documents that form a part of the electronic,
Submissions NeeS, and paper submissions to a health authority
for a regulatory application. Corresponds to the DIA
EDM reference model Regulatory/Administrative
domain.
Regulatory Includes any form of correspondence exchanged
Correspondence between the pharmaceutical companies and the
regulatory agencies for a particular submission.

19
Life Sciences Solution Fundamentals

Solution Domain Name Description


LSQM Good Includes controlled documents relating to
Manufacturing organizational directives, policies, standard
Practices (GMP) operating procedures (SOPs), production of
marketed products in a GMP environment including
methods, specifications, and master batch records,
certificates, records, and drawings related to a
drug manufacturing or packaging facility. Includes
documents relevant to the packaging and labeling
of drug product. Includes documents used in
the validation of products, substances, materials,
systems, and equipment. Includes guides, manuals,
reference documents, and training materials.
Includes project deliverables such as requirements,
planning, design, risk assessments, and reports.
GMP Medical Includes medical devices deliverables such as Bill Of
Devices Material, Labeling, DMR Specifications, Drawings,
DMR, Production Document, Customer Documents,
Service Documents, Software Release, Protocols,
Reports, Requirements, Reviews, Plans, Checklists,
DHF Specifications, Matrix, Risk Documentation and
DHF.
New document domains can be created to extend the solution to support additional kinds of
documents. Creating a new domain requires adding an additional Documentum object type and a set
of D2 configurations to support the creation of documents of that type.
Note: LSQM documents all belong to the domain, GMP. However, this domain is further subdivided
into categories as noted in parentheses in Table 1, page 19.

Roles
A role is a type of group that contains users or other groups that are assigned a specific role. Roles
provide a means of defining groups that have a particular function within a system. For example,
pharmaceutical companies manage their huge set of documentation by the assignment of roles such
as authors, reviewers, approvers, managers, and so on. Each role can have one or more people
designated to perform the activity.
Predefined roles (user and workspace roles) are installed with the Life Sciences solutions.
Administrators must add users or groups to the applicable predefined roles and ensure each user has
an assigned D2 workspace and appropriate access to documents and functions. For more information
about workspace roles, see Workspace Groups, page 223.
A set of roles are provided for each of the following domains:
• Clinical
• Non-Clinical
• Quality

20
Life Sciences Solution Fundamentals

• Regulatory/Administration
• Safety
• Labeling
• Promotional Materials (Ad Promo)
• Correspondence
• GMP
• Medical Devices
In addition, the Documentum for Quality and Manufacturing (GMP) solution roles are assigned by
Applicable Sites. Applicable Sites are defined in a dictionary and can be determined by physical
manufacturing location and/or business groups. The Configuring Roles (LSQM), page 65 section
provides the steps to configure site-based roles.
All roles are installed for all solutions. However, some roles are typically used only in a particular
solution. The naming convention used for the group name for each role in the Documentum for
Research and Development solution is cd_<domain>_doc_<role>. For example, the group name for
the Author role in the Clinical domain is cd_clinical_doc_author.
The naming convention used for the group name for each role in the Documentum for Quality and
Manufacturing solution is cd_<applicable_site>_<role>. For example, the group name for the Document
Coordinator role for the Boston site is cd_boston_coordinators.
Some roles have default members. For example Document Coordinators are members of the
Document Authors role. A top-level role for each domain is also installed. All domain roles are made
members of the top-level domain role as shown in the following table.

Table 2. Top-level domain roles

Role Group Name Contains


cd_ad_promo Users of Promotional Materials documents. Users in this role can
access the Promotional Materials domain documents and at least have
READ permission on the documents.
cd_clinical Users of Clinical documents. Users are able to access the Clinical
domain documents and at least have READ permission on the same.
cd_corres Users of Correspondence documents. Users in this role are able to
access the correspondence domain documents and at least have
READ permission on the documents.
cd_gmp_all_users Users of GMP documents.
cd_labeling Users of Labeling documents. Users in this role are able to access
the Labeling domain documents and at least have READ permission
on the documents.
cd_md_users Users of Medical Devices documents. Users in this role can access
the Medical Devices domain documents and at least have READ
permission on the documents.

21
Life Sciences Solution Fundamentals

Role Group Name Contains


cd_md_clinical Users of Clinical documents for medical devices. Users are able
to access the Clinical domain documents and at least have READ
permission on the same.
cd_md_regulatory Users of Regulatory documents for medical devices. Users in this role
are able to access the Regulatory domain documents and at least have
READ permission on the documents.
cd_md_regulatory_users Users of Regulatory documents for medical devices. This is a
subgroup within the cd_md_regulatory group. Users in this role are
able to access the Regulatory domain documents and at least have
READ permission on the documents.
cd_md_submission_users Users of Submission documents for medical devices This is a
subgroup within the cd_md_regulatory group. Users in this role are
able to access the Regulatory Submission documents and at least have
READ permission on the documents.
cd_non_clinical Users of Non-clinical documents. Users are able to access the
Non-clinical domain documents and at least have READ permission
on the documents.
cd_quality Users of Quality documents. Users in this role are able to access the
Quality domain documents and at least have READ permission on
the documents.
cd_regulatory User of Regulatory documents. Users in this role are able to access the
Regulatory domain documents and at least have READ permission
on the documents.
cd_safety Users of Safety documents. Users in this role are able to access the
Safety domain documents and at least have READ permission on
the documents.
The predefined roles are described in the following sections.

Administrators
Administrators have the ability to modify most D2 dictionaries and taxonomies as well as other
configuration objects, including Auto Inheritance Config and Delete Config objects.

Group name cd_admingroup


Contains members <none>
Is a member of <none>

22
Life Sciences Solution Fundamentals

Consumer Import
The Consumer Import group includes users who are typically consumers of documents who have the
ability to import documents into the system.

Group name cd_consumer_import


Contains members cd_admingroup

cd_ad_promo_consumers_imp

cd_clinical_consumers_imp

cd_corres_consumers_imp

cd_general_consumers_imp

cd_gmp_readers_imp

cd_labeling_consumers_imp

cd_md_clinical_consumers_imp

cd_md_consumers_imp

cd_md_non_clinical_consumers_imp

cd_md_regulatory_consumers_imp

cd_non_clinical_consumers_imp

cd_quality_consumers_imp

cd_regulatory_consumers_imp

cd_safety_consumers_imp
Is a member of <none>

Controlled Printers and Issued Printers (LSQM)


The Controlled Printers group is responsible for printing controlled documents. Users in this
group will be able to use the Print, Reprint, Recall, and Print Reporting features of the Controlled
Print widget.

23
Life Sciences Solution Fundamentals

Group name cd_controlled_print

cd_controlled_print_printers

cd_controlled_print_reprinters

cd_controlled_print_recall

cd_issued_print_admin

cd_issued_print_approvers

cd_issued_print

cd_issued_print_printers

cd_issued_print_reconcilers

cd_issued_print_reprinters
Contains members cd_admingroup

cd_gmp_coordinators
Is a member of <none>

Document Approvers
Document Approvers are responsible for approving controlled documents. They perform the For
Approval task in a controlled document workflow.

24
Life Sciences Solution Fundamentals

Group name cd_ad_promo_doc_approvers

cd_ad_promo_template_approvers

cd_clinical_doc_approvers

cd_clinical_template_approvers

cd_corres_doc_approvers

cd_corres_template_approvers

cd_general_doc_approvers

cd_general_template_approvers

cd_global_approvers

cd_gmp_approvers

cd_gmp_template_approvers

cd_labeling_doc_approvers

cd_labeling_template_approvers

cd_md_clinical_doc_approvers

cd_md_clinical_template_approvers

cd_md_doc_approvers

cd_md_non_clinical_doc_approvers

cd_md_non_clinical_template_approvers

cd_md_regulatory_doc_approvers

cd_md_regulatory_template_approvers

cd_med_device_template_approvers

cd_non_clinical_doc_approvers

cd_non_clinical_template_approvers

cd_quality_doc_approvers

cd_quality_template_approvers

cd_regulatory_doc_approvers

cd_regulatory_template_approvers

cd_safety_doc_approvers

cd_safety_template_approvers
Contains members cd_admingroup
25
Life Sciences Solution Fundamentals

Is a member of cd_<domain> (LSRD, LSTMF, LSSSV)

cd_gmp_all_users (LSQM)

Document Auditors
Document Auditors inspect the state of documents in their respective domain for audit-readiness.
Document Auditors have read-only access to Effective/Final/Released, Superseded, and Expired
documents and no access to documents in other states.

Group name cd_ad_promo_doc_auditors

cd_clinical_doc_auditors

cd_corres_doc_auditors

cd_general_doc_auditors

cd_global_auditors

cd_gmp_auditors

cd_labeling_doc_auditors

cd_md_clinical_doc_auditors

cd_md_doc_auditors

cd_md_non_clinical_doc_auditors

cd_md_regulatory_doc_auditors

cd_non_clinical_doc_auditors

cd_quality_doc_auditors

cd_regulatory_doc_auditors

cd_safety_doc_auditors
Contains members cd_admingroup
Is a member of cd_<domain> (LSRD, LSTMF, LSSSV)

cd_gmp_all_users (LSQM)

26
Life Sciences Solution Fundamentals

Document Authors
Document Authors create documents and submit them for collaborative editing, review, and
approval. They can self-approve documents that do not require formal review and approval.

27
Life Sciences Solution Fundamentals

Group name cd_ad_promo_doc_authors

cd_ad_promo_template_authors

cd_clinical_doc_authors

cd_clinical_doc_authors_tmf

cd_clinical_template_authors

cd_corres_doc_authors

cd_corres_template_authors

cd_general_doc_authors

cd_general_template_authors

cd_global_authors

cd_gmp_authors

cd_gmp_template_authors

cd_labeling_doc_authors

cd_labeling_template_authors

cd_md_clinical_doc_authors

cd_md_clinical_template_authors

cd_md_doc_authors

cd_md_non_clinical_doc_authors

cd_md_non_clinical_template_authors

cd_md_regulatory_doc_authors

cd_md_regulatory_template_authors

cd_med_device_template_authors

cd_non_clinical_doc_authors

cd_non_clinical_template_authors

cd_quality_doc_authors

cd_quality_template_authors

cd_regulatory_doc_authors

cd_regulatory_template_authors

cd_safety_doc_authors

cd_safety_template_authors
28
Contains members cd_<domain>_doc_coordinators

cd_admingroup
Life Sciences Solution Fundamentals

Note: The cd_general_doc_authors role is not a member of any Document Coordinators role because
there is no corresponding Document Coordinators role for the General domain.
All domain Authors roles except the cd_general_doc_authors role, are members of the
cd_general_doc_authors role. This enables authors in all roles the ability to create General documents.
Documentum for Quality and Manufacturing does not use the cd_general_doc_authors role.

Document Contributors (LSTMF)


Document Contributors upload files that originated outside of the Documentum repository to a Trial
Master File (TMF) within a Documentum repository.

Group name cd_clinical_doc_contributors

tmf_contributors

tmf_external_contributors
Contains members tmf_external_contributors

tmf_external_reviewers

tmf_investigators

tmf_inspectors
Is a member of cd_clinical

Document Coordinators
Document Coordinators manage the release of controlled documents. They can also create documents
and submit them for collaborative editing, review, and approval. Document Coordinators monitor
the progress of document workflow tasks. They can change workflow task performers.

29
Life Sciences Solution Fundamentals

Group name cd_ad_promo_doc_coordinators

cd_clinical_doc_coordinators

cd_corres_doc_coordinators

cd_general_doc_coordinators

cd_global_coordinators

cd_gmp_coordinators

cd_labeling_doc_coordinators

cd_md_clinical_doc_coordinators

cd_md_doc_coordinators

cd_md_non_clinical_doc_coordinators

cd_md_regulatory_doc_coordinators

cd_non_clinical_doc_coordinators

cd_quality_doc_coordinators

cd_regulatory_doc_coordinators

cd_safety_doc_coordinators
Contains members cd_admingroup

cd_non_clinical_managers

cd_product_managers

cd_regulatory_managers
Is a member of cd_<domain> (LSRD, LSTMF, LSSSV)

cd_<domain>_authors (LSRD, LSTMF)

cd_gmp_all_users (LSQM)

Document Quality Organization Approvers (LSQM)


Quality Organization (QO) Approvers perform the second-level of approval for controlled documents
that require two-levels of approval.

30
Life Sciences Solution Fundamentals

Group name cd_global_qo_approvers

cd_gmp_qo_approvers
Contains members cd_admingroup
Is a member of cd_gmp_all_users

Document Readers
Document Readers have read-only access to Effective versions of documents. They browse for,
search and read documents.

Group name cd_ad_promo_doc_readers

cd_clinical_doc_readers

cd_corres_doc_readers

cd_general_doc_readers

cd_global_readers

cd_gmp_readers

cd_labeling_doc_readers

cd_md_clinical_doc_readers

cd_md_doc_readers

cd_md_non_clinical_doc_readers

cd_md_regulatory_doc_readers

cd_non_clinical_doc_readers

cd_quality_doc_readers

cd_regulatory_doc_readers

cd_safety_doc_readers
Contains members cd_admingroup

cd_regulatory_managers

cd_regulatory_publisher
Is a member of cd_<domain> (LSRD, LSTMF, LSSSV)

cd_gmp_all_users (LSQM)

31
Life Sciences Solution Fundamentals

Document Reviewers
Document Reviewers review documents and edit documents using annotations. They are responsible
for technical reviews during the authoring and review cycle. Reviewers complete workflow tasks and
can browse and search for documents.

Group name cd_ad_promo_doc_reviewers

cd_clinical_doc_reviewers

cd_corres_doc_reviewers

cd_format_reviewers

cd_general_doc_reviewers

cd_global_reviewers

cd_gmp_reviewers

cd_labeling_doc_reviewers

cd_md_clinical_doc_reviewers

cd_md_doc_reviewers

cd_md_non_clinical_doc_reviewers

cd_md_regulatory_doc_reviewers

cd_non_clinical_doc_reviewers

cd_quality_doc_reviewers

cd_regulatory_doc_reviewers

cd_safety_doc_reviewers

tmf_external_reviewers
Contains members cd_admingroup

cd_regulatory_managers
Is a member of cd_<domain> (LSRD, LSTMF, LSSSV)

cd_gmp_all_users (LSQM)

Managers (Product, Project, Trial, Study, Regulatory)


Managers manage the documentation for their respective domains. They create and manage the
registration forms that users use to import and create documents. They also monitor document

32
Life Sciences Solution Fundamentals

progress. Product Managers are responsible for all documents across the regulatory applications,
non-clinical studies, clinical trials, and quality projects associated with particular products. They
create and manage Product Registration Forms.

Group name cd_ad_promo_managers

cd_clinical_managers

cd_clinical_trial_managers_tmf

cd_corres_managers

cd_gmp_item_registration_group

cd_labeling_managers

cd_md_managers

cd_non_clinical_managers

cd_product_managers

cd_quality_managers

cd_<domain>_ref_copy_mgrs

cd_regulatory_activity_managers

cd_regulatory_managers

cd_safety_managers
Contains members cd_admingroup

cd_product_registration_group

cd_project_registration_group
Is a member of cd_<domain>

cd_<domain>_doc_coordinators

Note: Because the cd_product_managers role spans multiple domains, this role is a member of the
following roles:
• cd_clinical_doc_coordinators
• cd_non_clinical_doc_coordinators
• cd_quality_doc_coordinators
• cd_regulatory_doc_coordinators

33
Life Sciences Solution Fundamentals

Global External Participants (LSTMF)


The Global External Participants group includes external participants in a clinical study who at least
have READ access to the document.

Group name tmf_global_external_participants


Contains members <none>
Is a member of cd_<domain>

Inspectors (LSTMF)
Document Inspectors are the users who can inspect the TMF documents uploaded to a placeholder
to which they have access.

Group name tmf_inspectors


Contains members <none>
Is a member of cd_<domain>

Investigators (LSTMF)
TMF Investigators are the users who can import the TMF document for a placeholder of a particular
site, country, or trial but cannot make it Effective/Approved/Final.

Group name tmf_investigators


Contains members <none>
Is a member of cd_<domain>

Quality Check (LSTMF)


TMF Quality Check are the users who performs a quality check on an indexed TMF document for a
placeholder of a particular product, site, country, or trial before making it Final.

Group name cd_clinical_qc_tmf


Contains members cd_clinical_qc_technical

cd_clinical_qc_business
Is a member of cd_<domain>

34
Life Sciences Solution Fundamentals

Security Groups
These groups are the document admins who can modify the configured roles on the document and
the model admins who can change the security model on the document.

Group name <domain>_sec_doc_admin

<domain>_model_admin
Contains members <none>
Is a member of cd_<domain>

Solution-Specific User Roles


This section lists the defined user roles for each Life Sciences solution.

User Roles in Documentum for eTMF

Documentum for eTMF provides defined user roles that enable or restrict user access to documents
and information in the system. The following table describes the user roles:

User Role Groups Description


Managers cd_product_managers Create and manage registration forms for
each domain. For example, the Clinical
cd_clinical_managers Trial Managers create clinical trial, country,
and site registration forms. Clinical Trial
cd_<domain>_ref_copy_mgrs
Managers also set up and maintain the file
plan for a trial.

The members of cd_<domain>_ref_copy


_mgrs group will be those to which
reference copies of the crossover documents
are assigned for indexing and approval by
default.
Contributors cd_clinical_doc_contributors Import and index TMF documents.
TMF Contributors are able to perform
tmf_contributors inspection and review of TMF documents
for placeholders to which they have access.
Consumer Import cd_clinical_consumers_imp Import document. They are consumers
who have read-only access to
Effective/Approved/Final versions of
documents and are considered general
consumers.

35
Life Sciences Solution Fundamentals

User Role Groups Description


Authors cd_clinical_doc_authors Create documents and submit them for
collaborative editing and review. Authors
cd_clinical_doc_authors_tmf can also import documents like the
Contributors. Authors can self-approve
cd_clinical_template_authors
most TMF documents. Template Authors
create and manage clinical template
documents.

Users in the cd_clinical_doc_authors_tmf


group can import Clinical files to the
repository but do not author new
documents. For example external TMF
Investigators and external Authors belong
in this role.
Document cd_clinical_doc_coordinators Manage the publication of controlled
Coordinators documents.

Authors can act as Document Coordinators


on most TMF documents.
Reviewers cd_clinical_doc_reviewers Review documents using annotations and
edit documents.
cd_format_reviewers
Approvers cd_clinical_doc_approvers Responsible for approving controlled
documents. Template approvers approve
cd_clinical_template_approvers clinical template documents.
Auditors cd_clinical_doc_auditors Have read-only access to audit logs as well
as Effective/Approved/Final, Superseded,
and Expired documents.
Readers cd_clinical_doc_readers Have read-only access to Effective
/Approved/Final versions and are
considered general consumers.
Investigator* tmf_investigators Clinical investigators who administer the
drug or therapy to subjects (patients or
volunteers) and record clinical data on
each subject. Investigators typically act as
contributors to the TMF.
Inspector* tmf_inspectors Health authority or regulatory agency
representatives who may audit a clinical
trial. Inspectors are typically given
read-access to Effective/Approved/Final
documents in the TMF.
Quality Check cd_clinical_qc_technical Inspects the indexed documents for quality
before making them Final.
cd_clinical_qc_business

36
Life Sciences Solution Fundamentals

User Role Groups Description


External tmf_external_contributors Produces documents or imports LSTMF
Contributor* documents. For example, a member of a
Contract Research Organization.
External Reviewer* tmf_external_reviewers Peer reviews or participates in collaborating
on documents. For example, an expert in
the relevant field of medicine.
Security Admin cd_clinical_sec_doc_admin Manages the configured roles on the
document.
cd_clinical_model_admin
Manages the security model on the
document.
Administrator cd_admingroup Accesses administrative functions but does
not have access to controlled documents.

*These roles are external participants. They can receive access to documents associated with a country
or site. The use of the term external does not require the user to be a contractor or otherwise external
to the system. It means that they do not have global access to all documents in the system and only
have access to what managers specifically grant to them. Managers can grant the access for a limited
time. External Trial Participant Roles, page 38 provides more information.

Cross-functional User Groups

The following table describes the cross-functional user groups:

Groups Description
cd_admingroup Administrator:
• Has access to administrative functions

• Requires Administrative client capability

• Does not have access to controlled documents


cd_product_managers Product managers who create Product
Registration Forms.
cd_general_doc_authors Users in the role can create general documents
(letters, memos, notes). By default, all
other authors groups are members of the
cd_general_doc_authors role.

Reporting Groups

Documentum for eTMF provides an optional reporting feature for monitoring active clinical trials
based on AMPLEXOR myInsight for Documentum (myInsight). The reports can be customized
using myInsight.

37
Life Sciences Solution Fundamentals

In Documentum D2, users view dashboards containing report information. The following table
describes the reporting groups used by the Report Generator:

Groups Description
report_user Report Users can generate reports and view
historical data in the form of saved reports.
report_builder Report Builders can manage report definitions
and presentations.
report_administrator Report Administrators can define categories and
scheduling of reports.

External Trial Participant Roles

Documentum for eTMF enables you to add a named user to participate as a particular role within
the LSTMF system. These users, known as external participants, can receive access to countries or
sites. External participants only have access to the documents associated with the entity granted.
Administrators specify document access levels when configuring the user roles. External trial
participants require a Documentum user account (dm_user) for system access.
Note: The use of the term external does not require the user to be a contractor or otherwise external to
the system. It means that they do not have global access to all documents in the system and only have
access to what managers specifically grant to them.
Managers can register external participants for countries and sites to grant access for the specified
entities. They register the external participants by adding them to the relevant site or country
registration forms. By default, users who have Write access to the registration form can add external
trial participants. The Access Control tab of a registration form defines the users who can update the
form. For example, a clinical trial manager or a local site administrator delegated to act in this role
can add external participants to registration forms. These participants receive the access specified in
the registration form.
The following table describes the default external participant roles:

Role Description
Investigator A clinical investigator responsible for
administering the drug or therapy to subjects
(patients or volunteers) and recording clinical
data on each subject.
Inspector A representative of the health authority or
regulatory agency responsible for ensuring that
good clinical practice is followed during the
conduct of the trial.

38
Life Sciences Solution Fundamentals

Role Description
External Contributor A producer of specific documents or a person
who can typically import LSTMF documents.
For example, a member of a Contract Research
Organization.
External Reviewer A peer reviewer or participant of specific
documents. For example, an expert in the
relevant field of medicine.

External Trial Participant Groups

The system creates user groups for providing document access to external trial participants. The
following table describes the external trial participant user groups:

Groups Description
tmf_global_external_participants Provides access rights common to all external
participants. This group is a TMF group.
tmf_contributors Provides read-only and browse access to the
top-level Clinical cabinet and product folders.
All clinical trial participants belong to this group
indirectly.
tmf_external_contributors Assigns a workspace to the External
Contributors.
tmf_external_reviewers Assigns a workspace to the External Reviewers.
tmf_inspectors Assigns a workspace to the Inspectors.
tmf_investigators Assigns a workspace to the Investigators.
pg_<product-code> Provides Read access to the product-level
folders.
(Product group)
tg_<trial-ID> Provides Read access to the top-level TMF folder
for the trial.
(Trial group)
cg_<trial-ID>_<country-code> Provides Read access to the top-level country
folder for the trial.
(Country group)
sg_<site-ID> Provides Read access to the top-level site folder
for the trial.
(Site group)
Each external trial participant user group is further divided into subgroups for roles with the
following suffixes:
• Investigator: _inv
• Inspector: _insp

39
Life Sciences Solution Fundamentals

• External Contributor: _contrib


• External Reviewer: _rev
For example, for Product XYZ, Trial 1234, Country IRL, and Site dublin, the system creates the
following groups and subgroups:
• pg_XYZ
— pg_XYZ_inv
— pg_XYZ_insp
— pg_XYZ_contrib
— pg_XYZ_rev
• tg_1234
— tg_1234_inv
— tg_1234_insp
— tg_1234_contrib
— tg_1234_rev
• cg_1234_IRL
— cg_1234_IRL_inv
— cg_1234_IRL_insp
— cg_1234_IRL_contrib
— tcg_1234_IRL_rev
• sg_dublin
— sg_dublin_inv
— sg_dublin_insp
— sg_dublin_contrib
— sg_dublin_rev
Note: You should not directly add users to these groups. The system automatically populates these
groups when you select Manage External Participants.

User Roles in Documentum for Quality and Manufacturing

Documentum for Quality and Manufacturing provides defined user roles that enable or restrict user
access to documents and information in the system. The following table describes the user roles:

40
Life Sciences Solution Fundamentals

User Role Groups Description


Administrators cd_admingroup Access administrative functions but do not
have access to controlled documents.
Approvers cd_<applicable site>_approvers Approve controlled documents (Control
Category 1 and 2). Some documents
cd_gmp_template_approvers require electronic signatures. Template
approvers approve template documents.
cd_med_device_template
_approvers
Auditors cd_<applicable site>_auditors Have read-only access to audit logs,
Effective/Approved/Final, Superseded,
and Expired documents. They can view
document content, history, and properties.

They must have a minimum extended


privilege of View Audit.
Authors cd_<applicable site>_authors Create documents and submit them for
collaborative editing, review, and approval.
cd_gmp_template_authors
• Control Categories 1 and 2: Authors
cd_med_device_template cannot be an Approver.
_authors
• Control Category 3: Authors can
approve documents and act as
Document Coordinators.

Template authors create and manage


template documents.
Consumer Import cd_gmp_consumers_imp Import document. They are consumers
who have read-only access to
Effective/Approved/Final versions of
documents.

41
Life Sciences Solution Fundamentals

User Role Groups Description


Document cd_<applicable site>_coordinators Manage the release of controlled
Coordinators documents.
• Control Category 1: Schedules release
dates. Verifies that all members of the
To Be Read (TBR) Distribution List have
signed off on a document or rejected the
task in the Send to TBR distribution
workflow before the document becomes
Effective/Approved/Final.

• Control Categories 1 and 2:


Releases documents to an
Effective/Approved/Final state and
controls the long-term management of
the document. Can submit a document
to a workflow.

• Control Category 3: Authors can act as


Document Coordinators.
Quality cd_<applicable site>_qo Responsible for final approval of Control
Organization _approvers Category 1 documents.
Approvers
Readers cd_<applicable site>_readers General consumers with read-only access
to Effective/Approved/Final versions.

(Optional) Recipients on the TBR


Distribution List receive notification
when a Control Category 1 document
is scheduled to become Effective. They
confirm that they have read the document.

When a Reader requests a printable


PDF, the system adds them to the TBR
Distribution List. Typically, PDFs are
secure and do not allow local printing,
editing, or annotation.
Reviewers cd_<applicable site>_reviewers Review documents using annotations and
edit documents. They are responsible for
technical review during the authoring and
review cycle.
Product Managers cd_product_registration_group Create and manage Product Registration
Forms.

42
Life Sciences Solution Fundamentals

User Role Groups Description


Manufactured cd_gmp_item_registration Create and manage Manufactured Item
Item Manager _group Registration Forms.
Device Managers cd_md_managers Create and manage Medical Device
Registration Forms, Medical Device
cd_md_regulatory_managers Regulatory Application Registration Forms
(RARFs), and Medical Device Submission
cd_md_submission_managers
Registration Forms.

Cross-functional User Groups

The following table describes the cross-functional user groups:

Groups Description
cd_admingroup Administrator:
• Has access to administrative functions

• Requires Administrative client capability

• Does not have access to controlled documents


cd_c2_printers Users in this group have access to the right-click
Print menu used for uncontrolled copies with
C2 watermarks/overlays.
cd_controlled_print Users in this group can use the Print, Reprint,
and Print Reporting features of the Controlled
Print widget.
cd_controlled_print_printers Users in this group are allowed to create a
controlled print.
Note: Previously, members of the
cd_controlled_print group were allowed
to both create a controlled print as well
as view controlled print reports. For
consistency across both types of prints, the
cd_controlled_print_printers group will now
determine if a user can create a controlled print.
cd_controlled_print_recall Users in this group can use the Recall and
Print Reporting features of the Controlled Print
widget.
cd_controlled_print_reprinters Users in this group are allowed to reprint an
existing controlled print.

43
Life Sciences Solution Fundamentals

Groups Description
cd_issued_print_admin The only user that should be part of this
group is the user used in the Controlled Print /
Issued Print web application. That is, the user
configured in the properties file to log into the
repository. This user is granted access to specific
D2 contexts for virtual document publishing.
cd_issued_print_approvers Members of this group are allowed to approve
Issued Print Requests. By default, this group
will contain the cd_issued_print_printers group.
cd_issued_print Members of this group are allowed to see the
Issued Print Reporting option. By default, it
includes cd_issued_print_printers, cd_issued
_print_reprinters, cd_issued_print_approvers
and cd_issued_print_reconcilers.
cd_issued_print_printers Members of this group are allowed to create an
Issued Print.
cd_issued_print_reconcilers Members of this group have access to reconcile
issued prints. By default, this group will contain
the cd_issued_print_printers group.
cd_issued_print_reprinters Members of this group are allowed to reprint
an existing issued print. By default, the
cd_issued_print_reprinters group will contain
the cd_issued_print_printers group.

Note: In addition to the Issued Print groups specified in the cross-functional user groups table,
the system provides site-based approvers (cd_issued_print_<site>_approvers) and reconcilers
(cd_issued_print_<site>_reconcilers) groups. For more information about creating these groups,
see Issued Print Site-based Groups, page 261.

Reporting Groups

Documentum for Quality and Manufacturing provides an optional reporting feature for monitoring
active clinical trials based on myInsight. See Reporting Groups, page 37 for the reporting groups
used by the Report Generator.

User Roles in Documentum for Research and Development

Documentum for Research and Development provides defined user roles that enable or restrict user
access to documents and information in the system. The following table describes the user roles:

44
Life Sciences Solution Fundamentals

User Role Groups Description


Approvers cd_ad_promo_doc_approvers Approve controlled documents. Some
documents require electronic signatures.
cd_clinical_doc_approvers Template approvers approve template
documents.
cd_labeling_doc_approvers

cd_md_clinical_doc_approvers

cd_md_clinical_template
_approvers

cd_md_doc_approvers

cd_md_non_clinical_doc
_approvers

cd_md_non_clinical_template
_approvers

cd_md_regulatory_doc
_approvers

cd_md_regulatory_template
_approvers

cd_non_clinical_doc_approvers

cd_quality_doc_approvers

cd_regulatory_doc_approvers

cd_safety_doc_approvers

cd_<domain>_template
_approvers
Auditors cd_ad_promo_doc_auditors Have read-only access to audit logs,
Approved, and Superseded documents.
cd_clinical_doc_auditors They can view document content, history,
and properties.
cd_labeling_doc_auditors
They must have a minimum extended
cd_md_clinical_doc_auditors
privilege of View Audit.
cd_md_doc_auditors

cd_md_non_clinical_doc
_auditors

cd_md_regulatory_doc
_auditors

cd_non_clinical_doc_auditors

cd_quality_doc_auditors

cd_regulatory_doc_auditors
45

cd_safety_doc_auditors
Life Sciences Solution Fundamentals

User Role Groups Description


Authors cd_ad_promo_doc_authors Create documents and submit them for
collaborative editing, review, and approval.
cd_clinical_doc_authors
• Control Categories 2: Authors cannot be
cd_labeling_doc_authors an Approver.

cd_md_clinical_doc_authors • Control Category 3: Authors can


approve documents and act as
cd_md_clinical_template Document Coordinators.
_authors
Template authors create and manage
cd_md_doc_authors template documents for that domain.

cd_md_non_clinical_doc
_authors

cd_md_non_clinical_template
_authors

cd_md_regulatory_doc_authors

cd_md_regulatory_template
_authors

cd_non_clinical_doc_authors

cd_quality_doc_authors

cd_regulatory_doc_authors

cd_safety_doc_authors

cd_<domain>_template_authors

46
Life Sciences Solution Fundamentals

User Role Groups Description


Consumer Import cd_ad_promo_consumers_imp Import document. They are consumers
who have read-only access to Approved
cd_clinical_consumers_imp versions of documents.
cd_general_consumers_imp

cd_labeling_consumers_imp

cd_md_clinical_consumers_imp

cd_md_consumers_imp

cd_md_non_clinical_consumers
_imp

cd_md_regulatory_consumers
_imp

cd_non_clinical_consumers
_imp

cd_quality_consumers_imp

cd_regulatory_consumers_imp

cd_safety_consumers_imp
Document cd_ad_promo_doc Manage the release of controlled
Coordinators _coordinators documents.
• Control Category 2: Approvers can act
cd_clinical_doc_coordinator
as Document Coordinators.
cd_labeling_doc_coordinators
• Control Category 3: Authors can act as
cd_md_clinical_doc Document Coordinators.
_coordinators

cd_md_doc_coordinators

cd_md_non_clinical_doc
_coordinators

cd_md_regulatory_doc
_coordinators

cd_non_clinical_doc
_coordinator

cd_quality_doc_coordinator

cd_regulatory_doc_coordinator

cd_safety_doc_coordinator

47
Life Sciences Solution Fundamentals

User Role Groups Description


Product Managers cd_product_managers Create and manage Product Registration
Forms. Responsible for all documents
across the studies, clinical trials, and
projects associated with particular
products.
Device Managers cd_md_managers Create and manage Medical Device
Registration Forms. Responsible for all
documents across the medical device
clinical trials and regulatory documents
associated with particular medical device.
Regulatory, cd_ad_promo_managers Create and manage registration forms for
Clinical Trial, each domain. For example, Quality Project
Non-clinical cd_clinical_managers Managers create Project Registration
Study, Quality Forms.
cd_labeling_managers
Project, Labeling,
Safety Project, Responsible for related documents.
cd_md_clinical_managers
Promotional
The members of cd_<domain>_ref
Materials cd_md_regulatory_managers
_copy_mgrs group will be those to
Managers,
cd_md_submission_managers which reference copies of the crossover
Reference Copy
documents are assigned for indexing and
Managers
cd_non_clinical_managers approval by default.

cd_quality_managers

cd_<domain>_ref_copy_mgrs

cd_regulatory_activity
_managers

cd_regulatory_managers

cd_safety_managers

48
Life Sciences Solution Fundamentals

User Role Groups Description


Readers cd_ad_promo_doc_readers General consumers with read-only access
to Approved versions.
cd_clinical_doc_readers

cd_labeling_doc_readers

cd_md_clinical_doc_readers

cd_md_doc_readers

cd_md_non_clinical_doc
_readers

cd_md_regulatory_doc_readers

cd_non_clinical_doc_readers

cd_quality_doc_readers

cd_regulatory_doc_readers

cd_safety_doc_readers
Reviewers cd_ad_promo_doc_reviewers Review documents using annotations and
edit documents. They are responsible for
cd_clinical_doc_reviewers technical review during the authoring and
review cycle.
cd_format_reviewers

cd_labeling_doc_reviewers

cd_md_clinical_doc_reviewers

cd_md_doc_reviewers

cd_md_non_clinical_doc
_reviewers

cd_md_regulatory_doc
_reviewers

cd_non_clinical_doc_reviewers

cd_quality_doc_reviewers

cd_regulatory_doc_reviewers

cd_safety_doc_reviewers

49
Life Sciences Solution Fundamentals

User Role Groups Description


Submission cd_submission_archivists Imports submissions.
Archivist
Note: Although this role is present
in LSRD, it does not play any vital
function as it is only required for importing
submission against a regulatory application
registration form, which is available in
LSSSV only.

50
Life Sciences Solution Fundamentals

User Role Groups Description


Security Admin cd_ad_promo_sec_doc_admin Manages the configured roles on the
document.
cd_clinical_sec_doc_admin
Manages the security model on the
cd_labeling_sec_doc_admin document.
cd_md_clinical_sec_doc_admin

cd_md_nonclinical_sec_doc
_admin

cd_md_regulatory_sec_doc
_admin

cd_nonclinical_sec_doc_admin

cd_quality_sec_doc_admin

cd_regulatory_sec_doc_admin

cd_safety_sec_doc_admin

cd_ad_promo_sec_model
_admin

cd_clinical_model_admin

cd_labeling_sec_model_admin

cd_md_clinical_sec_model
_admin

cd_md_nonclinical_sec_model
_admin

cd_nonclinical_sec_model
_admin

cd_quality_sec_model_admin

cd_regulatory_sec_model
_admin

cd_safety_sec_model_admin

cd_md_regulatory_sec_model
_admin
Administrators cd_admingroup Access administrative functions but do not
have access to controlled documents.

51
Life Sciences Solution Fundamentals

Cross-functional User Groups

The following table describes the cross-functional user groups:

Groups Description
cd_admingroup Administrator:
• Has access to administrative functions

• Requires Administrative client capability

• Does not have access to controlled documents


cd_product_managers Product managers who create and manage
Product Registration Forms.
cd_product_registration_group Regulatory Managers who can register products
and projects.
cd_project_registration_group
cd_general_doc_authors Users in the role can create general documents
(letters, memos, notes). By default, all
other authors groups are members of the
cd_general_doc_authors role.
Alldomain_roles Users who have access to documents across all
domains (Clinical, Quality, Safety, and so on).
<domain>_allproduct Users who have access to documents across all
products in a domain (Clinical, Quality, Safety,
and so on).
Alldomain_<product> Users who have access to documents belonging
to a particular product across all domains
(Clinical, Quality, Safety, and so on).

Reporting Groups

Documentum for Research and Development provides an optional reporting feature for monitoring
active clinical trials based on myInsight. See Reporting Groups, page 37 for the reporting groups
used by the Report Generator.

User Roles in Documentum Submission Store and View

Documentum Submission Store and View provides defined user roles that enable or restrict user
access to documents and information in the system. The following table describes the user roles:

52
Life Sciences Solution Fundamentals

User Role Group Description


Consumer Import cd_regulatory_consumers_imp Import document. They have read-only
access to Approved documents.
Regulatory cd_regulatory_managers Create and manage regulatory application
Manager registration forms.

Create product and project registration


forms.

Handles regulatory documents.

Imports correspondence documents and


manages the lifecycle transitions of these
documents.
Regulatory cd_regulatory_activity Users who can act as Managers of all
Activity Managers _managers applications/submissions by default.
Regulatory cd_regulatory_activity Users who can act as Readers of all
Activity Monitors _monitors applications/submissions by default.
Regulatory cd_regulatory_doc_authors Creates correspondence documents and
Authors manages the lifecycle transitions of these
documents.

Creates and monitors document


workflows.
Submission cd_submission_archivists Imports submissions for a regulatory
Archivist application registration form.
Submission archivists contains
cd_regulatory_doc_readers.
Reader cd_regulatory_doc_readers Have read-only access to submission
and correspondence documents based on
their permissions set by the regulatory
application registration forms.
Auditor cd_regulatory_doc_auditors Have read-only access to audit logs,
Approved, and Superseded documents.
Reviewer cd_format_reviewers Review documents using annotations and
edit documents.
cd_regulatory_doc_reviewers
Approver cd_regulatory_doc_approvers Approve controlled documents.
Document cd_regulatory_doc Manage the release of controlled
Coordinator _coordinators documents.

53
Life Sciences Solution Fundamentals

User Role Group Description


Device Managers cd_md_managers Create and manage Medical Device
Registration Forms, Medical Device
cd_md_regulatory_managers Regulatory Application Registration
Forms, and Medical Device Submission
cd_md_submission_managers
Registration Forms.
Security Admin cd_corres_sec_doc_admin Manages the configured roles on the
document.
cd_corres_sec_model_admin
Manages the security model on the
document.

Correspondence User Groups

Use the correspondence user groups for correspondence documents. The following table describes
the correspondence user groups:

Groups Description
cd_corres Users of Correspondence documents. Users in
this role are able to access the Correspondence
domain documents and have at least READ
permission on the documents.
cd_corres_consumers_imp Consumers who can import documents. They
have read-only access to Approved versions.
cd_corres_doc_approvers Correspondence document approvers approve
controlled documents. Some documents require
electronic signatures.
cd_corres_doc_auditors Correspondence document auditors have
read-only access to audit logs, Approved,
and Superseded documents. They can view
document content, history, and properties.
cd_corres_doc_authors Correspondence document authors create and
edit Correspondence documents in the Draft
state. For Control Categories 2 documents,
Authors cannot be an Approver.
cd_corres_doc_coordinators Correspondence document coordinators
manage the release of controlled documents.
For Control Category 2 documents, Approvers
can act as Document Coordinators.
cd_corres_doc_readers Correspondence document readers have
read-only access to Approved versions.

54
Life Sciences Solution Fundamentals

Groups Description
cd_corres_doc_reviewers Correspondence document reviewers review
documents using annotations and edit
cd_format_reviewers documents. They are responsible for technical
review during the authoring and review cycle.
cd_corres_managers Correspondence project managers create
and manage registration forms for the
Correspondence domain.
cd_corres_template_authors Correspondence template document authors
create and manage template Correspondence
documents.
cd_corres_template_approvers Correspondence template document approvers
approve template documents.

Cross-functional User Groups

The following table describes the cross-functional user groups:

Groups Description
cd_admingroup Administrator:
• Has access to administrative functions

• Requires Administrative client capability

• Does not have access to controlled documents


cd_regulatory Users of regulatory and administrative
documents.
cd_product_managers Product managers who create and manage
Product Registration Forms.
cd_quality_managers Quality and manufacturing project managers
who create Project Registration forms
cd_product_registration_group Regulatory Managers who can register products
and projects.
cd_project_registration_group

Reporting Groups

Documentum Submission Store and View provides an optional reporting feature for monitoring
active clinical trials based on myInsight. See Reporting Groups, page 37 for the reporting groups
used by the Report Generator.

55
Life Sciences Solution Fundamentals

Control Categories
The term control category refers to the level of regulation that should be applied to a document. The
control category for each artifact is defined in the creation profile through the assigned Default
Values template and lifecycle.
The control category for a document is stored in the Category attribute. It is an integer value of either
1, 2, 3, or 4. The value of the Category attribute determines the lifecycles and workflows that are
applied to a document. The Life Sciences solutions provide four control categories.
The common security features for all control categories:
• Allow joint and collaborative authoring in Documentum D2.
• Restrict access to content based on user role and lifecycle state.
• Allow complete withdrawal of the document with optional retention of historic copies.
The following table describes the security of the document control categories:

56
Life Sciences Solution Fundamentals

Control Categories Security Description


Category 1 Controlled documents that require formal
review and approval along with signoff by the
Quality Organization (QO). These documents
become Effective/Approved/Final and may be
sent on TBR and periodic review workflows.

The workflow process includes:


• Independent two person review and approval
before release.

• Approval with electronic signatures for


specific document types.

• Additional approval by the Quality


Organization department before release.

• Controlled release by Document


Coordinators.

• Controlled release that can be deferred until a


specified effective date.

• Effective/Approved/Final versions that


automatically rescind on expiration.

• Effective/Approved/Final versions that can be


suspended and reinstated on demand.

• Effective/Approved/Final versions that can be


superseded when the next version becomes
effective.

• Expiration notifications.
Category 2 Controlled documents that require formal
review and approval. They do not require
signoff by the Quality Organization.

The workflow process includes:


• Independent two person review and approval
before release.

• Approval with electronic signatures for


specific document types.

• Documents that become effective


immediately.

• Effective/Approved/Final versions that can be


suspended and reinstated on demand.

• Effective/Approved/Final versions that can be


superseded when the next version becomes
effective.
57
Life Sciences Solution Fundamentals

Control Categories Security Description


Category 3 Controlled documents that can be self-approved
by Authors. Independent review and approval
is not required. This category is typically
assigned to documents that are created and/or
approved outside of the system.

The workflow process:


• Enables self-approval by Authors.

• Makes the previously Effective/Approved


/Final version superseded when the next
version becomes Effective/Approved/Final.

• Allows the Effective/Approved/Final version


to be suspended and reinstated on demand.
Category 4 Non-controlled general documents that do
not require formal review or approval. These
documents are not allowed in regulatory
submissions. Authors manage access to these
documents.

58
Chapter 2
Customizing D2-Based Solutions

The Life Sciences solutions are configured in D2 Config. The configurations are based on life science
industry best practices and stored in a set of Documentum D2 applications. Topics in this chapter
describe configurable features and tasks for experienced Documentum and Documentum D2
administrators to perform to meet their business requirements. To generate a document listing the
default configurations, select Specifications > Generate specifications from D2 Config.
Some administrative functions are available in the D2 Client.
The Documentum D2 Administration Guide provides more information. The Appendix D, D2
Configurations section provides the list of D2 configurations that must not be renamed or removed.

Extending D2-Based Life Sciences Solutions


You can use all of the configuration capabilities of Documentum Server and D2 when you deploy the
Life Sciences solution. It is important to note that future upgrades can be complicated by extensive
changes to the D2 configurations of the base Life Sciences solution.
This section describes procedures and best practices for extending D2-based Life Sciences solutions
in a manner that will enable you to upgrade to a new version of a solution in the future and then
reapply your specific extensions and modifications.
Extending the Life Sciences solution involves the following key activities:
1. Creating a custom application
2. Modifying the base configuration
3. Extending the base solution context

Creating a Custom Application


All configuration items that are either part of the base solution but modified by the customer or newly
added by the customer must be assigned a custom application. The first step when extending an
existing solution is to define the custom application:
1. In D2-Config, select Tools > Applications.
2. Click New.

59
Customizing D2-Based Solutions

In the example below, the new application is named, KB Pharma. All of the configurations that are
either modified or added by the customer will be assigned this application. Existing OpenText
applications defined by the base solution should be left in the Applications list for base solution
configurations as well.
Assignment of a custom application enables you to easily identify your custom configurations and
package them for deployment and backup.

Modifying or Extending a Base Configuration


For each customization, you must decide whether:
• To change the base solution configuration or,
• Use the base configuration as a template for creating a new configuration.
You can use the Create from option to create the new configuration from the base template.
In general, it is recommended to create a new configuration from the existing one rather than directly
modifying the base solution configuration. However, this is not always practical because you have to
change all existing references to the base configuration to instead refer to your custom configuration.
This can significantly increase the number of configuration items that you must change.
If there are many references to a configuration item, directly modifying it is better than copying it.
Dictionaries are a good example of configurations that you should modify directly. Dictionaries and
taxonomies are referenced heavily in property pages, auto-naming, auto-linking, and so on. It is easy
to reconcile differences between two versions of a dictionary.

Extending the Base Solution Contexts


To ensure that any new custom configurations are used at runtime, extend the base solution contexts
to which your new configurations apply.
1. Use Create from to create a new context with the same parameters as the existing context.
2. Assign your custom application name to the extended context.
3. In the matrix, position the extended context to the left of the context it extends so that your
custom context takes precedence.
4. Assign your custom configurations to the relevant extended contexts in the configuration matrix
as shown in the following image:

60
Customizing D2-Based Solutions

Upgrading the Customized Solution


When upgrading to a new version of the solution and if you have changed an existing solution
configuration, you must perform a comparison of your modified configuration with the new version
of the solution configuration and merge the two configurations. Merging can either be done through
the D2-Config tool after importing the new version or within the configuration export XML files prior
to importing them. However, this is an activity that requires a developer or consultant with extensive
knowledge of D2, the base Documentum for Life Sciences solution, and the customer modifications.

61
Customizing D2-Based Solutions

Reconciling Extended Base Configurations with


Solution Upgrades
D2-Config provides the Create from capability for creating a new configuration from an existing
configuration. In D2, Create from is enhanced so that a reference to the original base configuration
item is maintained in the parents_config attribute of the new custom configuration.
When a new version of the base configuration is imported during a solution upgrade, D2-Config
reports warnings for any custom configurations in the new version that differs from the previous
version. In this way, D2 identifies which of your custom configurations needs to be reconciled with
the new base solution configuration.
For example, assume you customize the Procedure Properties configuration by using Create from to
first create a copy of this property page and then modify the copy. You name the custom property
page KB Procedure Properties. D2-Config will report the following warning when importing the new
version of the solution configurations:

Reconciling the changes between the new version of the base configuration (in this example,
Procedure Properties) and the extension (in this example, KB Procedure Properties) is a manual
process that involves either:
• Using a diff and merge tool on the two versions of the XML file and then importing the updated
XML file, or
• Using D2-Config to extend the upgraded version of the configuration and reapplying your
changes.

Merging D2 Configuration XML Files


The format of D2 configuration files exported to the file system is enhanced in D2 to better support
team development of D2-based applications and comparison between two versions of a configuration
file. D2 supports the following features related to exported configuration files:
• Granular files: The structure of the exported configurations is completely granular. There is a
one-to-one correspondence between a configuration item and an XML file.
• Consistent ordering of XML attributes and elements: The XML elements and attributes within
the XML files are ordered consistently and in a more readable format to facilitate comparison
between multiple versions of the file.
• Un-encoded filenames: Filenames in D2 configuration export packages are not hex-encoded. The
filename reflects the name of the configuration item.
These improvements to the D2 configuration export packages greatly improve the team development
experience on D2 and are required for enabling and simplifying the comparison between two versions
of a configuration export package. Comparing export packages is a necessary step in the process of
upgrading a solution and migrating customer extensions to the upgraded version.

62
Customizing D2-Based Solutions

The following image shows the comparison of two versions of a creation profile using a diff and
merge tool:

Notice that the changes are highlighted. In this case, the version of the file on the left allows for
creation of two additional document types. There are also some changes in the properties of the
creation profile highlighted. After you have verified the changes, you can merge the changes using
the tool.
Note: Between D2 4.5 and 4.6 release, the D2 config XML has undergone some changes. For instance,
in D2 4.5, the display config is named as d2_attribute_mappings.xml but in 4.6 it is just .xml.
Note: In some sections, there is a separation between the configuration definition and its settings.
This can be seen as a “contents” folder under the main folder. While there may be no changes in the
definition file, there may be changes in the content file. It is recommended that you check both files.

Best Practices for Team Development with D2


You should manage the source files for all of your custom components in a source code control
system that allows the versioning of files, including D2 configuration files. Multiple developers
can modify configurations in a shared Documentum repository or they can modify them in their
own local repository that is not shared. In either case, each developer should export their modified
configurations, unzip the exported configuration package, and check the individual XML files into
a source code control system. Within the source code control system maintain the same directory
hierarchy in which D2 exports files.
Building D2 configurations can be performed by retrieving the latest configuration directory from
your source code control system and rezipping them. A standard archive tool like Winzip can be
used to rezip the files.

63
Customizing D2-Based Solutions

Extending Roles
If you require custom roles to be created in the Life Sciences solution, you can do so by extending the
predefined roles that are installed with the application. Extending the roles involves making property
page changes and workflow configuration changes through D2-Config.

Creating Custom Roles


1. Log in to Documentum Administrator.
2. Create a new custom role.
3. Assign the users and groups to the new role.
4. Add the custom role to the existing role. For example, to create a new role for authors,
create a new_authors custom role. Then, add this role as a member of the existing
cd_<domain>_doc_authors role.
The Documentum Administrator User Guide provides the steps of creating custom roles.

Configuring the Property Pages


The Property pages in the Life Sciences solution query and display the default roles. You must modify
these queries so that the custom roles are used. For example, in the Quality Documents property
page, the Process Info tab displays the users in different roles. If you want to display a custom role
for authors, you must change the query by replacing cd_quality_doc_authors with the custom role.
1. Log in to D2-Config.
2. Select Go to > Property page.
3. Under Property pages, select the property page for the respective domain.
4. Under Structure, expand the Process Info folder.
5. Expand the role folder for which a custom role is created.
6. On the right pane, in the DQL Query field, replace the old role in the query with the new role.
7. Click Save.

Configuring Workflows
When a workflow is started, the system specifies the performers for each state of the workflow. By
default, users who are members of the default roles are assigned as performers to the workflow.
However, if the default roles are extended with custom roles, the query that assigns the performers
must be modified.
1. Log in to D2-Config.
2. Select Go to > Workflow.

64
Customizing D2-Based Solutions

3. Under Workflow List, select a workflow.


4. Under Participant's Structure, select the default role.
5. In the DQL query for the role, change the default role to the custom role.
6. Click Save.

Configuring Roles (LSQM)


By default, Documentum for Quality and Manufacturing provides a list of site-based roles as defined
in the Applicable Sites dictionary. Follow these steps to configure your own sites and user roles
according to your environment and requirements.

1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Data > Dictionary.
4. Under Dictionaries, select GMP Applicable Sites.
5. On the Languages tab, you can add or remove the required sites.
6. On the Alias tab, under each roles, define the respective user group for the respective site. For
example, for the Boston site, under authors, type cd_boston_authors.
7. Click Save.
All Documentum for Quality and Manufacturing groups are cd_<site>_<role>. Custom roles should
follow the same paradigm as the queries in the system are built to look for that structure. Note that
this also applies for custom LSRD, LSSSV, and LSTMF roles where it is cd_<domain>_doc_<role>. The
queries are built expecting that structure as well, although they typically take the group_name
(for example, cd_clinical, cd_quality, and so on).

65
Customizing D2-Based Solutions

66
Chapter 3
Life Sciences Data Model

This section provides an overview of the object data model used in the Life Sciences solutions.

Data Model Overview


The Life Sciences data model serves two functions:
• It defines attributes that support all of the features provided by the Life Sciences solutions
such as controlled document management, TMF planning and tracking, regulatory application
management, and so on.
• It implements, as a Documentum object model, the taxonomy and metadata described in the
DIA Reference Model for GMP Quality Systems Artifacts 1.0 for Documentum for Quality and
Manufacturing, DIA EDM reference model for the inventory in Documentum for Research and
Development, and the DIA TMF reference model 2.0 and 3.0 for Documentum for eTMF. Users in
the appropriate roles can create any document artifact defined in these standard reference models.
The following diagram depicts the Life Sciences object type hierarchy:

67
Life Sciences Data Model

Two base types, cd_controlled_document and cd_common_ref_model, define metadata that pertains
to all subtypes. Users do not directly create instances of these two types.
Types with the _info suffix represent registration forms. Registration forms are created and managed
by the Business Administrators in each respective domain, that is, Product Managers, Clinical Trial
Managers, Non-Clinical Study Managers, Project Managers, and Regulatory Managers.
Users in each respective domain in the appropriate roles can create instances of the types listed
in the following table.

Table 3. Objects Types for which Users can Create Documents

Type Domain Created By Solution(s)


cd_app_reg_labeling Regulatory Labeling Authors LSRD
_doc /Administrative
cd_change_request Change Request GMP Coordinators LSQM
/Authors
cd_clinical Clinical Clinical Authors LSRD
cd_clinical_tmf_doc Clinical TMF Clinical Authors and LSTMF
Contributors
cd_core_reg_labeling Regulatory Labeling Authors LSRD
_doc /Administrative
cd_general General Authors from all LSRD
domains
cd_non_clinical Non-Clinical Non-Clinical Authors LSRD
cd_quality Quality Quality Authors LSRD
cd_quality_gmp [1] GMP All GMP Authors LSQM

68
Life Sciences Data Model

Type Domain Created By Solution(s)


cd_reg_admin Regulatory Regulatory LSRD
/Administrative /Administrative
Authors
cd_reg_ad_promo Regulatory Ad Promo Authors LSRD
/Administrative
cd_reg Regulatory Correspondence LSSSV
_correspondence Correspondence Authors
cd_safety Safety Safety Authors LSRD
[1] — Users can create documents for the subtypes under this object type, that is,
cd_quality_gmp_effective and cd_quality_gmp_approved.
You can extend the data model in the following ways:
• Add or modify attributes in existing solution types
• Create new types that extend from the existing solution types

Add or Modify Attributes in Existing Solution


Types
If an attribute that represents metadata required for your organization’s documents is not available in
the standard data model, you can alter the existing types by following these recommendations:
• Use your own naming convention for custom attributes so that you can easily identify
them. For example, if KB Pharmaceuticals requires additional attributes on the
cd_quality_manufacturing_sop documents for equipment type, physical location, and process
step, name these attributes as kb_equipment_type, kb_physical_location, and kb_process_step
respectively.
• Do not add or move a standard attribute to a higher level in the type hierarchy. If an existing
attribute is required on additional object types, add a new attribute to those object types instead.
Moving a standard attribute from a lower-level object type to a higher-level type can cause
installation errors during future upgrades of the base solution.
• You can increase the length of string attributes if needed. When upgrading, the longer length is
retained. However, you cannot change an attribute from repeating to single or from single to
repeating.
• After adding new attributes, you must extend the relevant property page configurations and any
other D2 configurations in which you want to use the new attributes. For example, you may
choose to define new auto-naming or auto-linking configurations that use the new attributes.

69
Life Sciences Data Model

Create New Types that Extend from Existing


Solution Types
If there is no standard object type that represents your documents, you can create a new object
type that extends from an existing solution type.
• Use your organization’s naming convention for the type name. Do not use the cd_ prefixes.
• After adding new types, you must add D2 configurations that enable users to create and manage
instances of the new type. Typically, the following D2 configurations are needed to support a
new object type:
— Dictionary used in a creation profile
— Default Values template used in a creation profile
— Property page
— Creation profile
— Auto-naming
— Auto-linking
— Access Control List (ACL)/security
• You may also need to create content templates for the new object type.

Caution: Extending the object model with additional levels can affect the performance of the
system.

70
Chapter 4
Reference Models

A document management reference model defines the following:


• The kinds of documents that are managed by the document management system
• The metadata that is used to describe each kind of document
• An organizational structure for documents
The DIA EDM and TMF reference models are industry standard models developed by a consortium
of Life Sciences industry and vendor experts. The DIA EDM and TMF reference models provide an
excellent starting point for implementing a Life Sciences document management system because they
are based on industry experience and best practices. The Documentum for Life Sciences solutions
include a complete implementation of the DIA EDM reference model. The Documentum for eTMF
solution includes a complete implementation of the TMF reference model as well. After installing
a solution, you can immediately create and manage any of the documents (referred to as artifacts)
defined by the reference models.
The reference models are implemented using D2 configurations. Dictionaries, property pages, auto
link, and auto naming configurations for all documents in the DIA EDM Regulatory/Administrative,
Quality, Non-Clinical, Clinical, and so on domains are provided in all solutions. The LSTMF solution
provides D2 configurations for creating and managing all document artifacts defined in the DIA
TMF reference model. The Documentum for Quality and Manufacturing solution provides D2
configurations for creating and managing document artifacts related to the quality and manufacture
of the marketed product. The configurations have been derived from industry best practice and are
closely aligned with the DIA reference model that is currently in development for LSQM documents.
The Documentum for Research and Development solution provides D2 configurations for creating
and managing document artifacts related to the research and development of a product.
You can use the standard solution implementation of these models out-of-the-box. You can also
extend the standard implementation to support additional kinds of documents and metadata. Or
you can use the standard solution implementation as a pattern for implementing a completely
different model.

Standard Solution Implementation of the DIA


EDM Reference Model
The DIA EDM reference model is defined in an Excel spreadsheet that is available on the DIA website.
The spreadsheet contains tabs for each of the four document domains: Regulatory/Administrative,

71
Reference Models

Quality, Non-Clinical, and Clinical. Each tab lists the document artifacts for that domain and the
metadata for the documents. The documents are categorized according to group and subgroup. The
following image is a small snippet of the DIA EDM reference model spreadsheet:

The Artifact Name column lists the individual document artifacts for the given domain. In the
snippet, artifacts for the Quality domain are shown. The Group and Sub-Group columns show how
each artifact should be categorized. The columns to the right of Artifact Name, list the metadata
for documents in the given domain. An “M” in the cell indicates the attribute is mandatory for
the associated artifact.
This model is implemented in the Life Sciences solution using D2 dictionaries and taxonomies. D2
property pages enforce mandatory attributes as defined in the model.
For the purpose of explanation, the Quality Domain dictionaries are used as an example. The
complete set of D2 dictionaries that list the groups, subgroups, and artifacts defined in the DIA
EDM and TMF model are described in Appendix C. The Quality Domain dictionaries used in the
implementation of the DIA EDM reference model are listed in the table below. Notice that these
dictionaries map to the columns of the DIA EDM reference model spreadsheet.

72
Reference Models

Table 4. EDM reference model Quality domain dictionaries

Dictionary Name Description


Quality Classification by Group List of values for with aliases as Group,
Subgroup, Artifacts Names, and Workflow
mappings for Category 2 documents.
Quality Classification by eCTD Outline Node List of artifacts with aliases as Group, Subgroup,
List of values for with aliases as Group,
Subgroup, Artifacts Names, and eCTD outline
node mappings.
Quality Adventitious Agents Artifact Names List of artifacts in the Adventitious Agents
group on the Quality tab of the DIA EDM
reference model spreadsheet.
Quality Control of Excipients Artifact Names List of Excipients artifacts.
Quality Drug Product Artifact Names List of artifacts in the Drug Product group on
the Quality tab of the DIA EDM reference model
spreadsheet.
Quality Drug Substance Artifacts Names List of artifacts in the Drug Substance group on
the Quality tab of the DIA EDM reference model
spreadsheet.
Quality Information Artifact Names List of artifacts in the Quality Information group
on the Quality tab of the DIA EDM reference
model spreadsheet.
Quality Literature Reference Artifact Names List of artifacts in the Literature References
group on the Quality tab of the DIA EDM
reference model spreadsheet.
Quality Medical Device Artifact Names List of artifacts in the Medical Device group on
the Quality tab of the DIA EDM reference model
spreadsheet.
Quality Project Management Artifacts List of artifacts in the Quality Project
Management group.
Quality Overall Summary List of artifacts in the Quality Overall Summary
group on the Quality tab of the DIA EDM
reference model spreadsheet.
Quality Overall Summary Drug Product Artifact List of artifacts in the Quality Overall
Names Summary group/Drug Product subgroup on the
Quality tab of the DIA EDM reference model
spreadsheet.
Quality Overall Summary Drug Substance List of artifacts in the Quality Overall Summary
Artifact Names group/Drug Substance subgroup on the Quality
tab of the DIA EDM reference model spreadsheet
Quality Regional Information Artifact Names List of artifacts in the Quality Regional
Information group on the Quality tab of the DIA
EDM reference model spreadsheet.

73
Reference Models

These dictionaries are used in D2 creation profile configurations to enable users to create specific
document artifacts. In the following creation profile image, the Quality Drug Product Artifact Names
dictionary is used in a creation profile to enable users to create Quality – Drug Product documents:

D2 dictionary configurations are used to define the Group/Subgroup/Artifact organizational structure


defined by the DIA EDM reference model.

Standard Solution Implementation of the DIA


TMF Reference Model (LSTMF)
The DIA TMF reference model is a standard that describes the documents that might be included in
a Trial Master File. Some of the artifacts defined in the Clinical domain of the DIA EDM reference
model overlap with those defined in the DIA TMF reference model. This information is defined
for the Documentum for eTMF solution in the D2 configuration in the form of a single dictionary
configuration for each DIA TMF reference model, instead of multiple dictionaries and taxonomies.
Defining the artifact names and its corresponding information including category, workflow
mappings, and so on, in a single dictionary configuration makes it easy for administrators to manage
changes to existing artifacts and add new artifacts whenever required.
Note: You must not modify the reference model in the spreadsheets (file plan template) as it will
always be overwritten by the value in the object.
The following table lists the dictionaries that map to the columns of the DIA TMF reference model.

74
Reference Models

Table 5. TMF dictionaries

Dictionary Name Description


TMF Models Name and version of the TMF reference model
that is supported as listed in the DIA TMF
reference model spreadsheet.
TMF Unique Artifact Names 2.0 List of values in the Unique Artifacts name, TMF
Zone, TMF Section, and TMF Artifact Number
columns of the DIA TMF Reference Model 2.0
spreadsheet. Additional columns that reduce
the dependency on other dictionaries and
creation profiles include Category, Is Crossover,
Object Type, Initial Version, and Workflow
Mappings and so on.
TMF Unique Artifact Names 3.0 List of values in the Unique Artifacts name, TMF
Zone, TMF Section, and TMF Artifact Number
columns of the DIA TMF Reference Model 3.0
spreadsheet. Additional columns that reduce
the dependency on other dictionaries and
creation profiles include Category, Is Crossover,
Object Type, Initial Version, and Workflow
Mappings and so on.
All the information about the artifact is consolidated into a single dictionary per reference model. A
separate TMF Models dictionary identifies the available models and the associated dictionary for the
model as an alias as shown in the following image:

Each value mapped to the Dictionary alias for the associated key in the TMF Models dictionary has a
dictionary. For example, the DIA TMF 2.0 key is associated to the TMF Unique Artifact Names 2.0
dictionary. In the TMF Unique Artifact Names 2.0 dictionary, information about an artifact is stored
as an alias (as columns) as shown in the following image:

75
Reference Models

When you create placeholders and TMF documents, the creation logic uses the respective dictionary,
which is determined by the reference model of the trial, and populates the relevant artifact
information in the property pages. A default value template, TMF Placeholder Defaults, is defined
to apply the default values for all the placeholders and documents that are created. Anything that
changes from artifact to artifact is defined as an alias in the dictionary and a lookup expression is
used in the default values template to determine the values from the dictionary.

Standard Solution Implementation of Quality


and Manufacturing Document Models (LSQM)
Life Sciences solutions enable the creation and management of other document types besides the
standard DIA reference model artifacts that have been established for LSRD and LSTMF. The
following D2 dictionaries and taxonomies are used for documents in the LSQM document models:
• GMP Categories
• GMP Groups
• GMP Subgroups
• GMP Artifacts
The following figure shows the GMP Artifacts taxonomy in D2-Config:

76
Reference Models

The following table lists the taxonomy properties for the D2 dictionaries in D2-Config.

Table 6. Taxonomy

Taxonomy Name Description Used Dictionaries


GMP Artifact Hierarchy and classification for GMP Categories, GMP Groups,
all LSQM documents. GMP Subgroups, GMP
Artifacts

77
Reference Models

Standard Solution Implementation of


Submission Store and View Document Models
(LSSSV)
Life Sciences solutions enable the creation and management of other document types such as the
Correspondence documents in Documentum for Submission Store and View.

Correspondence Domain
The Correspondence domain had the following dictionaries and taxonomies in the previous design:

Table 7. Dictionaries in the Correspondence Domain

Dictionary Name Description


Regulatory Correspondence Artifact Names List of the types of documents in the
Correspondence domain.
Regulatory Correspondence Groups List of logical groupings or CTD modules in the
Correspondence domain.

Table 8. Taxonomies in the Correspondence Domain

Taxonomy Name Description


Regulatory Correspondence Classification by List of Correspondence artifacts based on
Group its group and subgroup mappings and the
workflow mappings for each artifact.
Apart from the preceding configurations, the category of each artifact is configured in the following
creation profiles:
• Regulatory Correspondence Documents
• Regulatory Correspondence Email
• Regulatory Correspondence Telephone
• Regulatory Correspondence Template Management Artifacts
• Regulatory Correspondence Attachments
For each artifact, there is a default values template configured in which the category is set to either 2
or 3 based on the artifact. Therefore, if there was any change in the artifact's category, the creation
profile also had to be updated.
New Implementation:
The new implementation includes the following changes:
• The preceding taxonomies have been replaced with a new dictionary, Regulatory Correspondence
Classification by Groups and Artifacts. This dictionary holds a key with a syntax
<primary_group>|<artifact_name>. This dictionary contains the list of Correspondence document

78
Reference Models

types based on its group and artifact mappings, the workflow mappings for each artifact, and also
the property page acronyms for it.
• The preceding creation profiles are now replaced with the following creation profiles (with
the exception of Regulatory Correspondence Template Management Artifacts, Regulatory
Correspondence Attachments):
— Regulatory Correspondence Documents - Agency
— Regulatory Correspondence Documents - General Non Product
— Regulatory Correspondence Documents - Internal
— Regulatory Correspondence Documents - Submission Receipt Confirmation
— Regulatory Correspondence Email - Agency
— Regulatory Correspondence Email - General Non Product
— Regulatory Correspondence Email - Internal
— Regulatory Correspondence Telephone - Agency Contact Record
— Regulatory Correspondence Telephone - General Non Product
— Regulatory Correspondence Template Management Artifacts
— Regulatory Correspondence Attachments
These creation profiles are configured with a single document type, New Document. Users must
select document properties including artifact_name in the Properties page. The data in the
Properties page is sourced from the single dictionary, Regulatory Correspondence Classification
by Groups and Artifacts. All the artifacts within the Correspondence domain are Category 2.
• The dictionary/taxonomy mappings from the property page configuration have been replaced
with DQL queries to populate data from the new dictionary where ever it is required.
• The dictionary contains the property page acronym for the respective property pages to be used
for that artifact. This acronym is used in the DQL query of the related property page to list the
artifacts.
With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Regulatory Correspondence Classification by Groups and
Artifacts dictionary instead of managing multiple configurations.

Implementation of the DIA EDM Reference


Model for Research and Development (LSRD)
In the previous implementation of the DIA EDM reference model for the LSRD domains such
as Regulatory/Administrative, Quality, Clinical, Non-clinical, Safety, Labeling, Ad-Promo, the
inventory-related data was scattered across multiple sources like dictionaries and taxonomies, such as
Group, Subgroups, and so on. Although this design provided better organization of the data, a side
effect was that in case there were any changes to the existing data model, any additions to the model
at various stages including Group, Subgroup, or Artifact level, or any workflow mapping changes
to an existing artifact, it required a better understanding of all those configuration objects. Updates

79
Reference Models

had to be made to more than one configuration resulting in a cumbersome process of making any
changes to the data model.
To centralize the definitions of groups, subgroups, and artifacts within the system and simplify
making changes to the data model, a new implementation of the reference model for each domain is
used with the objective to reduce the number of configuration elements and capture the data in a
single dictionary rather than scattering it into multiple configurations.

Regulatory Domain
The Regulatory domain had the following dictionaries and taxonomies in the previous design:

Table 9. Dictionaries in the Regulatory Domain

Dictionary Name Description


Regulatory-Admin Artifact List of the types of documents in the
Regulatory/Administrative domain.
Regulatory-Admin Groups List of logical groupings or CTD modules in the
Regulatory/Administration domain.
Regulatory-Admin Subgroups Within the Regulatory/Administrative domain,
list of categories of documents within a group,
usually based on CTD module subsets.

Table 10. Taxonomies in the Regulatory Domain

Taxonomy Name Description


Regulatory-Admin Classification by Groups List of artifacts based on its group and subgroup
mappings and the workflow mappings for each
artifact.
Apart from the preceding configurations, the category of each artifact is configured in the creation
profile, Regulatory Documents by EDM Artifact Name. For each artifact, there is a default values
template configured in which the category is set to either 2 or 3 based on the artifact. Therefore, if
there was any change in the artifact's category, the creation profile also had to be updated.
New Implementation:
The new implementation includes the following changes:
• All the preceding dictionaries and taxonomies have been replaced with a single dictionary,
Regulatory-Admin Artifact. This dictionary holds a key with a syntax <artifact_name>|<region>.
There is an alias created for each value that varies for each artifact, for example, Group, Subgroup,
Region, and so on.
• The creation profile, Regulatory Documents by EDM Artifact Name, is configured with a single
document type, New Document. Users must select document properties including artifact_name
in the properties screen. The data in the properties screen is sourced from the single dictionary,
Regulatory-Admin Artifact. All the artifacts within the Regulatory domain are Category 2.

80
Reference Models

Therefore, a single default values template Regulatory-Administrative Cat 2 Default Values is


configured in the creation profile.
• The dictionary/taxonomy mappings from the property page configuration have been replaced
with DQL queries to populate data from the new dictionary where ever it is required.
With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Regulatory-Admin Artifact dictionary instead of managing
multiple configurations.

Clinical Domain
The Clinical domain had the following dictionaries and taxonomies in the previous design:

Table 11. Dictionaries in the Clinical Domain

Dictionary Name Description


Clinical Report Component Artifact Names List of Report Component documents in the
Clinical domain.
Clinical CTA Document Artifacts List of CTA documents in the Clinical domain.
Clinical Investigator Brochure List of Investigator Brochure documents in the
Clinical domain.
Clinical Literature References Artifact Names List of Literature References documents in the
Clinical domain.
Clinical Artifact Names List of types of documents in the Clinical
domain.
Clinical Groups List of logical groupings or CTD modules in the
Clinical domain.
Clinical Subgroups Within the Clinical domain, list of categories of
documents within a group, usually based on
CTD module subsets.

Table 12. Taxonomies in the Clinical Domain

Taxonomy Name Description


Clinical Classification by Group Organization of Clinical documents according
to group, subgroup, and region as defined by
the DIA EDM Reference model.
Clinical Classification for Reg Form Clinical Groups and Subgroups used when
creating registration forms.
Apart from the preceding configurations, the category of each artifact is configured in the following
creation profiles:

81
Reference Models

Non-Crossover Creation profiles


• Clinical Summaries Creation Profile
• Clinical Literature References Creation Profile
• Clinical CTA Document
• Clinical Creation Profile Non-Study-Specific
• Clinical Creation Profile Study-Specific
• Clinical Investigator Brochure
Crossover Creation profiles
• Clinical - Investigator Brochure Crossover
• Clinical Crossover Creation Profile Non-Study-Specific
• Clinical Crossover Creation Profile Study-Specific
• Clinical Crossover Creation Profile Study-Specific TMF
For each artifact, there is a default values template configured in which the category is set to either 2
or 3 based on the artifact. Therefore, if there was any change in the artifact's category, the creation
profile also had to be updated.
New Implementation:
The new implementation includes the following changes:
• All the preceding dictionaries and taxonomies have been replaced with a single dictionary, Clinical
Artifact Names Inventory. This dictionary holds a key with a syntax <artifact_name>|<subgroup>.
There is an alias created for each value that varies for each artifact, for example, Group, Subgroup,
Artifact Name, and so on.
• The preceding creation profiles are configured with a single document type, New Document.
Users must select document properties including artifact_name on the Properties page. The data
on the Properties page is sourced from the single dictionary, Clinical Artifact Names Inventory.
All the artifacts within the Clinical domain are Category 2.
Note: Clinical Literature Reference, Financial Disclosure Form, and Financial Disclosure Summary
documents were Category 3 in the Life Sciences 4.3 release. As part of the new implementation,
these artifacts have been converted to Category 2. However, you can still self-approve these
documents.
• The dictionary/taxonomy mappings from the property page configuration have been replaced
with DQL queries to populate data from the new dictionary where ever it is required.
With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Clinical Artifact Names Inventory dictionary instead of
managing multiple configurations.

Non-clinical Domain
The Non-clinical domain had the following dictionaries and taxonomies in the previous design:

82
Reference Models

Table 13. Dictionaries in the Non-clinical Domain

Dictionary Name Description


Non-Clinical Artifact Names List of kinds of documents in the Non-clinical
domain.
Non-Clinical Groups List of logical groupings in the Non-clinical
domain.
Non-Clinical Literature References Artifact Dictionary for Non-clinical Literature
Names References.
Non-Clinical Report Component Artifact Names Artifact Names for three groups in the
Non-clinical domain: Pharmacology Study
Reports, Pharmacokinetics Study Reports, and
Toxicology Study Reports.
Non-Clinical Subgroups Within the Non-clinical domain, categories of
documents within a group, usually based on
CTD module subsets.
Non-Clinical Summaries Artifact Names Dictionary for Non-Clinical Summaries group.

Table 14. Taxonomies in the Non-clinical Domain

Taxonomy Name Description


Non Clinical Taxonomy for reg form Organization of Non-clinical documents
according to group and subgroup as defined by
the DIA EDM Reference model for Registration
form.
Non-Clinical Classification by Artifact Organization of Non-clinical documents
according to group and subgroup as defined by
the DIA EDM Reference model.
Non-Clinical Classification by Group Organization of Non-Clinical documents
according to group and subgroup as defined by
the DIA EDM Reference model.
Apart from the preceding configurations, the category of each artifact is configured in the creation
profiles:
• Non-Clinical 3 Groups Study Reports Artifact Names
• Non-Clinical Literature References Documents by EDM
• Non-Clinical Summaries Documents by EDM Artifacts
For each artifact, there is a default value template configured in which the category is set to either 2
or 3 based on the artifact. Therefore, if there was any change in the artifact’s category, the creation
profile also had to be updated.
New Implementation:
The new implementation includes the following changes:
• All the preceding dictionaries and taxonomies have been replaced with a single dictionary,
Non-Clinical Artifacts. This dictionary holds a key with a syntax <artifact_name>|<subgroup>.

83
Reference Models

There is an alias created for each value that varies for each artifact, for example, Group, Subgroup,
Artifact Name, and so on.
• The preceding creation profiles have been modified with a single document type, New Document.
Users must select the document properties including artifact_name in the Properties page when
creating documents. The data in the Properties page is sourced from the single dictionary, Non
Clinical Artifacts. All the artifacts within the Non Clinical Domain are Category 2.
Note: Non-Clinical Literature Reference documents were Category 3 in Life Sciences 4.3 release.
As part of the new implementation, these documents have been converted to Category 2.
However, you can self-approve Non-Clinical Literature Reference documents.
• The dictionary/taxonomy mappings from the property page configuration have been replaced
with DQL queries to populate data from the new dictionary where it is required.
With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Non Clinical Artifacts dictionary instead of managing
multiple configurations.

Quality Domain
The Quality domain had the following dictionaries and taxonomies in the previous design:

Table 15. Dictionaries in the Quality Domain

Dictionary Name Description


Quality Artifact Names List of Quality artifacts in the Quality domain.
Quality Groups List of logical groupings or CTD modules in the
Quality domain.
Quality Subgroups Within the Quality domain, list of categories of
documents within a group, usually based on
CTD module subsets.

Table 16. Taxonomies in the Quality Domain

Taxonomy Name Description


Quality Classification by Groups Organization of Quality documents according to
group/subgroup/artifact as defined by the DIA
EDM Reference model.
Quality Classification by Artifact Organization of Quality documents according to
artifact/subgroup/group as defined by the DIA
EDM Reference model.
Apart from the preceding configurations, the category of each artifact is configured in the following
creation profiles:
• Quality:Adventitious Agents Documents by EDM Artifact Name
• Quality:Control of Excipients

84
Reference Models

• Quality:Drug Product Documents by EDM Artifact Name


• Quality:Drug Substance Documents by EDM Artifact Name
• Quality:Literature References Documents by EDM Artifact Name
• Quality:Medical Device Documents by EDM Artifact Name
• Quality:Overall Summary
• Quality:Overall Summary-Drug Product Documents by EDM Artifact Name
• Quality:Overall Summary-Drug Substance Documents by EDM Artifact Name
• Quality:Quality Information Documents by EDM Artifact Name
• Quality:Regional Information
• Quality:Virtual Documents
For each artifact, there is a default values template configured in which the category is set to either 2
or 3 based on the artifact. Therefore, if there was any change in the artifact's category, the creation
profile also had to be updated.
New Implementation:
The new implementation includes the following changes:
• All the preceding dictionaries and taxonomies have been replaced with a single dictionary, Quality
Classification by Group. This dictionary holds a key with a syntax <artifact_name>|<group>.
There is an alias created for each value that varies for each artifact, for example, Group, Subgroup,
Artifact, and so on.
• The preceding creation profiles are configured with a single document type, New Document.
Users must select document properties including artifact_name on the Properties page. The data
on the Properties page is sourced from the single dictionary, Quality Classification by Group. All
the artifacts within the Quality domain are Category 2.
• The dictionary/taxonomy mappings from the property page configuration have been replaced
with DQL queries to populate data from the new dictionary where it is required.
With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Quality Classification by Group dictionary instead of
managing multiple configurations.

Promotional Materials Domain


The Promotional Materials (Ad-Promo) domain had the following dictionaries and taxonomies in
the previous design.

85
Reference Models

Table 17. Dictionaries in the Ad-Promo Domain

Dictionary Name Description


Promotional Materials Artifact Names List of the types of documents in the Promotional
Materials domain.
Regulatory-Admin Groups List of logical groupings or CTD modules in the
Promotional Materials domain.
Regulatory-Admin Subgroups Within the Promotional Materials domain, list of
categories of documents within a group, usually
based on CTD module subsets.
eCTD Section The eCTD sections in the Promotional Materials
domain referring to the eCTD submission node.

Table 18. Taxonomies in the Ad-Promo Domain

Taxonomy Name Description


Promotional Materials Classification by Group List of artifacts based on its group and subgroup
mappings and the workflow mappings for each
artifact.
Promotional Materials Classification by eCTD List of artifacts based on its eCTD mappings for
Section each artifact.
Apart from the preceding configurations, the category of each artifact is configured in the creation
profile, Regulatory Ad Promo Documents by EDM Artifact Name. For each artifact, there is a default
values template configured in which the category is set to either 2 or 3 based on the artifact. Therefore,
if there was any change in the artifact's category, the creation profile also had to be updated.
New Implementation:
The new implementation includes the following changes:
• All the preceding dictionaries and taxonomies have been replaced with a single dictionary,
Promotional Materials Artifacts. This dictionary holds a key with a syntax <artifact_name>.
There is an alias created for each value that varies for each artifact, for example, Group, Subgroup,
Artifact Name, and so on.
• The creation profile, Regulatory Ad Promo Documents by EDM Artifact Name, has been
removed and two new creation profiles have been created:
— Regulatory Ad Promo:Correspondence Documents by EDM Artifact Name
— Regulatory Ad Promo:Materials Documents by EDM Artifact Name
These creation profiles have a single document type, New Document. Users must select the
artifact_name, primary_group, and subgroup in the Properties page when creating documents.
The data in the Properties page is sourced from the single dictionary, Promotional Materials
Artifact. All the artifacts within the Promotional Materials domain are Category 2.
• The dictionary/taxonomy mappings from the property page configuration have been replaced
with DQL queries to populate data from the new dictionary wherever it is required.

86
Reference Models

With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Promotional Materials Artifact dictionary instead of
managing multiple configurations.

Labeling Domain
The Regulatory Labeling domain had the following dictionaries and taxonomies in the previous
design:

Table 19. Dictionaries in the Labeling Domain

Dictionary Name Description


Regulatory Labeling Artifact Names List of the types of documents in the Labeling
domain.
Regulatory Labeling Groups List of logical groupings or CTD modules in the
Labeling domain.
Regulatory Labeling Subgroups Within the Labeling domain, list of categories
of documents within a group, usually based on
CTD module subsets.

Table 20. Taxonomies in the Labeling Domain

Taxonomy Name Description


Regulatory Labeling Classification by Artifact List of artifacts.
Regulatory Labeling Classification by Group List of artifacts based on its group and subgroup
mappings and the workflow mappings for each
artifact.
Apart from the preceding configurations, the category of each artifact is configured in the creation
profile, Regulatory Labeling Documents by EDM Artifact Name. For each artifact, there is a default
values template configured in which the category is set to either 2 or 3 based on the artifact. Therefore,
if there was any change in the artifact's category, the creation profile also had to be updated.
New Implementation:
The new implementation includes the following changes:
• All the preceding dictionaries and taxonomies have been replaced with a single dictionary,
Regulatory-Labeling Artifacts. This dictionary holds a key with a syntax <artifact_name>. There
is an alias created for each value that varies for each artifact, for example, Group, Subgroup,
Artifact Name, and so on.
• The creation profile, Regulatory Documents by EDM Artifact Name, has been removed and
two new creation profiles have been created:
— Regulatory Application Labeling Documents
— Regulatory Core Labeling Documents
The creation profiles are configured with a single document type, New Document. Users must
select document properties including artifact_name in the Properties page. The data in the

87
Reference Models

Properties page is sourced from the single dictionary, Regulatory-Labeling Artifacts. All the
artifacts within the Labeling domain are Category 2.
Note: Few artifacts such as Artwork, SPL - Non-PLR, SPL - PLR were Category 3 in the Life
Sciences 4.3 release. As part of the new implementation, these documents have been converted to
Category 2. However, you can still self-approve these documents.
• The dictionary/taxonomy mappings from the property page configuration have been replaced
with DQL queries to populate data from the new dictionary where ever it is required.
With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Regulatory-Labeling Artifacts dictionary instead of managing
multiple configurations.

Safety Domain
The Safety domain had the following dictionaries and taxonomies in the previous design:

Table 21. Dictionaries in the Safety Domain

Dictionary Name Description


Safety Artifact Names List of the types of documents in the Safety
domain.
Safety Groups List of logical groupings or CTD modules in the
Safety domain.

Table 22. Taxonomies in the Safety Domain

Taxonomy Name Description


Safety Classification by Artifact List of Safety artifacts.
Safety Classification by Group List of Safety artifacts based on its group
and subgroup mappings and the workflow
mappings for each artifact.
Apart from the preceding configurations, the category of each artifact is configured in the creation
profile, Safety Documents by EDM Artifact Name. For each artifact, there is a default values
template configured in which the category is set to either 2 or 3 based on the artifact. Therefore, if
there was any change in the artifact's category, the creation profile also had to be updated.
New Implementation:
The new implementation includes the following changes:
• All the preceding dictionaries and taxonomies have been replaced with a single dictionary, Safety
Artifact. This dictionary holds a key with a syntax <artifact_name>|<primary_group>.
• The creation profile, Safety Documents by EDM Artifact Name, is configured with a single
document type, New Document. Users must select document properties including artifact_name

88
Reference Models

in the Properties page. The data in the Properties page is sourced from the single dictionary,
Safety Artifact. All the artifacts within the Safety domain are Category 2.
• The dictionary/taxonomy mappings from the property page configuration have been replaced
with DQL queries to populate data from the new dictionary where it is required.
With this single dictionary implementation, users can make any changes to the data model and
workflow mappings by modifying the Safety Artifact dictionary instead of managing multiple
configurations.

Extending Dictionaries
You can extend a dictionary for any of the LSRD domains by adding a new artifact using the
following procedure:

1. Log in to D2-Config as Administrator.


2. In the filter on the main toolbar, select All elements.
3. Select Data > Dictionary.
4. Under Dictionaries, select the dictionary for the domain you want to extend.
5. Under Properties, on the Alias tab, click Add alias.
6. In the Key column, type the name of the new artifact.
7. In the corresponding columns, type the values for the new artifact and then click Save.
8. Click Update repository.
You can also modify dictionaries by exporting the values as an Excel spreadsheet, making the
changes, and then importing them back. After importing the spreadsheet, click the Update repository
button to ensure the changes are available.

Extending the Standard Reference Model


Implementation (LSTMF)
Documentum for eTMF maintains its reference model in several configuration components within
the system:
• Dictionaries: Contain information about sets of acceptable values in the reference model.
• File plan and file plan templates: Configured to activate artifacts within studies. Each artifact to
be activated must be contained within the file plan for the study. The file plan template enables you
to add new artifacts in the new study file plans, and be imported when study plans are reloaded.

89
Reference Models

Each of these components must be configured with the new artifacts that are to appear in the system.

Extending Dictionaries
In Documentum for eTMF, you must extend the following dictionaries:
• TMF Models if you want to add a new reference model.
• TMF Unique Artifact Names <version> if you want to add a new artifact.
To extend a dictionary by adding a new artifact, follow the steps in Extending Dictionaries, page 90.

File Plans and File Plan Templates


A file plan is a Microsoft Excel spreadsheet specifying the expected artifacts in each filing area.
The file plan specifies whether artifacts are required (must-have), recommended (should-have), or
optional (could-have) documents. In Documentum for eTMF, the File Plan Sample spreadsheet is
used as the base when file plan templates are created.
Each new artifact needs to be placed in the file plan to be activated. If the artifact is to be included
in every study, you can place it in the master template. Alternately, it can be placed in a specific
template or file plan as any artifact within the system. The Documentum Electronic Trail Master File
User Guide provides more information about file plans.

Implementing a Custom Reference Model


(LSTMF)
To implement a custom reference model in Documentum for eTMF, follow these steps:
1. In the TMF Models dictionary, add a new key for the custom reference model. Follow the steps
provided in Extending Dictionaries, page 90 to add a new artifact.
2. Select the existing dictionary, for example, TMF Unique Artifact Names 2.0, and click Create from.
3. In the Name field, type a name for the custom dictionary and then click Save.
4. Click Export to export the dictionary into an Excel spreadsheet.
5. Open the Excel sheet, delete the existing content and add the new artifact values.
6. After you finish updating the dictionary Excel sheet, in D2-Config, click Import.
7. Select the updated dictionary Excel sheet and click Open.

90
Reference Models

8. Click Save and then click Update repository.


9. Restart the Java Method Server.

91
Reference Models

92
Chapter 5
Document Creation Profile

The Life Sciences solutions provide several D2 creation profiles to enable users to create and import
documents and registration forms. There are one or more creation profiles for each LSRD, LSTMF,
and LSSSV domain. There is no manager role in LSQM as registration forms are not currently used
in this solution.

Management Creation Profiles


Management creation profiles enable users in a manager role to create registration forms and
other management related documents, such as content templates and Trail Master File (TMF) bulk
import/export packages, in the manager’s domain.

Table 23. Management Creation Profiles

Creation Profile Available to Role


Clinical Trial Management Artifacts cd_clinical_managers
Clinical Template Management Artifacts cd_clinical_template_authors
Medical Device Management Artifacts cd_md_managers
Medical Device Clinical Content Management cd_md_clinical_template_authors
Artifacts
Medical Device Non Clinical Content cd_md_non_clinical_template_authors
Management Artifacts
Medical Device Regulatory-Admin Management cd_md_regulatory_managers
Artifacts
Medical Device Regulatory Content cd_md_regulatory_template_authors
Management Artifacts
Non-Clinical Study Management Artifacts cd_non_clinical_managers
Non-Clinical Template Management Artifacts cd_non_clinical_template_authors
Product Management Artifacts cd_product_managers
Quality Project Management Artifacts cd_quality_managers
Quality Template Management Artifacts cd_quality_template_authors

93
Document Creation Profile

Creation Profile Available to Role


Regulatory-Admin Management Artifacts cd_regulatory_managers
Regulatory Correspondence Template cd_corres_template_authors
Management Artifacts
Regulatory Template Management Artifacts cd_regulatory_template_authors
Safety Template Management Artifacts cd_safety_template_authors
TMF Clinical Trial Management Artifacts cd_clinical_managers
TMF Clinical Template Management Artifacts cd_clinical_tem_authors_tmf
GMP Template Management Artifacts cd_gmp_template_authors
Med Device Template Management Artifacts cd_med_device_template_authors

Document Creation Profiles


Multiple document creation profiles, for each document domain, enable end users to create or import
documents for the applicable domain. The DIA EDM reference model categorizes documents
according to the taxonomy, domain/group/subgroup/artifact.
For the creation profiles relevant to the DIA EDM reference model domains: Regulatory/Admin,
Quality, Non-Clinical, and Clinical, there is a creation profile for each DIA EDM group within the
applicable domain.

Note: For the Regulatory/Admin domain, only one artifact available in the creation profile that is,
New Document. Group, Subgroup, and Artifact information is captured on the Edit properties screen.

94
Document Creation Profile

For Documentum for Quality and Manufacturing, within each creation profile, the individual
document types map to artifacts. In the case of Documentum for Research and Development,
Documentum Submission Store and View, and Documentum for eTMF, the creation profiles are
mapped to a single document type, New Document. Users must select the document properties
including artifact_name in the Properties page when creating documents. The data in the Properties
page is sourced from the single dictionary for that domain.
Each creation profile has an associated D2 dictionary that defines the document types (that is,
artifacts) for the relevant domain and group. For Quality and Manufacturing domains, the individual
document types within each creation profile map to groups rather than artifacts.

Control Category Definition


The control category for each artifact is defined within the creation profile through the assigned
Default Values template and lifecycle.

The Default Values template and lifecycle must always refer to the same control category number.
In the example, the Directive artifact in the LSQM Governance and Procedures domain will be
created as a Control Category 1 document. The assigned Default Values template ensures that the
category attribute is assigned the value 1. The document then matches contexts with the condition,
category=’1’. This ensures the correct Control Category 1 configurations are applied to the
document at runtime.

Default Values Template


A pertinent Default Values template is assigned to each document type within the creation profiles.
The following figure is an example of a Default Values template, which is assigned to Control
Category 2 Quality documents:

95
Document Creation Profile

The Life Sciences solutions depend on correct default values to function correctly. The following table
lists attributes that must have default values when a document is created.

Table 24. Required default values

Attribute Default Value Runtime Use


authors User creating document Used in security configuration
to ensure that the user that
created the document continues
to have WRITE access to the
Draft document.
category Control Category 1,2,3, or 4 Determines the lifecycle and
workflows available for the
document.

96
Document Creation Profile

Attribute Default Value Runtime Use


domain Document domain (for Used along with artifact_name
example, Quality, Clinical, to look up applicable content
Non-Clinical, Regulatory) templates.
group_name Group prefix corresponding Used in value assistance
to domain (for example, queries for role attributes on
cd_quality, cd_clinical, the document and for potential
cd_non_clinical, and so on) workflow performers. This is
not relevant to LSQM.

Customizing Creation Profiles


You can customize creation profiles in the following ways:
• Create a new creation profile.
• Change creation profile properties.
• Change creation properties for a group within an existing creation profile.
• Remove artifacts from an existing creation profile.
• Remove or disable an existing creation profile.
Prior to adding additional artifacts to a new or existing creation profile, you must create any of the
following configurations that will be referenced in the creation profile:
• Object types
• Dictionaries or added dictionary values
• Property pages
• Inheritance configurations
• Default values templates
• Lifecycles

Creating a New Creation Profile


This procedure is applicable to Documentum for Quality and Manufacturing. For Documentum
for Research and Development, Documentum for eTMF, and Documentum Submission Store and
View, a new creation profile is created per group that is present in the single dictionary defined
for each domain.
1. In D2-Config, click Creation > Creation profile.
2. Click New.

97
Document Creation Profile

3. Under Properties, enter the profile details, such as name, description, list of applications, and
dictionary properties.
4. Click Save.

Changing Creation Profile Properties


1. In D2-Config, click Creation > Creation profile.
2. Under Profiles, select the creation profile you want to update.
3. Under Properties, change the relevant properties.
4. Click Save.

Changing Creation Properties for an Artifact within an


Existing Creation Profile
This procedure is applicable to Documentum for Quality and Manufacturing only.
1. In D2-Config, click Creation > Creation profile.
2. Under Profiles, select the creation profile you want to update.
3. Add your custom application name to the Applications list.
4. In the Dictionary table, update the rows for a particular artifact.
5. Click Save.

Adding Artifacts to an Existing Creation Profile


This procedure is applicable to Documentum for Quality and Manufacturing only.
1. In D2-Config, click Creation > Creation profile.
2. Under Profiles, select the creation profile you want to modify and make a note of the dictionary
that is used to define the documents that can be created.
3. Edit the dictionary. Add the names and labels of your new artifacts to this dictionary.
4. Reopen the creation profile.
5. Add your custom application name to the Applications list.

98
Document Creation Profile

6. In the table, create a new row for each new artifact and select the following values in their
respective columns for each new row:
a. First column: Select a dictionary item from those you added in step 3.
b. Type column: Select the object type for the new document that will be created when a user
selects this item in the creation profile. If you created your own custom type, you can
specify the custom type.
c. 02 config column: Leave blank.
d. Property pages column: Select the property page configuration that should be displayed
during creation or import for this item.
e. Version column: Typically, you should select 0.1. This value ensures that the first effective
version of the document is version 1.0, while all draft or pre-effective versions are 0.x.
f. Inheritance column: Select an appropriate D2 inheritance configuration. If you created your
own inheritance definition, select the custom inheritance definition.
g. Default Values Template column: Select a Default Values template that corresponds to the
specific artifact, its domain, and/or its group and the desired control category.
h. Lifecycle column: Select the lifecycle corresponding to the desired control category for the
document. This must correspond to the Default Values template. If you created your own
lifecycle definition, select the custom lifecycle definition.
i. Workflow column: Leave blank.
7. Click Save.

Removing Artifacts from an Existing Creation Profile


This procedure is applicable to Documentum for Quality and Manufacturing only.
1. In D2-Config, click Creation > Creation profile.
2. Add your custom application name to the Applications list.
3. Remove the unwanted rows from the table at the bottom of the page.
4. Click Save.

Removing or Disabling an Existing Creation Profile


1. In D2-Config, click Creation > Creation profile.
2. Under Profiles, select the creation profile that you want to disable.
3. Under Properties, in the Users group field, set the value to admingroup. This ensures that no
business users will have access to the creation profile.
4. Click Save.

99
Document Creation Profile

File Naming and Versioning


The document name is automatically assigned based on the naming schema: <artifact name> -
<document ID>. For Documentum for Quality and Manufacturing, the naming schema is <group
name prefix> - <document ID>.
Where the artifact name is 230 characters and the document ID is a 9 digit incremental number.
The naming rules ensure that documents are named consistently. For example, when the artifact
name is Cover Letter and the document ID is 000000001, the name displayed in the D2 Client
workspace is: Cover Letter - 000000001

Caution: The system generates the document_id from a 9-digit counter, which allows 100
million unique document IDs. In D2-Config, the size of the counter can be increased by editing
the _Document_ID auto naming configuration. However, the document_id attribute in the
cd_controlled_doc object type is a ten character string. When the digits are increased earlier
10, the cd_controlled_doc object type must also be changed.
If the counter value on the _Document_ID auto naming configuration is reset after creating
documents, subsequent document IDs might not be unique. Consequently, you should not reset
this counter value in a production repository.

If two autonaming configurations have the same prefix, the system does not validate the
configurations for existing documents. This can result in duplicate document names in the system.
For example:
• QUA - Quality
• QUA - Qualitative Use Assessment
You can avoid this by defining a different prefix for the document in the respective Prefixes dictionary
in D2-Config.
Document versions are minor (v 0.1) until they are made effective. Documents in an
Effective/Approved/Final state have major version numbers (v 1.0).

O2 Configuration for PDF and Native


Annotations
If you use PDF annotations in your documents, you must avoid using O2 trigger events other than
the Checkout/Checkin and Save Properties events. This is because O2 events update the content
and, if PDF annotations exist on the document, these updates cause the annotations to be deleted
prior to the author being able to view them. In addition, when a user updates the properties of a
document, the updated properties will not be in sync with the content of the object until the user
checks out the document and checks it back in.
By default, the Life Sciences solution supports only the Checkout/Checkin and Save Properties
O2 trigger events. Other O2 trigger events will only work with native annotations and not PDF
annotations. For example, updating the last periodic review date on an Approved document or

100
Document Creation Profile

updating the Product Code that then cascades down to the documents. The Documentum D2
Installation Guide provides information about O2 configurations.

C2 View Configurations for PDF


The Life Sciences solutions use C2 View configurations to dynamically generate PDFs that the end
users can view. The following table describes the C2 View configurations for the Life Sciences
solution:

Option Description
Form Details
Default Configuration Select to make this configuration the default
view configuration. If more than one default
configuration is applicable for the end user in
a context within the configuration matrix, D2
uses the top-most configuration in the C2 View
Configuration page.
Compatibility Select the oldest compatible version of the PDF
viewing software.
Protection Details
Encryption Level Determines the type of encryption (40 or 128 bit).
User password The password required for opening the PDF.
Password to modify preference The password required for modifying the PDF.
Permissions Check the checkboxes to allow the user to print,
modify content, copy, or modify annotations.
Initial View Details
Page Layout Defines the layout of the document, for example,
multiple pages, single page, and so on.
Display Determines which panels to display on initial
view, for example, bookmarks or just the PDF
page, and so on.
Displayed page on running Select the page to display on initial view.
The following table lists the C2 View configurations used by the Life Sciences solutions:

C2 View Configuration Name Solution


Audit Report C2 View All
Category 1 C2 View for Effective LSQM

Category 1 C2 View for Non Effective LSQM

Category 1 C2 View for Effective GMP Forms LSQM

101
Document Creation Profile

C2 View Configuration Name Solution


Category 2 C2 View for Approved Non QM LSRD, LSTMF, LSSSV

Category 2 C2 View for Non Effective LSQM

Category 2 C2 View for Effective LSQM

Category 2 C2 View for Non QM LSRD, LSTMF, LSSSV

Category 3 C2 View for Approved LSRD, LSTMF, LSQM

Category 3 C2 View for Approved Non QM LSRD, LSTMF, LSSSV

Category 3 C2 View for Non Effective All

Category 3 C2 View for Non QM LSRD, LSTMF, LSSSV

Category 3 C2 View for Submission Documents LSSSV

Category 4 C2 View LSRD

Change Request C2 View LSQM

TBR Audit Report C2 View LSQM

Templates C2 View All

Note: When exporting Audit Report C2 View and TBR Audit Report C2 View, the Audit table will
be appended in front of the document followed by rest of the document content. There is no way
to show only the Audit Report through C2 View configurations.

PDF Rendition Overview


The Life Sciences solution uses C2 Rendition configurations, which is similar to the C2 View
configurations as described in C2 View Configurations for PDF, page 101, to provide PDF renditions
of a document to publishing tools. This allows the system to generate a PDF rendition with a
signature page that can be retrieved using lower-level API calls directly against Documentum, that
is, using DFC/DFS and not through D2. The PDF rendition generated from the rendering engine
(Advanced Documentum Transformation Services (ADTS) or Adlib PDF Enterprise) can be used by
the publishing tool for all Non-Effective (Approved or Final) versions. However, after the document
becomes Effective (Approved or Final) and a signature page is required, the PDF rendition in the
repository is updated to include the signature page. The following sections describe the flow of
documents through the system with regard to PDF renditions.

102
Document Creation Profile

C2 Configurations with Fast Web View Option


D2 provides the ability to produce a Fast Web View (linearized) PDF through the C2 View, Export,
Rendition, and Print configs in addition to Advanced Publishing. The C2 Export, Print, Rendition,
and View configurations includes a checkbox for Fast Web View that users can select to enable
linearized (Fast Web View) PDFs. However, this option is dependent on Content Transformation
Services (CTS) being installed and configured. There can be some performance impact as CTS is
called to render the document after the C2 processing occurs.

Configuring CTS for C2 Fast Web View


To configure the CTS client environments for the C2 Fast Web View to work, follow these steps:

1. Create a folder (for example, /realtimeclient_config) in the client file system with the
following structure:

a. Copy aek.key from the %CTS%\config folder on the CTS host machine to the folder you
created in Step 1. This is used in the preferences.xml file as part of <AekFilePath>.
b. Copy mspassword.txt from the %CTS%\docbases\<repository>\config\pfile on
the CTS host to the pfile folder. This is used in the preferences.xml file as ServerProperty
passwordFile.
c. Create an empty cache folder. This is used in preferences.xml as ServerProperty Cache.
d. Create the preferences.xml file. You can use the following sample content but replace the
highlighted values with the values corresponding to your environment:
<?xml version="1.0" encoding="UTF-8"?>
<ServiceConfiguration ID="CTS Web Services">
<PropertyList>
<ServerProperty Key="Cache" Description="The Temperory Cache Directory"
Value="C:/realtimeclient_config/cache" />
<ServerProperty Key="AllowDirectTransfer" Description="Allow Direct File
Transfer From CTS Server to Client. Set it to false if there is a firewall restriction"
Value="true" />
<ServerProperty Key="CTSWSPingInterval" Description="Interval (in seconds)
used to specify how frequent the LB should ping its CTS instances for heart rate."
Value="30" />
<ServerProperty Key="FailoverRetries" Description="Allow a number of retries
if a request fails while waiting on the HTTP response from CTS" Value="1" />
<ServerProperty Key="FailoverWait" Description="Wait between failover retries"
Value="1"/>
<ServerProperty Key="InstanceSelector" Description="Specify an implementation
class for instance selection " Value="com.emc.documentum.cts.lb.workers

103
Document Creation Profile

.OccupancyBasedSelector"/>
<ServerProperty Key="CTSOccupancyPollingInterval" Description="Specify occupancy
polling interval in seconds" Value="7"/>
<ServerProperty Key="ConnectionRetries" Description="Specify connection retries
(in case Repositories section is not configured )" Value="10"/>
<ServerProperty Key="AvailabilityRetries" Description="Number of retries when
CTS instances are not available" Value="2"/>
<ServerProperty Key="AvailabilityWait" Description="Number of seconds to wait
for rechecking availability" Value="4"/>
<!-- if local load balancer is used, no need of CTS-WebServices -->
<LoadBalancer type="local" URL="" sendMode=""/>
<!-- Otherwise, a remote CTS-WebServices can be used as Load Balancer, for ex: -->
<!-- <LoadBalancer type="remote" URL="http://10.31.158.35:8080/services/transformation
/LoadBalancer" sendMode="remote"/> -->
<Repositories>
<AekFilePath>C:/realtimeclient_config/aek.key</AekFilePath>
<LoginContext DocbaseName="lrx64">
<ServerProperty Key="domain" Value=""/>
<ServerProperty Key="userName" Value="Administrator"/>
<ServerProperty Key="passwordFile"
Value="C:/realtimeclient_config/pfile/mspassword.txt"/>
<ServerProperty Key="maxConnectionRetries" Value="10"/>
</LoginContext>
</Repositories>
</PropertyList>
</ServiceConfiguration>

Note: If you have multiple repositories configured with CTS, you may need to add multiple
login context nodes in the <Repositories> tag in the preferences.xml file. In addition, you
must create a separate pfile folder with the mspassword.txt file for each repository. For
example, if you have two repositories, repo1 and repo2 configured with CTS, you must create
two folders—repo1 and repo2, each with a pfile folder and mspassword.txt in them.
2. Create an environment variable named CTS_CONFIG_LOC with the value as the path of root
folder that contains the preferences.xml file, for example, C:\realtimeclient_config.

Enabling Fast Web View in C2 Configurations


1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select C2 > View configuration.
4. Under View configurations, select a configuration and under Properties, select the Fast Web
Compatibility checkbox and click Save.
5. Select C2 > Export configuration
6. Under Export configurations, select a configuration and under Properties, select the Fast Web
Compatibility checkbox and click Save.
7. Select C2 > Print configuration
8. Under Print configurations, select a configuration and under Properties, select the Fast Web
Compatibility checkbox and click Save.
9. Select C2 > Rendition configuration

104
Document Creation Profile

10. Under Rendition configurations, select a configuration and under Properties, select the Fast
Web Compatibility checkbox and click Save.

Electronic Signatures
Electronic signatures are captured during the workflow process when approving documents by
forcing the user to enter a username and password and storing that information in the audit trail.
The C2 configuration then reads the audit trail to create the signature page that is dynamically or
statically applied to the PDF rendition.

Page Sizes and Overlays


Not all parts of the world use the same page size. Most documents in the US are created using US
Letter page size (279 mm x 215.mm). Documents within the European regions typically use an A4
size (297 mm x 210 mm). Unfortunately, C2 does not automatically scale signature pages or overlays.

Signature Page
Signature pages are currently handled through C2 configurations that attach the signature page to
the PDF rendition. This process uses XSL FO to produce one or more PDF pages containing a table
of signature information. C2 does not provide a mechanism for determining the page size of the
generated signature page. To make it easier for clients to configure the page size for the signature
page, parameters for the page height and width can be configured in D2-Config on the C2 Rendition
Configuration Merge section:

It is recommended to set the page size to the largest page size used if multiple page sizes are required
for a client. When choosing to physically print the document, the “fit to page” option should be
selected.

105
Document Creation Profile

Overlays
Overlays are applied through C2 Export, C2 Print, and C2 View configurations through the layer
setting in the Stamping configuration. To ensure that the correct layer is used with regard to page
size, the overlay must contain a page for each page size and orientation. For example, to support both
portrait and landscape orientations in both A4 and US Letter page sizes, four pages are required in
the overlay:
• US Letter Portrait
• US Letter Landscape
• A4 Portrait
• A4 Landscape
Other page sizes can be supported by simply adding another page with the appropriate settings. If
the header or footer is changed in the document, such as property changes, header/footer location
changes, or changes to the size of the header or footer, the C2 overlay may also need to be altered.
Note: This only applies to overlays generated through C2. It does not apply to overlays generated by
the Controlled Print feature.
Note: Only portrait-based templates are included out of the box. If landscape pages are required,
these pages must be added to the authoring template as well as to any existing overlays so that the
properly sized or positioned overlays can be applied.

106
Chapter 6
Registration Forms

This section provides an overview of registration forms.

Overview and Form Types


When document Authors create a new document, they associate it with an existing registration form.
The registration form specifies the product codes, trial IDs, project names, default role assignments,
and other property values that can be selected in the new document’s properties dialog box. Each
Documentum for Research and Development and Documentum for eTMF document type that can
be directly created by users has a corresponding registration form subtype. This does not apply to
Documentum for Quality and Manufacturing documents or Change Requests.
Registration forms can be associated with a document in one of two ways:
• The user selects the registration form before importing or creating a document.
• The user selects the ID (for example, product_code, clinical_trial_id, project_name, and so on) of
the registration form in the properties dialog for the new document during creation or import.
Documents can inherit from multiple registration forms. Chapter 7, Attribute Inheritance and
Cascading Attribute Rules provides more information about how to configure documents to inherit
properties from registration forms.
Most registration forms are contentless documents with the exception of the TMF Clinical Trial
Registration Form. In this case, the content of the registration form is the TMF File Plan spreadsheet.
Registration forms have their own lifecycle and security configurations. Registration forms serve the
following purposes:
• Define the product code names, trial ID numbers, project names, and so on, that can be selected in
document properties dialog boxes (on the Classification tab) when a new document is created or
imported.
• Provide default metadata that can be inherited by the relevant documents. This simplifies the
document creation process for the Authors and reduces the amount of metadata to be filled-in
by the end-user.
• Define default role assignments (for example, Reviewers, Approvers, Coordinators, and so on)
to apply to the relevant documents.

107
Registration Forms

• Enable the overall status of a product, trial, or project to be indicated by the appropriate Managers.
• Enable an entire product, trial, or project to be locked down if necessary, in order to prevent
additional documents from being made “Effective” (perhaps temporarily).
Registration forms are set up by the high-level Management team, that is, Product Managers,
Regulatory Managers, Clinical/Non-Clinical Trial Managers, Project Managers in the Documentum
for Research and Development Quality area, and Corporate Administrators. Only these users can
create and modify the forms. In general, these forms provide an overarching governance mechanism
that can be used at the Management level, over and above the role-based access controls applied at the
individual document level. The following table shows the available registration forms, the registration
form object type, and the type of object that inherits properties from this kind of registration form.

Table 25. Registration form types

Registration Form Name Registration From Type Target Object Type(s) Inherit
from Registration Form
Country Registration Form cd_clinical_country_info cd_clinical_tmf_doc
(LSTMF solution only)
Site Registration Form (LSTMF cd_clinical_site_info cd_clinical_tmf_doc
solution only)
Clinical Trial Registration Form cd_clinical_trial_info cd_clinical

cd_clinical_tmf_doc
Non-Clinical Study cd_non_clinical_study_info cd_non_clinical
Registration Form
Medical Device Clinical Trial cd_clinical_trial_info cd_clinical
Registration Form
Medical Device Registration cd_product_info cd_clinical
Form
cd_non_clinical

cd_reg_admin
Medical Device cd_reg_admin_info cd_reg_admin
Regulatory-Admin
Registration Form
Medical Device Submission cd_reg_submission_info cd_reg_admin
Registration Form
Product Registration Forms cd_product_info cd_clinical_tmf_doc

cd_clinical

cd_non_clinical

cd_quality

cd_reg_admin
Project Registration Forms cd_quality_project_info cd_quality

108
Registration Forms

Registration Form Name Registration From Type Target Object Type(s) Inherit
from Registration Form
Regulatory Application cd_reg_admin_info cd_reg_admin
Registration Forms
Regulatory Activity Package cd_reg_admin_activity_info cd_reg_admin
Registration Form
Submission Registration Form cd_reg_submission_info cd_reg_admin
In most cases, each registration form object type extends the relevant target object type (that is, the
document object type to which the form pertains) for the following reasons:
• Optimizes query performance since there are usually fewer instances of registration forms than
documents, so the queries used to populate the product, trial, or project codes work efficiently.
• Facilitates D2 configuration, enabling D2 contexts to be used to restrict certain configuration
elements to registration forms (for example, attachment of property screens, lifecycles, and
security models).
• Provides the ability to add attributes specific to the registration form but not the target documents.
It is possible to extend the configuration to support other types of registration forms if necessary.

Creating Custom Registration Form Types


You can create a custom registration form type for a new type of document or to support an additional
level of inheritance to existing document types using the following steps:
1. Create a new object type for the registration form type. By convention, the object type
name should have the _info suffix and as a best practice, the object type name should be
prefixed according to your organization’s naming convention. For example, if you are adding a
registration form for marketing materials for KB Pharmaceuticals, name the target object type
kb_marketing and the registration form subtype kb_marketing_info. The object type should also
have an attribute that can be used as an identifier. This will be the key attribute when associating
a document with the registration form.
2. Add the registration form name to a D2 dictionary. This dictionary value is used in a creation
profile to enable users to create instances of the registration form. For example, create a
dictionary, Marketing Management Artifacts, with a key value and language label value:
Marketing Registration Form. Alternatively, if you are adding the registration form to an existing
creation profile, modify the existing dictionary that is associated with the existing creation profile.
3. Create a property page configuration for the registration form.
4. Create an inheritance configuration to be used during creation of the new registration form
if needed. This step refers to inheriting configurations from an existing registration form to
your new registration form. For example, the Marketing Registration Form might inherit
from a selected Product Registration Form. You can also use a standard solution inheritance
configuration rather than create a custom one.
5. Create a D2 inheritance configuration that defines the attributes that will be inherited from the
new registration form to target documents.

109
Registration Forms

6. Create a Default Values template to be used during creation. Ensure that all of the mandatory
default properties described in the Default Values Template, page 95 are defined.
7. Create a lifecycle configuration for the new registration form type if desired. You can also use
an existing registration form lifecycle if it applies.
8. Create a creation profile that enables creation of the new registration form type. Alternatively,
add the new registration form to an existing creation profile. Customizing Creation Profiles,
page 97 provides more information.

110
Chapter 7
Attribute Inheritance and Cascading
Attribute Rules

This section explains how attribute inheritance and cascading rules work in the Life Science solution.

Auto-Inherited Attribute Rules


The Life Sciences solutions use standard D2 inheritance configurations but they also introduce
a flexible mechanism called Auto-Inherited Attribute Rules. The following table summarizes the
differences between these two mechanisms.

111
Attribute Inheritance and Cascading Attribute Rules

Table 26. Comparisons between D2 inheritance and Auto Inherited Attribute Rules

D2 Inheritance Auto-Inherited Attribute Rules


Configured in D2-Config. Configured in D2 Client by creating
an Auto-Inherited Attribute Rules
(cd_auto_inherit_config) object and setting its
properties.

cd_auto_inherit_config does not delete empty


folders or update versions other than the current
one. When configuring auto-inheritance and
cascading attribute rules, run the following DQL
to set the immutable flag on the type. This is
applicable for all the types for which the rules
are being configured. A sample DQL query
for the "cd_non_clinical" object type with the
attribute swy_study_category is as follows:

alter type cd_non_clinical


modify swy_study_category(set
ignore_immutable=-1) publish;
Requires the user to preselect the source Inheritance is based on the product code, trial
document before initiating a new or import id, study id, project name, and so on, that
operation. the user selects in the Properties dialog box
during creation or import. The source object
for inheritance is identified through a query
defined within the Auto-Inherited Attribute
Rules object. Therefore, it is not necessary to
preselect a source document.
Documents can only inherit from the selected Documents can inherit from more than one
document. document. They inherit from all documents that
match a configured source query qualification.
Applied by the D2 runtime before displaying Applied by the Life Sciences solution server-side
the Properties dialog box during document logic after the document is created and the
creation or import. initial lifecycle state actions executed. Therefore,
the inherited attribute values will not be shown
in the Properties dialog box during creation or
import. Consider configuring the property page
such that fields that will be inherited through
this approach are not visible during creation.
Allows copying an attribute value from the Provides for configurable inheritance rules.
selected document to a new document. Documents can inherit attributes from multiple
sources. You can merge or override attributes,
can transfer attributes to different attributes,
and can apply conditional inheritance rules,
depending on the object type and/or properties
of the target object.

112
Attribute Inheritance and Cascading Attribute Rules

Configuring Auto-Inherited Attribute Rules


1. Log in to D2 Client as a user in the cd_admingroup role.
2. Click New > Content.
3. Select the System Administration creation profile.
4. In the System Administration list, select Auto Inherited Attributes Rule and click Next.
5. In the Name field, type a name for the configuration.
6. On the Rule applicability tab, configure the following fields.

Option Description
Enabled Check if this auto-inheritance configuration is enabled. If the
selection is cleared, it is ignored at runtime.
Automatically applies to new Check if this auto-inheritance configuration should be
objects applied automatically whenever objects of the specified types
are created. If the selection is cleared, the configuration is
only applied if it is explicitly called from a lifecycle action.
Order of precedence Specifies the order of precedence for this rule in relation
to other rules that may be applicable. Rules are applied in
increasing order of precedence that is, a precedence value
of 0 denotes the highest precedence, 1 is the next highest;
and so on. When two or more applicable rules have the same
level of precedence, they are applied in alphabetical order of
object_name.

The order in which the rules are applied and if a subsequent


rule depends on a value being already set can be significant.
For example, if two or more rules are configured to set default
values, the first rule will override any subsequent rules.
Applies to object_types Defines the optional list of target object types that this rule
can apply to. Where defined, if the target object is not one of
the specified object types (including subtypes), it is ignored.
If undefined, the rule can apply to any target object.
Condition qualifier (optional) Specifies an optional DQL qualifier that must be satisfied for
the target object in order for the rule to apply. For example,
the condition product_code is not null ensures the rule
only applies to documents that have defined product_code
values.

7. On the Inherit from tab, configure the following fields.

113
Attribute Inheritance and Cascading Attribute Rules

Option Description
Inherit from Specifies the source object(s) that are used to provide the
values to be inherited. The object(s) are in terms of a DQL
qualifier. This can refer to attribute values of the target object
using the $value(...) notation within the qualifier, as in
standard D2 configurations. For example,
cd_product_info where product_code =
‘$value(product_code)’

uses a cd_product_info source object with a product_code


value matching that of the target object.

The $quotedvalue(...) notation can also be used


to safely refer to attributes that could potentially have
embedded quotes in them. For example, the preceding
qualifier could also be written as follows:
cd_product_info where product_code =
$quotedvalue(product_code)

If you use the $quotedvalue(...) form, the value is


quoted automatically, with internal quotes doubled-up
so that the DQL clause is syntactically-correct. If the
specified attribute is a repeating attribute, you can use the
plural form $quotedvalues(...) to generate a series
of comma-delimited, safely-quoted strings, which can be
referenced in a DQL “in” clause. For example,
cd_non_clinical_trial_info where any study_num
in ($quotedvalues(study_num))

If the resulting DQL qualifier identifies multiple source


objects, the inheritance rules are applied to each source
object in turn, in the order in which they are returned by the
qualifier. Use an order by clause in the qualifier if necessary,
to ensure the source objects are taken in the appropriate order.

To implement folder-based inheritance, a qualifier of the


following form can be used:
<folder_object_type> where r_object_id in
($quotedvalues(i_folder_id))
Where <folder_object_type> is the appropriate object type.
This enables the target object to inherit attribute values
from the parent folder(s) after it has been created and
auto-filed in the repository. However, if the target object
is linked to multiple parent folders, the order in which the
parent folders are retrieved is not deterministic, and the
target object may inherit values from the wrong folders
as a result. To counter this, the source parameter can be
set to the special value PARENT_FOLDERS, which applies
parent folder inheritance in strict order, so that the primary
folder is used first, followed by any secondary folders. The
special value PRIMARY_PARENT_FOLDER can also be used to
114 enforce folder inheritance from the primary folder link only.
The PRIMARY_PARENT_FOLDER is the one listed as the first
folder in the i_folder_id property.
Attribute Inheritance and Cascading Attribute Rules

8. On the Inherit to tab, configure the following fields.

Option Description
Inherit to Specifies the target object(s) on which the inheritance rules
are to be applied. The default setting is SELECTED. This
enables attribute inheritance to be applied to one or more
objects related to the selected object (the object specified by
the –id argument) as opposed to the selected object itself such
as its parent folders. For example, for Documentum for eTMF,
to propagate attributes from the currently-selected object to
its parent folders along the folder path towards the cabinet
level, set source to SELECTED and target to ALL_FOLDERS
for the tmf_folder object type .
For checked-out objects Specify how currently checked-out items should be handled.
If you use the Bypass lock temporarily option, the existing
check-out locks are reinstated after the update, enabling users
to continue to work on checked-out documents.

9. On the Attributes tab, configure the following fields.

115
Attribute Inheritance and Cascading Attribute Rules

Option Description
Attributes Specifies the list of attributes to be inherited from the source
object(s) to the target object. Each entry is of the form:
[<update-mode> ]<targetAttribute> [=
<sourceAttribute> | :<literal-value>]
where <update-mode> is:

default – apply the inheritance only if the current target


attribute value is undefined (for example, a blank string) or
zero.

merge – merge the attributes of the source and target,


avoiding duplicates. (This only applies to repeated attributes
– if applied to single-valued attributes, the effect is the same
as default.)

replace – (default) discard the existing target attribute


value(s), if any, and replace it with the source attribute values.
This is the only valid option if batch update mode is used.
Update Mode Specifies how the identified target objects are to be updated:

Serial (default) – target objects are processed in sequence, in


the defined order.

Parallel – target objects are processed in parallel using the


concurrent method handling framework.

Batch DQL – target objects are updated as a group using a


DQL update query. This option is only valid if a single source
object is specified, the target objects are specified in terms of
a DQL qualifier, and the attributes list specifies only replace
operations.

10. On the Folder updates tab, configure the following fields.

116
Attribute Inheritance and Cascading Attribute Rules

Option Description
Update Folders Specifies a set of folders that should be moved or renamed
after updating the attributes of the target object, in the
form <source-folder-path>|<DQL-qualifier> =>
<new-folder-path>|<new-folder-name>. Attribute
expression functions prefixed by the “$” symbol can be used
to refer to attributes of the context object (the first source
object, by default). For example,
/Clinical/$arg(old_value)=$new_value
renames a product-level folder in the Clinical cabinet.

Attribute expression functions prefixed by “@” can also


be used in the right-hand side of the expression to refer to
attributes of the existing folder. For example,
dm_folder where folder('/Clinical Trial
Library/$arg(new_value)', descend) and
object_name like '$arg(old_value) -%' =>
@replace("@value(object_name)","$arg(old_value)
-","$arg(new_value) -")
renames all subfolders below /Clinical Trial Library/<new
-value> beginning with “<old-value> -”, replacing <old-value>
with <new-value> in the existing folder names.

If a folder path is specified in the right-hand side, the existing


folder is renamed if necessary and then moved into the
specified target folder, which is created automatically if
necessary. For example,
/Clinical/$arg(old_value)/$value(product_code)=>
/Clinical/$arg(new_value)/$value(product_code)
moves an existing product-level folder to a new parent folder.
The folder type can also be specified in curly braces in the
right-hand path if necessary. For example,
/Clinical/$arg(old_value)/$value(product_code)=
/Clinical/{tmf_folder}$arg(new_value)/$value
(product_code)
creates a new folder of type tmf_folder below the Clinical
cabinet if necessary, instead of using the default folder type
(dm_folder).

Renaming and moving existing folders in this way can


remove the requirement to apply D2 auto-linking rules to
the individual subordinate objects. Multiple folders to be
renamed can be specified; if the specified source folder does
not exist, the setting is logged as a warning and disregarded.

11. On the Deletion tab, configure the following fields.

117
Attribute Inheritance and Cascading Attribute Rules

Option Description
Delete Empty Folders Select this option if you want to delete empty folders after
applying D2 auto-linking or deleting objects, that is, where
a document is moved from its current folder to some other
location, and there are no other documents in the original
folder.
Delete Target Objects Where Use this setting to selectively remove redundant
objects identified in terms of a DQL qualifier or
Boolean attribute expression (that is, an attribute
expression returning a ''true'' or ''false'' value). For
example, either the DQL qualifier r_object_type
= 'cd_product_info' or the attribute expression
@eq(@value(r_object_type),cd_product_info) can
be used to delete Product Registration Forms.

If the update mode is set to Bulk DQL, a DQL qualifier must


be specified here; otherwise, it is more efficient to use an
attribute expression. Within the attribute expression, use
expressions to refer to attributes of the context object (the first
source object, by default) and @value() expressions to refer
to attributes of the target object in each case. The wildcard
symbol * refers to all target objects. Note that the selection
is made before inherited attributes are applied, so the DQL
qualifier or attribute expression should refer to the current
object values, not the new values.

12. On the Post-processing tab, configure the following fields.

Option Description
Reapply D2 Auto-Naming Specifies a sub-set of the modified target objects that should
Where be auto-named after updating them, in terms of either a
DQL Qualifier or a rule expression (i.e. an expression that
evaluates to either true or false, as appropriate). As for
update_folders, “$” or “@” function prefixes can be used
within the rule expression to refer to attributes of the context
object or modified target object, respectively. For example,
@endswith(@value(r_object_type),_info)
causes D2 auto-naming to apply to all modified target objects
with an object type ending in “_info”; that is , all registration
form object types, but not others.

The wildcard value * can be used to apply D2 auto-naming


to all modified target objects. If null or undefined, D2
auto-naming is not applied to any of the modified objects,
unless –apply_d2_configs true is specified explicitly in the
method arguments.
Reapply D2 Auto-Linking Specifies a subset of the modified target objects that should
Where be auto-linked after updating them.

118
Attribute Inheritance and Cascading Attribute Rules

Option Description
Reapply D2 Security Where Specifies a subset of the modified target objects that should
have D2 security configurations applied after updating them.
Apply Custom Processing Specifies a subset of the modified target objects that should
Where have custom processing applied after updating them.
Class Path of Custom Plug-in The post-processing plug-in class to invoke for each
qualifying object, if any. Specify the fully-qualified class
path. For example, com.documentum.cdf.plugins
.MyPluginClass.
Additional plug-in arguments Specifies additional arguments for the custom plug-in.

13. On the Auditing tab, configure the following fields.

Option Description
Apply Auditing To Enables auditing to be applied to a subset of the impacted
objects based on a DQL qualifier; for example, specify
r_object_type = ‘target_type’ in the Apply Auditing To field
to audit changes on objects of a specific type. Specify “*” to
audit changes on all impacted objects. Note that this can
incur a lot of extra processing if many objects are affected.
Audited Event Name Specifies the event name to be recorded in the audit trail for
each impacted item. If null or undefined, auditing is disabled.
Audited Event Arguments Up to 5 arguments can be recorded in the audit trail for
each modified object. The argument values can contain
attribute expressions, prefixed either by “$” symbols to refer
to attributes of the source object, or by “@” symbols to refer
to attributes of the target (impacted) object in each case. For
example, “@value(object_name)” records the name of the
source object in the audit trail (that which caused the object
to be updated) in each case.

14. Click Next.


The Documentum Controlled Document Foundation Developer API Javadocs provides more details.

Configuring Cascading Attribute Rules


During document creation or import, a document inherits a set of attributes from a corresponding
registration form. For example, if you create a non-clinical document, the system requires you to
associate the document to a Non-clinical Study Registration Form. The individual document inherits
a set of product and study information from the registration form.
But what happens if the information on the registration form changes after you create the document?
The Life Sciences solution Cascading Attributes functionality enables you to update existing
documents for searching and reporting purposes. You also have the option to leave existing
documents unchanged for compliance reasons.

119
Attribute Inheritance and Cascading Attribute Rules

For example, there may be cases where you want the related documents to take on the new attribute
value from the registration form, such as the addition of a generic name to a product registration.
There may be other cases where you do not want the existing documents to take on the new value of
the attribute, but only new documents, such as the changing of a Principal Investigator at a trial site.
And there may still be other cases where you not only want the attribute to be saved on the existing
documents, but you would like to have business logic invoked to properly represent the change to
the registration form.
See Appendix C, Visual Representation of Attribute Cascading in Life Sciences for a visual
representation of how the attributes are cascaded from the registration forms to the documents.
This section defines how to configure the Cascading Attribute rules. By default, the Life Sciences
solutions come with preconfigured cascading attribute rules. Preconfigured Cascading Attributes
Rules, page 140 provides more information.
To configure the Cascading Attributes rules:
If you modify the cascading attribute rule settings (the cd_auto_inherit_config objects in the
/System/CDF/Auto Attribute Inheritance Config folder), restart the Documentum Java Method
Server service on each Documentum Server node in order to pick up the changes.

1. Log in to D2 Client as a member of cd_admingroup.


2. From the Repository browser, navigate to System/CDF/Auto Attribute Inheritance Config.
3. Right-click an auto inheritance configuration object in the folder, such as Change Product Code,
and select Properties.
4. Select Enabled to enable this configuration rule or clear the check box to disable it.
5. On the Rule applicability tab, configure how to apply the rule to object types as described in
the following table:

Field Description
Automatically applies to new Select whether the system applies the rule automatically
objects when users create the specified object type. Clear this
option if the rule applies explicitly as part of a lifecycle
transition action. For example, in the Change Product
Code function.
Order of precedence Select the order of precedence for this rule when there
are other applicable rules. The system applies rules with
the highest order of precedence first. If two or more
applicable rules have the same order of precedence,
the system applies the rules in alphabetical order by
object_name.

The order in which the rules apply can be significant. For


example, if two or more rules set default values, the first
rule overrides any subsequent rules.

120
Attribute Inheritance and Cascading Attribute Rules

Field Description
Applies to object types Select the object type or types that apply to this rule.
For example, Change Product Code applies to the
cd_product_info object type.
Condition qualifier (optional) Type an optional DQL qualifier to satisfy before applying
the rule to the object type. For example, the condition
product_code is not null ensures that the rule
only applies to objects with defined product-code values.

6. On the Inherit from tab, in the Inherit from field, type the source objects that provide the
inherited values:
• SELECTED: Inherit values from the currently selected object.
• PRIMARY_PARENT_FOLDER: Inherit values from the immediate primary parent folder of
the selected object.
• PARENT_FOLDERS: Inherit values from each immediate parent folder of the selected object,
starting with the primary folder, which is the first folder in the i_folder_id property.
• ALL_FOLDERS [<DQL-qualifier>]: Inherits values from all direct and indirect parent
folders, from the first parent folder to the cabinet level, that match the specified conditions,
if any. For example, ALL_FOLDERS tmf_folder where product_code != ' '
inherits values from all direct and indirect parent folders of type tmf_folder with non-blank
product_code values.
• <DQL-qualifier>: Inherits values from the specified objects in terms of a DQL qualifier. You
can use $-expressions within the DQL qualifier to refer to attribute values of the selected
object, if necessary. For example, the Inherit Clinical Trial Registration Form Info rule
uses the DQL qualifier: cd_clinical_trial_info where clinical_trial_id =
$quotedvalue(clinical_trial_id)
If you leave this field blank, the system uses SELECTED as the default.
7. On the Inherit to tab, configure target object inheritance as described in the following table:

Field Description
Inherit to Type the target objects that inherit the relevant attribute
values.

The selection options are the same as for the source


objects in the Inherit from field.
For checked-out objects Select how to handle checked-out items.

8. On the Attributes tab, configure attribute inheritance as described in the following table:

121
Attribute Inheritance and Cascading Attribute Rules

Field Description
Attributes Specify the list of attributes to inherit from the source to
the target objects. In each case, specify:

• default: where the attribute value should be set by


default, if it does not currently have a value.

• replace: if the attribute value should always overwrite


the current value.

• merge: With repeated attributes, it enables the current


values to combine with new inherited values, without
duplicates. For example, to combine values from
two or more registration forms. For example, merge
keywords adds keywords from the source object or
objects that are not currently listed in the keywords
repeated attribute.

If default, replace or merge is not specified, the system


uses replace.

It is also possible to inherit attributes from a different


attribute of the source object if necessary, by appending
=<source-attribute>. For example, replace
title=study_title copies the value of study_title
from the source object to the title attribute of the target
object.

A literal value can also be specified using the :<value>


notation. For example, replace log_entry:Created
by user $USER. You can use $-expressions in the value
to refer to attributes of the source object.

You can modify the predefined rules if necessary to


include custom attributes, where you have added them
to the standard object model. For example, the Inherit
Clinical Trial Registration Form Info rule defines the
attributes inherited from Clinical Trial Registration
Forms.
Update mode Select the update mode to use to process the target objects.

Serial update mode processes the target objects one at a


time. Use this option if there is only one target object, or a
small number of target objects to update.

Parallel update mode processes the target objects


concurrently. Use this option if there could potentially
be many target objects to update and the attribute list
contains default or merge entries.

Batch DQL update mode processes the target objects as a


group using a DQL update query. Use this option if there
could be many target objects to update and the attribute
list contains only replace entries. You can only use this
122
update mode to replace (overwrite) attribute values.
Attribute Inheritance and Cascading Attribute Rules

The following example shows the Attributes field for the Inherit Clinical Trial Registration
Form Info automatic inheritance configuration object:

9. On the Folder Updates tab, specify the list of folders to move or rename.
To move a folder and its contents to another location, specify <current-folder-path> =>
<new-folder-path>
To rename a folder, specify <current-folder-path> => <new-folder-name>.
In both cases, you can use $-expressions to refer to attributes of the source object, and you can
use @-expressions to refer to attributes of the target folder. You can also use a DQL qualifier to
specify the source folder if necessary.
The following example shows the Folder updates field for the Change Product Code automatic
inheritance configuration object:

In the example, $arg(old_value) and $arg(new_value) refer to the old and new product
code values, respectively.
10. On the Deletion tab, configure how to delete objects as described in the following table:

Field Description
Delete empty folders Select whether to delete the empty folders that remain
after moving documents and folders to new locations.
Delete target objects where Type a DQL qualifier or a Boolean attribute expression to
remove redundant objects.

123
Attribute Inheritance and Cascading Attribute Rules

11. On the Post-processing tab, you can selectively apply Documentum D2 autonaming, autolinking,
security, and custom post-processing to the modified objects. Type a DQL qualifier, Boolean
attribute expression, or the wildcard symbol * in the applicable fields.
For example, in the Change Product Code rule, the system applies Documentum D2 autonaming
to the relevant Product Registration Form and its associated Clinical Trial Registration
Forms using the following DQL qualifier: r_object_type in ('cd_product_info',
'cd_clinical_trial_info')
You can apply Documentum D2 autonaming, autolinking, and security selectively to the affected
objects using this technique. The system applies the DQL qualifier to identify the target objects
before applying the inherited attributes, but applies the Documentum D2 configurations after the
objects update. (If there are multiple target objects to update, they process in parallel, regardless
of the Update Mode setting.)
It is also possible to apply custom processing to selected target objects using the Apply custom
post-processing where field. For example, to apply custom autonaming to the affected
objects, you can write a Java plug-in and install it in the Java Method Server lib directory. The
Documentum Controlled Document Foundation Developer API provides details.
12. On the Auditing tab, configure the following fields.

Option Description
Apply Auditing To Enables auditing to be applied to a subset of the impacted
objects based on a DQL qualifier; for example, specify
r_object_type = ‘target_type’ in the Apply Auditing To field
to audit changes on objects of a specific type. Specify “*” to
audit changes on all impacted objects. Note that this can
incur a lot of extra processing if many objects are affected.
Audited Event Name Specify the event name to be recorded in the audit trail for
each impacted item. If null or undefined, auditing is disabled.
Audited Event Arguments Up to 5 arguments can be recorded in the audit trail for
each modified object. The argument values can contain
attribute expressions, prefixed either by “$” symbols to refer
to attributes of the source object, or by “@” symbols to refer
to attributes of the target (impacted) object in each case. For
example, “@value(object_name)” records the name of the
source object in the audit trail (that which caused the object
to be updated) in each case.

13. Click OK.

124
Attribute Inheritance and Cascading Attribute Rules

Extended Attribute Expressions


D2 supports a number of built-in functions for referencing attribute values within configuration
settings such as $value(attribute). However, in some cases, additional functions are required.
The Controlled Document Foundation (CDF) Layer, therefore, supports extended attribute
expressions. These can be used in the following cases:
• In various method arguments passed to CDF server methods in Apply Method lifecycle actions
(for example, the –value argument to CDFSetAttributeMethod).
• In the Source and Target DQL qualifiers of cd_auto_inhert_config objects, used in the
CDFApplyAttributeInheritanceMethod
The following extended attribute expression functions are supported.

Table 27. Extended attribute expressions

Extended Attribute Expression Functions Description


$user Returns the context username. This is the
name of the user who caused this function to
be invoked. This is not necessarily the same as
the session user name, as the method could be
running as a privileged server method on behalf
of the user. In this case, the D2 username should
be passed to the method in the -context_user
argument so that it can be passed to this method
for use in this function.
$value(<attribute>) Extracts the value of the specified attribute of
the context object. For repeating attributes,
a value index can be specified. For example,
$value(r_version_label[0]) refers to the
version number of the object in Documentum. If
an index is not specified, a comma-delimited list
of repeating attribute values is returned.
$quotedvalue(<attribute>) Similar to $value(<attribute>), with the
value(s) being surrounded by single-quotes
and with internal quotes doubled-up. Use this
form when referring to attributes that may have
embedded quotes in them in DQL statements.

125
Attribute Inheritance and Cascading Attribute Rules

Extended Attribute Expression Functions Description


$dqlvalue(<query>) Runs the specified DQL query and returns
the result in the first row/column position, if
any. Else, returns a blank string. For example,
$dqlvalue(select count(*) from
ls_tmf_doc where clinical_trial_id
= $value(clinical_trial_id) and
artifact_name = $quotedvalue
(artifact_name) and is_placeholder =
false returns the current number of documents
of the same type as the document/placeholder
itself in the same trial.
$datevalue(<attribute>, <format>) Returns a date/time attribute value in the
specified format. In the format pattern, use
MM to denote the month in year, and mm
to denote the minute in the hour. The time
part can be omitted if necessary. For example,
$datevalue(r_modify_date,MM/dd/yyyy)
returns the last modify date in US-style date
format.
$upper(<str>) Converts <str> to upper-case.
$lower(<str>) Converts <str> to lower-case.
$firstupper(<str>) Converts <str> to first-upper format, or
proper-case; that is, the first letter of each word
is capitalized. For instance, $firstupper("an
example subject") returns “An Example
Subject”.
$left(<str>,<n>[,<suffix>]) Extracts the leftmost <n> characters from
<str>, and appends the optional <suffix> (if
specified) if the string is truncated. For example,
$left("ABCDEFG",3,"...") returns
“ABC...”.
$right(<str>,<n>) Extracts the rightmost <n> characters from <str>.
$before(<str>,<separator>) Extracts the left part of <str> up to but not
including the sub-string <separator>. For
example, $before("ABC.DEF",".") returns
"ABC".
$after(<str>,<separator>) Extracts the right part of <str> after but not
including the sub-string <separator>. For
example, $after("ABC.DEF",".") returns
"DEF".
$trim(<str>) Removes leading/trailing spaces from <str>.
$length(<str>) Returns the number of characters in <str>.

126
Attribute Inheritance and Cascading Attribute Rules

Extended Attribute Expression Functions Description


$maxlength(<attr>) Returns the maximum number of characters
permitted for the specified string-valued
attribute <attr> according to the object model.
Not valid for non-string data types.
$substring(<str>,<start>[,<n>]) Extracts up to <n> characters from <str>
starting from position <start>, indexed from
1. If the third parameter is omitted, all
characters from the specified start position
to the end of the string are extracted. For
example, $substring("ABCDEFG",3) returns
"CDEFG", and $substring("ABCDEFG",3,2)
returns "CD".
$replace(<str>,<find>,<replace>) Replaces all occurrences of <find>
with <replace> in <str>. For example,
$replace("ABC.DEF.GHI",".","-")
returns "ABC-DEF-GHI".
$if(<dql-qualifier>,<true-value>[,<false-value>]) Evaluates a condition expressed as
<dql-qualifier> against the context object,
and returns either <true-value> or <false-value>
depending on whether or not the condition
is satisfied. If the third parameter is
omitted, either <true-value> or a blank
string is returned, as appropriate. For
example, $if("is_placeholder =
true","Placeholder","Document")
returns "Placeholder" if the context object is a
placeholder, or "Document" if it is a document.
$integervalue(<str>) Returns the numeric integer value of <str>; 0 if
<str> is not a numeric value. This can be used to
remove leading zeroes from a string of digits.
For instance, $integervalue("0000256")
returns "256".
$inc(<n>) Returns the integer value n+1.
$dec(<n>) Returns the integer value n-1.
$add(<a>,<b>) Returns the sum of the integer values of a and b.
For example, $add("256","1") is "257".
$subtract(<a>,<b>) Returns the difference of the integer values of a
and b. For example, $subtract("256","1")
is "255".
$position(<str>,<substr>) Returns the first position of <substr> in <str> in
a left-to-right scan, indexed from 1; zero if the
specified substring is not found. For example,
$position("ABRACADABRA","BR") is "2".

127
Attribute Inheritance and Cascading Attribute Rules

Extended Attribute Expression Functions Description


$lastposition(<str>,<substr>) Returns the first position of <substr> in <str> in
a right-to-left scan, indexed from 1; zero if the
specified substring is not found. For example,
$lastposition("ABRACADABRA","BR") is
"9".
$pad(<str>,<n>[,<padding-chars>]) Pads <str> to <n> characters by inserting the
first character of the specified <padding-chars>
string (which defaults to the digit 0) at the start
of the string as many times as necessary; for
example, $pad("123","6") returns "000123".
If <padding-chars> comprises two characters,
then the second character is used to signify
an overflow if the string exceeds the specified
length. For example, $pad("100","2","0*")
returns "**".
$default(<str>,<default-value>) Returns <default-value> if <str> is a blank
string; otherwise it returns <str>. For example,
$default($value(title),"Title is
undefined") returns the value of the title
attribute if it is non-null, otherwise it returns
"Title is undefined".
$lookup(<str>,<dictionary>,<alias|locale>) Retrieves the value of <str> from the specified
D2 dictionary, using the specified alias or locale
code to retrieve the value. Use this function to
convert strings to localized values or encoded
values via D2 dictionaries. For example,
$lookup("FR","TMF Countries","en")
checks the en locale value (that is, English
language value) of the country code FR in the
TMF Countries dictionary. In this case, the result
is "France".

128
Attribute Inheritance and Cascading Attribute Rules

Extended Attribute Expression Functions Description


$list(<attribute>,<expression>[,<separator>]) Evaluates the specified expression for an attribute
value or series of repeating attribute values
and combines the results into a list. Within the
expression string, the "~" (tilde) character must
be used to prefix embedded attribute expression
functions instead of the "$" (dollar) character.
This is to prevent the attribute expression
functions from being evaluated prematurely.
Use ~item or ~item() to refer to the list item
value in each case. The expression string must
also be quoted if it contains embedded commas.
For example, $list(keywords,"~upper
(~left(~item,3))",",") takes the first
three characters of each keyword attribute value
defined for the context object, and converts
them to uppercase, combining the values into
a comma-separated list. If a separator is not
defined, the values are concatenated without
separators.

129
Attribute Inheritance and Cascading Attribute Rules

Extended Attribute Expression Functions Description


$case(<expression>,<matching-pattern Selects the ith value from <result-list>, where
-list>,<result-list>) the value of <expression> matches the ith
value in <matching-pattern-list>, comprising
a comma-delimited list of literal values or
regular expressions. If no matching pattern
is found, a blank string is returned. If the
last item in <matching-pattern-list> is “*”
(the wildcard character), it will match any
expression; in this way, the last value of
<result-list> is returned by default. For example,
$case($value(domain),’Clinical,Non
-Clinical,*’,’Clinical Trial,Pre
-Clinical Study,Other’) returns “Clinical
Trial” if domain = Clinical, “Pre-Clinical Study”
if domain = Non-Clinical, and “Other” if domain
is some other value.
$lookup(<key Extends the dictionary lookup function to
-path>,<taxonomy>,<dictionary>[,<alias|locale>] retrieve a value from a D2 taxonomy. The first
) parameter begins with the path prefix character
“/”, and specifies the location of a parent node in
the taxonomy as a series of key values separated
by “/” characters. The path “/” denotes the root
node in the taxonomy. The second parameter is
the D2 taxonomy configuration name. The third
parameter denotes the subordinate D2 dictionary
name used in the taxonomy for the required
value, which may be several levels below the
parent node, provided that the intervening
levels in the taxonomy contain only one value.
If multiple values are defined at the specified
level, only the first value is returned. If an alias
or locale is not specified in the fourth parameter,
the subordinate key value itself is returned. For
example, to lookup the corresponding artifact
number for a particular TMF artifact in the DIA
TMF 2.0 reference model, use $lookup(‘/DIA
TMF 2.0/$value(artifact_name)’,TMF
Classification by Artifact,TMF
Artifact Numbers).
Characters outside of function calls remain unchanged. Function names are not case-sensitive.
Nested subexpressions can be specified in function argument lists, which are then evaluated
recursively. There is no defined limit on the level of nesting that can be used.
Function arguments should be quoted if they contain embedded commas, so that
they are parsed correctly. Either matching single or double quotes can be used for
these. For example, specify $substring('$value(object_name)',10,20) or
$substring("$value(object_name)",10,20) if the object_name attribute value can

130
Attribute Inheritance and Cascading Attribute Rules

potentially contain commas. Unrecognized functions are not processed, and are passed into the
resulting string unchanged.
Method arguments containing attribute expressions can be specified in D2 lifecycle configurations.
However, D2 evaluates them where functions are recognized (for example, $value(...) and
$dqlvalue(...) expressions) prior to passing the argument values to the method. To prevent
D2 from doing this, the alternative function prefix symbol “@” is supported – for example,
"@dqlvalue(select ... where ...)”. These expressions are not recognized by D2 and are
passed to the method unresolved, so they can be evaluated by the method itself.
The Documentum Controlled Document Foundation Developer API Javadocs provides more details.

Context User Support in CDF Methods


Certain CDF methods when called by a lifecycle action, do not pass the context user argument.
Instead, they use the Administrator session to run the method. The following table lists the CDF
methods that support context user, the operations performed by the context user, and whether the
context user is passed in the Lifecycle configuration.

131
Attribute Inheritance and Cascading Attribute Rules

Table 28. Context User Support in CDF Methods

Methods Is Context User Param Applicable? Description Lifecycle Configuration Setup Comments
CDFAppendRepeatingAttributesMethod Y D2 Server method that appends values Does not pass the context user in any Supports the -context_user argument.
CDFApplyAttributeInheritanceAsynchMethod from one set of attributes to another, Lifecycle method call in the default Method is executed with an
ensuring that the data is maintained in a configurations. Administrator session. Context
table format (no missing cells). user is used to create audit entry.
CDFApplyAttributeInheritanceMethod Y D2 server method that copies or Does not pass the context user in any Supports the -context_user argument.
merges attributes from arbitrary source Lifecycle method call in the default Method is executed with an
objects to arbitrary target objects configurations. Administrator session. Context
associated with the currently-selected user is used to create audit entry.
object (or the selected object itself)
according to pre-configured attribute
inheritance rules, defined in the
cd_auto_inherit_config configuration
objects in the Documentum repository.
CDFApplyD2ConfigurationsMethod N NA
CDFApplyDynamicSecurityMethod N D2 server method that applies the NA NA
dynamic security to the documents
based on the security model settings set
on the document.
CDFAuditMethod Y D2 Server method that logs a specified Context user is mandatory. Supports the -context_user argument.
event in the Documentum audit trail. Method is executed with an
Administrator session. Context
user is used to create audit entry.
CDFCloseChangeRequestMethod Y The method is invoked by a user action
or a lifecycle batch to close the CIP
Change Request based on its associated
documents status. Currently, there is no
support of context user available in the
method. Therefore, when the Change
Request is closed by a Coordinator, the
audit shows Administrator instead of
the user name.
CDFCopyRelatedObjectAttrsMethod Y D2 Server method that copies specific Does not pass the context user in any Supports the -context_user argument.
attributes to or from other repository Lifecycle method call in the default Method is executed with an
objects related to the selected document configurations. Administrator session. Context
through dm_relations, for example, to user is used to update last modifier.
synchronize attributes from a document
to a Change Request or from a Change
Request to a document.

132
Attribute Inheritance and Cascading Attribute Rules

Methods Is Context User Param Applicable? Description Lifecycle Configuration Setup Comments
CDFCopyRelationMethod Y D2 Server method that copies Not used in any Lifecycle action in the Supports the -context_user argument.
relationships between the selected default configurations. Method is executed with an
document and a parent document, for Administrator session. Context
example, to repurpose a D2_COPY_OF user is not used in the logic.
relation, which is created by D2 when
a document is created with another
document selected.
CDFCreateObjectAsynchMethod Y D2 Server method that creates a new Not used in any Lifecycle action in the Supports the -context_user argument.
object of a specified type, optionally default configurations. Method is executed with an
CDFCreateObjectMethod inheriting content from a specified Does not pass the context user in any Administrator session. Context
template document, attributes from the Lifecycle method call in the default user is used to update last modifier.
currently-selected object, establishing a configurations.
relation to that object, encapsulating that
object as a child of a new parent virtual
document and adding the new object to
existing virtual documents.
CDFCreateRelationMethod Y D2 Server method that establishes Does not pass the context user in any Supports the -context_user argument.
relationships between the selected object Lifecycle method call in the default Method is executed with an
and a set of target objects identified by configurations. Administrator session. Context
a DQL query, or an individual target user is used to create audit entry.
object identified by object type and object
name.
CDFD2ContextOrderExportMethod N NA
CDFD2ContextReorderMethod N NA
CDFDeleteObjectAsynchMethod Y D2 Server method that deletes a specified Does not pass the context user in any Supports the -context_user argument.
object, or a set of objects associated with Lifecycle method call in the default Method is executed with an
it (for example, related objects). configurations. Administrator session. Context
CDFDeleteObjectMethod Y Passes the context user in all the Lifecycle user is used to create audit entry.
method call in the default configurations.
CDFDeleteRelationMethod Y D2 Server method that removes Does not pass the context user in any Supports the -context_user argument.
relationships for the selected document, Lifecycle method call in the default Method is executed with an
for example, to detach change requests configurations. Administrator session. Context
from an "Effective" version of a user is used to create audit entry.
document.
CDFGroupGeneratorMethod N D2 server method that generates the NA NA
security groups for a selected group
generator object as per the configurations
set on the group generator object.

133
Attribute Inheritance and Cascading Attribute Rules

Methods Is Context User Param Applicable? Description Lifecycle Configuration Setup Comments
CDFGroupMaintenanceMethod N D2 server method that generates the NA NA
Group maintenance spreadsheet for
a selected group generator object
according to the configurations set on
the group generator object.
CDFHardDeleteMethod Y CDF method to perform Hard Delete Not used in any Lifecycle action in the Supports the -context_user argument.
on documents by user based on the default configurations. Method is executed with an
cd_delete_config specified in docbase. Administrator session. Context
Audittrail entries to be captured prior to user is used to create audit entry.
Hard Delete, along with reason code.
CDFInitializeArtifactMethod Y D2 Server method that is used for Passes the context user in all the Supports the -context_user argument.
enabling individual documents or Lifecycle methods calls in the default Method is executed with an
batches of documents to be initialized configurations. Administrator session. Context
correctly in the Life Sciences solutions user is used to create an audit entry.
in an efficient and scalable manner,
through a single-server method call.
CDFLoadAttributesFromContentFileMethod N NA
CDFNotifyUsersAsynchMethod Y Sends notification messages to users in Does not pass the context user in any Supports the -context_user argument.
the appropriate roles on a set of objects Lifecycle method call in the default Method is executed with an
through the D2 InBox, optionally routing configurations. Administrator session. Context user
CDFNotifyUsersMethod Y messages through email. Not used in any Lifecycle action in the used in resolving attribute expression.
default configurations.
CDFReAssignWorkflowPerformers N NA
CDFSaveAttributesInContentFileMethod N NA
CDFSetAttributeMethod Y D2 server method that sets a specified Does not pass context user in any Supports the -context_user argument.
attribute on an object, or set of objects Lifecycle method call in the default Method is executed with an
identified by a query, optionally configurations. Administrator session. Context
using D2 dictionary lookups, attribute user is used to update last modifier.
expressions, and re-application of D2
configurations to the specified objects.
Note that this method is used to update
objects with attribute values that can be
computed up-front, in terms of attribute
expressions. To set attribute values
based on DQL query restuls, you can use
the UpdateAttrsViaQuery method as an
alternative.
CDFSetVersionNumberMethod N NA

134
Attribute Inheritance and Cascading Attribute Rules

Methods Is Context User Param Applicable? Description Lifecycle Configuration Setup Comments
CDFShareRelatedContentMethod Y D2 Server method that is used to share NA Supports context_user. Method is
the content of the documents between executed with an Administrator session.
domains. This is used to copy the
content of the master document from
TMF to the RD (Clinical) child document
in case of the cross-domain functionality
is enabled.
CDFSetWFLists Y The method is responsible for updating
the workflow attributes of the document.
The method is invoked when the
document is created in the (init) state of
its lifecycle. Therefore, it must run on
context user session to capture the audit
correctly.
SSVIndexSubmissionFoldersMethod Y SSV Method to index or reindex a series Lifecycle methods calls in the default Supports the -context_user argument.
of submission folders in the repository, configurations. Method is executed with an
to enable them to be viewed in the Administrator session.
Submission Viewer tool.
SSVRefreshSubmissionViewsMethod Y After a Regulatory Application Lifecycle methods calls in the default Supports the -context_user argument.
Registration Form (RARF) has been configurations. Method is executed with an
updated for Application Description, Administrator session. This method is
the new description is reflected in the also supported with Async version for
Imported Submissions View XML files parallel processing capabilities.
for title of RARF.
SSVUpdateCorrespondenceDocumentsMethod Y SSV Server Method that updates the Lifecycle methods calls in the default Supports the -context_user argument.
Internal Correspondence Documents configurations. Method is executed with an
for post-processing steps with regard Administrator session.
to Regulatory Activity Package,
Submissions, Application, and so on.
CDFUpdateAttrsViaQueryMethod Y D2 Server method that runs a DQL Does not pass context user in few Supports the -context_user argument.
"select" query and uses the results Lifecycle method call in the default Method is executed with an
to populate the attributes of the configurations. Administrator session. Context
currently-selected object, objects user is used to update last modifier.
returned by the query itself, or objects
returned by a separate query. Note
that this method is designed to be used
to update objects with attribute values
computed through DQL queries. If
they can be computed up-front, you
can use the SetAttribute method as an
alternative.

135
Attribute Inheritance and Cascading Attribute Rules

Methods Is Context User Param Applicable? Description Lifecycle Configuration Setup Comments
CDFUpdateChangeRequestAttributesMethod Y The method is responsible for updating
the change request attributes when the
change request is checked in by adding
SOP documents to it. The method is
invoked when the document is checked
in by the user. Therefore, it must run on
context user session to capture the audit
correctly.
CDFUpdateRepeatingAttribute Y D2 Server method that updates a Does not pass context user in any Supports the -context_user argument.
repeating attribute in place, adding, Lifecycle method call in the default Method is executed with an
updating, or deleting a specific value as configurations. Administrator session. Context
indicated. user is used to update last modifier.
CDFVirtualDocumentMethod Y D2 server method that enables virtual Passes the context user in all the Supports the -context_user argument.
document structures to be processed in Lifecycle methods calls in the default Method runs in a user session if context
various ways: configurations. user is passed. Context user is used to
• component documents can be added create audit entry.
or removed,

• "snapshots" (assemblies) taken;


lifecycle transitions applied selectively
to the constituent documents,

• attributes propagated from the root to


the relevant subordinate documents,
and

• validation to be applied on the


structure prior to invoking lifecycle
operations on it.
CDFWorkflowManagerMethod Y D2 Server method that halts, resumes, The method is no longer in use in the Supports the -context_user argument.
restarts, or terminates D2 workflows. default configurations. Method is executed with an
Administrator session. Context
user is used to create audit entry.
SSVDiscoverSubmissionFoldersMethod Y Does not pass the context user in any Supports the -context_user argument.
Lifecycle method call in the default Method is executed with an
configurations. Administrator session. Context
user is used to create audit entry.
SSVImportSubmissionMethod Y Passes the context user in all the Supports the -context_user argument.
Lifecycle methods calls in the default Method is executed with an
configurations. Administrator session. Context
user is used to create audit entry.
TMFAdminMethod N NA

136
Attribute Inheritance and Cascading Attribute Rules

Methods Is Context User Param Applicable? Description Lifecycle Configuration Setup Comments
TMFBulkUploadMethod Not used anymore
TMFCreateLinks N NA
TMFImportExportPackageAsynchMethod Y Provides methods for initializing, Passes context user in all the Supports the -context_user argument.
TMFImportExportPackageMethod zipping, and unzipping TMF document Lifecycle methods calls in the default Method runs in a user session if the
packages containing TMF document configurations. context user is passed. Context user is
content and metadata, suitable used to create audit entry.
for exporting, offline editing, and
re-importing.
TMFNotifyDocumentsNeedIndexing N NA
TMFReconcileArtifactsAsynchMethod Y D2 Server method that reconciles Does not pass the context user in a few Supports the -context_user argument.
a set of TMF documents against a Lifecycle method calls in the default Method is executed with an
file plan, as defined in a Clinical configurations. Administrator session. Context
TMFReconcileArtifactsMethod Trial Registration Form (CTRF) in Passes the context user in all the user is used to update last modifier.
Documentum. As a result of this process, Lifecycle methods calls in the default
any missing TMF placeholders for configurations.
required, recommended, or optional
artifacts specified in the file plan are
generated as required, and the existing
placeholders with matching documents
are updated or removed, depending on
whether or not they refer to repeatable
or non-repeatable artifacts, respectively.
Redundant placeholders referring to
unplanned artifacts (for example, those
that have been removed from the file
plan) are also deleted. The current
progress statistics for the CTRF are then
updated for each planned stage, plus the
trial overall; the validation status and
current progress of each planned artifact
is also recorded in the file plan, if file
plan validation or progress tracking is
enabled.
TMFUpdateContributorGroupsMethod N NA
TMFUpdateFolderAccessForDocumentMethod Not used in TMF methods.
TMFUpdatePlaceholderForDocumentMethod Y D2 Server method that copies, updates, Passes the context user in a few Supports the -context_user argument.
or deletes the TMF artifact placeholder Lifecycle method calls in the default Method is executed with an
for a document that has been uploaded configurations. Administrator session. Context
by the user, either as a new version of user is used to update last modifier.
the placeholder itself, or as a stand-alone
document.

137
Attribute Inheritance and Cascading Attribute Rules

Special Naming Conventions


The Special Naming Conventions alias is supported in the TMF Unique Artifacts dictionary
configuration in D2. The alias can be used to override the default naming conventions for placeholders
and/or documents on a per-artifact basis. The alias values are applied in the following ways:
• By the reconciliation process when placeholders are generated.
• By document lifecycle actions when TMF documents are created, as and where applicable.
If an alias value is not defined for a particular artifact, the standard D2 auto-naming rules apply. The
specified naming convention can be applied later so that it can include references to properties
generated by D2 auto-naming, such as document ID numbers. The uses_special_naming_conv flag
is also set to true for these items to enable subsequent D2 auto-naming to be suppressed for these
documents.

Artifact-based Autonaming and Attribute Lists


(LSTMF)
To add flexibility to the LSTMF system, the UI can specifically display a set of attributes depending
on the TMF 3.0 reference model artifact being worked on. Additionally, the naming convention of
the object is based on the values of the attributes on the object. The TMF Unique Artifacts Names
3.0 dictionary uses a Document Naming Convention alias to set the object name of the document
during certain lifecycle transitions using the artifact_name attribute as the key. The Attributes and
RequiredAttributes aliases are used to get the attribute values through the optional and mandatory
text fields. These fields appear on the property page of the document during creation or indexing.
The attribute, variable_part_of_name, captures any additional information that should be appended
to the name of the document in the D2 Client. The Instructions alias provides the instructions to
update the Variable Part of Name field, which is available in the property pages. The following image
shows the aliases in a sample TMF Unique Artifacts Names 3.0 dictionary:

The system sets a new attribute, attribute_list, that contains a pipe delimited list of attributes that
are appropriate for the selected document. This attribute is also set by the CDFSetAttributeMethod,
using the TMF Unique Artifact Names dictionary Attributes alias to set the attribute_list during
certain lifecycle transition using the artifact_name attribute as a key. During indexing, the system
obtains the attribute_list value in real time from the dictionary, using the artifact_name of the
placeholder selected during indexing.

138
Attribute Inheritance and Cascading Attribute Rules

Each attribute that is part of this attribute_list is placed on the property page(s) with a visibility
condition resulting in the attribute being displayed only if the attribute appears in the attribute
list defined for that artifact.
Note: Attributes should not be editable when the document is in the Effective (Approved or Final)
state as this can cause the object name to change. If there are temporal attributes that are always
editable, care should be taken that these are not part of the naming convention because after a
document is approved, its object name should not change.
In the case of Bulk Import/Export (BIE), the functionality has been updated to incorporate the
RequiredAttributes aliases for an artifact and use them accordingly in the autonaming process. The
BIE spreadsheet has been updated with 41 additional columns and the schema has been updated
accordingly to store the values for required attributes for the artifacts during BIE. When creating the
BIE spreadsheet, the user has to manually add the required attributes to each of the artifacts listed in
the spreadsheet to ensure that the autonaming works as expected after BIE. For the list of required
attributes for a particular artifact, refer to the TMF reference model.

Configuring the Existing Dictionary for Artifact-based


Autonaming
Although the artifact-based autonaming and attribute list is available out-of-the-box in the TMF
Unique Artifacts Names 3.0 dictionary, this configuration can be extended to the existing dictionaries
such as the TMF Unique Artifacts Names 2.0 dictionary. To configure the existing dictionary to
support artifact-based autonaming and attribute list:

1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Data > Dictionary.
4. Under Dictionaries, select TMF Unique Artifact Names 3.0 and click Export.
5. Select Excel as the format and click OK.
6. Under Dictionaries, select the existing dictionary such as TMF Unique Artifact Names 2.0
and click Export.
7. Select Excel as the format and click OK.
8. Open both dictionaries and copy the Document Naming Convention, Attributes,
RequiredAttributes, and Instructions columns from TMF Unique Artifact Names 3.0 to TMF
Unique Artifact Names 2.0.
9. For each artifact listed in the TMF Unique Artifact Names 2.0 dictionary, update the values in
these columns as required.
10. Under Dictionaries, select TMF Unique Artifact Names 2.0 and click Import.
11. Import the updated TMF Unique Artifact Names 2.0 dictionary Excel sheet and click Save.

139
Attribute Inheritance and Cascading Attribute Rules

Preconfigured Cascading Attributes Rules


The following table describes the preconfigured cascading attribute rules included in Documentum
for eTMF, Documentum for Research and Development, and Documentum Submission Store and
View:

Configuration Solution Description Parameters (Method


Rule Name Modules Arguments)
Change Activity LSRD Propagates Activity Reference reg_activity
Reference Number changes to the relevant _reference_num =
Number LSSSV documents. $arg(new_value)
Change LSSSV Propagates Application Description $arg(old_value) =
Application changes to the relevant submission Previous Application
Description documents. Description
for Imported
Submission $arg(new_value) = New
Documents Application Description

Change LSSSV Propagates Application Description $arg(old_value) =


Application changes to the relevant submission Previous Application
Description folders. Description
for Imported
Submission $arg(new_value) = New
Folders Application Description

Change LSSSV Propagates Application Description $arg(old_value) =


Application changes to the relevant submission Previous Application
Description subfolders. Description
for Imported
Submission $arg(new_value) = New
Subfolders Application Description

Change LSRD Propagates Application Description $arg(old_value) =


Application changes to the relevant documents. Previous Application
Description LSSSV Description
for Regulatory
Documents, $arg(new_value) = New
Correspondence Application Description
and Registration
Forms
Change LSRD Propagates Application Number $arg(old_value) =
Application changes to the relevant documents. Previous Application
Number LSSSV Number

$arg(new_value) = New
Application Number

140
Attribute Inheritance and Cascading Attribute Rules

Configuration Solution Description Parameters (Method


Rule Name Modules Arguments)
Change LSSSV Propagates Application Number $arg(old_value) =
Application changes to the relevant submission Previous Application
Number for elements. Number
Submission
Elements $arg(new_value) = New
Application Number
Change LSSSV Propagates Application Number $arg(old_value) =
Application changes to the relevant submission Previous Application
Number for folders. Number
Submission
Folders $arg(new_value) = New
Application Number
Change LSSSV Propagates Application Number $arg(old_value) =
Application changes to the relevant submission Previous Application
Number for registration form. Number
Submission
Registration Form $arg(new_value) = New
Application Number
Change LSSSV Propagates Application Number $arg(old_value) =
Application changes to the relevant submission Previous Application
Number for sub folders. Number
Submission Sub
Folders $arg(new_value) = New
Application Number
Change Product LSTMF Invokes through the Change $arg(old_value) =
Code Product Code lifecycle action for previous product code
LSRD Product Registration Forms. It value
propagates product code changes
LSSSV $arg(new_value) = new
to the relevant documents and
registration forms. product code value

Combine Product LSTMF Invokes through the Combine $arg(old_value) =


Codes Product Codes lifecycle action for previous product code
LSRD Product Registration Forms. It value
merges documents and associated
LSSSV $arg(new_value) = new
registration forms for one product
into another, and deletes the product code value
current Product Registration Form.
get Source LSQM Inherits the source document None
information to information to newly-created
Copy document using technology
transfer.
Inherit Controlled LSSSV Applies role-based access controls -event "Update"
Folder Role to the subordinate controlled
Changes To folders of a parent controlled
Sub-Folders folder, in a top-down manner.

141
Attribute Inheritance and Cascading Attribute Rules

Configuration Solution Description Parameters (Method


Rule Name Modules Arguments)
Inherit Controlled LSSSV Applies role-based access controls -event "Update"
Folder Role to the subordinate controlled
Changes To documents of a parent controlled
Subordinate folder, in top-down manner
Documents (including historic versions).
Inherit Clinical LSTMF Invokes automatically whenever a None
Trial Registration new Documentum for Research and
Form Info LSRD Development Control Category 1-3
Clinical or Clinical TMF document
is created. It copies trial-related
attributes from the relevant Clinical
Trial Registration Form to the new
document.
Inherit Non LSRD Invokes automatically whenever a None
-Clinical Study new Documentum for Research and
Registration Form Development Control Category 1-3
Info Non-clinical document is created.
It copies non-clinical project-related
attributes from the relevant
Non-clinical Study Registration
Form to the new document.
Inherit Product LSTMF Invokes automatically whenever None
Registration Form a new Control Category 1-3
Info LSRD document is created in any
domain. It copies product-related
LSSSV
attributes from the relevant
Product Registration Form to the
new document, where applicable.
Inherit Project LSRD Copy project-related information to None
Registration Form the Trial Registration Form based
Info on the project settings.
Inherit Quality LSRD Invokes automatically whenever a None
Project new Documentum for Research
Registration Form and Development Control
Info Category 1-3 Quality document is
created. It copies project-related
attributes from the relevant Quality
Project Registration Form to the
new document, where applicable.
Inherit RAP LSRD Propagates role changes for -event "Update"
Application Regulatory Activity Package
And Submission LSSSV Registration Forms (RAPs) to the -if "reg_activity_permit
Role Changes To associated Regulatory Application _reqd = 'WRITE'"
RARFs Registration Forms (RARFs).

142
Attribute Inheritance and Cascading Attribute Rules

Configuration Solution Description Parameters (Method


Rule Name Modules Arguments)
Inherit Regulatory LSSSV Copy RARFs related to Activity replace corresponding
Activity Package to a new Correspondence application
Package Info To document. _descriptions,
Correspondence application_numbers,
regions, country_codes,
submission
_procedure_types,
submission_types,
submission_numbers,
submission_info_object
_id from cd_reg_admin
_activity_info

replace product_codes
Inherit Regulatory LSRD Copy product-related attributes replace corresponding
Application from the relevant Regulatory dosage_form,
Registration Form LSSSV Application Registration Forms dosage_strength,
Info To Activity of the activity package to Activity indication from
Package Package. cd_reg_admin_info

replace corresponding
product_generic_name,
product_compound_id,
product_chemical
_names, inn_names
from cd_reg_admin
_info

replace corresponding
product_trade_name,
product_trade_country
from cd_reg_admin
_info
Inherit Regulatory LSSSV Copy Regulatory Application None
Application related attributes from the
Registration relevant Regulatory Application
Form Info To Registration Forms to a new
Correspondence Correspondence document, based
on the application_description
setting.

143
Attribute Inheritance and Cascading Attribute Rules

Configuration Solution Description Parameters (Method


Rule Name Modules Arguments)
Inherit Regulatory LSRD Copy Regulatory Application None
Application related attributes from the
Registration Form LSSSV relevant Regulatory Application
Info To Regulatory Registration Forms to a new
Administrative Regulatory Administrative
document, based on the
application_description setting.
Inherit Regulatory LSRD Copy role-based attributes from the None
Application relevant Regulatory Application
Registration Form LSSSV Registration Forms to a new
Roles To RD regulatory documents, based on
Documents the application_description setting.
Inherit Single LSSSV Copy Single Regulatory None
Regulatory Application-related attributes
Application from the relevant Regulatory
Registration Form Application Registration Form to
Info a new document, based on the
application_description setting.
Inherit SRF LSSSV Applies role-based security to -event "Update"
Role Changes archived submission documents
To Submission based on the associated Submission
Documents Registration Form (SRF).
Inherit SRF LSSSV Applies role-based security to -event "Update"
Role Changes archived submission folders based
To Submission on the associated Submission
Folders Registration Form (SRF).
Inherit SRF LSSSV Applies role-based security to -event "Update"
Role Changes archived submission subfolders
To Submission based on the associated Submission
Subfolders Registration Form (SRF).
Inherit TMF LSTMF Invokes automatically whenever a None
Placeholder Info new Clinical Trial Master File (TMF)
Control Category 1-3 document is
created. It copies attributes from
the relevant placeholder to the new
document, where applicable.
MD Inherit LSRD Copy trial-related attributes None
Clinical Trial from the relevant Clinical
Registration Form Trial Registration Form to a
Info new document, based on the
clinical_study_num setting.

144
Attribute Inheritance and Cascading Attribute Rules

Configuration Solution Description Parameters (Method


Rule Name Modules Arguments)
MD Inherit LSRD Copy Regulatory Application None
Regulatory -related attributes from the
Application relevant Regulatory Application
Registration Form Registration Forms to a new
Info To Regulatory Regulatory Administrative
Administrative document, based on the
application_description setting.
Merge Product LSTMF Invokes as part of the Combine $arg(old_value) =
Info to PRF Product Codes lifecycle action for source product code
LSRD Product Registration Forms. It value
copies product-related information
LSSSV $arg(new_value) =
from one Product Registration
Form to another. target product code
value
Move Clinical LSTMF Invokes through the Reassign to $arg(old_product_code
Trial to New Product lifecycle action for Clinical ) = previous product
Product LSRD Trial Registration Forms. It enables code value
a set of clinical trial documents to
be moved to a different product. $arg(new_product
_code) = new product
code value
Move Site LSTMF Invokes through the Change Site $arg(site_id) = site
Registration to Location lifecycle action for Site identifier code
New Location Registration Forms. It updates a
TMF site registration to refer to a $arg(site_name) = site
different country. name / label

$arg(old_country_code
) = previous country
code value

$arg(old_area_code)
= previous area code
(U.S. state code, where
applicable)

$arg(new_country
_code) = new country
code value

$arg(new_region) =
new region code value

$arg(new_area_code) =
new area code (U.S.
state code, where
applicable)

145
Attribute Inheritance and Cascading Attribute Rules

Configuration Solution Description Parameters (Method


Rule Name Modules Arguments)
Move TMF Site LSTMF Same as Move Site Registration to Same as Move Site
Documents to New Location. (It is part of the Registration to New
New Location same operation.) Location.
Re-Apply Security LSTMF This configuration is used in the None
on Documents Protect or Unprotect product
LSRD scenarios. This is used to move
all the documents that belong
LSSSV
to a particular product from
the Open security model to the
Product security model (Protect
scenario) and from Product to
Open (Unprotect scenario) security
model.
Re-Apply Security LSTMF This configuration is used in None
on Folders the Protect or Unprotect product
LSRD scenarios. This is used to recalculate
the security on all the folders and
LSSSV
subfolders for a particular product
when the product is protected or
unprotected.
Rename Site LSTMF Invokes through the Change Site $arg(site_id) = site
Name lifecycle action for Site identifier code
Registration Forms. It updates a
TMF site registration to refer to a $arg(old_site_name)
different site name. = previous site name /
label

$arg(new_site_name) =
new site name / label
Rename Site-level LSTMF Same as Rename Site. (It is part of Same as Rename Site.
TMF Documents the same operation.)
Update Clinical LSTMF Invokes through the Update Trial None
Trial Info Info lifecycle action for Clinical
LSRD Trial Registration Forms. It
reapplies updated inherited clinical
trial information to the relevant
documents.
Update Product LSTMF Invokes through the Update None
Info Product Info lifecycle action
LSRD for Product Registration Forms.
It reapplies updated inherited
LSSSV
product information to the
relevant documents and associated
registration forms.

146
Attribute Inheritance and Cascading Attribute Rules

Configuration Solution Description Parameters (Method


Rule Name Modules Arguments)
Update Reference LSTMF Copy metadata from the selected None
from Document product document to its references.
Update Reference LSTMF Update the metadata of a reference None
New Study from its study.
Update LSRD Copy product-related attributes replace corresponding
Regulatory from the relevant RARFs of dosage_form,
Activity Product LSSSV the activity package to Activity dosage_strength,
Info To Activity Package. indication from
Package cd_reg_admin_info

replace corresponding
product_generic_name,
product_compound_id,
product_chemical
_names, inn_names
from cd_reg_admin
_info

replace corresponding
product_trade_name,
product_trade_country
from cd_reg_admin
_info

147
Attribute Inheritance and Cascading Attribute Rules

Configuration Solution Description Parameters (Method


Rule Name Modules Arguments)
Update LSSSV Copy Regulatory Application replace corresponding
Regulatory Registration Forms related application
Activity to Activity Package to a _descriptions,
Package Info To Correspondence document. application_numbers,
Correspondence regions, country_codes,
submission
_procedure_types,
submission_types,
submission_numbers,
submission_info_object
_id from cd_reg_admin
_activity_info

replace product_codes
Update LSSSV Copy product-related attributes replace corresponding
Regulatory from the relevant RARFs of the application_description
Activity activity package to Correspondence as trade_name
Product Info To documents. _applications,
Correspondence product_trade_name,
product_trade_country
from cd_reg_admin
_info

replace corresponding
indication, application
_description as
indication_applications
from cd_reg_admin
_info

Using a Custom Attribute Inheritance Rule to


Reapply D2 Configurations to Selected Objects
Occasionally it may be necessary to reapply D2 configurations, such as autonaming, autolinking,
and security to existing objects in the repository. For example, after you change the auto-filing rules
or upgrade from a previous release of the Life Sciences solution that uses new auto-filing rules. To
reapply the D2 configurations, you can create a custom attribute inheritance rule and invoke it
through a DQL statement.

1. To create a custom attribute inheritance rule, log in to D2 Client as a member of the Controlled
Document Administrators group (cd_admingroup) or the installation owner (for example,
dmadmin).
2. Select New > Content from the menu bar.
3. In the Creation profile field, select System Adminstration.

148
Attribute Inheritance and Cascading Attribute Rules

4. In the Document Type field, select Auto Inherited Attributes Rule and click Next.
5. On the Edit properties page, in the Configuration name field, type a name for the attribute
inheritance rule. For example, Reapply D2 Configurations. Type an optional description to
explain its purpose. The system automatically enables the rule by default.

6. On the Rule applicability tab, configure the rule as described in the following table:

Field Description
Automatically applies to new Do not select this option. You should only enable it for
objects rules that the system automatically applies when users
create documents. For this rule, you invoke it only when
necessary using a DQL query.
Order of precedence Leave this option set to 1- High. You can ignore this
option because it only applies when multiple rules apply
to the same object to ensure that the rules apply in the
appropriate order.
Applies to object types Select the wildcard symbol * (any object type). This field
defines the scope of the rule by identifying the documents
in the repository that apply to the rule.

When you execute the rule, you specify the cabinet and
folder path of the top-level folder containing the relevant
documents explicitly in a –folder_path method
argument. It is not necessary to specify an –id argument
(a context object) in this field.

149
Attribute Inheritance and Cascading Attribute Rules

7. Skip the Inherit from tab. This tab specifies the source objects used for copying inherited
attributes and this rule does not apply inherited attributes.
8. On the Inherit to tab, specify the target objects to process as described in the following table:

Field Description
Inherit to Type a DQL qualifier (of the form <object-type>[(all
)] where <criteria>) that identifies the target
objects to update.

For this rule, type:

cd_controlled_doc(all) where folder('$arg


(folder_path)', descend)

which selects all controlled documents at or below a


specified folder including historic versions. (You specify
the folder in the –folder_path method argument when
you execute the rule.)

To only update the CURRENT versions, omit (all) after


the object type.
For checked-out objects Specify how to handle checked-out documents.

For this rule, select Skip update (default). Any currently


checked-out documents do not update until the relevant
users check them in. At that time, D2 reapplies the
configurations.

You can also elect to force check-ins or temporarily bypass


the check-out locks for these documents. If you use the
force check-in or bypass options, they could impact users
who are currently working on those documents.

9. Skip the Attributes and Folder updates tabs. They do not apply to this rule.
10. On the Deletion tab, select Delete empty folders. When documents auto-file to new locations,
the system deletes any remaining empty folders automatically, including any empty parent
folders along the chain towards the cabinet level.
11. On the Post-processing tab, type the wildcard symbol * in the following Reapply D2
configurations fields:
• Reapply D2 autonaming where
• Reapply D2 autolinking where
• Reapply D2 security where
Leave the Custom processing fields blank.
12. On the Auditing tab, type or select values in the fields.
13. Click Next. The system creates the rule in the /System/CDF/Auto Attribute Inheritance Config
folder with the other rules.

150
Attribute Inheritance and Cascading Attribute Rules

14. To execute the rule, log in to Documentum Administrator or IDQL as the repository installation
owner (for example, dmadmin) and issue a DQL statement of the following form:
execute do_method with method = 'CDFApplyAttributeInheritanceMethod',
arguments = '-docbase_name <repository-name> -user_name dmadmin
-password "" -auto_inherit_config "Reapply D2 Configurations"
-folder_path "<repository-folder-path>'
You do not need a password if you execute the query from the Documentum Server as the
installation owner.
For example:
execute do_method with method = 'CDFApplyAttributeInheritanceMethod',
arguments = '-docbase_name documentum -user_name dmadmin -password
"" -auto_inherit_config "Reapply D2 Configurations" -folder_path
"/Clinical/Cardiology-Vascular/Vasonin"'
The method should return a success code when it is completed. The Java Method Server
Applications log file provides additional details. For example:
%DOCUMENTUM%\jboss7.1.1\server\DctmServer_MethodServer\logs\ServerApps
.log
You can tune the performance of the method by adjusting the max_threads and content_servers
settings in the Documentum D2 System Parameters dictionary.

Extensions to Cascading Attributes and


Auto-Inheritance Rules to Support Auditing
To support auditing of role changes at the individual object level, extensions to the existing CDF
cascading attributes functionality and the cd_auto_inherit_config object type have been made. The
new options enable auditing to be applied selectively to objects modified by a cascading attributes
rule, or a sub-set of those objects, based on DQL qualifiers.
The audited event code and optional audit trail arguments are specified in the cd_auto_inherit_config
rule as and where appropriate, and can include attribute values of the source folder or target object in
each case, via $-prefixed and @-prefixed attribute expressions. This mechanism that can be extended
to other cascading attribute rules as required.
When applying D2 auto-naming, the event code is taken into account, that is, the auto-naming rule for
new objects is used for the Create events, and the auto-naming rule for existing objects is used for the
Update events. This enables auto-numbering to be applied selectively to newly-created objects only.
The following table lists the news attributes added to the cd_auto_inherit_config object type to
support the auditing of role changes:

151
Attribute Inheritance and Cascading Attribute Rules

Attribute Type Description


apply_auditing_to CHAR(255) Enables auditing to be applied to a subset
of the impacted objects based on a DQL
qualifier, just as for the existing apply_d2
_auto_naming_to, apply_d2_auto_linking_to,
apply_d2_security_to, and apply_plugin_to
settings. If left blank, auditing is disabled. If
set to “*”, it applies to all impacted objects,
otherwise it applies to those impacted objects
that meet the specified criteria (For example,
r_object_type = ‘cd_reg_submission_info’).
audited_event_name CHAR(64) Specifies the event name to be recorded in
the audit trail for each impacted item, where
applicable. If null or undefined, auditing is
disabled.
audited_event_args CHAR(200) Specifies a list of arguments to be recorded
REPEATING in the audit trail. Up to five arguments can
be specified, and they may contain attribute
expressions prefixed by “$” symbols referring
to attributes of the context object (the first
source object, by default), and attribute
expressions prefixed by “@” symbols referring
to attributes of the target (impacted) object in
each case, as and where applicable.
automatic_events CHAR(32) Specifies the optional event code(s) for which
REPEATING this rule is applicable. This is used to filter
“automatic” rules, for example, the applicable
events can be Create or Update (or both).

Extensions to the CDF ApplyInheritedAttributes Method


The existing Auto Inheritance Configuration properties page includes an Auditing tab, where new
auditing properties can be configured as shown in the following image:

152
Attribute Inheritance and Cascading Attribute Rules

The automatic_events setting has been added to the main Rule Applicability tab, after the
Automatically applies to new objects setting. This setting becomes visible only if the Automatically
applies to new objects option is selected. A D2 dictionary, Auto Inheritance Event Codes, has been
defined for this, with the values “Create” and “Update”.
In the CDF ApplyAttributeInheritance server method:
• An optional –event argument enables the running of automatic rules based on specific events. The
default value for this is “Create”. This is to simplify the lifecycle configuration. It enables lifecycle
transitions to invoke the method for either “Create” or “Update” events, as appropriate, without
having to specify the rule names explicitly. If the rules to apply are not specified explicitly in the
–auto_inherit_config argument, it identifies all rules applicable to the selected object type that
are enabled, designated as automatic, and have a matching event code in the automatic_events
list. For backwards-compatibility, automatic rules without automatic_events defined for them
are considered as “Create” events.
• The auditing precondition rules are applied to each modified target object, and a post-processing
task generated for each qualifying object. These can then be submitted for processing in parallel or
distributed manner.
• The existing task processing method for each object has been extended to generate an audit
trail entry for each modified object.

153
Attribute Inheritance and Cascading Attribute Rules

154
Chapter 8
Workflows

Workflows comprise tasks that provide business logic to the lifecycle phases and pass content from
one state to another.

Workflow Roles
The Life Sciences solution provides a number of predefined, generic workflow templates and
associated D2 workflow configurations to process controlled documents. The number of states and
workflow task performers depends on each workflow.
Note: Workflow availability is determined by the type of document.
The typical workflow initiator for a workflow is a Document Author or Coordinator. A workflow
initiator is also the workflow supervisor. When an initiator starts a workflow, a dialog box appears
where the workflow task performers must be specified. It is not possible to start a workflow without
specifying the performers.
The workflow task performers must be of the following roles:
• Document Author
• Reviewer
• Format Reviewer
• Approver
• QO Approver (LSQM only)
• Coordinator
• TBR Reader (LSQM only)

Workflow Diagrams
The Life Sciences solutions provide generic workflows to process Control Category 1-3 documents.
Workflows are composed of tasks that provide business logic to the lifecycle phases and pass content
from one state to another.

155
Workflows

For Collaborative Editing (Categories 1-3)


Use this workflow to send a Control Category 1-3 document for collaborative editing before sending
it to one of the review and approval workflows. This workflow is configured for all solutions.

The system automatically performs the tasks indicated by the icon. Users with specific roles

perform the tasks with the icon. Users perform the tasks described in the following table:

User Task Description


Edit document Reviewers review and edit the document.

Incorporate changes Authors incorporate the changes from each Reviewer.

Submit for Review and Approval (Category 1)


Use this workflow to send Control Category 1 documents for formal review and approval. This
workflow is configured for Documentum for Quality and Manufacturing only.

Users perform the tasks described in the following table:

156
Workflows

User Task Description


For Review Reviewers review and annotate the content.
Draft Authors review feedback on the document. After the Approver demotes the
document to Draft, the Author can skip the review task after making the
changes and can send the document directly for approval.
Draft – Do not allow Authors review feedback on the document and resend the edited draft
skip document for another round of review. Authors cannot skip the review
of the document.
For Approval Approvers approve the document. The document must be electronically
signed on approval.
For Approval no Sig Approvers approve the document. Electronic signature is not required on
approval.
QO Approval Quality Organization (QO) Approvers approve the document.
Correct Invalid The system validates the list of participants assigned as Approvers/QO
Approvers Approvers of the document and checks if the workflow initiator is assigned
as the sole participant in these roles. If so, the document is rejected back to
the workflow initiator who must correct the invalid list of Approvers.
Ready to Make After all of the review and approval tasks are finished, the Document
Effective Coordinator releases the document to the Release Pending state.

Submit for Approval (Category 1)


Use this workflow to send Control Category 1 documents directly for approval and skip the formal
review. This workflow is configured for Documentum for Quality and Manufacturing only.

Users perform the tasks described in the following table:

User Task Description


Draft Authors review feedback received from Approvers on the document.
For Approval Approvers approve the document. The document must be electronically
signed on approval.
For Approval no Sig Approvers approve the document. Electronic signature is not required on
approval.

157
Workflows

User Task Description


QO Approval Quality Organization (QO) Approvers approve the document.
Correct Invalid The system validates the list of participants assigned as Approvers/QO
Approvers Approvers of the document and checks if the workflow initiator is assigned
as the sole participant in these roles. If so, the document is rejected back to
the workflow initiator who must correct the invalid list of Approvers.
Ready to Make After all of the review and approval tasks are finished, the Document
Effective Coordinator releases the document to the Release Pending state.

Periodic Review (Category 1)


Use this workflow to send Control Category 1 documents that are in the Effective state for a periodic
review. This workflow is configured for Documentum for Quality and Manufacturing only.

User Task Description


Coordinator When the workflow is triggered on the defined periodic review date, the
assigned Document Coordinator receives a notification to initiate the review
process. The Document Coordinator assigns the list of reviewers for the
periodic review and initiates the review process.
Review Reviewers review the document. All reviewers in the list of participants
must review the document and accept the task for the review to complete
successfully.

Reviewers have the option to either accept the task using the Reviewed and
No Revision Required option or reject the task using the Reviewed and
Revision Requested option. If any of the Reviewers reject the document,
the system creates a new minor version of the document and a task for the
Author to acknowledge.
Acknowledge The Author accepts the document rejected by the Reviewers and
acknowledges that revisions need to be made to the document, which is
in the Draft state.

158
Workflows

Withdraw Document (Category 1)


Use this workflow to send Control Category 1 documents that are in the Effective state for
approval before being Withdrawn. This workflow is configured for Documentum for Quality and
Manufacturing only.

User Task Description


Approvers The list of Approvers must approve the withdrawal of a document and
provide the withdrawal reason before the document can be Withdrawn. If
any one of the Approvers rejects the task, the document does not change to
the Withdrawn state and the workflow completes.

Recall Document
There are two recall workflows that trigger when a recall operation is initiated through the
Controlled Print widget, based on the whether the recipient of the controlled copy of the selected
document is internal or external. These workflows are configured for Documentum for Quality and
Manufacturing only.
• Internal Recipient Recall Workflow
If the controlled copy being recalled was sent to an internal recipient, then this workflow is
initiated. In this workflow, the internal recipient is required to sign-off on the recall operation
by providing an electronic signature. On completion, the Requestor Acknowledgment task is
sent to all members of the cd_controlled_print group. After any member of cd_controlled_print
acknowledges the recall task, the workflow ends and the controlled copy is considered recalled.

• External Recipient Recall Workflow


If the controlled copy being recalled was sent to an external recipient, then this workflow is
initiated. In this workflow, only a requestor sign-off is required. After the requestor signs off, the
workflow ends and the controlled copy is considered recalled.

159
Workflows

Submit for Review and Approval (Change Request)


Use this workflow for Change Requests. This workflow is configured for Documentum for Quality
and Manufacturing only.

Users perform the tasks described in the following table:

User Task Description


Review CR Document Coordinators review the Change Request. When finished,
Document Coordinators promote or reject the document.
Edit CR Authors review the Change Request rejected by the Coordinator, make the
necessary changes, and promote the document for a re-review or directly
for approval.
Correct Invalid The system validates the list of participants assigned as Approvers of the
Approvers Change Request and checks if the workflow initiator is assigned as the sole
participant in these roles. If so, the Change Request is rejected back to the
workflow initiator who must correct the invalid list of Approvers.
Approve CR Document Approvers review the Change Request. When finished,
Document Approvers approve or reject the document.
Acknowledge Users assigned as Acknowledgers can acknowledge that the Change Request
Changes is approved and accepts the task to complete the workflow. The state of
the Change Request changes to CIP.

160
Workflows

Submit for Approval (Change Request)


Use this workflow for Change Requests. This workflow is configured for Documentum for Quality
and Manufacturing only.

Users perform the tasks described in the following table:

User Task Description


Approve CR Document Approvers review the Change Request. When finished,
Document Approvers approve or reject the document.
Edit CR Authors review the Change Request rejected by the Approver, make the
necessary changes, and promote the document for approval.
Correct Invalid The system validates the list of participants assigned as Approvers of the
Approvers Change Request and checks if the workflow initiator is assigned as the sole
participant in these roles. If so, the Change Request is rejected back to the
workflow initiator who must correct the invalid list of Approvers.
Acknowledge Users assigned as Acknowledgers can acknowledge that the Change Request
Changes is approved and accept the task to complete the workflow. The state of
the Change Request changes to CIP.

Submit for Review and Approval (Category 2)


Use this workflow to send Control Category 2 documents for formal review and approval. This
workflow is configured for all solutions.

161
Workflows

Users perform the tasks described in the following table:

User Task Description


Draft Authors review the feedback on the document. After the Approver demotes
the document to Draft, the Author can skip the review task after making the
changes and send the document directly for approval.
Draft – Do not allow Authors review the feedback on the document and resend the edited draft
skip document for another round of review. Authors cannot skip the review
of the document.
Review Reviewers review the content.
Approval Approvers approve the document. The document must be electronically
signed on approval.
Approval-no Sig Approvers approve the document. Electronic signature is not required on
approval.
Correct Invalid The system validates the list of participants assigned as Approvers of
Approvers the document and checks if the workflow initiator is assigned as the sole
participant in these roles. If so, the document is rejected back to the
workflow initiator who must correct the invalid list of Approvers.
Document has been After the review and approval tasks are complete, the Author receives a
issued notification to set the review dates.

Submit for Review-Format Approval (Category 2)


Use this workflow to send Control Category 2 documents for review, format review, and approval.
This workflow is configured for Documentum for eTMF and Documentum for Research and
Development.

Users perform the tasks described in the following table:

162
Workflows

User Task Description


Draft Authors review the feedback on the document. After the Format Reviewer
demotes the document to Draft, the Author can send the document to the
Reviewer, Format Reviewer or Approver. After the Approver demotes the
document to Draft, the Author can skip the review task after making the
changes and send the document directly for approval.
Draft – Do not allow Authors review the feedback on the document and resend the edited draft
skip document for another round of review. Authors cannot skip the review
of the document.
Review Reviewers review the content.
Format Review Format reviewers review the content.
Approval Approvers approve the document. The document must be electronically
signed on approval.
Approval-no Sig Approvers approve the document. Electronic signature is not required on
approval.
Correct Invalid The system validates the list of participants assigned as Approvers of
Approvers the document and checks if the workflow initiator is assigned as the sole
participant in these roles. If so, the document is rejected back to the
workflow initiator who must correct the invalid list of Approvers.

Submit for Approval (Category 2)


Use this workflow to send Control Category 2 documents directly for approval. This workflow is
configured for all solutions.

Users perform the tasks described in the following table:

User Task Description


Approval Approvers approve the document. The document must be electronically
signed on approval.
Approval-no Sig Approvers approve the document. Electronic signature is not required on
approval.

163
Workflows

User Task Description


Correct Invalid The system validates the list of participants assigned as Approvers of
Approvers the document and checks if the workflow initiator is assigned as the sole
participant in these roles. If so, the document is rejected back to the
workflow initiator who must correct the invalid list of Approvers.
Draft Authors review the feedback on the document.
Document has been After the approval tasks are complete, the Author receives a notification to
issued set the review dates.

Expiry Review (Category 2)


Use this workflow to send Control Category 2 documents for an expiry review. This workflow
is configured for Documentum for eTMF , Documentum for Research and Development, and
Documentum Submission Store and View.

Users perform the tasks described in the following table:

User Task Description


Review Document Coordinator reviews the document and can withdraw the
document, expire the document, or send it to the Author for revision.
Acknowledge Author acknowledges that revisions are required for the document, which
is in the Draft state.

Submit for Review (Category 3)


Use this workflow to process a Control Category 3 document. This workflow is configured for all
solutions.

164
Workflows

Users perform the tasks described in the following table:

User Task Description


For Review Authors or Reviewers review the document.
Draft Authors review the feedback on the document.

Submit for Delegated Approval (Category 3)


Use this workflow to send reviewed Control Category 3 documents directly for approval. This
workflow is configured for all solutions.

Users perform the tasks described in the following table:

User Task Description


Draft Authors review the feedback from Approvers on the document.
Delegated Approval Delegated Approvers approve the document. The document must be
electronically signed on approval.
Delegated Approval Delegated Approvers approve the document. Electronic signature is not
– no Sig required on approval.

Content Template Approval


Use this workflow to send a newly created content template for approval. This workflow is
configured for all solutions.

165
Workflows

Users perform the tasks described in the following table:

User Task Description


Draft Domain-specific template Authors review the feedback on the document.
For Approval Domain-specific template Approvers approve the document. The document
must be electronically signed on approval.

Review Ingested Document


Use this workflow to review ingested documents and set their status. This workflow is configured
for all solutions.

Document Coordinators perform the tasks described in the following table:

User Task Description


Review The Document Coordinator reviews the newly ingested Category 1, 2, or 3
documents.

Configuring Workflows
Each workflow has a workflow template and a corresponding configuration in D2-Config. To
configure a workflow, follow these steps:
1. Log in to D2-Config.
2. Select Go to > Workflow.
3. Select the workflow you want to configure.
4. In the Task configuration section, you can update the following field:

166
Workflows

Field Description
Subject Subject of the task that appears in the inbox of the performer.
Message Detailed message that explains the task.
Manual acquisition Acquire the task with or without a click action from the user.
Electronic signature Include an electronic signature with the message.
Audit Enable or disable auditing.
Task follow up Task duration can be defined on the this tab. Reminder notifications
can be configured by specifying the number of days before which the
reminder has to be sent and to whom.

Assigning Workflows to Artifacts


You can assign workflows for specific artifacts through taxonomies. The following figure explains
the parts of the taxonomy sheet where you can assign workflows:

In the figure, the highlighted header row indicates the various workflow dictionaries for each Control
Category. For an artifact, a dictionary value for the workflow can either be N (No) or Y (Yes).
For example, for the Debarment Certification artifact, the Collaboration workflow for Category 1
documents is set to N, indicating that the workflow is not enabled. If the value is set to Y, when you
right-click the document in the D2 Client, the Collaboration workflow option appears in the menu.
Before assigning workflows, start with the Object Type to Taxonomy mapping dictionary. This
dictionary holds for each object type of the document the taxonomy/dictionary that is used to
configure the workflow mappings. The alias, is dictionary, is used to specify the value mentioned in
value alias is the dictionary name or the taxonomy name. Check the value specified for the object type
and its dictionary alias setting. Then check that dictionary or taxonomy for the workflow mappings.
To assign workflows:
1. Log in to D2 Config as an administrator.
2. On the main toolbar, select All elements in the filter list.
3. Click Data > Taxonomy.
4. Under Taxonomies, select a Classification by Group taxonomy.
5. In the dialog box, select Modify and save a copy of the sheet on your machine.
6. In the taxonomy Excel sheet, enable or disable the workflow dictionaries for the required artifacts
by setting the values as N or Y.
7. Import the updated sheet.
8. Click Save.

167
Workflows

Modifying the Task Outcome Labels


The Life Sciences solution uses a set of predefined labels configured out of the box to replace the
default ‘Accept’ and ‘Reject’ labels for the various workflows. The user guide for each of the solutions
provides a list of these preconfigured task labels for each of the workflows. However, you can change
these labels in D2 Config according to your specific requirements. To modify the workflow task
outcome labels, follow these steps:

1. Log in to D2 Config as an administrator.


2. On the main toolbar, select All elements in the filter list.
3. Select Go to > Workflow.
4. In the Workflow List, select a workflow for which you want to modify the labels.
5. Under Task configuration, select a task.
6. On the Task parameters tab, under Workflow labels, in the Replace “Accept” with and Replace
“Reject” with fields, replace or add the new labels.
7. Click Save.

Configuring Workflow Messages


To meet business requirements, Administrators can modify when task participants receive workflow
reminder messages in D2 Config.
1. Log in to D2 Config as an administrator.
2. Select All elements from the configuration filter.
3. Select Go to > Workflow from the menu bar.
4. In the Workflow List, select the workflow to change.
5. Navigate to the Task configuration area and select the task to modify.
6. On the Task follow up tab, modify these settings and the messages as needed:
• Task duration in days: Type the total number of days allotted to the task.
• Remaining days before the end of task: Type the number of days remaining in the task
to send a workflow message. When the number of days remaining until the task deadline
reaches the specified number, the selected notification recipients receive a workflow message.
7. Save the configuration.

Workflow and Non-Workflow Attributes


Each workflow task performer value is stored as an attribute of the document. When the workflow is
started, values from these attributes are used to populate the workflow properties. These attributes
are referred to as non-workflow attributes or document attributes.

168
Workflows

In addition to task performers stored in the document attributes (Author, Reviewer, and so on),
another set of attributes are provided, which are unique to workflows. These attributes include a
“wf” prefix such as wf_authors, wf_reviewers, wf_approvers, and so on. These attributes
are referred to as the actual workflow performers. The non-workflow performers or attributes do
not include the “wf” prefix.
When a workflow is started, the non-workflow attribute values are copied to the corresponding actual
workflow attributes. Thereafter, these new attributes take part in the workflow. Any change to the
actual workflow attributes does not affect the corresponding non-workflow attributes. The actual
workflow performers have the necessary permissions on the document only when the document is in
the state where the performer can perform the task.
You can use the Update performers option to update the actual workflow attributes. The modified
values are not stored in the non-workflow attributes and cannot be viewed in the document
properties. Instead, they are stored only as part of the workflow attributes for the workflow.

Configuring Workflow Notification


A mailing list configuration, Workflow Completion, is included in the Life Sciences product,
which enables a workflow supervisor to receive an email notification as soon as a workflow
completes. This configuration is triggered by the WorkflowNotification method that is added under
com.documentum.cdf.methods. This method is deployed as a workflow method and is called from
the Finish state of all the workflow process templates. This method triggers the D2 mail notification
to the workflow supervisor as soon as the workflow is completed for a document.
To configure the email message for the notification:

1. Log in to D2-Config.
2. Select Go to > Mailing list.
3. Under Mailing lists, select Workflow Completion.
4. In the Recipients field, you can add or remove the user groups that should receive the notification.
5. In the Email subject en and Email message en field, you can modify the notification message.
6. Click Save.

Configuring Workflow Task Follow-up Notifications


By default, the Life Sciences solution can send follow-up notifications for workflow tasks for
documents in the For Review and For Approval states across domains. Users can configure these
notifications according to their business needs.
To configure or update the default notification configurations:

1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Go to > Workflow.

169
Workflows

4. Under Workflow list, select the workflow you want to update.


5. Under Task configuration, select the manual task, such as Review or Approval, for which you
want to update the follow-up notification message.
6. On the Task follow-up tab, update the follow-up message configurations.
7. Click Save.

Enabling the Submit for Self Approval Workflow


Menu Option
1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Go to > Menu D2.
4. Under Menus, select CDF Contributor Menu.
5. Under Contextual menus, click <Right click> and select Submit for Approval.
6. Under Actions, in the Action field, select Workflow management action :
"D2_ACTION_WORKFLOW_LAUNCH" (using Configuration "Cat 2 Self Approved
[2-SELFAPPROVE]" for "launchWorkflow" only.
7. Under Parameters, in the Action field, select D2_ACTION_WORKFLOW_LAUNCH.
8. In the Configuration field, type Cat 2 Self Approved [2-SELFAPPROVE].
9. Under Conditions, select User is connected.
10. Click Save.

Disabling the Self-Approve Lifecycle Menu


Option
With the Submit for Self-Approval workflow configured, you might want to disable the Self-Approve
menu option. Follow these steps:

1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Go to > Menu D2.
4. Under Menus, select CDF Contributor Menu.
5. Under Contextual menus, click <Right click> and select Self-Approve.
6. Under Menu properties, select Hide this entry.

170
Workflows

7. Click Save.

171
Workflows

172
Chapter 9
Lifecycles

This section describes the lifecycle state of a document in the Life Sciences solution.

Document Lifecycle
A document created within the Life Sciences solution has predefined states in which it can be present
at any given point of time. These states, as a whole define the lifecycle of the document. Lifecycles are
a very important part of this solution, as security and workflows are directly linked to it. Standard D2
lifecycle configurations are provided as part of the CDF layer to support each of the four security
categories (Category 1-4). Additionally, lifecycles are defined for Change Request, Product and
Project registration forms, and so on.
The lifecycle models for Category 1-3 controlled documents are designed to support the authoring,
review, approval, and controlled release of documents, and also post-release management operations
such as suspension, expiry, superseding, and withdrawal of previously-released versions. They are
closely-related to the corresponding review / approval workflows. The following table shows the
document lifecycle states associated with the document control categories:

173
Lifecycles

Solution Control Categories Lifecycle States


LSQM Category 1 Draft
For Review
For Approval
Release Pending
Effective
Suspended
Superseded
Withdrawn
Expired
Category 2 Draft
For Review
For Approval
Approved
Suspended
Superseded
Withdrawn
Category 3 Draft
For Review
For Approval
Approved
Suspended
Superseded
Withdrawn
Change Request Draft
For Review
For Approval
CIP
Closed

174
Lifecycles

Solution Control Categories Lifecycle States


LSRD Category 2 Draft
For Review
Format Review
For Approval
Approved
Suspended
Superseded
Withdrawn
Expired
Category 3 Draft
For Review
Reviewed
For Approval
Approved
Suspended
Superseded
Withdrawn
LSSSV Category 2 Draft
For Review
For Approval
Approved
Suspended
Superseded
Withdrawn
Expired
Category 3 Draft
For Review
Reviewed
For Approval
Approved
Suspended
Superseded
Withdrawn

175
Lifecycles

Solution Control Categories Lifecycle States


LSTMF Category 2 Draft
Index
Quality Check
For Review
Format Review
For Approval
Final
Suspended
Superseded
Withdrawn
Category 3 Draft
Index
Quality Check
For Review
Reviewed
Final
Suspended
Superseded
Withdrawn

The following tables provides a description of each of the lifecycle states:

Lifecycle State Description


Draft Indicates new documents and versions added and prepared
by Authors.
For Review Indicates documents submitted for review by Reviewers.
Reviewed Indicates that the review phase is completed.
Format Review Indicates documents submitted for format review by Format
Reviewers.
For Approval Indicates documents submitted for sign-off by Approvers.
Release Pending Indicates documents that have been signed-off by all designated
Approvers. Document Coordinators make the document
Effective.
Effective/Approved/Final Indicates documents that are approved for use and current.
Indicated by a major version number. Readers can view the
Effective version of documents.
Suspended Indicates an Effective document that was changed to Suspended
by a Document Coordinator. This state prevents the document
from being used while a modified version is being prepared,
reviewed, and approved. Suspended documents can be
reinstated to Effective if necessary, or changed to Superseded
when the next version or replacement document becomes
Effective. Alternatively, the entire document can be Withdrawn.

176
Lifecycles

Lifecycle State Description


Superseded Indicates a previously-Effective version of a document that
was replaced by a more recent version that was Reviewed,
Approved, and made Effective. In general, Superseded
versions are not recommended because they are, by definition,
out-of-date. However, it is useful to retain them as historical
records.
Expired Indicates a document that was previously Effective but is now
past its expiration date. Typically, Expired documents are not
used, as they can be invalid. Before they expire, the documents
are reviewed and the expiration date is rescheduled or revised
to generate a new version.
Withdrawn Indicates retired documents. All versions are withdrawn
together. Cannot be versioned, but can be copied to create a
document. Document Coordinators can withdraw a document
at any time, which affects all versions. Users are not allowed
to create new versions of a Withdrawn document. However, it
can be reverted to Draft if necessary, to enable its content to be
reused, or deleted. Retain Withdrawn documents as historical
records.
Not Issued Indicates a work-in-progress version of an uncontrolled
document that Authors are preparing and is not yet ready for
general consumption. Like the Draft state of Control Categories
1-3 documents. Default state for all new versions of Control
Category 4 documents.
Issued Indicates a Control Category 4 document that is ready for
general consumption by the Readers. This state is like the
Approved state of Control Category 1-3 controlled documents.
Historic Indicates a Control Category 4 document that is obsolete.
This state is like the Withdrawn state of Control Category 1-3
documents. Authors can delete these documents or retain them
as historic copies.

Document Lifecycle Models


This section describes the state transitions for the control category documents.

LSQM Document Lifecycle Models


This section shows the state transitions for the documents in Documentum for Quality and
Manufacturing.

177
Lifecycles

Control Category 0 Documents Lifecycle

The following figure illustrates the lifecycle state transitions in the Control Category 0 Change
Request Lifecycle Model:

Control Category 1 Documents Lifecycle

The following figure illustrates the lifecycle state transitions in the Control Category 1 Documents
Lifecycle Model:

178
Lifecycles

Control Category 2 Documents Lifecycle

The following figure illustrates the lifecycle state transitions in the Control Category 2 Documents
Lifecycle Model:

179
Lifecycles

Control Category 3 Documents Lifecycle

The following figure illustrates the lifecycle state transitions in the Control Category 3 Documents
Lifecycle Model:

180
Lifecycles

LSTMF Document Lifecycle Models


This section shows the state transitions for the documents in Documentum for eTMF.

Control Category 2 Documents Lifecycle

The following figure illustrates the lifecycle state transitions in the Control Category 2 Documents
Lifecycle Model:

181
Lifecycles

Control Category 3 Documents Lifecycle

The following figure illustrates the lifecycle state transitions in the Control Category 3 Documents
Lifecycle Model:

182
Lifecycles

LSRD/LSSSV Document Lifecycle Models


This section shows the state transitions for the documents in Documentum for Research and
Development and Documentum Submission Store and View.

Control Category 2 Documents Lifecycle

The following figure illustrates the lifecycle state transitions in the Control Category 2 Documents
Lifecycle Model:

183
Lifecycles

Control Category 3 Documents Lifecycle

The following figure illustrates the lifecycle state transitions in the Control Category 3 Documents
Lifecycle Model:

184
Lifecycles

Using Uniqueness Checks to Validate


Transition
Documents go through state transitions very often. A draft document may become Effective (so that
its available for everyone to read) whereas an Effective document might be required to be withdrawn
when it goes out-of-date. The state transitions can be defined in the D2 lifecycle configuration for each
and every state. There might be a large number of states of a document, but from a given state, the
states a document can transition to is defined in the “Next States” section of the lifecycle configuration.
The “Next State” section is available for all states of the document. Users can click on the “+” sign to
add a new transition state. The following options are possible in the next state configuration:
• The actions to perform, for example, check-in and check-out
• The transition type, for example, Promote, Demote
• The label to be displayed for the transition (all possible lifecycle states can be viewed by
right-clicking the state)
• Inclusion of electronic signature, if required
• If there is a need to launch a property pages on transition, that can be defined in a dialog box
• Inclusion of a confirmation message, if required

185
Lifecycles

The transition to any state can be restricted with the help of different checks. Uniqueness checks,
which are conditions, can be posed to a document, such as “validate document is not checked out”.
Any number of uniqueness checks can be created and restrictions can be applied based on any of
them.
There are two types of checks available:
• Entry Condition: This is a generic check that can be applied to enter a state. In order to enter
this target state, the checks specified in the ‘Entry conditions” section must be met. Here, the
source state is any other state.
• Transition Condition: This is a specific check that can be applied from a fixed source state to a
fixed target state. This option is present at the bottom of the lifecycle configuration page. First the
target state has to be selected from the “Next State” section, and then the transition condition can
be selected from the drop-down list.

Normal States and Pseudo States


Normal states are the states which are set to the a_status attribute of the document. These are
visible in the UI and are used where the document state actually changes.
Pseudo states are not actual states. These are not set to the a_status attribute and are not shown
in the UI. They are used to perform actions that do not require a state change on the document For
example: if you want to update the security of a document, you can create a pseudo lifecycle action
such as “(Update Security)”. The pseudo state will be added as the “Next State” for the normal states.
In the “Action Type” section of the lifecycle configuration, you can specify some actions. Also in the
Dialog box field, you can launch a property page that displays the current security settings. Users can
change the security and save it. All these changes do not change the actual state of the document. Life
Sciences follows the standard to enclose pseudo states within brackets.

Creating or Modifying a New Lifecycle


Configuration
There are two ways of creating a new lifecycle:
• By using the New button: From the D2-Config home page, select Go To > Lifecycle. Click the
New button. A blank lifecycle creation page appears. You can provide the details required in the
configuration on this page.
• By using Create from button: This is a simpler way of creating a new lifecycle configuration. You
can select an existing lifecycle and click the Create from button. The entire configuration from
the selected lifecycle is copied to the new one; only the name needs to be different. You need
to change the configuration based on the requirements. This option is useful when there are
commonalities between different lifecycle configurations.
After the lifecycle is created, it needs to be associated to the correct set of documents. You can click on
the Matrix button so that the home page of D2-Config is presented. In the rows section, expand the
lifecycle group. In the columns section, locate the desired context. Once both are located, double-click
the box that marks the intersection of the desired row and the column. Save the changes.

186
Lifecycles

Custom Business Logic Using Lifecycle


Actions
Custom methods can be called from lifecycle actions. Follow these steps to configure custom business
logic for lifecycle actions:

1. Create a custom method.


2. Compile it and create a JAR file.
3. Place the JAR file in the serverApps.ear/lib directory of the Java Method Server.
4. Log in to Documentum Administrator.
5. Go to Methods.
6. Select File > New method.
7. Specify the name, verb, and other fields, and then save the changes.
The newly added method appears in the “Apply Method” list in the lifecycle configuration
page in D2-Config.
8. Select the method and pass the necessary arguments in the Extra Arguments field.
9. Click Save.
The Life Science solution provides JAR files that have several dm_methods built in. Refer to the
javadocs and use any of them based on their requirements.

187
Lifecycles

188
Chapter 10
Security

This section describes the security model used in the Life Science solution.

Controlled Document Foundation Security


Model
The CDF Security Model applies security at four different levels:
• At the document level, through Role-Based Access Controls (RBAC) and Role-Based Lifecycle
Operations (RBLO). Depending on the role of the user on a particular document (if any), and
the current state of the document in its lifecycle, the user has a specified level of access to that
document, and may or may not be able to carry out certain operations on it, such as making the
document “Final”. Without a defined role, the user cannot access the document.
• At the group level, through control groups. Control groups are used to restrict the users
and subgroups that can be assigned to particular roles on a document. For instance, in
order for a user to act as an Approver for a clinical document, the user must belong to the
cd_clinical_doc_approvers control group.
• At the project, study, trial ,or procedure level, through Registration Forms. The Managers of these
forms can change the status of the registration to an Inactive state if necessary. This prevents
documents from being made Final, temporarily freezing the current set of Final documents. The
Manager can reactivate the registration form to enable documents to be made Final again.
• At the product level, through Product Registration Forms. Products can have multiple active
projects, studies, or trials associated with them. Product Managers can make product registrations
inactive to prevent documents from being made Final across all projects, studies, or trials.

Permissions
The Life Sciences solutions configure the permissions of documents based on the user role and the
document category. Permissions are defined for each item in the repository. Permissions identify the
security level needed for a group or user to access the item and their allowed actions.

189
Security

The basic permissions in D2 are:


• NONE: Cannot access any object or object attributes.
• BROWSE: Can view the properties of the object.
• READ: Can view the properties of the object and content.
• RELATE: Same as READ, plus users can add object annotations.
• VERSION: Same as RELATE, plus users can change object content.
• WRITE: Same as VERSION, plus users can alter attributes and change content without updating
the version.
• DELETE: Same as WRITE, plus users can delete any object.
The Documentum D2 User Guide provides additional information on permissions.

Permissions in Documentum for eTMF


The Documentum for eTMF permissions for TMF documents (Control Category 2 and 3) are listed in
the following table for the contributors, authors, document coordinators, and reviewers roles:

State Contribu- Authors Document Reviewers Format Ap-


tors Coordinat- Reviewers provers
ors
Index DELETE DELETE WRITE WRITE WRITE NONE
Draft DELETE DELETE WRITE WRITE WRITE NONE
For Review RELATE RELATE RELATE RELATE NONE NONE
(Cat 2 only)
Reviewed VERSION VERSION READ READ NONE NONE
(Cat 3 only)
For Approval READ READ READ READ READ READ
(Cat 2 only)
Format Review RELATE RELATE RELATE RELATE RELATE NONE
(Cat 2 only)
Effective VERSION VERSION READ READ READ READ
/Approved
/Final
Quality Check VERSION VERSION READ READ READ READ
Superseded READ READ READ READ READ READ
Suspended VERSION VERSION READ READ READ READ
Withdrawn READ READ READ READ READ READ
The Documentum for eTMF permissions for TMF documents (Control Category 2 and 3) are listed in
the following table for the readers and auditor roles:

190
Security

State Readers Auditors


Index NONE NONE
Draft NONE NONE
For Review NONE NONE
(Cat 2 only)
Reviewed NONE NONE
(Cat 3 only)
For Approval NONE NONE
(Cat 2 only)
Format Review NONE NONE
(Cat 2 only)
Effective/Approved/Final READ READ
Quality Check NONE NONE
Superseded NONE READ
Suspended NONE READ
Withdrawn NONE NONE
The permissions for Product Registration Forms, Clinical Trial Registration Forms, and Clinical Trial
Country and Site Registration Forms are listed in the following table:

User Group Permission


Form Managers WRITE
Form Users RELATE
External Readers BROWSE
Admin Group DELETE
Default NONE

Permissions in Documentum for Quality and


Manufacturing
The permissions for Control Categories 1–3 documents are listed in the following table for the
authors, document coordinators, reviewers, and approvers roles:

State Authors Document Reviewers Approvers QO


Coordinators Approvers
(Cat 1 only)
Draft DELETE WRITE WRITE NONE NONE
For Review RELATE RELATE RELATE NONE NONE
For Approval READ READ READ READ READ

191
Security

State Authors Document Reviewers Approvers QO


Coordinators Approvers
(Cat 1 only)
Release Pending VERSION READ READ READ READ
(Cat 1 only)
Effective VERSION READ READ READ READ
(Effective
/Approved/Final)
Expired VERSION READ READ READ READ
(Cat 1 only)
Superseded READ READ READ READ READ
Suspended VERSION READ READ READ READ
Withdrawn READ READ READ READ READ

The permissions for Control Categories 1-3 documents are listed in the following table for the
recipients, readers, and auditors roles:

State Recipients – TBR Readers Auditors Recall


List (Cat 1 only)
Draft NONE NONE NONE NONE
For Review NONE NONE NONE NONE
For Approval NONE NONE NONE NONE
Release Pending READ NONE READ NONE
(Cat 1 only)
Effective READ READ READ NONE
(Effective
/Approved/Final)
Superseded BROWSE NONE READ NONE
Suspended BROWSE NONE READ READ
Withdrawn BROWSE NONE NONE NONE

The permissions for Change Requests (CRs) are:

State Au- Docu- Review- Appro- QO Read- Audi- Regu- Noti-


thors ment Co- ers vers Appro- ers tors latory fica-
ordina- vers Af- tion
tors fairs User
Draft DELETE DELETE NONE NONE NONE NONE NONE NONE NONE
For READ READ RELATE NONE NONE NONE NONE NONE NONE
Review
For READ READ NONE READ READ NONE NONE NONE NONE
Approval

192
Security

State Au- Docu- Review- Appro- QO Read- Audi- Regu- Noti-


thors ment Co- ers vers Appro- ers tors latory fica-
ordina- vers Af- tion
tors fairs User
CIP READ READ READ READ READ NONE NONE NONE READ
Closed READ READ NONE NONE READ READ READ READ READ

Permissions in Documentum for Research and


Development and Documentum Submission Store and
View
The permissions for Control Categories 2-3 documents are listed in the following table for the
Authors, Reviewers, and Approvers roles:

State Authors Document Reviewers Approvers Format


Coordina- (Cat 2 only) Reviewers
tors (Cat 2 only)
Draft DELETE WRITE WRITE NONE WRITE
For Review RELATE RELATE RELATE NONE NONE
Reviewed VERSION READ READ NONE NONE
(Cat 3 only)
For Approval READ READ READ READ READ
Format Review RELATE RELATE RELATE NONE RELATE
(Cat 2 only)
Effective VERSION READ READ READ READ
/Approved/Final
Expired VERSION READ READ READ READ
(Cat 2 only)
Superseded READ READ READ READ READ
Suspended VERSION READ READ READ READ
Withdrawn READ READ READ READ READ

The permissions for Control Categories 2-3 documents are listed in the following table for the
Readers, Auditors, and Regulatory Affairs roles:

State Readers Auditors Regulatory


Operations
(cd_regulatory
_publisher)
Draft NONE NONE WRITE
For Review NONE NONE WRITE

193
Security

State Readers Auditors Regulatory


Operations
(cd_regulatory
_publisher)
Reviewed NONE NONE WRITE
(Cat 3 only)
For Approval NONE NONE WRITE
Format Review NONE NONE WRITE
(Cat 2 only)
Effective/Approved READ READ WRITE
/Final
Expired NONE READ READ
(Cat 2 only)
Superseded NONE READ READ
Suspended NONE READ READ
Withdrawn NONE NONE READ

The permissions for Control Category 4 documents are listed in the following table:

State Authors Readers


Not Issued DELETE NONE
Issued VERSION READ
Historic READ BROWSE

The permissions for Registration Forms are listed in the following table:

Access Control Groups Registration Form Permission


Managers WRITE
User Groups RELATE
Default NONE

These permissions apply to the access control groups on all registration forms. Form managers define
the access control groups on the Access Control tab of the Registration Form properties.
To manage the registration form, the manager must be listed on the Managers list. For example, in the
Product Registration Form, the Product Managers product_mgr1, product_mgr2, and product_mgr3
have the DELETE permission.
To access the registration form, the user group must be on the Primary User Groups list. The
cd_clinical, cd_non_clinical, and cd_quality groups have the RELATE permission.
Users or groups not listed on the Access Control tab of the Registration Form have the NONE
permission.

194
Security

Assignment of Control Categories


Each document type (or artifact) that can be created in the repository is assigned an appropriate
control category automatically. This is done by setting the category attribute (which is defined for the
cd_controlled_doc object type) to the relevant value, 1-4, in the default value template. Examples of
this can be found in the Clinical Documents creation matrix. This contains a row for each artifact
in the DIA Reference Model: for example, Clinical Literature References is assigned to Category 2
through the Clinical Cat 2 Default Values template as shown in the following figure:

You can reconfigure the security categories for the various artifacts by changing the Default values
template settings in the creation matrix. However, you must ensure that the corresponding lifecycle
model is assigned in the Lifecycle column in each case.
After the correct category value is assigned, the corresponding lifecycle and security model can be
applied through the configuration matrix as shown in the following figures:

195
Security

Note that there are four separate lifecycle configurations defined for each of the four control
categories, but only two control configurations are defined: one for Category 1-3 controlled
documents and a separate one for Category 4 documents. This simplifies the reconfiguration process
and maintains a consistent security model, so that access to documents in specific lifecycle states is
defined uniformly. For example, Authors always have DELETE permits on “Draft” documents,
irrespective of the security category. However, these documents may have different lifecycles and
workflows associated with them, depending on the security category in each case.

Role-Based Access Control


The system automatically imposes role-based access controls on all documents created in the
repository and grants the relevant level of access to the appropriate users and groups automatically,
according to the role member settings and the document’s status in its lifecycle in each case. This is
achieved through security configurations in D2.
Internally, D2 uses the security configuration applied to a document to construct an ACL in
Documentum and applies it to the document, granting the role members the appropriate level of
access in each case. Where a user has two or more roles on the same document, the effect is additive,
so that the highest applicable permit level is granted to that user. For example, if a user is designated
as both an Author with WRITE access and a Reader with READ access, the higher permit (WRITE)
takes precedence. If the roles or lifecycle state is updated for a particular document, D2 updates the
underlying ACL automatically. For example, if the document is in a “Draft” state, only the Document
Authors (generally) can access it, but when it becomes “Effective”, the Authors’ permit is reduced to
VERSION, and Readers should be given at least READ access (in practice, they may be given RELATE
access, which enables them to cross-reference the document in change requests, as well as being able
to access the document on a read-only basis).
D2 also has a built-in ACL re-use mechanism to reduce the number of distinct ACLs generated by
the system. This provides certain performance advantages in normal operations but can adversely
affect mass update operations in which the role members are updated for multiple documents. To
avoid these updates, the use of groups instead of individual users as role members is recommended.
The group members can be updated independently of the documents and ACLs that refer to them,

196
Security

without incurring performance penalties. This also makes the system easier to maintain as changes to
role group members immediately affect all documents that use them. If a very large number of role
groups are defined, this can also lead to performance issues, particularly for users belonging to many
groups because of the way in which the Documentum Server filters out objects that the user should
not see while navigating and searching the repository. (As a general rule, if a user belongs to no more
than 20-30 groups this does not cause any significant performance problems, but if they belong to 200
groups or more the performance degradation can be significant.)

Ownership of Category 1-3 Documents


In standard Documentum, documents are owned by the user who creates them, and this user has
special privileges, as they are able to change the ACL permissions assigned to the document in
the repository. This is known to cause problems in role-based access control systems, where the
permissions should be enforced in accordance with business rules, and where the “ownership”
of objects changes as the document progresses through its lifecycle, and as users join or leave the
organization or change roles within it.
To circumvent this problem, the system automatically transfers ownership of all Category 1-3
documents to the admingroup (the Documentum Administrators group), and adds the Creator as
an initial Author/Document Coordinator by default, so that they are given the appropriate level of
access according to the document state. When the document becomes “Final”, the Authors are given
VERSION access but as they are not the owners of the document they created, they cannot override
the permits assigned to them by the role-based access control policy. The Document Creator has no
special privileges; the documents are effectively “owned” by the system, as opposed to individual
users.
To accomplish this, the Category 1-3 lifecycle models contain a dummy initial state, “(init)”, which
transfers ownership to the admingroup, assigns the creating user to the Authors role, sets the initial
lifecycle state to “Draft”, and applies the role-based security accordingly. Thus, although the initial
lifecycle state is set to “(init)”, the actual initial lifecycle state is “Draft”. (The “Draft” state cannot
be used to transfer ownership, because documents are sometimes reverted back to “Draft” from
other states, for example, as a result of creating a new version of an “Effective” document, and this
would fail if the user does not belong to the admingroup. Therefore, a dummy initial state must be
used in the D2 lifecycle configuration.)
In the Cat 1-3 security model, users who can bring about state changes (that is, Authors and
Document Coordinators) are given the “change state” and “change permit” extended permits, so
they can affect state transitions even though they are not the document owners. The owners (that is,
admingroup) are given full permissions, so they can fix up documents as and where necessary. It is
not necessary to reassign Category 1-3 documents to new owners as users leave the organization
or change roles, or for the admingroup to change permissions in order to make adhoc changes,
making the repository easier to maintain.

197
Security

Folder Security
Cabinets and folders may be created automatically by D2 as a result of auto-filing/auto-linking
rules. Access to these folders is governed by separate security configurations that are referenced in
the auto-filing/auto-linking path configuration at each level. This enables access to be restricted to
users in the appropriate functional area master groups, and for cabinets/folders to be hidden to users
in other groups. For example, the auto-linking path for Clinical documents use the Clinical Folder
Security Model configuration, which grants DELETE access to members of the cd_clinical group and
sets the default (dm_world) permit to NONE for all other users, so that only users in the cd_clinical
group can see these folders (apart from the admingroup that is, who, being privileged users, can
always access all folders in the repository). Note that this does not mean that cd_clinical users can
delete any folder in the “Clinical” cabinet – they can only delete empty folders.
Similarly, the auto-filing path for Change Requests (CRQs) uses the Change Request Folder Security
Model configuration at each level, which grants public DELETE access to all users. This enables any
user to create a change request and access it through the Change Requests cabinet/folder structure. It
does not necessarily mean they can delete any change request folder; they can only delete empty
folders. For example, by deleting CRQs that they themselves have raised but not yet submitted, then
deleting the now-empty folder(s) that were created for them.

TMF Dynamic Role-Based Access Control


(LSTMF)
In the Documentum for eTMF solution, external trial participants involved in a clinical trial can be
registered for access to specific parts of the Trial Master File (TMF) related to that trial. This enables
them to access the relevant documents and folders in the repository depending on their role, and
the scope of access required.
In Documentun for eTMF, registration forms represent the TMF. There is a Product Registration
Form for each product and a Clinical Trail Registration Form for the trial. There is also a separate
registration form for each country in which the trial is conducted (Country Registration Forms) and
each site in each country (Site Registration Forms), which can be added to the repository as the
trial is rolled-out. These registration forms are used to build out the TMF in accordance with a
prescribed file plan.
However, managing access to external users centrally in an adhoc manner is not a workable solution.
A trial may involve many sites spread over many countries, involving thousands of external users,
such as Clinical Investigators, Inspectors, and external Authors/Reviewers, who are not part of the
sponsoring organization. Given the large-scale, global nature of clinical trials and the relatively
short-term engagement of external users in each site, coupled with the need to maintain strict
controls over access to the trial documentation, an automated system is required using devolved
administration, to safeguard against IP theft, protect patient confidentiality and to ensure Corporate
governance and regulatory compliance.
To this end, extensions are provided in Documentum for eTMF to enable external users to
be registered for access to a TMF at the appropriate level, that is against a trial, country or
site-registration form, and in an appropriate role, such as Clinical Investigator. Registrations are
valid for a designated period only. They can be set up in advance by local Administrators, and
are activated and deactivated automatically by the system accordingly. The external user roles to

198
Security

be supported are centrally-configurable, and the access levels to be granted to each role on specific
artifacts (document types) in the TMF are also centrally-configurable, according to the security
requirements of the business. In this way, the registration of external users is simplified and devolved
to local Administrators, and the predefined security model is enforced automatically by the system.

External User Registration


External users must be named Documentum users. They must have Documentum user accounts
created for them in the repository. The user accounts for external users can be managed by
conventional means, for example, through Active Directory, and synchronized with Documentum
through the standard LDAP synchronization job.
They are then registered in the system by associating them with a suitable role, such as Inspector or
Contributor, against the appropriate registration form:
• For site-level access, users are registered against the relevant Site Registration Form.
• For country-wide access to all sites in a particular country, users are registered against the relevant
Country Registration Form.
• For study-wide access to all sites in all countries, they are registered against the relevant Trial
Registration Form.
The access level that a registered external user is granted on a particular document – whether they
have read-write access, read-only access, or no access at all – is then governed automatically by the
system, and depends on the following factors:
1. The level at which the user is registered in the system such as for a specific site, country, or
an entire trial (study).
2. The role assigned to the user in their registration such as Inspector, Investigator or Contributor
(the roles are configurable, although these are the standard roles that are provided out-of-the-box).
3. The time period during which the user registration is to be active. This enables access to be
planned in advance, and revoked automatically when the time period expires.
4. The position of the document in the TMF folder structure such as site-specific, country-specific,
trial-level, or product-level.
5. The artifact name of the document such as Investigator Brochure. External users can be given
different levels of access to different artifacts, according to their role. For example, as a
Contributor, they may be able to edit certain TMF artifacts, but not others.
6. The current status of the document in its lifecycle – whether it is a Draft or Final/Effective version.
The system also grants access to the relevant folders in the TMF structure automatically, enabling
external users to navigate the cabinet/folder structure and locate documents in the relevant areas of
the TMF while their registration is active. However, they cannot see the folders for products, trials,
countries and sites that they are not registered to access. They are also provided browse-level access
to the relevant registration forms, so that they can search for documents based on product codes,
trial IDs, countries and sites for which they are registered. However, they cannot search on product
codes, trial IDs, countries and sites that they are not registered to access – access is strictly on a
need to know basis.

199
Security

If an external user is registered for access at a higher level than for an individual site, that is at the
country or trial level, the registration is assumed to apply to all of the relevant sites for that country or
trial. In other words, their registration fans down to the site level. Likewise, if a TMF document is
created at the country, trial or product level, then it is assumed to apply to all of the relevant sites. So,
if a user has been granted access to a particular artifact at the site-level, they also have the same level
of access to the same artifact at the country, trial and product level. In other words, their access also
fans up to the relevant artifacts at the peer levels.
Note that for logistical reasons it is not possible to register external users for compound or
product-level access to all sites across all studies or trials associated with a particular product using
this mechanism. Doing so would impact the system extensively whenever a product-level registration
is updated. However, it may be possible to accomplish this through other means, for example, by
managing access to individual documents manually on an adhoc basis.

Granular Security for Submissions (LSSSV)


For regulatory submission folders and documents, the system assigns role-based security to the
documents and folders in the archived submission folder based on the roles assigned in the
Regulatory Application Registration Form for the associated application. This security is established
at the time of the import of the regulatory submission. However, to address requirements where the
system might want to restrict access to imported submissions initially until they have been verified,
or provide access to specific parts of the submission to the relevant users and groups, and so on, the
system enables managers to assign role-based security at a granular level, that is, at the Submission
Registration Form, submission folders, subfolders, and documents level.
Granular security on submissions provides the following benefits:
• Allow the default security settings for submissions to be specified in advance for individual
pre-registered Submission Registration Forms or at the application or regulatory activity level,
prior to importing them.
• Enables authorized users to change the roles assigned to an individual archived submission
document after it has been imported.
• Enables authorized users to change the roles assigned to an archived submission subfolder or
individual submission document after it has been imported, and these roles are propagated to
the subfolders and documents within that folder automatically.
• Enables authorized users to change the submission roles assigned to a Regulatory Application
Registration Form and have these changes applied automatically by default to any new
Submission Registration Form created for the application. However, this does not impact existing
Submission Registration Forms and existing archived submission folders and submission
documents associated with the application.
• Enables authorized users to change the submission roles assigned to a RAP and have these
changes applied automatically to the related Regulatory Application Registration Forms so that
the new roles apply to all new submissions associated with the activity. However, existing
Submission Registration Form, submission folders, and documents associated with the activity
are not impacted.

200
Security

Security Settings in the Regulatory Application


Registration Form
The following table lists the security attributes that can be set through the Regulatory Application
Registration Form:

Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Application <current-user> cd_regulatory No Access Control
Managers _managers
(form_managers)
Application Users cd_regulatory cd_ad_promo, No Access Control
(form_users) _users cd_clinical,
cd_corres,
cd_labeling,
cd_non_clinical,
cd_quality,
cd_regulatory,
cd_safety
Submission <current-user> cd_regulatory Yes Access Control
Managers _managers
(submission
_managers)
Submission Users cd_submission cd_ad_promo, Yes Access Control
(submission _users cd_clinical,
_users) cd_corres,
cd_labeling,
cd_non_clinical,
cd_quality,
cd_regulatory,
cd_safety
Document cd_regulatory cd_regulatory No Default users
Coordinators _doc _doc /Groups Control
(doc _coordinators _coordinators
_coordinators)
Reviewers cd_regulatory No Default users
(reviewers) _doc_reviewers /Groups Control

201
Security

Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Approvers cd_regulatory No Default users
(approvers) _doc_approvers, /Groups Control
cd_regulatory
_managers
Readers (readers) cd_ad_promo cd_ad_promo No Default users
_doc_readers, _doc_readers, /Groups Control
cd_clinical cd_clinical
_doc_readers, _doc_readers,
cd_corres cd_corres
_doc_readers, _doc_readers,
cd_labeling cd_labeling
_doc_readers, _doc_readers,
cd_non_clinical cd_non_clinical
_doc_readers, _doc_readers,
cd_quality cd_quality
_doc_readers, _doc_readers,
cd_regulatory cd_regulatory
_doc_readers, _doc_readers,
cd_safety_doc cd_safety_doc
_readers _readers
The security changes in the Regulatory Application Registration Form cascades to new Submission
Registration Forms only.

Security Settings in the Submission Registration Form


The following table lists the security attributes that can be set through the Submission Registration
Form:

202
Security

Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Submission <Derived from cd_regulatory Yes Access Control
Managers Regulatory _managers
(form_managers) Application
Registration
Form-(submission
_managers)>
Submission Users <Derived from cd_regulatory, Yes Access Control
(form_users) Regulatory cd_clinical,
Application cd_non_clinical,
Registration cd_quality,
Form-(submission cd_safety,
_users)> cd_labeling,
cd_corres,
cd_ad_promo
The security changes in the Submission Registration Form cascades to the submission folder,
subfolders, and documents.

Security Settings in the Regulatory Activity Package


The following table lists the security attributes that can be set through the Regulatory Activity
Package:

Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Regulatory <current_user> cd_regulatory Yes Access Control
Activity _managers
Managers
(form_managers)
Regulatory cd_regulatory cd_regulatory Yes Access Control
Activity Users _users _users
(form_users)
Submission <current-user>, cd_regulatory Yes Access Control
Managers cd_regulatory _managers
(submission _managers
_managers)
Submission Users cd_submission cd_submission Yes Access Control
(submission _users _users
_users)
The security changes in the Regulatory Activity Package cascades to the Regulatory Application
Registration Form.

203
Security

Security Settings on Submission Folders


The following table lists the security attributes that can be set on submission folders:

Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Folder Managers <Derived from cd_regulatory No Access Control
(folder_managers Submission _managers (non-editable)*
) Registration
Form-(submission
_managers)>
Folder Users <Derived from cd_ad_promo, No Access Control
(folder_readers) Submission cd_clinical, (non-editable)*
Registration cd_corres,
Form-(submission cd_labeling,
_users)> cd_non_clinical,
cd_quality,
cd_regulatory,
cd_safety
* — For submission folders and its children, security can only be adjusted through the Submission
Registration Forms. Therefore, it is non-editable in this Properties page.

Security Settings on Submission Subfolders


The following table lists the security attributes that can be set on submission subfolders:

Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Folder Managers <Derived from cd_regulatory Yes Access Control
(folder_managers Submission _managers
) Registration
Form-(submission
_managers)>
Folder Users <Derived from cd_ad_promo, Yes Access Control
(folder_readers) Submission cd_clinical,
Registration cd_corres,
Form-(submission cd_labeling,
_users)> cd_non_clinical,
cd_quality,
cd_regulatory,
cd_safety
The security changes on the submission subfolders cascades to subfolders and documents.

204
Security

Security Settings on Submission Documents


The following table lists the security attributes that can be set on submission documents:

Attribute Default Users Can be Set with Used in LSSSV Tab Name
Label/Name /Groups the following Security Cascade
Users/Groups
Coordinators <Derived from cd_regulatory Yes Process Info
(doc Submission _managers
_coordinators) Registration
Form-(submission
_managers)>
Readers (readers) <Derived from cd_ad_promo, Yes Process Info
Submission cd_clinical,
Registration cd_corres,
Form-(submission cd_labeling,
_users)> cd_non_clinical,
cd_quality,
cd_regulatory,
cd_safety
Other role-based attributes such as Authors, Reviewers, Approvers, and Auditors are not available in
the Properties page for submission documents.

Dynamic Security Framework


In earlier versions of Life Sciences, security on documents was applied through various sources such
as the default D2 security, the CDF security, default value templates, domain-specific roles that were
not common across solutions, and so on. This security implementation was specific to a domain and
hardcoded into the system making it unpredictable and difficult to maintain. The following diagram
illustrates attributes set on the document by various mechanisms throughout the system:

205
Security

To streamline the security and have it consistent across solutions, the dynamic security framework has
been implemented. This framework allows multiple security models to be in place at the same time. It
also enables users to define additional models with a limited amount of customization and regression.
The framework combines all modifications to the security of a document into a single entry point,
while allowing for document-specific flexibility to be managed by an authorized set of users.
Note: The dynamic security framework is only applicable to LSTMF, LSRD, and the Correspondence
domain in LSSSV. In LSTMF, external roles such as Investigator, Inspector, External Contributor, and
External Reviewer do not use this security framework.
The following diagram illustrates the implementation of dynamic security on documents :

206
Security

Layers of the Security Framework


The security framework includes the following objects that only an Administrator can create:
• Tag
• Group generator
• Folder security model
• Security model
• Shared attribute objects

Tags

Tags are the lowest-level objects in the framework hierarchy. They define how group names are
constructed and how role attributes are resolved. They can be specific to a set of roles that are used in
the creation of user groups. For example, secTag_roles defines the role values such as doc_author,
doc_reviewer, doc_approver, and so on. The secTag_group_names defines the domain group name
values such as cd_clinical, cd_regulatory, and so on.
Tags can refer to a custom list of values or attributes in the repository or a DQL query that can
generate values. This allows the system to know, not only how to construct the required strings for
group names, but also the appropriate value combinations.

Group Generators

Group generator objects enable the system to define how to create a set of user groups to be utilized
by the security models. Group generators are used to:
• Generate groups
• Set up group parents
• Identify patterns for group names used in security models
By using the generator to generate the groups within the definition of the security models, it ensures
that the groups that are needed to support the model exist.
The group generator is generally called with a set of tag definitions to generate the required groups.
These tag definitions can be refined so as to limit the range of values used, for example, for the initial
creation of the groups in the system.
The default group generators are run by the system during the installation of the solution to generate
the set of user groups. You can also manually run the generator in a more limited scope to generate
additional groups as needed when the range of available values changes (for example, additional
products are created when using a product-based group model). To generate the groups, right-click
the generator and select the Generate Groups menu option.
The product groups are not generated on creation of new product; they are generated when the
product is protected based on the Protect lifecycle action on the product registration form. This helps
avoid lot of groups being created in the system which may not be required by the application. You

207
Security

can also create product groups by selecting the group generator, secGroup_ProductGroups, and
select Generate Groups lifecycle action. This creates the product groups for all the products in the
system at that point of time. However, this is not a recommended approach as it may create lot of
groups in the system that may not be required by the application.

Security Model

The security model specifies an attribute set along with a set of group generators that define how
each attribute is populated with groups based on the values of attributes identified by the tags
used in the group generator. These attributes can exist on a single common object shared by many
documents (security model template) as well as on the document. This is used in cases where the
values need to be shared across all objects that have the same set of parameters. Examples include
groups for user selection, groups for ACL security.
The attributes can also exist on the document. In this case, the security model is used where
values need to be initialized similarly across objects that have the same set of parameters but are
manipulated on a document by document basis, or use cases where offset values (that is, not stored on
the document itself) will not function within the D2 framework. For example, the default user/groups
to be set on participants within a workflow.
Security models are used to:
• Define the rules for setting role attributes on documents and registration forms.
Note: In the current system, the security model for registration forms is not supported.
• Define the model as standard or restriction-based (document specific).
• Define the groups that can change the model of the document.
• Define the groups that can change the role attributes on the document.
• Define which role attributes can be modified.
• Define how to calculate what documents should share the shared attribute object. Shared attribute
objects are contentless objects that contain the group information in their attributes and can be
referred to by many documents. This way the information need only be stored once. The link
between the document and the object is through a foreign key in the document pointing to the
shared object. This connection is calculated through a CDF function. The function is evaluated as
a string. Therefore, if you use repeating attributes and the $list function, the sequence will matter.
The model defines whether the document-specific values are allowed and who can set those
values (which are combined with the values on the shared object). If no group is assigned, then no
one has rights to change it.
• Define role attributes that are used by the model:
— Set on the document or on the shared attribute object
— The group generator is used to resolve the group name(s)

208
Security

— The values of the additional tags resolution


— The role attributes on the document that should be cleared before setting (in case they were
set by the previous model)
• Define the folder model to be applied to the containing folders of this document (assuming that
they were created by autolinking when this document was updated). Multiple models can be
listed with filters, such that the first entry to satisfy the filter (a DQL expression using CDF
functions) returning rows is used. A blank filter acts as a catchall entry.
There are different security models that are implemented through the framework. The default
models include the following:
• secModel_Open
• secModel_Manual
• secModel_Product
• secModel_Locked
All the documents within a domain may use a particular security model. For example, all Clinical
documents might use the Open security model. The Open model aligns with the existing domain
level security. It has a restriction level of 0, meaning it is used to be assigned with the reference
model and not manually by a restriction action on a document. However, if you want to restrict
access to a particular document, then you have to change the security model on that document only,
for example, from Open to Manual or Locked.
The Product security model locks down rights to product groups by domain. This model does not
allow a user to change the product code if the product is protected. The user must first unprotect the
product, which reverts all the security to the domain level (Open model). Then change the product
code, then re-protect, which will then generate new groups for the new code. If you create a new
document under the protected product, then the document will have the product security applied on
it and not the default security model specified in the artifact dictionary. You can change the security
model on documents only in the Draft or Approved states.
The Manual security model is a restricted model (restriction level of 1) where the users must be
manually selected by the Document Admin. The Locked security model is also a restricted model
where only users who belong to the cd_admin group can view and modify documents.
By default, the security model for each document artifact is defined within the respective artifact
dictionary. For example, the default security models assigned to non-clinical documents can be found
in the Non-Clinical Artifacts dictionary. Each artifact dictionary includes a new SecurityModel
column wherein the default security models are set for each artifact. To change the model for a
particular artifact, you must edit the dictionary in D2-Config. Updating the security on a document
in the D2 Client does not cascade the change to the dictionary.
Each model defines which contexts it can be used for selection based on:
• Restriction level: The restriction level selected for the document allows only models defined for
that level to be selected. Models that are defined for automatic placement through the dictionary
are set for restriction level 0, and cannot be selected as a restricted model.
• Type of object: Models are defined for the Documents objects.
This prevents the models from being applied to disparate types of objects that would generally
have different assignment rules.

209
Security

The security model also defines two administrative groups:


• Model Admin: Allows the user to change the security model of the document.
• Document Admin: Allows the user to change the document attributes to add additional users
to the security settings. For the Manual security model, this will be restricted_attribute_admin,
who will have Coordinator access to the documents and can use the Update Roles action to
add any users to any roles.
Any change that you make to the security model object after its creation, such as removing a role
attribute and adding a new role attribute, is only applied to the new set of documents that are created.
Changes to the security model are not backward compatible. In addition, some of changes to the
security model object will require a restart of the Java Method Server.

Folder Security Model

The folder security model specifies, using a tag value that is set on the folders when they are created,
the groups that are to be set in the Readers attribute of the folder. The folder security information is
stored in the keywords attribute, which in turn drives the security of the folder. The folder security
model is specified in each security model definition that is assigned to a document. Security at the
folder level is applied based on the first document that the user creates.
In D2-Config, the folder security model is defined in a single security template, Generic Folder
Security Model across domains. This eliminates the need to define multiple security templates
for each domain, which can be cumbersome.
To enable documents that are linked to multiple folders to identify which of the repeating values
to use, the definition also includes the attribute that drives the folder name, the level at which the
folder is created, and the transformation used to generate the folder name (if not a simple application
of the attribute value).

Shared Objects

Shared objects are used to optimize the data on the document and make it easy to maintain, when the
same data is shared across documents.
The shared objects automatically are created by the system when the user creates the first document
using the shared security model, that is, the attribute location defined in the model is set to Shared
instead of Document. By default, all document data is stored in shared objects and each domain
will have one shared object. The Administrator can view all the shared objects in the system using
the Shared Objects widget.

Creating a Tag Object


1. Log in to the D2 Client as the Administrator.
2. Select New > Content.
3. In the Creation profile field, select Security Objects.

210
Security

4. In the Document Type field, select Security-Tag Definition.


5. Type or select values in the fields described in the following table:

Field Description
Tag Definition Name The name of the tag. This is used to reference
the tag in the other configuration settings such
as group generators and security models.

The value should be surrounded by “fencing”


characters such as [] to eliminate possible false
hits.
Type of Source for Values Source of the values for the tag. It can be a
list of static values or a DQL query that can
generate the values.
List of Values This field appears if you select List of Values
in the Type of Source for Values field. You
can add a list of valid values.
DQL for Tag Values This field appears if you select DQL Query in
the Type of Source for Values field. Provide a
DQL query to generate a list of values – first
column only. Can use other tags in the DQL,
assuming they would be resolved first.
Binding Type Type of object (doc_type) that uses this tag,
which will contain the value to resolve when
the tag is referenced. This is an optional field
and is not needed if tag is resolved explicitly
in the model.
Binding Attributes Attributes on the object (doc_type) that is used
to evaluate for tag replacement when this tag
configuration is used.

The attributes used to determine values, if


more than one is specified, it is the union
of the values. If the value specified is not a
CDF $ function, it is assumed it is an attribute
name and the system resolves the set of value
from the document using the appropriate
resolve functions ($value for single, $list for
repeating values). If there is a need for a more
complicated resolution expression, you can
manually enter a CDF $ function expression.
Delimiter to Use in Group Name An extra delimiter to help differentiate the
type of value used when generating group
names. It can be used to surround values to
remove ambiguity if values cross tags.

6. Click Next. The tag object appears in the Security/Tag Definition folder.

211
Security

You can view the properties of the tag object in the Properties widget. Click Edit if you want to edit
the values in the tag object. To delete a tag object, right-click it and select Delete.

Creating a Group Generator


To generate groups, you have to identify:
• The pattern used to create the groups using tags.
• The tags used in generating the groups and the order in which they are processed. If there is a tag
dependency, that is, in order to retrieve the values of one tag, another tag must be resolved, the
order of the tags must take this into account.
• The parent generator if there is one.
• The pattern for assigning parents or an indication to use the pattern of the parent generator.

To create a group generator:


1. Log in to the D2 Client as the Administrator.
2. Select New > Content.
3. In the Creation profile field, select Security Objects.
4. In the Document Type field, select Security-Group Generator.
5. Type or select values in the fields described in the following table:

212
Security

Field Description
Name of the Generator Name of the group generator. This is the
name that will be used in security models to
identify which groups should be assigned by
combining the pattern in the group generator
and the values of the document attribute set
in the tag.

The convention is to use camel case for the


name and not to include spaces.
Pattern for Group Generation The pattern that drives the creation
of the group names. For example,
[group_name]_[roles] pattern combines the
[group_name] and the [roles] and creates the
user groups, such as cd_clinical_doc_author,
cd_regulatory_doc_reviewer, and so on,
during group generation.

Tags specified in the hierarchy are resolved


and replaced within this string. This then
generates a group (if used during generation)
or resolves to a specific group name to be used
in a model. Not all tags may be resolved by
the document but by the security model.

The tags are evaluated as they are defined in


the tag objects. By using brackets in the tags, it
makes it easier to understand the pattern.
Parent Generator Names of the generators that would generate
the parents of any group generated by this
generator. Parent generators generally have
a subset of tags in their hierarchy. You can
specify more than one parent generator. The
system runs the parent generators first to
ensure the parent groups exist.
Use parent Generators for patterns If parent generators are specified, the pattern
for assigning to parents can be implied by the
pattern on the parent generator. Otherwise,
you can add your own parent patterns as
needed in the Parent Patterns field.
Parent Patterns Pattern of the parent that resolves based on
the values for the current group and then uses
that value as a parent group. In general, the
parent pattern should be fully resolved if the
child is fully resolved. Otherwise, tag names
will appear in the parent.

213
Security

Field Description
Pattern for group admin group Pattern for the group that can invoke group
administration from this group generator. For
example, [group_name]_groupadmin. This
pattern should be fully resolved if the child
is fully resolved.
Hierarchy of Tag Processing The list of tags used in the generator in the
order they should be processed.
Can be used in Security Models A flag indicating if this generator should
appear in the security model pages. Clear the
selection for grouping groups that are never
used directly in models.

6. Click Next. The group generator object appears in the Security/Group Generator folder.

Creating a Folder Security Model


1. Log in to the D2 Client as the Administrator.
2. Select New > Content.
3. In the Creation profile field, select Security Objects.
4. In the Document Type field, select Security-Folder Security Model.
5. Type or select values in the fields described in the following table:

Field Description
Name Name of the folder security model object.
Folder Security Model Type of security model to be applied to
the folders. By default, it is either Open or
Product.
Default Level The folder-level tag to use when the tag
defined on the folder cannot be found in the
model, or if the tag defined on the folder is
null. If a folder has a subject value (level tag)
that does not match any of the existing tags,
this is the tag it should be evaluated as.
Use Default for Empty Select this option if folders with blank folder
tags should use the default folder level.
Attribute for Roles Attribute to store the resolved groups. By
default, the values are stored in the keywords
attribute.
Folder Level This is the level tag that is used to match on
a folder’s subject attribute to determine what
groups to store in the folder’s role attribute.

214
Security

Field Description
Group Generator This is the group generator to use to figure out
which groups to place in the role attribute. The
document that triggered the creation of the
folder will be used to evaluate all $ functions.
Tag These are the repeating attributes of the
document that are used in the group generator
to calculate the value that caused the creation
of the folder (the value must be used to create
the folder name) and figure out which groups
to assign. The values can be separated by a
pipe delimiter.
Tag Generator If the autolinking configuration used a
function or dictionary to create the folder
name, a corresponding function must be
defined here to make a match. Use ~tag to
refer the value of the attribute identified
in the tags list. For example, if there was
a country name using the TMF Countries
lookup—$lookup(~tag.TMF Countries,en) The
values can be separated by a pipe delimiter.
Tag Level For each tag, specify set of numbers that
indicate the folder level at which the
corresponding attribute is used. The values
can be separated by a pipe delimiter.

6. Click Next. The group generator object appears in the Security/Folder Model folder.

Creating a Security Model


1. Log in to the D2 Client as the Administrator.
2. Select New > Content.
3. In the Creation profile field, select Security Objects.
4. In the Document Type field, select Security-Security Model.
5. Type or select values in the fields described in the following table:

Field Description
Security Model This is the name that is used to refer to the
model.
Type of Model Type of object on which the model is to be
applied, document or registration form.

215
Security

Field Description
Group for Document Admin Pattern to determine group allowed to
change the role attributes on the document.
For the Manual security model, this is
restricted_attribute_admin.
List of Modifiable Document Attributes List of attributes on the documents that can be
edited by the document admin. By default,
only the Authors attribute is editable.
Group for Model Admin Pattern to determine group allowed to change
the model for the document through the
change model menu action. Generally, the
target models would be restricted models.

For the Manual security model, this is


restricted_model_admin.
Shared Key Generator A CDF expression that when resolved
represents a value that is the same for
documents that use the same shared key
generator (that is, have the same group
settings for security).
Restriction Level It can be used to filter on document based on
document’s restriction_level. 0 means that
this model is only used by resolving on the
reference model. Other numbers are used to
filter the list when the user wants to restrict the
document and picks an associated restriction
level.
Attributes to be cleared from the document List of document attributes to clear out before
applying the model, in case they were set by
the prior model. If an attribute is set by the
model, it is automatically cleared first. There
is no need to include it here.
Attributes to be cleared from the shared object List of shared attributes to clear out before
applying the model, in case they were set by
the prior model. If an attribute is set by the
model it is automatically cleared first. There
is no need to include it here.
Attribute Location Indicates where the role attribute is
located—on the document or on a shared
object.

216
Security

Field Description
Role Attribute Name of the attribute that will store the group
values. Preferably should be a repeating
attribute. If the same attribute appears more
than once, it will be updated with a union of
all the rows.

The alldomain_<roles> is a set of users that


have access to documents in all the domains.

The select_<role> attribute is a set of attribute


values to be shared across the documents. For
example, select_authors is a master list group
that comprises of all the authors across all
domains.
Group Generator The group generator to use to resolve the
attributes on the document. If the attribute is
repeating and has more than one value, you
get the Cartesian product of all permutations
of all the repeating attributes.
Resolved Tag A list of tags and resolved values to apply
to the group generator for tags that are not
resolved from values on the document. Uses
the format [tag]:value,[tag]:value.
Folder Model Name of the folder model for this document
model.
Filter You can use a DQL query to filter the folder
models. If the expression returns rows, use
this folder model. If not, move to the next row.
If blank, use this row.

6. Click Next. The group generator object appears in the Security/Model folder.

Generating the User Groups Manually


To manually run the group generator:

1. Log in to the D2 Client as the Administrator.


2. In the Browse widget, navigate to Security/Group Generator folder.
3. In the Doc List, right-click a group generator and select Generate Groups.
4. In the Values field for a specific tag, type a value or a set of values separated by a pipe delimiter
and click OK. This field can be left blank to iterate all possible values.
The system generates all the groups identified by the generator as well as the parent
configurations to ensure the hierarchy identified can be created as well.

217
Security

5. To edit the parent generator pattern for a group generator, right-click the generator and select
Copy Parent Generator. This clears the Use parent Generators for patterns option and copies the
parent generator patterns into the Parent Patterns field so that they can be edited. You can use
this to break the pattern link to the parent, while getting an initial setting to modify.
6. Right-click a generator and select Link to Parent Pattern to link to a parent generator. This
action ensures that the Use parent Generator for patterns option is selected and the values in
the Parent Patterns field are cleared.

User Group Maintenance


Administrators can use the group maintenance utility tool to manage the groups that are used within
the security configurations. The tool simplifies the process of adding and removing multiple users
from multiple groups across domains. For example, you can add 10 users to the Approvers group
and remove 5 users to the Authors group at the same time using the tool instead of manually adding
and removing each user. The group maintenance tool is an Excel spreadsheet that is created from a
group generator object that is reserved for maintenance.

1. In the D2 Client, log in as the model Administrator.


2. In the Browse view, navigate to the Security > Group Generator folder.
3. Right-click a group generator object and select Reserve for Group Maintenance.
4. In the dialog box, specify a value for each tag and click OK. You can specify multiple values
separated by a comma delimiter.
The status of the group generator changes to Reserved and the object is converted to an Excel
spreadsheet. Once reserved, no one else can process updates to this group generator.
5. Open the spreadsheet and make the necessary changes.
In the spreadsheet, the Administrator can update the Add Users and Remove Users fields under
a particular group column with the relevant usernames they want to add or remove. All the other
fields are read-only. You can use a comma delimiter to list multiple users.
6. Right-click the spreadsheet and select Process Changes. This applies the changes to the groups.
This reads the spreadsheet and add/remove users as indicated to/from the groups in the column
headers. All processing is captured in an XML file that shows the success or failure of each
operation. Additionally, a running results XML file is created as well. Both these files are added
as renditions to the group generator in the Renditions widget.
7. Right-click the spreadsheet and select Release Reservation. The status of the group generator
changes to Active.
8. To change the filters or get the latest values without losing their reservation in the spreadsheet,
right-click the spreadsheet and select Refresh Spreadsheet.

218
Chapter 11
Workspaces and Welcome Pages

This section describes the workspaces and welcome pages used in the Life Science solutions.

Workspaces
A workspace is a container of widgets that allows you to personalize functionality for availability and
convenience. D2 includes preconfigured workspace templates. These templates contain the layout
and positioning of widget areas. The templates also come with a predetermined set of widgets.
Administrators can configure workspaces to contain workspace views. Views function the same
way as workspaces but provide a method for organizing widgets without losing widget-to-widget
interaction.
In the Life Sciences solution, the organization of the D2 workspace is based on user roles. This means
that all role-based view contexts, widgets, menus, and query forms are assigned to a single workspace
that includes the embedded views necessary for users in that role to perform their tasks and activities.

Workspace Views and Tasks


The following table lists the different workspaces available in the Life Sciences solution and the
tasks that can be performed in each of them.

Table 29. Workspace views and tasks

219
Workspaces and Welcome Pages

Roles View Tasks


Business • Browse The following tasks can be performed in this
Administrator workspace:
• Task
• Navigate the cabinets and folders
• Administration
• Search for documents using either quick search
• View Submission or saved searches

• Compare • View details about documents such as Preview,


Properties, Locations, Versions, Renditions,
Relations, Audit trails, Workflow overview, and
Permissions

• Add or remove group memberships

• Manage dictionaries

• Manage taxonomies

• Manage lifecycle transitions

• Manage the workflow

• Create new content, folder, cabinet, and group


(user needs to have Create Group privileges)

• Import content such as file, rendition, or a new


version of a document

• Check in, check out, or cancel the checkout of a


document

• Edit a document

• View a document

• View native content

• Insert annotations

• C2 views (Audit Report, TBR Audit Report)

• Locate a document

• Manage Favorites

• Export a document or a folder

• Print a document

• Request a rendition

• Delete a document

• Create a relation

220 • Copy link to clipboard

• Process workflow tasks


Workspaces and Welcome Pages

Roles View Tasks


Author • Welcome The tasks that can be performed in this workspace
are similar to the Coordinator tasks.
• Browse

• Task

• Dashboard
Reviewer/Approver • Welcome The tasks that can be performed in this workspace
are similar to the Author tasks, except that
• Browse Reviewers/Approvers cannot create or import
documents.
• Task
Consumer • Welcome The following tasks can be performed in this
(read-only) workspace:
• Browse
• Navigate the cabinets and folders

• Search for documents using either quick search


or saved searches

• View details about documents such as Preview,


Properties, Locations, and Relations

• Manage lifecycle transitions

• View a document

• Import a document

• C2 views (Audit Report, TBR Audit Report)

• Locate a document

• Manage Favorites

• Export a document

• Print a document

• Copy link to clipboard

• Process workflow tasks (needed by recipients to


process TBR task)

221
Workspaces and Welcome Pages

Roles View Tasks


Regulatory Manager • Browse The following tasks can be performed in this
workspace:
• Task
• Navigate the cabinets and folders
• Administration
• Search for documents using either quick search
• View Submission or saved searches

• Compare • View details about documents such as Preview,


Properties, Locations, and Relations
• Dashboard
• Manage lifecycle transitions

• View the submissions and also the documents


imported in the submission in the PDF Viewer.

• Locate a document

• Compare two documents.

• Manage Favorites

• View the locations, rendition, versions, relations,


audit, and workflow overview of the content.

• View the virtual document created as part of


regulatory correspondence email import.

• Access the tasks assigned to the user and also


view its attachment and properties.

• Preview the thumbnail of the object and also


provide notes to tasks and view tasks details.

• Control group, dictionary, and taxonomy


administration
Submission • View Submission The tasks that can be performed in this workspace
Contributors are similar to the Regulatory Manager tasks,
• Browse excluding the administration tasks.

• Task

• Compare

222
Workspaces and Welcome Pages

Workspace Groups
Workspace groups provide users with the flexibility to reassign the same workspace to multiple user
roles and not bound to the primary role groups that exist in the default Life Sciences solutions.
Workspace groups provide a mechanism to separate UI-based groups from security groups so that
workspaces can be assigned to the UI groups and not the security groups. The following table lists
the workspace groups and the user roles that are part of that group:

223
Workspaces and Welcome Pages

Solution Group Name Description Member Subgroups


Documentum for ws_lsrd_authors LSRD authors Users of the following
Research and workspace group groups:
Development
cd_ad_promo_doc
_authors

cd_ad_promo
_template_authors

cd_clinical_doc
_authors

cd_clinical_template
_authors

cd_labeling_doc
_authors

cd_labeling_template
_authors

cd_md_clinical_doc
_authors

cd_md_doc_authors

cd_md_non_clinical
_doc_authors

cd_md_regulatory
_doc_authors

cd_non_clinical_doc
_authors

cd_non_clinical
_template_authors

cd_quality_doc
_authors

cd_quality_template
_authors

cd_regulatory_doc
_authors

cd_regulatory
_template_authors

cd_safety_doc_authors

cd_safety_template
_authors

224 ws_lsrd_coordinators LSRD coordinators Users of the following


workspace group groups:

cd_ad_promo_doc
Workspaces and Welcome Pages

Solution Group Name Description Member Subgroups


Documentum ws_lsssv_authors LSSSV author Users of the following
Submission Store and workspace group groups:
View
cd_corres_doc_authors

cd_corres_template
_authors

cd_submission
_archivists
ws_lsssv_consumers LSSSV consumers Users of the following
workspace group groups:

cd_corres_consumers
_imp

cd_corres_doc
_auditors

cd_corres_doc_readers

cd_regulatory_activity
_monitors

cd_regulatory
_consumers_imp
ws_lsssv_coordinators LSSSV coordinators Users of the following
workspace group groups:

cd_corres_doc
_coordinators
ws_lsssv_managers LSSSV managers Users of the following
workspace group groups:

cd_corres_managers

cd_product_managers

cd_regulatory_activity
_managers

cd_regulatory
_managers
ws_lsssv_reviewer LSSSV reviewer and Users of the following
_approver approver workspace groups:
group
cd_corres_doc
_approvers

cd_corres_doc
_reviewers

cd_corres_template
225
_approvers
Workspaces and Welcome Pages

Solution Group Name Description Member Subgroups


Documentum ws_lsqm_authors LSQM authors Users of the following
for Quality and workspace group group:
Manufacturing
cd_gmp_authors
ws_lsqm_consumers LSQM consumers Users of the following
workspace group groups:

cd_gmp_auditors

cd_gmp_consumers
_imp

cd_gmp_readers
ws_lsqm_coordinators LSQM coordinators Users of the following
workspace group group:

cd_gmp_coordinators
ws_lsqm_reviewer LSQM reviewers and Users of the following
_approver approvers workspace groups:
group
cd_gmp_approvers

cd_gmp_qo_approvers

cd_gmp_reviewers
ws_lsqm_managers LSQM managers Users of the following
workspace group groups:

cd_md_managers

cd_md_regulatory
_managers

cd_md_submission
_managers

226
Workspaces and Welcome Pages

Solution Group Name Description Member Subgroups


Documentum for ws_lstmf_authors LSTMF authors Users of the following
eTMF workspace group groups:

cd_clinical_doc
_authors_tmf

cd_clinical_tem
_authors_tmf
ws_lstmf_consumers LSTMF consumers Users of the following
workspace group groups:

cd_clinical_consumers
_imp

cd_clinical_doc
_auditors

cd_clinical_doc
_readers
ws_lstmf LSTMF coordinators Users of the following
_coordinators workspace group group:

cd_clinical_doc
_coordinators
ws_lstmf_contributors LSTMF contributors Users of the following
workspace group groups:

tmf_contributors

tmf_external
_contributors
ws_lstmf_managers LSTMF managers Users of the following
workspace group group:

cd_clinical_trial
_managers_tmf

cd_<domain>_ref
_copy_mgrs
ws_lstmf_product LSTMF product Users of the following
_managers managers workspace group:
group
cd_product_managers
ws_lstmf_inspectors LSTMF inspectors Users of the following
workspace group group:

tmf_inspectors
ws_lstmf LSTMF investigators Users of the following
_investigators workspace group group:

tmf_investigators
227
ws_lstmf_reviewer LSTMF reviewers and Users of the following
_approver approvers workspace groups:
group
Workspaces and Welcome Pages

Workspace Views for Workspace Roles


For the Life Sciences solutions, each out-of-the-box Life Sciences workspace role is mapped to a
workspace view D2 context as shown in the following tables.

228
Workspaces and Welcome Pages

Table 30. Workspace Views for LSTMF Workspace Roles

Workspace Name Welcome Screen Browse Task Quality Check My Sites Dashboard Administration Concurrent View eTMF
WS eTMF Author Y Y Y Y N Y N N N
Workspace
WS eTMF Y Y Y N N N N N N
Consumer
Workspace
WS eTMF Y Y Y Y N N N N N
Contributor
Workspace
WS eTMF N Y Y Y N Y Y N N
Coordinator
Workspace
WS eTMF Inspection Y N N N N N N Y Y
Workspace
WS eTMF N Y Y N N Y N N N
Multi-view
(Coordinators)
WS eTMF N Y Y N N Y Y N N
Product Managers
Workspace
WS eTMF Y Y Y N N N N N N
Reviewer-Approver
Workspace
WS Life Sciences Y N Y N Y N N N N
TMF Investigator
Workspace

229
Workspaces and Welcome Pages

Table 31. Workspace View for LSQM Workspace Roles

Workspace Name Welcome Screen Browse Task Dashboard Administration


WS QnM Multi-view (Authors) Y Y Y Y N
WS QnM Multi-view (Consumers) Y Y Y N N
WS QnM Multi-view N Y Y Y Y
(Coordinators)
WS QnM Multi-view Y Y Y N N
(Reviewer-Approver)

230
Workspaces and Welcome Pages

Table 32. Workspace View for LSRD Workspace Roles

Workspace Name Welcome Screen Browse Task Dashboard Administration


WS RnD Multi-view (Authors) Y Y Y Y N
WS RnD Multi-view (Consumers) Y Y N N N
WS RnD Multi-view (Managers) N Y Y Y Y
WS RnD Multi-view Y Y Y N N
(Reviewer-Approver)

231
Workspaces and Welcome Pages

Table 33. Workspace View for LSSSV Workspace Roles

Workspace Name Welcome Screen Browse Task View Submission Dashboard Compare Administration
WS SSV Multi-view Y Y Y Y Y Y N
(Authors)
WS SSV Multi-view Y Y N Y N Y N
(Consumers)
WS SSV Multi-view N Y Y Y Y Y Y
(Managers)
WS SSV Multi-view Y Y Y Y N Y N
(Reviewer-Approver)

232
Workspaces and Welcome Pages

Display Labels
Labels can be set to show different values for attributes than those that are stored in Documentum
Server. The Life Sciences solution utilizes this functionality to change the way the status label for
Effective documents displays for non-Good Manufacturing Practices (GMP) documents. The display
label is set to show Approved or Final instead of Effective for these documents despite the attribute
being saved in Documentum Server with an a_status of Effective.

Welcome Pages
Welcome pages in the Life Science solution are .jsp files that are called through an external widget
within D2. When the user first logs in to the D2 Client, the Welcome view is displayed with a single
external widget that points to these .jsp files. In addition, there is a utility .jsp file for each Welcome
page that contains the DQL queries and a CSS file that contains the style information. There also
exists an imgs folder that contains the images displayed on the pages.
There are three basic identical Welcome pages for the Documentum for Quality and Manufacturing,
Documentum for Research and Development, and Documentum Submission Store and View for the
following roles:
• Authors
• Reviewers/Approvers
• Readers
Documentum for eTMF includes Welcome pages for the following roles:
• Author
• Contributor
• Inspector
• Investigator
• Reviewers/Approvers
• Readers
Each Welcome page includes a menu bar and a quick action bar. The menu bar changes based
on each workspace. The menu bar provides an actionable button for one view available in the
workspace, which is displayed on the top left of the Welcome page. Clicking this button switches the
user to the specified view.
The menu bar also includes the following buttons with numbers next to them indicating the number
of documents the user needs to address:
• Tasks: Number of workflow tasks assigned to the user. This button appears for all solutions.
• Index: Number of documents the user has uploaded but not yet indexed. This button appears
only in Documentum for eTMF.
• My Sites: Number of key documents uploaded to a site. This button appears only in Documentum
for eTMF.

233
Workspaces and Welcome Pages

These values do not update dynamically. You must refresh the D2 Client page to update the values.
The quick action bar includes the following two buttons:
• Create: Invokes the D2 create_object method and displays the Creation Profile interface
to the user.
• Import: Invokes the D2 import_object method and displays the Import Creation Profile
interface to the user.

234
Chapter 12
Reports

This section provides an overview of reporting implemented in the Life Sciences solutions.

Information Hub
The OpenText Information Hub (iHub), a part of the OpenText Analytics Suite, is a scalable analytics
and data visualization platform that enables IT leaders and their teams to design, deploy, and manage
secure, interactive web applications, reports, and dashboards fed by multiple data sources. iHub
supports high volumes of users and its integration APIs enable embedded analytic content in any
app, displayed on any device.
OpenText Analytics Designer is a tool used to develop and design the reports for the Life Sciences
solutions with help of the Documentum for Life Sciences JDBC connector. Using the Analytics
Designer reporting tool, you can also publish reports directly to iHub. The JDBC connector provides
a connection interface between Documentum platform with the Analytics Designer and iHub. The
JDBC connector is bundled with the Documentum for Life Sciences iHub reports package.

Using the iHub Analytical Designer


You can use the iHub Analytical Designer to update existing reports and add new reports. Before
starting to update/add reports, you must use the Documentum connector to establish a connection for
Documentum Repository with following steps.

Connecting the iHub Analytical Designer with the Life Sciences


Repository

The following optional steps enable the design and deployment of custom iHub reports through
iHub Analytical Designer.

1. One component provided by Documentum for Life Sciences iHub Reports is a Documentum
JDBC connector. This connector enables extraction of repository information through DQL
queries. The Documentum JDBC driver JAR needs dfc.properties to be bundled within the
JAR due to a known limitation. The dfc.properties is bundled inside the Documentum JDBC

235
Reports

jar, lsdmjdbc.jar, and points to C:\Documentum\Config\dfc.properties (on Windows).


You must ensure that the Content Server connection properties are placed in the specified location.
2. Create a file named iHubDocumentumJDBC.properties file on the iHub Designer machine
in the following location:
• <iHub Install location>\modules\BIRTiHub\iHub\bin (on Windows)
• <iHub Install location>\modules\BIRTiHub\iHub\encyc (on Linux)
and add the following values in it:
• dctmAdminUserName=<DOCUMENTUM_INSTALL_OWNER_USER_NAME>
• dctmAdminPassword=<DOCUMENTUM_INSTALL_OWNER_ENCRYPTED_PASSWORD>
• numberOfDFCSessions=<NUMBER_OF_SESSIONS>
3. If you want to encrypt the password for <DOCUMENTUM_INSTALL_OWNER_ENCRYPTED
_PASSWORD>, use the following instruction at a command prompt:
java com.documentum.fc.tools.RegistryPasswordUtils password
4. Place the iHubDocumentumJDBC.properties file in the <Install Directory>/OpenText
/AnalyticsDesigner folder.
5. Place the following JARs in the same location as lsdmjdbc.jar of the JDBC driver to iHub:
• activation.jar
• All-MB.jar
• aspectjrt.jar
• bpmutil.jar
• certj.jar
• ci.jar
• collaboration.jar
• commons-codec-1.3.jar
• commons-lang-2.4.jar
• configservice-api.jar
• configservice-impl.jar
• cryptojce.jar
• cryptojcommon.jar
• dfc.jar
• DmcRecords.jar
• dms-client-api.jar
• jaxb-api.jar
• jaxb-impl.jar
• jcifs-krb5-1.3.1.jar
• jcmFIPS.jar

236
Reports

• jsr173_api.jar
• krbutil.jar
• log4j.jar
• messageArchive.jar
• messageService.jar
• questFixForJDK7.jar
• subscription.jar
• vsj-license.jar
• vsj-standard-3.3.jar
• workflow.jar
• xtrim-api.jar
• xtrim-server.jar
These JARs must be copied from the Documentum Server. Some of the specific JAR file versions
may vary with the Documentum Server version.
6. In the iHub Analytics Designer, create a new project and Data Object.
7. Navigate to the .datadesign file and in the Data Sources section, right-click and select New
Data Sources.
8. On the New Data Source page, select JDBC Data Source and click Next.
9. On the New JDBC Data Source Profile page, select Manage Drivers.
10. On the Manage JDBC Drivers page, on the JAR Files tab, click Add.
11. Select the lsdmjdbc.jar file provided with this package and click Add.
12. Select all the required JARs listed in Step 1 and add them as part of the JAR Files. Click OK.
13. On the Create a new data source page, in the Driver Class list, select com.documentum.ls.oca
.jdbc.jdbc20.ext.DjdbcDriverExt( v7.2).
14. In the Database URL field, type jdbc:documentum:oca:docbaseext@<REPOSITORY NAME>.
where <REPOSITORY NAME> is the Documentum Server repository name.
15. In the User Name and Password fields, you can either provide the Administrator user name
and password (this causes all the reports to be executed using the Admin Session) or configure
as report parameters:
a. If you need to replace the User Name and Password at runtime, remove the default values
from the report parameters.
b. In the Data Design, create two report parameters, username and password with Data type
as String.

237
Reports

238
Reports

239
Reports

c. On the Edit Data Source page, under Property Binding, in the User Name and Password
fields, map the user name and password as newly created report parameters.

d. In the .datadesign file, to replace the user name and password at runtime, place the following
script in the beforeOpen script:
extensionProperties.odaUser = params[“username”].value
extensionPropeties.odaPassword = params[“password”].value

240
Reports

16. Create new Data Sets that provide the DQL queries required to fetch the report data from the
repository. The following sample shows a dynamic mandatory parameter (with Is Required
check box selected) that needs to be passed when executing the report. The parameter should be
the i_chronicle_id of the document selected in D2.
a. In the Edit Parameters dialog box, create another report parameter named chronid, which
represents the Selected Document Chronicle ID and passes it to the document query
retrieving the required document data for the report.

b. Click the data set to open the Edit Data Set dialog box and then click Query.

241
Reports

c. Under Query Text, add the following query for retrieving the required document data
needed by the report:
SELECT
r.object_name AS document_name,
r.title AS title,
r.r_creation_date AS creation_date,
r.r_modify_date AS last_modified_date,
v.alias_value AS status,
r.r_version_label as version_label,
r.primary_group AS "group",
r.subgroup AS subgroup,
r.artifact_name AS artifact,
r.r_object_id as object_id,
r.i_chronicle_id as chron_id
FROM
cd_common_ref_model(ALL) r, d2_dictionary di, d2_dictionary_value v
WHERE
r.i_chronicle_id = '?' and
di.object_name = v.dictionary_name AND
di.object_name = 'Domain Document Status Display' AND
di.alias_name=r.r_object_type AND
v.object_name = r.a_status AND
v.i_position=di.i_position and
r_version_label != ' '
ORDER BY
r.r_creation_date
ENABLE
(ROW_BASED)

d. On the Outline tab in the iHub Analytics Designer, select the newly created data source.
e. For the selected data source, on the Scripts tab, add the following script:
this.queryText=this.queryText.replace("?",params["chronid"].value);
This script ensures that the dynamic content from the Query Text (in this case ‘?’) is replaced
with the added param required to execute the query.
17. To create new report design, you need some draft data to be available in the report designer
preloaded for designing. In the report parameters, username, password, and chronid, provide
default values to create the data object and then save the report design.
Note: These should be reverted after the .data object has been created, otherwise the reports when
packaged will bundle these values as default values. The Default value field must be empty.
18. On the Project Navigator tab, navigate to the newly created .datadesign object, right-click it and
select Generate Data Objects to create the new .data object used in the report design.

Creating an iHub Report


A new iHub Report design has the extension RPTDESIGN added to the name of the file. To create
the report:

1. On the Project tab, right-click the new project and select New > Report. To create a new report
design, see the OpenText Information Hub Designer Guide. For the above example query, the sample
report layout is shown in the following figure:

242
Reports

In the report design, the data object points to the created data object. This needs to be changed to
the Data Design object after the designing is done.

2. After completing the report design, point the data source for the Report Design to the .datadesign
object so that the values are replaced based on the document selected at run time.
3. The Data Object parameters, both new and existing, must be defined as shown in the following
figure:

243
Reports

4. In the report design object, in the beforeOpen field for the .datadesign object, replace the existing
script with the following:
extensionProperties.odaUser = params["username"].value;
extensionProperties.odaPassword = params["password"].value;

5. Publish the new report to the iHub Report Engine.


6. After the report has been published to the iHub Server, the report reference must be updated
in the ReportViewer.html file in the XMLViewer.war deployed location on the application
server.
7. Create a new D2 external widget configuration and reference the report in the Widget URL
with the dynamic parameters.

myInsight
AMPLEXOR myInsight for Documentum (myInsight) is a third-party reporting tool developed by
AMPLEXOR and is integrated with the Life Sciences solutions to provide reporting functionality.
myInsight can be integrated into the Documentum Administrator, Webtop, or D2 user interfaces,
enabling reports to be run directly from those clients. For the D2 integration, myInsight includes
a web application that can be deployed along with the D2-based Life Sciences solution to provide
reporting widgets.
The myInsight web application dynamically generates reports based on Report Definitions. A Report
Definition is a Documentum object that defines one or more variables and DQL queries that will
generate data for the report. The look and feel and the format of a report are defined through Report
Presentation .xsl files, which are referenced in the Report Definition. Report Presentation .xsl files
can be stored as Documentum objects or can be present on the file system where the JMS (Java
Method Server) is running.

244
Reports

The reports enable users to determine the following:


• Documents and placeholders that exist for a specified site (that is, site inventory)
• Trials that exist for a given product
• Missing documents for a selected trial
• Progress of a trial
• Progress of each stage of a trial
• Status of each site for a selected trial
• List of audited events
• Progress of the workflow tasks for a document
• Open tasks in a workflow
• List of document templates
• List of documents assigned for periodic review
• Completed TBR tasks
• Regulatory Submissions for a region
Some Life Sciences reports use the Fusion Charts JavaScript library, which is bundled with myInsight
4.x (http://www.fusioncharts.com/) to generate graphical reports.

Report Widgets
The Life Sciences solution includes reporting widgets to display reports in the D2 user interface.
When a user selects a document that matches the type of document required for a reporting variable
(such as a Clinical Trial, Site, or Product Registration Form), any reporting widgets that are displayed
in the user’s workspace, updates to display the report defined by the widget.
The following figure shows the configuration of a reporting widget.

The Widget type is External Widget. The Widget url references the myInsight web application and
the URL parameters include login information, the object ID of the select objects, and the object ID

245
Reports

of the Report Definition object. The D2_EVENT_SELECT_OBJECT communication channel event is


selected, which enables the widget to be notified when the user selects an object.

Report Generation Process


When you display a report in a Life Sciences solution, the following processing takes place:
1. The Life Sciences solution issues an HTML request to the myInsight web application. The object
ID of the selected object is passed as the objectid URL parameter. For example:
http://localhost:8080/myInsight/?user=$LOGIN&repository
=$DOCBASE&objectid=$value(r_object_id)&openreport
=090002da8007b664&ticket=$TICKET
2. The myInsight web application populates the parameters of the Report Definition queries with
the URL parameter values and executes the queries against the Documentum repository.
3. myInsight converts the query results to XML.
4. myInsight fetches the corresponding Report Presentation XSL and applies the defined
transformations to the XML result in order to output the report with the look and feel defined
in by the Presentation XSL.

Configuring External Widgets for myInsight Reports


The external widget URL needs to be associated with a specific report to make it show the actual
reports. To configure the External Widget for viewing myInsight reports, follow these steps:
1. Query LSTMF Report Definitions.
SELECT r_object_id, object_name, report_name FROM my_report_definition
(ALL) WHERE FOLDER ('/myInsight/Categories/Life Sciences/eTMF')

246
Reports

Each resulting r_object_id represents a specific report. The object id for a Report Definition must
be associated with the External Widgets corresponding to that report. For consistency, object
name and location of the report is used for specific reports.
2. Specify the object_id in the External Widget URL. You can also specify the report location instead
of r_object_id.
This configuration is only required if the report is needed in a standalone widget. Other options
include:
• Run the report from the myInsight Reporting widget.
• Run the report from the "Reports by Object Type" widget.

Accessing the Reports


The myInsight reports in the Life Sciences solution are categorized as follows:
• Standard reports: These report are packaged with the Life Sciences solution and includes folder
reports, repository reports, and user reports.
• Life Sciences Generic reports: These reports enable you to monitor active workflows, audit trails,
document status, and so on. These reports are common across all Life Sciences solutions.
• Solution-specific reports: These report show information related to a specific solution.
You can access the reports in the following ways:
• Dashboard widget: You can view the Life Sciences generic reports through the Dashboard widget.
The Dashboard widget reports do not require user input or a specific document to be selected to
generate the report. These reports provide data for all documents in the repository.
• Report List widget: The Report List widget is available for all Managers and Coordinators in the
Life Sciences solution. You can double-click a report in this widget, provide the required user
input value or artifact, and generate the report for that artifact.

247
Reports

248
Chapter 13
Change Request

The following sections provides steps to disable the Change Request functionality in Documentum
for Quality and Manufacturing, steps to configure Release Pending as the final state for Category 1
documents and steps to prevent a Change Request document from being sent to a workflow when a
non-current version of a document is attached to it.

Disabling Change Request for Category 1


Documents
The following procedure enables you to send a Category 1 document to a workflow without
associating it to a Change Request:

1. Log in to D2-Config as Administrator.


2. Select All elements from the configuration filter on the main toolbar.
3. Select Go to > Workflow.
4. Under Workflow List, select Cat 1 Approve Site Performers [1-GMP-APPROVE].
5. Under Workflow entry conditions, remove the Condition checked by method with the value
CDFValidateAffectedDocumentHasApprovedCR uniqueness condition and click Save.
6. Perform steps 4 and 5 for the Cat 1 Review Approve Site Performers [1-GMP-REVIEW
-APPROVE] workflow.
7. Select Go to > Lifecycle.
8. Under Lifecycles, select QM Cat 1 Controlled Document Lifecycle.
9. Under Properties, in the Lifecycle state table, select (launch withdraw wf).
10. Under Entry conditions, remove the Condition checked by method with the value
CDFValidateAffectedDocumentHasApprovedCR uniqueness condition and click Save.
11. Perform steps 9 and 10 for the (Reinstate as effective) lifecycle state.
To disable the Change Request functionality:

1. Log in to D2-Config as Administrator.


2. Select All elements from the configuration filter on the main toolbar.
3. Select Creation > Creation profile.

249
Change Request

4. Under Profiles, select Change Request.


5. Under Properties, in the Users group field, replace cd_gmp_authors with admingroup.
6. Click Save.

Configuring the Change Request Properties


If you want to configure the properties page of a Change Request such as set the Change Description
and Impact Assessment fields as optional or remove these fields entirely, follow these steps:

1. Log in to D2-Config as Administrator.


2. Select All elements from the configuration filter on the main toolbar.
3. Select Go to > Property page.
4. Under Property page, select Change Request Document Properties.
5. Under Structure, expand Properties, and then expand Affected Documents.
6. To make a field optional, in the left pane, select the particular field and in the right pane, on the
Required condition tab, clear the Creation mode, Edit mode, and Import mode checkboxes.
7. To remove the fields, select a field and click Delete.
8. Click Save.

Configuring Release Pending as the Final State


for Category 1 Documents
You can configure Category 1 documents to have Release Pending as the final state instead of
Effective by using the following steps:

1. Log in to D2-Config as Administrator.


2. Select All elements from the configuration filter on the main toolbar.
3. Select Go to > Lifecycle.
4. Under Lifecycles, select QM Cat 1 Controlled Document Lifecycle.
5. Under Properties, in the Lifecycle state table, select the Release Pending row.
6. In the Action Type table, click the + icon to add an action.
7. Select Make version from the list and under Action Parameters, in the Version type list, select
Same document with major version number.
8. Select the Keep symbolic version label checkbox.
9. Make the new action as the first action in the Action Type table.
10. Click Save.
11. In the Next state table, go to the (Cancel release) row.

250
Change Request

12. Under Transition parameters, remove the values in the Menu label en field and click Save.
13. In the Lifecycle state table, select the Effective row.
14. In the Entry condition table, delete the Condition checked by Method entry condition row that
has the parameter CDFValidateAffectedDocumentHasApprovedCR and click Save.
15. In the Action Type table, delete the Make version action type row and click Save.
16. Select Creation > Default values template.
17. Under Values templates, select GMP Change Request Default Values.
18. In the Properties table, for the cr_document_final_state property, in the Default values field,
append |Release Pending at the beginning of the existing default value. The end value should be
|Release Pending|Effective|Withdrawn|.
19. Click Save.

Binding Rules for Change Requests


A Change Request workflow can be started only when there is at least one document associated with
it and when all the mandatory fields in the affected document tab are filled. The following table lists
the binding rules configured in the system for Change Requests:

Document Version Default (out-of-the-box) CURRENT Binding Rule


Binding Rule
0.X 0.1 0.X
X.Y X.0 X.Y
When the CURRENT binding rule is configured, the Change Request workflow cannot be started
unless the affected documents information is updated with the current version of the child documents.
Update the Change Request properties for the affected documents and save the property page.
A Change Request in the CIP state can be reverted to Draft when the following conditions are met:
• The user requesting the state change must be the Author or Document Coordinator of the Change
Request.
• The associated documents must not be participating in any active workflows. Any active
workflows must be aborted.
• None of the associated documents on the same Change Request has already reached its final
status. For the change type New or Revise, it is Next Major Version. For Reinstate and Withdraw,
it is Bound Major Version with status Effective for the change type Reinstate and Withdrawn for
the Withdraw change type.

251
Change Request

Note:
— Next Major Version: If an X.Y version document is bound to a Change Request, the final
status is Effective or Suspended with the next major version, that is, the document version
will be X.0 + 1.0.
— Bound Major Version: If an X.Y version is bound to Change Request, final status is Withdrawn
with the major version, that is, the document version will be X.0.
A CIP Change Request can be closed when the following conditions are met:
• The user requesting the state change must be the Document Coordinator of the Change Request.
• The associated documents must not be involved in any active workflows.
• All of the associated documents must reach its final status. For the change type New or Revise,
it is Next Major Version. For Reinstate and Withdraw, it is Bound Major Version with status
Effective for the change type Reinstate and Withdrawn for the Withdraw change type.
Note:
— Next Major Version: If an X.Y version document is bound to a Change Request, the final
status is Effective or Suspended with the next major version, that is, the document version
will be X.0 + 1.0.
— Bound Major Version: If an X.Y version is bound to Change Request, final status is Withdrawn
with the major version, that is, the document version will be X.0.
Documents can be sent to the Review/Approval, Approval, or Withdrawal workflows only when
there is a valid Change Request created and approved (indicated by the CIP status) for the document.
For example, one CIP Change Request can be used to make the SOP document final only once. Once
the document reaches its final status, a new CR can be created and approved to make any revisions to
the document or to withdraw the document irrespective of the old CR status.

Exceptions: An X.0 version of a document attached to a CIP Change Request is sent to the Withdrawal
workflow. If the Approvers reject the workflow, the version of the document remains in the X.0
Effective state. The Change Request cannot be closed until the document is Withdrawn.

Configuring the CURRENT Binding Rule for a


Change Request
You can make this configuration at any time in such a way that the system has some Change Requests
that follow the default binding rule and other Change Requests that follow the CURRENT binding
rule. The Change Requests follow the binding rule as defined in the Change Request at the time
of creation.

1. Log in to D2-Config as Administrator.


2. Select All elements from the configuration filter on the main toolbar.
3. Select Creation > Default values template.
4. Under Values templates, select GMP Change Request Default Values.
5. In the Properties table, for the cr_binding_at_creation property, type CURRENT in the Default
values field.

252
Change Request

6. Click Save.

253
Change Request

254
Chapter 14
Controlled Print and Issued Print

The Controlled Print and Issued Print overlay and cover page templates can be created using PDF
acro-forms. PDF acro-forms can be created by creating an Open Office document and then exporting
it to PDF or by creating the PDF using Adobe Acrobat. The following sections describe the permitted
form fields in the templates.

Note:
• By default, Controlled Print and Issued Print are configured for use with Documentum for
Quality and Manufacturing only.
• The Controlled Print functionality is printer driver-agnostic. If you experience any issues during
printing, make sure first that the printer driver is the latest version. If problems still occur,
contact OpenText Global Technical Services and provide the PDF that is failing with the exact
Printer Model and driver information.

Cover Page Configuration


The following table lists the tags that are allowed on the cover page. These tags represent the
acro-fields of the PDF template. In the generated PDF these tags are replaced by the actual values.

Tag Description Controlled Print Issued Print


#controlled_copy Controlled Copy Y
_number Number
#issuedPrintNumber Issued Print Number Y
#requestor Requestor of the print Y Y
#recipient Recipient of the print Y
#selectedAction Print or reprint Y Y
#printSite Issued print site Y
#printLocation Issued print location Y
#printer Name of the printer Y Y
#profile Name of the print Y Y
profile

255
Controlled Print and Issued Print

Tag Description Controlled Print Issued Print


#pageRange Page range printed. Y
Works only for reprints
#profileAttributes Profile attributes and Y
values separated by a
semicolon
#reason Print reason Y
#reprintReason Issued print reprint Y
reason
#datetime Prints the current Y Y
system date and time
#isInternal Controlled Print Y
Internal or External
User indicator

To configure the date/time format for #datetime or any date/time data type property, the
D2ControlledPrint.properties file provides a DATE_FORMAT value that can be
used to specify a Java SimpleDateFormat date/time format. It is also possible to use the
$datevalue(<attribute>,"<Java Date Format>") syntax (for example, $datevalue(r_modify_date,
"dd-MMM-yyyy hh:mm:ss a z")) rather than just the name of the attribute in order to specify different
date formats for various properties. However, this only works for properties on the document object
itself and does not support the #datetime computed value.

Overlay Configuration
Overlay configuration is a two-step process:
1. Configure the profile attributes—Attribute is a descriptive label. The profile attributes section in
the Print widget displays the labels as specified in the attribute. Value represents the look-up
parameter. For example, $object_name retrieves the document name. These parameters are used
to retrieve the values from the repository and present them to the user on the profile attributes
section in the Print widget. These values will be printed on the overlay template.
In the current configuration, this process is no longer required to be performed by the system with
the exception of #user_input, which represents the user-input field and needs to be preconfigured
for the overlay template in the profile. Instead, the system can retrieve any property directly from
the document and make it available in the overlay.
Therefore, you can put the document properties directly in the PDF template and avoid
configuring them in the Print Profile. See Creating the Overlay PDF Template, page 259 for the
steps to create the PDF. This follows the same principles of C2 overlays, which is to put the
Documentum property name as the field name, <property_name>, for example, object_name.
For backward compatibility, it also supports property names from the profile. In addition, it
supports $<property_name>, for example, $object_name, and all of the # coverpage variables as
well. In addition, CDF functions that display document status can be added as values in the PDF
template field. For more information about the CDF functions, see the AttributeExpression

256
Controlled Print and Issued Print

JavaDocs. For an example of how to add CDF function in the PDF field, see Creating the Overlay
PDF Template, page 259.
If you want to specify a value on the Print widget at run-time, you need to configure that value
in the Print Profile.
2. Configure the PDF template—The overlay template acro-fields need to have the same strings
as the attribute labels. In the rendered PDF, the attribute specified in the acro-field is replaced
with either the value retrieved from the repository or with the value specified by the user. This
process is described in the following diagram:

Note that the Profile Attributes are now optional but supported for backwards compatibility in the
overlay as mentioned in Step 1. For example, instead of the form having Modify Date as the field
name, you can use r_modify_date as the field name and not have the Modify date in the profile.

257
Controlled Print and Issued Print

Note: The "controlled_print_stamp" System Parameter must be manually added to force the system
to apply overlays as an overlay and not as an underlay.

Creating the Print Profile


Documentum for Quality and Manufacturing provides three Print Profile configurations, Print
Profile #1, Print Profile #2, and Issued Print Profile #1, each with its own cover page and overlay
configurations, which you can use out-of-the-box. The configurations for the Print Profiles is as
follows:
• Print Profile #1 — This profile does not include a cover page. It includes an overlay and the
attributes, Comments and #user_input.
• Print Profile #2 — This profile includes a cover page and an overlay. The cover page contains the
information, Document Name <$object_name>, Title <$title>, and Version <$r_version_label>.
• Issued Print Profile #1 — This profile does not include a cover page. The overlay includes the
following print information, Print Site <#printSite>, Print Location <#printLocation>, Printer
<#printer>, Issued Print Number <#issuedPrintNumber>, Version <$r_version_label>, status
<$a_status>, Print Date <#datetime>, Print Profile <#profile>, and Batch Number <Batch Number
print profile attribute>.
In both the Print Profile #1 and Print Profile #2, the overlay contains the following information:
Printed by <#requestor>Print Date and Time <#datetime>CC# <#controlled
_copy_number>Printed For <#recipient>
Using these default Print Profile configurations as samples, you can create your own custom Print
Profiles. The supported Print Profile attributes that can be added through the D2-Config UI are:
• #user_input
• $<property>, for example, $object_name
Note: CDF functions are not supported in the overlay in the Print Profile attributes through the
D2-Config UI.
Follow these steps to create a Print Profile:
1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Controlled Printing > Profiles.
4. On the Profiles page, click New.
5. In the Profile Name field, specify the profile name.
6. In the Overlay File field, click the ellipsis button and browse to the template. Select the file and
click OK.
7. In the Cover Page File field, click the ellipsis button and browse to the template. Select the file and
click OK. This field is optional and can be left blank if you do not want to include the cover page.
8. Select the C2 Print Profile you want to associate with the Controlled Print Profile.
9. Specify the Attribute and Value for the overlay template.

258
Controlled Print and Issued Print

10. Click Save.

Importing Print Profiles


If you import the Controlled Print profiles using the commandline D2-Config import, the profiles do
not get imported unless you copy the ControlledPrint JAR files from D2-Config to the Documentum
Server where the import is performed.
You must copy the D2-Config\WEB-INF\lib\ControlledPrintConfig-API.jar and
D2-Config\WEB-INF\classes\plugins\ControlledPrintConfig-Plugin.jar files to
the JMS ServerApps.ear\lib folder before running the Config Import. They can be removed
afterwards or left as they will not cause any harm.

Creating the Overlay PDF Template


Follow these steps to create a custom controlled print overlay PDF template.

1. Create a blank Word document in Microsoft Word and convert it to PDF.


2. Open the PDF in Adobe Acrobat.
3. Select File > Create > PDF Form.
4. In the Create or Edit Form dialog box, select Use the current document or browse to a file
and click Next.
5. Select Use the current document and click Next.
6. In the Tasks pane, click Add New Field and select Text Field.
7. Place the Text Field on the PDF.
8. In the Field Name box, type the name of the controlled print property or document property,
such as #<controlled print property>, $<Documentum Object Property>, <Print Profile Attribute
Name>, and so on. For example, $object_name, #datetime, $version_label, and so on.

CDF functions can also be added as a value in the PDF Field Name box. If you want to display
the document status by retrieving the information from the Dictionary alias, you can use the
following syntax:

259
Controlled Print and Issued Print

$lookup(<property>, <dictionary name>, <dictionary alias>)


For example:
$lookup($value(a_status), Document Domain Status Display,
$value(r_object_type))
The value that you need to add in the Field Name box for a document with a_status = ‘Effective,’
r_object_type = ‘cd_quality_gmp_approved,’ and the following value in the dictionary is:

$lookup(Effective, Document Domain Status Display, cd_quality_gmp


_approved)
In the PDF overlay, Approved is displayed as the result.
9. To add text to the PDF, add another Text Field and in the Text Field Properties dialog box, in the
Name field, type the required text.
10. Fill the property field in the other tab in the dialog box and click Close.
11. Save the PDF.
To import the PDF template to an existing print profile:

1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Controlled Printing > Profiles.
4. In the Profiles page, select a profile.
5. In the Overlay File field, click the ellipsis button and browse to the PDF template. Select the
file and click OK.
6. Click Save.

Mapping the Contexts to the Controlled Print


Profiles
1. Log in to D2-Config as Administrator.
2. In the filter on the main toolbar, select All elements.
3. Under Configuration elements, expand Controlled Print Profiles.
4. Under Contexts, expand Controlled Documents.
5. For the Cat 1 Controlled Documents context, select the Print Profile 1 and Print Profile 2
configuration element.
Note: You can extend the functionality to Category 2 and 3 documents if required.
6. Click Save.
The client URLs for D2 and D2-Config must be configured in D2-Config for the Controlled Print
functionality to work properly.

260
Controlled Print and Issued Print

Setting Up the Printers


1. Add or configure the required physical network printers on the Application Server machine
using the Control Panel > Add Printers option.
2. In D2-Config, add the printer name as it appears on the Application Server to the Controlled
Print Approved Printers dictionary.
3. For Issued Print, additionally perform these steps:
a. In D2-Config, update the Print Locations dictionary as appropriate. The Print Locations
dictionary in conjunction with the GMP Applicable Sites and Controlled Print Approved
Printers dictionaries is used when creating an issued print or reprint.
b. Update the Printer by Site and Location taxonomy as appropriate.

Print Reasons
Refer to the following table and update appropriate dictionaries to add or remove the reasons for
Print, Reprint, and Recall.

Action Dictionary
Print reasons for internal users Controlled Print Reason Internal
Print reasons for external users Controlled Print Reason External
Reprint reasons Controlled Print Reprint Reason
Recall reasons Controlled Print Recall Reason

Issued Print Site-based Groups


Any time the GMP Applicable Sites dictionary is updated, the site-based groups must be generated:
• To create the Issued Print <Site> Approvers groups, right-click the /Security/Group
Generator/secGroup_IssuedPrintSiteApprovers group generator, and select Generate Groups.
This creates the cd_issued_print_<site>_approvers groups for all GMP Applicable Sites. These
cd_issued_print_<site>_approvers groups are placed within the cd_issued_print_approvers
group.
• To create the Issued Print <Site> Reconcilers groups, right-click the /Security/Group
Generator/secGroup_IssuedPrintSiteReconcilers group generator and select Generate Groups.
This creates the cd_issued_print_<site>_reconcilers groups for all GMP Applicable Sites. These
cd_issued_print_<site>_reconcilers groups are placed within the cd_issued_print_reconcilers
group.
Note: These groups are automatically created during the installation of the solution. However,
if a new Site is added to the system, the preceding steps must be executed. This generates the
new site-based issued print groups. Users or groups can also be populated into these new groups
through the Group Generators using the Reserve for Group Maintenance menu option, editing the

261
Controlled Print and Issued Print

spreadsheet, and then selecting the Process Changes menu option. This is the generic Security
Group Generator processing.

Configuring Auto-Recall
Printed documents can be recalled automatically based on the lifecycle state. The following steps
outline the configuration needed for auto-recall:
1. Log in to D2-Config.
2. Select Go to > Lifecycle.
3. Under Lifecycles, select the lifecycle that must be configured.
4. Under Lifecycle state, select the lifecycle state on which the auto-recall needs to be configured.
5. Under Action Type, select Apply method.
6. In the Method field, select ControlledPrintRecallLCMethod.
7. In the Extra arguments field, add the following code:
-objectId "$value(r_object_id)" -service_url "ControlledPrint
/rest/controlledprint/doprint?_username=<<iouser>>&_docbase
=<<docbase>>&objId=<<objid>>" -dql "SELECT document_id as
objectId, requestor, recipient, controlled_copy_num as ccNum FROM
dm_dbo.controlled_print WHERE document_id = '<<objid>>' AND event_name
IN ('print') AND print_status = 'Printed'" -reason "Recall reason"
Note: "Recall reason" can be substituted with any valid reason.
8. Click Save.

Updating the Print Variables in System


Parameters
1. Log in to D2-Config.
2. Click Data > Dictionary.
3. Under Dictionaries, select System Parameters.
4. Update the following print variables:
• controlled_print_server_base_url: URL to the Print Application, for example,
<protocol>://<server>:<port> - http://server123:8080.
• controlled_print_statuses: List of comma separated statuses in which a document must be set
to in order to create a controlled print. Defaults to "Approved,Effective,Release Pending".
• issued_print_statuses: List of comma separated statuses in which a document must be set to
in order to create an issued print. Defaults to "Approved,Effective".

262
Controlled Print and Issued Print

Print Number Prefix Configuration


When creating a Controlled Print, a unique controlled print number is assigned for the document.
The first time a controlled print occurs for a document version, the controlled copy number
is defaulted to 1. Each time a new Controlled Print is created for that document version, the
number is incremented. To more easily distinguish between Issued Print and Controlled Print
numbers, a prefix is included with the number, for example, IP-<#>, CP-<#>. The prefixes for
Issued Print and Controlled Print is configured in the web application in the <Print Web
Application>\WEB-INF\classes\D2ControlledPrint.properties file:
• CONTROLLED_PRINT_NUMBER_PREFIX: By default, this value is not set to a value for
backward compatibility with previous installations of Controlled Print.
• ISSUED_PRINT_NUMBER_PREFIX: By default, this is set to “IP-”.
Users can blank out the prefix setting if required but the system does not allow both settings to be
blank.

Issued Print Virtual Document Publishing


Security
The Administrator configured to log in to the repository for the Print web application must have
WRITE access to virtual document root objects that may participate in Issued Print PDF publishing.
The security for any virtual documents must be configured correctly for this processing to succeed.
Typically, if this user is the repository owner, this will work without any security changes since the
owner of documents is changed to the repository owner.

263
Controlled Print and Issued Print

264
Chapter 15
Virtual Document Templates

A virtual document (VDoc) is a document composed of other documents. VDocs are used to organize
component documents that are located in various folders across functional areas, such as Clinical
Study Report components, CTD Module 2 or 3 components, and so on. Documentum for Research
and Development provides virtual document (VDoc) support where users can create documents,
convert them to a virtual document, and add child documents to it to make a virtual document
structure.
In the out-of-the-box solution, a Clinical Study Report (CSR) Assembly virtual document is shipped
with a predefined structure to accommodate the most common use-case in the industry. The section
provides instructions on how to create custom VDoc templates.
Currently, the Life Sciences solution only supports creating virtual documents in the Clinical
template management. However, template authors can create a template by dragging and dropping
documents into the Virtual Document widget and approve the template in other domains. However,
this template cannot be used when creating documents of that artifact. This is because currently VDoc
template configurations are available for only the Clinical domain.

Installing the Clinical Study Report VDoc


Template
The following steps are run automatically as part of the standard Documentum for Research and
Development installation to install the sample CSR VDoc template. See the Documentum for Research
and Development Installation Guide for more information.
1. The predefined CSR document / folder structure is installed in the /Templates/D2/Study
Report Assembly folder through the LSRD DocApp.
2. The ProcessVirtualDocument command-line tool is used to assemble the folder structure into a
VDoc. Documentum Composer does not support VDoc assembly.
You can use the same technique to install your own VDoc templates. The following figure shows the
default CSR VDoc template:

265
Virtual Document Templates

Each folder represents a level in the template VDoc structure and has a corresponding no-content
document with the same name alongside it. The order in which the subordinate documents are
inserted at each level is governed by the outline heading numbers stored in the a_special_app
attribute (not necessarily by object_name). The resulting template VDoc structure can be previewed
in D2 through the Virtual Doc widget.
Note: For performance reasons, the D2 Virtual Doc widget does not refresh automatically whenever
you select a VDoc in the browser. Right-click the top-level VDoc and select Display Virtual
Document to preview it

D2 Configuration for the VDoc Template


The default D2 configuration for the CSR VDoc template is as follows:

1. Log in to D2-Config as Administrator.


2. In the filter on the main toolbar, select All elements.
3. Select Go to > VD Template.
4. Under VD templates, select Clinical Virtual Document Template.
5. The following table lists the fields in the right pane that must be filled or selected:

Field Description
Asynchronous This option will build the vDoc creation asynchronously and display the
build message to the user to wait until process is complete.
Copy/Link This behavior is governed by a D2 VDoc Template configuration. This
defines whether to copy the child documents from template or link. In
case of documents created from predefined template it must be copied.
New object type The VDoc components are created as objects of type cd_clinical.
Inherit content The template structure can include placeholder documents providing
initial content for each node.

266
Virtual Document Templates

Field Description
Default values The default values is derived from the category attribute on the content
template. If it is set to 2, it uses Cat 2. The same rule is applicable for
lifecycle as well. If the category is set to 2 on template component, then
Cat .2 lifecycle is used on document. The logic for this is defined using
Qualification DQL.
Version label The initial version is 0.1.
If the components of the VDoc template are a combination of Category 2 and 3 documents,
two separate building rules must be defined for them. This causes D2 to apply the appropriate
default values template, inherited attribute rules, and lifecycle to each component. D2 also
applies the initial lifecycle actions to each component, so that they are initialized correctly. Since
the creation of the VDoc is configured to be asynchronous, users can perform other actions till
the VDoc is created.
6. Select Go to > Inheritance.
7. Under Inheritances, select Clinical Virtual Document Template Component Inheritance.
The Clinical Virtual Document Root Inheritance configuration is also applied to each node in
the VDoc copy, which specifies the attributes to be inherited from the root VDoc node to the
subordinate nodes in the copy. This enables the VDoc child documents to inherit the same
properties and default roles as the root VDoc when the VDoc structure is created.
8. In the Selection field on the right pane, add the following values from the Source pane so that the
VDoc inherits these attributes from the VDoc template node:
• artifact_name
• category
• subject ( = sub-folder path)
• title
Note: Do not include the template-inherited attributes in this list.
9. Click Matrix.
10. Under Contexts, expand Content Templates.
11. Under Configuration elements, expand VD Template.
12. Ensure that the Clinical Virtual Document Template configuration element is mapped to the
Clinical Virtual Document Templates context.
The D2 context for this is defined as:
• Object type: cd_content_template
• Qualifier: domain = ‘Clinical’ and r_is_virtual_doc = 1

VDoc Template Approval


VDoc templates must be approved before they are used. They use the same Content Template
Lifecycle Model and Template Approval Workflow Content Template Approval, page 165. In the

267
Virtual Document Templates

workflow lifecycle transition tasks, additional D2 actions are used to transition the VDoc child
documents as well as the root VDoc. To configure the template approval in D2-Config:

1. Log in to D2-Config as Administrator.


2. In the filter on the main toolbar, select All elements.
3. Select Go to > Lifecycle.
4. Under Lifecycles, select Content Template Lifecycle Model.
5. In the Lifecycle state table, select the Effective row.
6. In the Action Type table, add the Apply Method action.
7. In the Method field, select CDFVirtualDocumentMethod.
8. In the Extra arguments field, type:
-child_objects ALL -if_child "a_status not in ('Effective', 'Withdrawn', 'Suspended', 'Superseded',

9. Click Save.
On template approval (promotion to “Effective” state), the child VDoc components are also promoted
to “Effective” in a recursive fashion. For large or complex VDoc templates, it may be more efficient to
use the CDF Virtual Document Method (server method) to accomplish this. This method is more
flexible and configurable, for example, you can transition child documents conditionally. See Review
and Approval of VDocs, page 272 for more information.

Creating a Custom VDoc Template


1. In the D2 Client, log in as a member of the template author group for a particular domain (for
example, cd_clinical_template_author).
2. Select New > Content.
3. On the Edit properties page, select or type the required information for each of the fields as
described in the following table:

Field Description
Name Provide a name for the template
Artifact Name Select the relevant artifact for the template. Although this is not a
mandatory field, it is important to specify an artifact for a virtual
document template so that all the child documents in the VDoc
structure inherit this artifact name from the template.

268
Virtual Document Templates

Field Description
Classification Applicable Artifacts: Select the artifact for which the template
is applicable. For artifact-specific templates, ensure that the
relevant Artifact Name is selected in the Applicable Artifacts list.
The top-level VDoc node is defined as a cd_content_template
object associated with the relevant artifact. In the case of the CSR
VDoc, it is Study Report Assembly.

Virtual Document Parent: Select Yes if you want to create this


document as the no-content parent node that will contain child
nodes as part of the VDoc template, or as a VDoc with content.
Select No if you want to create a child document. The Virtual
Document Parent option must be enabled on each VDoc node in
the template and disabled for the leaf document templates (it
is disabled by default).

269
Virtual Document Templates

Field Description
Process Info Specify the default users in the Authors and Approvers fields
for this template.
Template-Inherited Category: Select the control category that will be inherited to
Properties the documents to use the correct configurations like default
values, lifecycles, and so on , as configured in the VDoc Template
configuration. Therefore, it is very important to select the
category correctly.

Sub-folder Name: Specify the name of the folder under


the Templates/D2 folder where you want to store this
template. This option enables you to organize your VDoc
template nodes into an easily accessible structure under
the Templates/D2 folder. For example, the figure shows
the folder structure for the Study Report Assembly VDoc

template:

When specifying a subfolder within a subfolder, use the syntax


<parent subfolder name>/<child subfolder name>. For example,
in the preceding Study Report Assembly VDoc template, for
a child document in the 16.1-Study Information folder,
the value in the Sub-folder Name field will be Study Report
Assembly/Study Report Appendix Assembly/16.1-Study
Information as shown in the following figure:

When the template is instantiated, the entire VDoc structure


is copied and the template-inherited properties applied to
the corresponding nodes in the copy. This ensures that each
component in the VDoc has the correct control category and
lifecycle associated with it and a suitable default title.

270
Virtual Document Templates

The value in the Title field is inherited for the component document when the VDoc is created.
4. Click Next.
5. On the Choose template page, select either the Word document template to create a document
or no content to create a no-content placeholder. Click Next.
6. Repeat steps 2 through 5 to create all the required artifacts for the VDoc template components.
7. After creating all the artifacts, navigate to the Templates/D2 folder and in the Doc List,
right-click the parent virtual document, and click Display Virtual Document. The template
appears in the Virtual Doc widget.
8. Drag and drop the artifacts from the Doc List to the Virtual Doc widget as child documents.
9. In the Virtual Doc widget, right-click the parent document and click Checkin.

Creating an Instance of a Clinical Study Report


Assembly
A new D2 creation profile Clinical Virtual Document Artifacts (Compound Artifacts) is provided for
creating VDocs. This is restricted to the Clinical Doc Authors group (cd_clinical_doc_authors) as
for other clinical documents. In the default configuration, there is only one VDoc artifact defined
here – the Clinical Study Report Assembly. You can add others.

1. Log in to D2-Config as Administrator.


2. In the filter on the main toolbar, select All elements.
3. Select Creation > Creation profile.
4. Under Profiles, select Clinical Virtual Document Artifacts (Compound Artifacts).
5. Under Properties, select the VD Structure Inheritance option.
This enables the user to select an existing CSR VDoc and create a copy of that VDoc. Otherwise, the
associated VD Template is used (according to the D2 configuration matrix). If you always want to
create the new VDoc from the predefined template (regardless of the current selection), you can
turn this off.
D2 creates a copy of the template VDoc structure and applies the attribute inheritance rules. The
component docs inherit the artifact_name, title, and subject from the template, and the initial lifecycle
status (Draft) from the root VDoc as shown in the following image.

271
Virtual Document Templates

The usual D2 auto-naming rules apply to the component docs (Clinical Document Auto Naming in
this case). Each component document gets its own unique document number (just as for other
Clinical documents).
D2 auto-linking also applies to the component document – Clinical Study Documents (study-Specific)
in this case as shown in the following image:

These documents are auto-filed in /Clinical/<product-code>/<clinical-trial-Id>/.

Review and Approval of VDocs


The standard RD-SSV Cat 2 Lifecycle Model and associated review and approval workflows can be
applied to VDoc artifacts as well as non-VDocs. For VDocs, additional lifecycle transition actions are
used to keep the component documents in-sync with the root as it progresses through its lifecycle.
However, the standard D2 Action on VD Children action is not used here because it applies to all
child nodes indiscriminately.
The system should apply transitions to component documents in the appropriate states only, for
example, those that are in the Draft, For Review, or For Approval states, but not those that are already
Effective (approved) or Withdrawn. This would then enable subsections of the VDoc and individual
leaf documents to be preapproved or withdrawn if necessary, without affecting other parts of the
VDoc structure.

272
Virtual Document Templates

To accomplish this in a configurable way, the CDF Virtual Document server method is invoked in the
RD-SSV Cat 2 lifecycle (which applies to the root VDoc). For example, in the “For Review” lifecycle
transition, argument for the CDFVirtualDocumentMethod field is set to:
-id <root-Vdoc-object-id> -child_objects ALL -if_child
"a_status not in ('Effective', 'Withdrawn')" -copy_attrs
"a_status,authors,reviewers,approvers,readers,auditors,wf_authors,
wf_format_reviewers,wf_reviewers,wf_approvers,wf_doc_coordinators"
-apply_d2_security true -context_user "$USER"

Based on the argument, all child documents except those with the status of Effective or Withdrawn
will have the listed attributes and D2 security applied based on the context of the user. Similarly
for VDoc templates, the CDF Virtual Document server method is invoked in the Content Template
Lifecycle Model. For example, in the “For Approval” lifecycle transition, argument for the
CDFVirtualDocumentMethod field is set to:
-child_objects ALL -if_child "a_status not in ('Effective', 'Withdrawn',
'Suspended', 'Superseded', '$value(a_status)') and r_lock_owner = ' '"
-apply_d2_security true -transition_child "For Approval" -context_user
"$USER" -run_as_server true -bind_children true
The CDFVirtualDocumentMethod has a number of significant enhancements over the standard D2
“Action on VD children” feature that can potentially be useful in your application. See the CDF
JavaDocs for details.

Versioning of Virtual Documents


Even though each document in the virtual document structure is treated as an individual document
and can be processed individually in a workflow, the virtual document itself can also be sent to
a workflow and approved by the workflow participants. In the process of sending individual
documents or the VDoc to a workflow, binding rules define what version of the child document is
bound to the approved and non-approved version of the virtual document root.
At the time the VDoc is approved, the VDoc is bound to the latest implicit version (version number)
of the children at the time of approval. When the VDoc is in progress (for example, Draft, For Review,
and other states), the VDoc is bound to the CURRENT version of the children, that is, the version
label CURRENT.

273
Virtual Document Templates

Currently, the VDoc binding rules are configured in the Documentum for Research and Development
solution. However, the CDF VirtualDocumentMethod is available in the CDF layer and is exposed for
all the solutions. The CDF VirtualDocumentMethod allows binding to be set when adding children to
the virtual document. To handle setting the binding outside of adding children, the method has been
updated to support a new command line parameter indicating that the binding of the children should
be set (-bind_children). When -bind_children is set to true, it checks the -binding_type passed to
the method. If the -binding_type is early, the system checks the -version_label. The following
table provides a description of these parameters:

Parameter Description
-binding_type Either early or late to denote the VDoc binding
mode to use for new child documents (not
case-sensitive). Early binding means that the
VDoc structure refers to predefined versions
of each version with a specific version label at
any given time. Late binding means that the
VDoc is not bound to a defined version of each
component and instead the relevant version
is selected as and when the VDoc structure is
traversed, as stipulated in the “with ...” clause
of the relevant DQL query. OPTIONAL - early
binding is assumed by default.
-version_label Specifies the child component version
label to use for early binding. For static
bindings, specify an explicit version number
such as 1.0 or use an attribute expression
such as @value(r_version_label[0]) to
fix the virtual document to the current
version of the child document in each case.
$value(r_version_label[0]) refers to the version
number of the parent virtual document node,
which may not be appropriate. For dynamic
bindings, specify a symbolic version label such
as "CURRENT". OPTIONAL - if undefined or
blank, the "CURRENT" version label is assumed
by default.
Based on the values passed for -binding_type and -version_label, the binding rules are enforced on
the VDoc.

274
Chapter 16
Configuration Tasks

This section explains how to perform certain common configuration tasks for the Life Science solution.

Configuring Credential Enforcement


Credential enforcement is required to prevent unauthorized use of passwords or identification codes,
and to detect and report in an immediate and urgent manner any attempts at their unauthorized
use to the system security unit and the organizational management. This can be enforced at the
repository level as dm_docbase_config object for any Documentum repository provides the following
configuration options regarding account lockout:

Configuration Option Description


max_auth_attempt Maximum number of unsuccessful login
attempts allowed.
auth_failure_interval Length of time, in minutes, in which consecutive
failed login authorizations will cause a user
account to be deactivated. The number of failed
attempts that must occur within the interval
to trigger deactivation is determined by the
max_auth_attempt property. The default is 0,
meaning that deactivation always occurs when
the maximum number of consecutive failed
login attempts is reached, regardless of how
long that takes.
auth_deactivation_interval Account deactivation and automatic
reactivation. If this is 0, the account is not
automatically reactivated. The value is specified
in minutes.
If a user account in the repository was created through synchronization with an LDAP directory (for
example, Active Directory), and the Documentum Server delegates authentication to this directory
(either through LDAP or Kerberos authentication), and the LDAP directory implements more
stringent policies regarding account lockout, these policies would override the dm_docbase_config
settings.

275
Configuration Tasks

The following changes have been made in the Life Sciences solutions to address preceding process:
• The default value for consecutive failed login attempts is set to 3. If the user failed to provide
correct password for 3 times consecutively, the system makes the user inactive automatically. This
is applicable while providing the credentials for electronic signatures. The default value can be
changed by modifying the DM_DOCBASE_CONFIG object:
update dm_docbase_config objects set max_auth_attempt = <NEW_VALUE>
where object_name = <DOCBASE_NAME>
where <NEW_VALUE> is the value for the failed login attempts and <DOCBASE_NAME> is
the name of repository used
Note: Setting the value to 0 disables the locking out of the accounts.
Note: If required, you can configure the other options listed in the preceding table.
• A LSReportDeactivatedUser job is created to send a report of the deactivated users to the users in
role cd_report_locked_users, which is disabled by default. This job runs on 24-hours cycle and
sends the list of deactivated users in HTML format. To enable this job:
1. Log in to Documentum Administrator.
2. Navigate to Job Management > Jobs.
3. Right-click the LSReportDeactivatedUser job and select Info.
4. Change the stage of the job from Inactive to Active and then click OK.
The frequency of the job can be changed in the Schedule tab according to your requirements.
The LSReportDeactivatedUser job calls CDFCreateObjectMethod, which generate the report of
deactivated users and save it as object in repository. The following parameters are used:

Parameter Name Value Description


type dm_document Type of object created by
method.
object_name Locked User Account Report Name of the created object.
acl System Admin Report ACL to be applied on the
report.
folder /Temp Location where report will be
saved.
mail_config Send locked user report Mailing configuration that is
used to send the mail.
report_query Query used to generate the
select deactivated_utc
report.
_time, user_name,
user_address,
failed_auth_attempt,
first_failed_auth_utc
_time from dm_user
where user_state >= 1
order by 1 desc, 2

276
Configuration Tasks

Parameter Name Value Description


report_style /System/CDF/Report Stylesheet to be used for
Stylesheets/Default Report generating the report HTML.
Stylesheet
report_title Deactivated user accounts Title of the report.
report_text List of Documentum user Text for the report.
accounts that have been
locked out due to repeated
failed login attempts:
report_hyperlinks user_address:email

• You can use the Find all deactivated users search query to search the list of inactive users. This is
enabled for users in the Controlled Document Admins Role, that is, cd_admingroup.
• Inline passwords are not used and authentication is typically handled by the platform (NT/LDAP).
User authentication and lockout can be enforced at the platform level. The Life Sciences solution
relies on the underlying operating system to enforce lockout and notification. Refer to the
corresponding documentation for more information.

Electronic Signature Signoff


Electronic signature signoff operations in the Life Science solution are configured to work with
SSO (Single Sign On) authentication such as inline password or LDAP. This requires users who
access the Documentum Server repository through an LDAP connection or as inline users have
their dm_user.user_source attribute set to "LDAP" or "inline password", respectively. By default,
the dm_user.user_source attribute is set to LDAP. During the e-sign operation, Documentum
Server invokes the ElectronicSignOff docbase API to validate the supplied login name and
clear text password, provided the corresponding user’s dm_user.user_source value is "LDAP"
or "inline password". In addition, Documentum Server performs the desired auditing. If the
dm_user.user_source attribute is set to dm_krb (Kerberos), the e-sign operation will not function
properly and will always return as invalid, regardless of the password entered. Note that the
dm_krb value is a legacy value and should be updated to LDAP. To resolve this, follow the steps
in the “Configuring LDAP synchronization for Kerberos users” section in the Documentum Server
Administration and Configuration Guide.
When the ElectronicSignOff docbase API is executed on the Documentum Server with a password
credential that does not begin with any prefix, the system interprets it as a clear text password and
validates it accordingly (against an LDAP directory for an LDAP user, or against the repository itself
if the user is an inline password user). If the password credential has no prefix but the corresponding
user’s dm_user.user_source attribute has the value "dm_krb", then the password credential is
interpreted as a Kerberos credential, which is not the desired behavior as it requires Documentum
Server to invoke the Kerberos authentication plugin to validate the credentials.

277
Configuration Tasks

D2 Mailing Configurations
Email configuration in D2 can be used for the following purposes:
• Create mailing lists that end users can use to send preconfigured batch email messages.
• Send email messages directly from the D2 Client.
• Create email distributions based on events.
• Create subscriptions-based events such as workflows, lifecycle transitions, and so on.

Types of Mailing Configurations


The following table lists the object types used in the configuration of D2 mailing:

Table 34. Object Types for Configuring D2 Mailing

Object Type Description Configurations Result

d2_mail_config You can configure D2-Config > Tools > End users receive
the mailing server, Email notifications when a
event-based mailing, workflow is initiated
and subject and Ensure that the email for a document. You
message for triggering server is configured can enable hyperlinks
event mails. and the relevant email for enabling the user
events are configured to directly navigate to
based on the business the task and approve
requirement. or reject it based on the
business use case.
d2_sendmail_config You can configure the D2-Config > Go to > End users want
email subject, message, Send Email the ability to send
and attachment messages to both
behavior to enable end Email templates can internal and external
users to send email be configured and users of the D2 system
messages concerning a mapped with different for a selected object.
selected object. role-based contexts Using the Send Mail
enabling those users configuration, you can
to use the template predefine the message
when they use the and subject of the email
Send Email option. to streamline this task.

278
Configuration Tasks

Object Type Description Configurations Result

d2_mail_attachment You can configure D2-Config > Creation End users need to
_config content import > Mail Attachments import email messages
such that when an with attachments
end user imports This is a global setting and retain the same
an email message that allows users to format of the email
with an attachment, import attachments in and create renditions
the attachment is email messages and of the email messages
processed and saved allow the system to and its attachments.
in the repository as create a rendition of
a rendition. The the imported email
supported content message and its
types are .eml or attachments.
Outlook formats.
d2_mailing_config You can configure D2-Config > Go to > End users need to
the recipients, Mailing list send the messages
subject, message, to users notifying
and attachments that Ensure mailing lists or reminding them
should be included in are configured for the of a document that
an email message. specific set of users needs to be reviewed.
in the system and The end user selects
can be used when the document and
configuring lifecycle chooses the mailing
transitions, context list that has been
mapping based on configured, which is
conditions and enable then automatically
email trigger to the sent to the predefined
users based on the recipients with
conditions. applicable message
and attachment.

279
Configuration Tasks

Object Type Description Configurations Result

d2_subscription You can define specific D2-Config > Go to > End users want to be
_config events and associated Subscription notified any time
email messages that someone checks
end users can subscribe Subscription templates in a new version
to, so that they can be can be created for of a particular
alerted when the event events and mapped document. By creating
occurs. with different a subscription, they
role-based contexts. can subscribe to this
This enables users to event for the document
subscribe events to and be notified by
get email notifications email when a new
using the Subscribe version has been
option and select checked in.
templates based on the
business requirement.
Note: You cannot
preset subscriptions
for users. It is based on
user preference.
d2_distribution_config You can configure a D2-Config > Go to > End users want a
simple workflow-like Distribution quick way of sending
process where you can a document for review
define attachments, Enable users to send to reviewers but want
property pages, bulk email. The to make sure that they
recipients, email distribution templates accept or reject the
messages, and any or user email messages new changes in the
electronic signature can be preconfigured document. Further,
requirements when and mapped to context any approval must be
accepting or rejecting based on which users electronically signed
the requested can use the same in and the approver must
distribution actions. the system through the indicate the reason for
Distribution option. approval.

Configuring Mailing Configurations


The Documentum D2 Administration Guide provides the information for configuring the Mail
Server, Mailing list configurations, Send mail configurations, subscriptions, and distributions
configurations in D2.

List of Task Variables


The following table lists the task variables that you can use to configure D2 email configuration, for
example, event dm_startedworkitem

280
Configuration Tasks

Table 35. List of Task Variables

Variable Syntax
docbase_name: $value(docbase_name)
due_date: $value(due_date)
event_name: $value(event_name)
message_text: $value(message_text)
object_name: $value(object_name)
package_id: $value(package_id)
planned_start_date: $value(planned_start_date)
task_priority: $value(task_priority)
router_id: $value(router_id)
router_name: $value(router_name)
sender_name: $value(sender_name)
supervisor_name: $value(supervisor_name)
task_name: $value(task_name)
task_number: $value(task_number)
recipient_login_name: $value(recipient_login_name)
recipient_os_name: $value(recipient_os_name)
recipient_name: $value(recipient_name)
platform: $value(platform)
mail_user_name: $value(mail_user_name)
stamp: $value(stamp)
date_sent: $value(date_sent)
link_cnt: $value(link_cnt)
package_type: $value(package_type)
content_type: $value(content_type)
content_size: $value(content_size)
dos_extension: $value(dos_extension)
web_server: $value(web_server)
web_server_type: $value(web_server_type)
temp_file_name: $value(temp_file_name)
mailScript: $value(mailScript)
bulk_mail_file: $value(bulk_mail_file)
smtp_server: $value(smtp_server)
Subject $value(Subject)
Message $value(Message)

281
Configuration Tasks

Variable Syntax
task.subject $value(task.subject)
task.message $value(task.message)
user_name $value(user_name)
password $value(password)
domain_name $value(domain_name)
launch_async $value(launch_async)
document.object_name $value(document.object_name)

Message Configuration
When mapping the appropriate events in the Email Configuration, you can specify the subject and
message for events. You can embed HTML messages with hyperlinks and images in the email
messages. The following sample code shows HTML messages with hyperlinks and images in the
email messages:
<html>
<head>
<meta http-equiv=Content-Type content="text/html; charset=windows-1252">
<meta name=Generator content="Microsoft Word 14 (filtered)">
<style></style>
</head>
<body lang=EN-US link=blue vlink=purple>
<div class=WordSection1>
<p class=MsoNormal>
<span style='font-family: "Arial", "sans-serif"'>The following workflow task has been assigned
to you ($value(recipient_os_name))by $value(sender_name):</span></p>
<p class=MsoNormal>
<span style='position: absolute; z-index: 251659264; margin-left: -7px; margin-top: 18px; width:
778px; height: 3px'><img width=778 height=3 src="notification%20message%204_files/image001.png">
</span>
<span style='font-family: "Arial", "sans-serif"'><br> <br></span></p>
<table class=MsoTableGrid border=0 cellspacing=0 cellpadding=0 style='border-collapse: collapse;
border: none'>
<tr style='height: 31.9pt'>
<td width=101 valign=top style='width: 76.1pt; padding: .05in 5.75pt .05in 5.75pt; height:
31.9pt'>
<p class=MsoNormal align=right style='text-align: right'>
<b><span style='font-size: 10.0pt; font-family: "Arial", "sans-serif"'>Task:</span></b></p></td>
<td width=537 valign=top style='width: 402.7pt; padding: .05in 5.75pt .05in 5.75pt; height:
31.9pt'>
<p class=MsoNormal>
<span style='font-size: 10.0pt; font-family: "Arial", "sans-serif"'>$value(task_name) for
value(document.object_name)</span></p></td></tr>
<tr style='height: 30.1pt'>
<td width=101 valign=top style='width: 76.1pt; padding: .05in 5.75pt .05in 5.75pt;
height: 30.1pt'>
<p class=MsoNormal align=right style='text-align: right'><b>
<span style='font-size: 10.0pt; font-family: "Arial", "sans-serif"'>Subject:</span></b></p></td>
<td width=537 valign=top style='width: 402.7pt; padding: .05in 5.75pt .05in 5.75pt; height:
30.1pt'>
<p class=MsoNormal>
<span style='font-size: 10.0pt; font-family: "Arial", "sans-serif"'>$value(task.subject)
</span></p></td></tr>

282
Configuration Tasks

<tr style='height: 31.0pt'>


<td width=101 valign=top style='width: 76.1pt; padding: .05in 5.75pt .05in 5.75pt;
height: 31.0pt'>
<p class=MsoNormal align=right style='text-align: right'><b>
<span style='font-size: 10.0pt; font-family: "Arial", "sans-serif"'>Description:
</span></b></p></td>
<td width=537 valign=top style='width: 402.7pt; padding: .05in 5.75pt .05in 5.75pt;
height: 31.0pt'>
<p class=MsoNormal>
<span style='font-size: 10.0pt; font-family: "Arial", "sans-serif"'>$value(task.message)
</span></p></td></tr>
</table>
<p class=MsoNormal>
<span style='font-size: 10.0pt; font-family: "Arial", "sans-serif"'>&nbsp;</span>
</p>
<p class=MsoNormal>
<span style='font-family: "Arial", "sans-serif"'>&nbsp;</span>
</p>
<p class=MsoNormal>
<span style='font-family: "Arial", "sans-serif"'>Please click
</span><a href="http://rbauv148.bas.roche.com:8180/D2/?docbase=$value(docbase_name)
&amp;amp;locateId=$value(stamp)
&amp;amp;locateTarget=TaskFoldersWidget">
<span style='font-family: "Arial", "sans-serif"'>here</span></a>
<span style='font-family: "Arial", "sans-serif"'> to open task. </span>
</p>
<p class=MsoNormal>&nbsp;</p>
</div>
</body>
</html>

Font definitions for the preceding sample:


@font-face {
font-family: Calibri;
panose-1: 2 15 5 2 2 2 4 3 2 4;
} /* Style Definitions */
p.MsoNormal,li.MsoNormal,div.MsoNormal {
margin: 0in;
margin-bottom: .0001pt;
font-size: 12.0pt;
font-family: "Times New Roman", "serif";
}
a:link,span.MsoHyperlink {
color: blue;
text-decoration: underline;
}
a:visited,span.MsoHyperlinkFollowed {
color: purple;
text-decoration: underline;
}
.MsoChpDefault {
font-family: "Calibri", "sans-serif";
}
.MsoPapDefault {
margin-bottom: 10.0pt;
line-height: 115%;
}
@page WordSection1 {
size: 8.5in 11.0in;
margin: 1.0in 1.0in 1.0in 1.0in;
}
div.WordSection1 {
page: WordSection1;
}

283
Configuration Tasks

All aliases are resolved when using through actual D2 environment. Images can be included in the
HTML. This can be done by deploying the D2 War file or any custom War files and then using an
Alias-set to point to the correct URL and file in each environment.

Disabling Email Notifications


1. To turn off notifications globally for all events, update the Documentum Server
$DOCUMENTUM\dba\config\documentum\server.ini file with the value mail_notification
= F.
2. To turn off notifications for certain users, update the dm_user object and remove the attribute
value set for user_address for the user that you d do not want to send email notification
as follows:
UPDATE dm_user OBJECT set user_address = '' where user_name='$user
_name$';
3. To turn off notifications for certain events, update the dm_event_sender.ebs script and add
exit sub for the events you do not want to send email notifications. For example, to stop email
alerts for the dm_startedworkitem event:
Case "dm_startedworkitem"
exit sub
object_info_flag = "false"
task_event_flag = "false"
router_event_flag = "false"
subject_line = "Started workitem: " & CleanQuote(package_id) & " in docbase " &
CleanQuote(docbase_name)

Auditing Events
D2 auditing is used to capture and record key events as documents progress through their lifecycle.
It is possible to reconfigure the audited events for each category of documents independently, if
necessary. In the default configuration, the following events are audited.
For Cat 1-3 controlled documents and Change Requests, the following events are audited:
• Document creation
• Document deletion (including D2 recycle bin delete/restore events)
• Document versioning (check-in events)
• Document property updates, including changes to role members and/or Effective, Review, and
Expiry dates, as and where applicable (audited explicitly)
• Document lifecycle state changes
• Creation of relations
• Removal of relations
• Workflow initiation
• Workflow termination (abort events)

284
Configuration Tasks

• Workflow task acquisition


• Workflow task forwarding (completion)
• Workflow task rejection
• Workflow task delegation
For Cat 4 documents, the same events as Cat 3 are audited except for workflow events, since there
are no defined workflows for these documents in the default configuration. (Auditing of Cat 4
events can be disabled if preferred.)
In the case of Registration Forms, the following events are audited:
• Form creation
• Form deletion
• Form property updates, including changes to the primary key values such as product codes
(audited explicitly)
• Form lifecycle state changes

Configuring Audit Events


Administrators can specify the events that the system audits in D2-Config. Administrators can
also restrict the audited events shown to users.
Auditing an event creates an audit trail entry, which captures the process of creating, reviewing,
approving, and releasing information in an electronic record. Audit trails facilitate the reconstruction
of events relating to an electronic record. The Life Sciences solutions are preconfigured to audit
events according to best practices, but you can modify the events based on your own requirements
or preferences.
1. Log in to D2-Config as an administrator.
2. Select a Life Science solution from the configuration filter.
3. Select Audit from the Configuration elements list.
4. Select the document group to audit.
5. Specify the events to audit:
• Select events from the Auditable events list and add them to the Audited events list. The
system audits the events in the Audited events list.
• To no longer audit events, select the events from the Audited events list and move them to
the Auditable events list.
6. Specify the audited events shown to users:
• Select events from the Audited events list and add them to the Displayed events list. Users
can only see events in the Displayed events list.
• To hide audited events from users, select events from the Displayed events list and move
them to the Audited events list.
7. Save the configuration.

285
Configuration Tasks

8. To view the audit events in D2 Client, administrators grant users the View Audit extended
privilege in Documentum Administrator.

Configuring Automated Delegation


Use the automated delegation tool to create automated delegation on behalf of other users. For
example, if a user goes on vacation for two weeks and forgets to set up self delegation using
Delegations widget, another user from the team can set up the delegation on his behalf to
automatically delegate all the workflow tasks from that user to specified users and groups.

1. Log in to D2-Config as Administrator.


2. In the filter on the main toolbar, select All elements.
3. Select Tools > Manage Delegations.
4. Click New to create a delegation control.
If you want to create a child delegation control that inherits the properties of an existing
delegation control, select a delegation control and click Create from. See the Documentum D2
Administration Guide for more information about child configurations.
5. Under Properties, in the User/Group name field, select a user or group who can setup a
delegation for other D2 Client users.
6. In the Manage user/group delegations field, select the list of users or groups for whom the
automated delegations can be targeted. Click Browse and then use the list controls to add or
remove automated delegation targets.
7. Click Save.

Date-Time UI Implementation
The date-time UI implementation in the Life Sciences solutions is based on the D2 time zone
awareness feature. The timestamp displayed in the solutions is based on the following time zone set
on the client machine.
By default, the timestamp displayed in the widgets such as Doc List, Versions, Renditions and so on,
have their values converted to the Client machine's time zone settings. However, when the timestamp
values are stored in the database, they are stored in the UTC format.
When applying dates to PDF documents during C2 Rendition processing, that is, fusing a PDF page
to a PDF rendition, the time zone of the server where the C2 processing occurs is used. For example,
if the PDF rendition processing is running on the Web Server which is in the Pacific time zone, the
date will be converted to the Pacific time zone when reading from the repository and applying
to the PDF. This also applies to C2 View, Export, and Print processing. Depending on where the
processing takes place, that server’s time zone is used. For example, for C2 View, Print, and Export
from the D2 Client, the web server’s time zone is used. For C2 Rendition processing which could
take place on the Documentum Server, the Documentum Server’s time zone is used. For the date
used in the autolinking configuration, the Documentum Server’s time zone is considered when
processing autolinking.

286
Configuration Tasks

It is recommended that all servers be set to the same time zone and if possible set to UTC. This
removes any confusion in that all dates within PDF renditions are UTC and any dates on the UI are in
the client’s local time zone.
Note: In the D2 Client, if you select Today in the Calendar for the date input field, the system takes
the same date at 12:00 AM irrespective of the time you performed the operation on that day. For
example, if you selected Today in the Calendar at 2:00 PM IST on 24-MAY-2018, the system displays
the timestamp as 24-MAY-2018 12:00 AM.
For more information about the time zone feature, see the following whitepapers:
• The Timezone Feature in Documentum Server
• Content Transfer Timeout Configuration for WDK-based Applications

Media Files
Media files, such as video, can be previewed in Documentum through the Life Sciences Document
Preview widget. Any media file format supported by Windows Media Player can be previewed: the
formats that are supported can be specified in the optional "mediaFormats" URL parameter to the
Life Sciences Document Preview widget as a comma-delimited list of dm_format names. Where
unspecified, the default formats are: wmv, avi, mpg/mpeg/mpg-4v, quicktime, and wav files.

Search Configuration Changes


The legacy search configurations, that is Basic Search, Advanced Search, and Facets, have been
changed such that each Life Sciences solutions now have one configuration that is mapped to
correct context. Previously, each user-based context was configured with one consolidated search
configuration. D2 would run multiple queries for each configuration mapped for the user and for the
facets, which was a huge performance impact on the system. For example, if there are two search
configurations mapped for a single context, the search runs the queries twice. Now, each role has
one search configuration mapped to it, which has the types used in the solution along with relevant
attributes.
The number of predefined (default) facets that apply to Basic Search has been limited resulting in
improved search performance. In addition, each solution is configured with additional facets that can
be selected in the Advanced search dialog box.

Configuring Cross-Domain Document Sharing


The Life Sciences solutions support the sharing of documents across domains. Any document artifact
can be shared across multiple domains, using different object types in each domain but sharing the
same underlying content. Artifacts to be shared across domains are defined in the artifact reference
model dictionary. For more information, see the Documentum for eTMF User Guide or the Documentum
for Research and Development User Guide.

287
Configuration Tasks

Note: For this release, only cross-domain sharing of documents from TMF to LSRD is supported.
Ensure that the browser is Java or CTF-enabled so that the Reference Copy does not appear
contentless.
Specific individual document artifacts in any domain in any Life Sciences solution can be shared
automatically with another domain when they are created, according to predefined rules, without
any user intervention. The same document can potentially be shared across multiple domains. The
shared document in each domain may have its own domain-specific object type, properties, lifecycle
model, role-based security, and workflows. To share an artifact across a domain, it is required the
Life Sciences solution share the domain and artifact within that domain. For example, if you want
to share a Clinical document artifact in the LSTMF Clinical domain with LSRD, the same domain,
that is Clinical, and artifact must exist in LSRD.
Limitation:
During the creation of the reference document, if the document is not created because of invalid
inputs, then the parent document is retained without a reference document. Post creation, it is not
possible to generate a reference document for that parent document.

Configuring Cross-Domain Document Sharing in


D2-Config
You can configure the cross-domain sharing of documents within the Life Sciences solution through
D2-Config using the following steps:

1. Log in to D2-Config as Administrator.


2. In the filter on the main toolbar, select All elements.
3. Select Data > Dictionary.
4. Under Dictionaries, select Object Type Mapping.
5. Under Properties, on the Alias tab, identify the domain for which you want to share documents
and note the dictionary name in the Artifact Dictionary column. For example, if you to want to
share documents from the Clinical domain, in the Key column, go to cd_clinical and note the
Clinical Artifacts Name Inventory dictionary. For the Clinical domain in LSTMF, you must
check the TMF Models dictionary and identify which model is used (that is, TMF Unique Artifact
Names 2.0 or 3.0) and then navigate to that relevant dictionary.
6. In the Key column, go to cd_clinical_tmf_doc and ensure that the Auto Create Related artifacts
column is updated with the value, Y.
7. Under Dictionaries, select the dictionary for the domain you want to share documents, that is,
TMF Unique Artifact Names 2.0 or 3.0.
8. In the Auto Create Reference Copies column, add the query using the following syntax for each
artifact that you want to share:
relationtype=Reference Copy;objecttype=<Reference Object
type>;artifactname=<Reference Artifact Name>;setattributeslist
=<[key1|value1:key2|value2:key3|value3:key4|value4….];copyattrs
=attr1:attr2:atr3;defaultvalues=<default value template>

288
Configuration Tasks

where,
• relationtype specifies the Documentum relation type to be established, which in this case
is Reference Copy.
• objecttype is the Documentum object type of the artifact in the target domain.
• artifactname specifies the related artifact name, which must be a valid entry (dictionary key
value) in the relevant artifact dictionary (reference model). If omitted, it is assumed to be the
same artifact name as that of the master document.
• setattributeslist is an optional series of items of the form <attribute-name>|<attribute-value>
pairs separated by the colon operator and key values are separated by the pipe operator that
enables specific attributes of the related artifact to be set explicitly or indirectly through the
D2 default values template configurations. These can use $-attribute expressions referring to
attributes of the original document.
• copyattrs specifies the attributes of the master document that need to be copied to the
reference document. If the attributes are not present in reference document then user has
to manually select.
• defaultvalues specifies the default value template to be set on the reference document.
For example, for the cd_clinical domain, the query will be:
relationtype=Reference Copy;objecttype=cd_clinical;artifactname
=Informed Consent Form;setattributeslist=[doc_coordinators|cd
_clinical_doc_coordinators:notification_list|cd
_clinical_doc_coordinators:authors|cd_clinical_ref_copy
_mgrs:domain|Clinical];copyattrs=product_code,primary
_group,subgroup,indication;defaultvalues=Clinical Cat 2 Default Values
9. Click Save.

Configuring Search Criteria


Administrators can change the Advanced Search Configuration in D2-Config. This feature allows
administrators to define types and properties from the repository that appear in the D2 Client
Advanced Search interface.
1. Log in to D2-Config as an administrator.
2. Select a Life Science solution from the configuration filter.
3. Select the Search element from the Configuration elements list and then select the Advanced
Search Configuration element.
4. In the Search configuration & properties mapping area, select a type from the Type list. Use the
arrows to add or remove items from the selection list.
5. In the Properties of selected type area, add or remove properties to refine the search.
6. Save the configuration.
If additional attributes are added as facets, configure the attributes for indexing by Documentum
xPlore as described in the Documentum xPlore Administration Guide.

289
Configuration Tasks

Configuring XML DocViewer to Display PDFs in


Excel Format
By default, the XML DocViewer widget does not display PDFs in Excel format. To enable this, you
must configure the XML DocViewer widget using the following procedure:
1. In D2-Config, select All elements from the configuration filter.
2. Select Widget view > Widget from the menu bar.
3. Under Widgets, select WG EXT LSCI LSS Doc Viewer (Browse).
4. In the Widget url field, remove the following value from the query:
&excludeFormats=excel8book,excel12book
5. Perform step 4 for the following widgets:
• WG EXT LSCI LSS Doc Viewer (Initial Browse)
• WG EXT LSCI LSS Doc Viewer (QC Index)
• WG EXT LSCI LSS Doc Viewer (Tasks)
6. Save the configuration.

Hard Delete (LSTMF)


Documentum for eTMF enables users to permanently remove documents using the Hard Delete
functionality. When users perform a Hard Delete of a document, Documentum for eTMF audits
the event. You can search for these events in the audit trail. For example, if a user improperly
scans or imports a document into the TMF, users with permissions can perform a Hard Delete on
the document to remove it. By default, the system is configured to only allow clinical document
coordinators (members of the cd_clinical_doc_coordinators group) to perform a Hard Delete of
documents in the Withdrawn state.
Prerequisites for performing a Hard Delete of document(s):
• The selected document(s) should not be in the checked-out state.
• The document should match the criteria specified through the cd_delete_config object.
Hard Delete of placeholders is not supported. The Delete menu item can be disabled/hidden.
The Hard Delete feature can be configured through the docbase object cd_delete_config. A
default instance of this object is provided with the Documentum for eTMF DAR file that is configured
as shown in the following table.

290
Configuration Tasks

Table 36. Configuring the Hard Delete feature

Attribute Name Single/Repeating Purpose Default Value


doc_type Single Document type on cd_clinical_tmf_doc
which users can
perform a Hard Delete.
doc_statuses Repeating Documents status Withdrawn
(refers to valid
values for a_status
attribute like Draft,
Withdrawn, Expired
and so on).
roles Repeating User roles who can cd_clinical_doc
perform a Hard Delete _coordinators
operation.
auditted_attributes Repeating Attributes of NIL
documents selected
for Hard Delete that
must be added in the
dm_audit_trail
table for the event
cdf_hard_delete
(maximum limit 6).
Note: This attribute is
not used to validate if
the document qualifies
for hard delete or not.
You can adjust the default configuration of the Hard Delete feature to meet your business
requirements by following these steps.
1. Log in to D2 Client as a member of cd_admingroup.
2. From the Repository browser, navigate to System/CDF/Delete Config.
3. Right-click the TMF Doc Delete Config object and click Properties.

Caution: Only one instance of this object can exist at any point in time. If you create
multiple instances of this object, Hard Delete may not work correctly.

4. Configure the TMF Doc Delete Config properties as defined in the following table:

Field Description
Name Shows TMF Doc Delete Config.
Title Shows TMF Doc Delete Config.
Document Type Select the document type that selected
user roles can hard delete. For example,
cd_clinical_tmf_doc.

291
Configuration Tasks

Field Description
Document Statuses Select the statuses that the document must be
in before users can hard delete it.
User Roles Select the user roles that can perform the hard
delete.
Attributes to Audit Select the document attributes to capture in
the audit trail for the cdf_hard_delete
event. The system audits a maximum of six
attributes.

5. Click OK.
You can view hard deleted documents in D2 Client using the Find Delete Audit Events search query
form.
Note: In the repository, you can delete related documents by setting the integrity_kind attribute
in the dm_relation_type table for that relationship type to 2. The system deletes both the original
document (referred to by parent_id in the dm_relation table) and the related document (child).
However, deleting the child does not delete the parent in the repository. The system records only the
dm_destroy event (not the cdf_hard_delete event) for the related document. If the integrity
attribute is set to 0 or 1, the hard delete fails with an error message.
The standard Documentum functionality is that after an object is deleted, the content file relating
to it is not deleted until the dm_DMClean job is run. Therefore, prior to execution of dm_DMClean,
it is possible to recover the content. Configuring the dm_DMClean job as part of Hard Delete is
not supported.

Bulk Import-Export (LSTMF)


A document package is an arbitrary collection of documents, folders and/or virtual documents in
Documentum that can be packed into a ZIP file (including content files, renditions and metadata)
and unpacked from a ZIP file back into the repository. Using document packages, users can easily
package up and export a set of documents and send these to an external agency for offline editing and
completion. Any changes can then be imported back into the repository as a ZIP file and unpacked.
The Documentum for eTMF solution uses document packages to import and export TMF
placeholders, documents, and associated document information (metadata). A document
pack is represented as an object of type cd_document_pack in the repository (a sub-type of
cd_controlled_doc) with a series of dm_relations linking it to the various objects included in the
package. The following figure illustrates this representation.

292
Configuration Tasks

The document package is the parent object in a Package Item relation, and the various items are child
objects. The child objects can be individual documents, folders, or virtual document nodes. In the
case of folders and virtual documents, the subordinate objects are included automatically in the
document pack, in recursive fashion.
The document pack has a main content file that is either a Microsoft Excel spreadsheet (when new
or unpacked) or a ZIP file (when packed). The Excel spreadsheet is derived from a template and is
used to record the Documentum metadata for each item when the document pack is packed. The
Excel spreadsheet template is part of the default installation and is defined as a cd_content_template
object in the repository, with the domain set to Document Pack.

Lifecycle Model for Document Packages


A document pack has a D2 lifecycle configuration associated with it, comprising the following states.

Table 37. Lifecycle model for document packages

State Description
New Newly-created document pack.
Refreshing A transient state representing a document pack that is being refreshed (that
is, the spreadsheet is being regenerated with the appropriate metadata).
When the refresh process is complete, it reverts to New.
Packing A transient state representing a document pack that is in the process of being
packed (through an asynchronous server method).

293
Configuration Tasks

State Description
Packed A document pack that has been packaged up into a ZIP file and is ready for
export or unpacking, as appropriate.
Unpacking A transient state representing a previously packed document pack (or one
that has been imported as a ZIP file) that is in the process of being unpacked
(through an asynchronous server method).
Unpacked A document pack that has been successfully unpacked into the repository,
that is, any changes such as new or modified documents have been applied.
Invalid A document pack that could not be processed for some reason; for example,
it refers to document files that do not exist.
The “Importing and Exporting Multiple Documents” section in the Documentum for eTMF User Guide
provides more information about the bulk import-export process.
Note: In the Bulk Import-Export Excel spreadsheet, the LIFECYCLE_STATE is set to Read-only
(‘Y’) by default. This setting does not allow lifecycle transition to be considered when importing
documents. All documents will be imported in the “Draft” state even if the user specifies a different
state in the Status column, such as Final, To be Reviewed, and so on. To import documents to the
repository in the specified document status and perform lifecycle transitions, you must set the
Read-only column to ‘N’.

Configuring the Bulk Import-Export Spreadsheet


Documentum for eTMF provides the ability to import and export Trial Master File (TMF)
placeholders, documents, and associated document information (metadata). The package contains a
Microsoft Excel spreadsheet that contains metadata, known as the bulk import-export spreadsheet.
You can configure the spreadsheet to use any properties that exist on the cd_clinical_tmf_doc type
or your own custom type. The system synchronizes any dictionaries associated to properties on
the Schema worksheet of the spreadsheet. This procedure shows you how to add a Documentum
attribute to the bulk import-export spreadsheet. You can find detailed information within the
spreadsheet template to make additional configurations.

1. Log in to D2 Client as a member of cd_admingroup.


2. From the Repository browser, navigate to Templates/D2.
3. Right-click TMF BulkUpload.xls and select Edit.
4. Add a column to the File List worksheet. For example, add a Subject column.

294
Configuration Tasks

5. On the Schema worksheet, add a row for the data field definition as described in the following
table:

Column Description
Worksheet Type File List to indicate that the data
resides in the File List worksheet.
Column Heading Type the column name for your attribute. For
example, Subject.
Data Field Type the Documentum attribute name or your
custom attribute name. For example, subject.
Default Value (Optional) Type a default value for blank cells.
Read-only Type Y if it is a read-only attribute. During
an export, the system reads the attributes
from the repository and writes them to the
spreadsheet. The system ignores this column
on import. Consider highlighting the column
in light blue so that the end users know that it
is a read-only column.

Type N to enable the system to read the


metadata in this column and write it to the
repository on import.

Do not change the information for the


pre-populated system data fields.

The following example shows information added to the Schema worksheet for the Documentum
attribute column:

The following example shows the Documentum attribute column highlighted in light blue to
indicate that it is a read-only column:

295
Configuration Tasks

6. Save the Microsoft Excel Spreadsheet in Microsoft Excel 97-2003 format (.xls format) or Excel
2007-2010 (.xlsx format).
An Alias column has been added to the Schema worksheet to enable dictionary aliases to be specified.
For example, you can add a "Synonym" alias to the "TMF Unique Artifact Names" dictionary and
override the default artifact display names in this column. If a dictionary alias is not specified or
blank, the locale setting is used for dictionary lookups. The default locale is "en" (English) although
this can also be specified in the schema settings.

Configuring Roles and Permissions for


External Participants (LSTMF)
Documentum for eTMF allows you to provide direct, tailored access to TMF documents and
placeholders for external trial participants such as Inspectors or Contract Research Organizations
(CROs). Documentum for eTMF contains four preconfigured roles, but you can add additional roles
or disable the default roles as needed. You can configure each role defined in the system to have
different permissions at the artifact level. For example, an External Contributor can have Reader
access to some artifacts, Author access to other artifacts, and no access to others.
Configuring the roles and their permissions in the Documentum for eTMF system is a two-step
process. First, you define the role and then you assign the role permissions at the artifact level. An
IT administrator performs this configuration based on business requirements. Changes to the role
definitions do not affect existing trials and related registration objects unless the security is refreshed
explicitly, so it is important to define your business rules as part of your implementation project. In
this way, the roles and their permissions are consistent across your system, regardless of the country
or site assigned to an external participant.
This section describes how to create the external roles in your system and assign the roles security at
the artifact level. Once defined, the roles are available to Trial Managers so they can assign a user as
an external role to a country or site. For example, if a regulatory authority is inspecting two sites, the
Trial Manager can add the user as an External Inspector to those sites. By doing so, the user receives
the configured access to only the documents in those sites (including all country, trial, and product
documents related to those sites), and nothing more in the system.

296
Configuration Tasks

The preconfigured roles are:


• External Contributor
• External Reviewer
• Inspector
• Investigator
To define the roles and permissions, complete the following procedures:
• Defining External Trial Participant Roles, page 297
• Defining Permissions for External Participant Roles, page 300
The following procedure provides an example of how to add a new role in your system:
• Adding an External Participant Role Example, page 301

Defining External Trial Participant Roles


Documentum for eTMF includes predefined roles that enable you to provide direct access to your
Documentum for eTMF system for external trial participants. You can configure the roles based on
your business requirements.

1. Log in to D2-Config as an administrator.


2. Select OpenText Life Sciences eTMF Solution from the configuration filter.
3. Select Data > Dictionary from the menu bar.
4. In the Dictionaries list, select TMF External Contributor Roles.
5. On the Alias tab, configure the external participant roles as defined in the following table:

Column Description
(Checkbox) Select the checkbox for each role to include as
an external participant.

To add a role, type information in a blank row


with the checkbox selected.

To remove a role, clear the checkbox for that


role.
Key Type a user-friendly role name. These
participant role names appear as selections for
users in drop-down lists. For example, you
can type Contractor in the Key column for
a new role.

297
Configuration Tasks

Column Description
Group Name Suffix Type a unique suffix for your participant roles.
For example, you can type _contract for a
new document Contractor role.

The system uses this suffix to configure the


dynamic role group naming conventions. The
Documentum for eTMF User Guide provides
additional information on external trial
participant groups.

It is not necessary to change the Group Name


Suffix for the default roles.
Note: By convention, Documentum group
names use lower-case letters and underscore
symbols. Do not specify excessively long
group name suffixes, as the group names are
limited to 48 characters in total.
Taxonomy Dictionary Level Type the participant names to use for the
dictionary referenced by the taxonomy.
This name must be identical to the name
of the dictionary created for a particular
participant role. For example, if you type TMF
Contractor Access for a new role in this
column, the system requires a dictionary for
this role with the same name.

It is not necessary to change the Taxonomy


Dictionary Level name for the default roles. If
you decide to change it, the system requires a
dictionary for the role with the same name.

298
Configuration Tasks

Column Description
File Plan Column Alias Type a column alias in upper case letters with
no spaces. This alias is for system use. For
example, you can type CONTRACTOR for a new
role.
Context Group
Note: This setting is not currently used. It is
reserved for future use.

Type the name of the top-level group


associated with this role. For example, for the
preconfigured External Contributor role, the
top-level group is tmf_external_contributors.
This is used to associate a Documentum D2
workspace with the members of this group
so that users working in a particular role
can access the appropriate workspace. You
can modify the predefined workspaces and
configure additional workspaces if necessary,
and associate them with the relevant groups.
If a user has only one workspace available, it
is preselected automatically when they log in
to D2 Client. Otherwise, they can choose the
appropriate workspace when they log in.

For example, you can type tmf_contractor.

The Documentum for eTMF User Guide provides


additional information on external trial
participant groups.

6. Save the configuration.


7. For each context group that you add:
a. In Documentum Administrator (DA), create a role with the same name as the context group.
For example, if you create the context group tmf_contractor, create a role in DA named
tmf_contractor.
b. In D2-Config, select Go to > Context from the menu bar, click New, to create a context with a
similar name.
c. In the Group area, move the context group to the Selection area of the context and save the
configuration.
d. Click Matrix and map the context to a workspace. Click to place a checkmark at the
intersection of the context and the workspace.
8. For each participant role that you add, copy the TMF Document Roles dictionary to create a
new dictionary:
a. Select Data > Dictionary from the menu bar.
b. In the Dictionaries list, select TMF Document Roles and click Create from on the toolbar.

299
Configuration Tasks

c. Type a name for this copy of the dictionary. This name should be the same as the name
defined for the participant role in the Taxonomy Dictionary Level column of the TMF
External Contributor Roles dictionary. For example, TMF Contractor Access.
d. Do not make any other changes and save the configuration.
9. Update the TMF Classification by Artifact taxonomy with your participant roles:
a. Select Data > Taxonomy from the menu bar.
b. In the Taxonomies list, select TMF Classification by Artifact.
In the dialog box, select Excel as the file format and click Modify. Save the file to your local
machine.
c. Adjust the default TMF External Contributors, TMF External Reviewers, TMF Inspector
Access, and TMF Investigator Access columns of the spreadsheet based on your role
definitions. These columns must match the name of the dictionaries created for each role. For
example, TMF Contractor Access. Add a column for each additional role. If you remove a
role, remove the column for that role.
d. Save the file, import it, and save the configuration.
Related topic:
• Adding an External Participant Role Example, page 301

Defining Permissions for External Participant Roles


Documentum for eTMF provides predefined document permissions for each external trial participant
role. If necessary, you can define individual document access permissions (for example: None,
Reader, Author, Reviewer) for each external trial participant role.

1. Log in to D2-Config as an administrator.


2. Select OpenText Life Sciences eTMF Solution from the configuration filter.
3. Select Data > Taxonomy from the menu bar.
4. In the Taxonomies list, select TMF Classification by Artifact and edit the taxonomy.
In the dialog box, select Excel as the file format and click Modify. Save the file to your local
machine.
5. In the TMF External Contributors, TMF External Reviewers, TMF Inspector Access, and
TMF Investigator Access columns of the spreadsheet, type the document-level roles for each
artifact in the cells of the columns. If you adjusted or added additional role columns, type the
document-level roles for each artifact for your defined roles.
The following figure shows an example of document-level access changes in the TMF External
Reviewers column and an additional TMF Contractor Access column added for a new role:

300
Configuration Tasks

In practice, these role setting are most likely to be Author to provide read and write access and
either Reader or Auditor for read-only access. The available roles are defined in the following
table:

Role Description
Document Coordinator Has full access and can reassign
document-level roles and define the
effective period of a Category 1 document
Author Has read and write access to work-in-progress
versions and can self-approve Category 3
documents
Reviewer Can participate in a review and approval
workflow
Approver Electronically-signs a Category 1-2 document
through a review and approval workflow
Reader Has read-only access to Effective versions
Auditor Has read-only access to release-ready versions,
including historic release-ready versions.
For example, Release Pending, Effective,
Superseded, but not work-in-progress and
Withdrawn versions.
None Has no access to this artifact (the artifact is
hidden)

6. Save the file, import it, and save the configuration.


The Documentum D2 Administration Guide provides more information on configuring D2-Config.

Adding an External Participant Role Example


This procedure provides an example of how to add an external trial participant role, Contractor,
to your system.

1. Log in to D2-Config as an administrator.


2. Select OpenText Life Sciences eTMF Solution from the configuration filter.
3. Select Data > Dictionary from the menu bar.

301
Configuration Tasks

4. In the Dictionaries list, select TMF External Contributor Roles.


5. On the Alias tab, configure the blank row for the new role as defined in the following table:

Column Configuration example


(Checkbox) Select the checkbox.
Key Type Contractor.
Group Name Suffix Type _contract.
Taxonomy Dictionary Level Type TMF Contractor Access.
File Plan Column Alias Type CONTRACTOR.
Context Group Type tmf_contractor.

6. Save the configuration.


7. In Documentum Administrator (DA), create a role with the same name as the context group.
In this example, create a role in DA named tmf_contractor since the context group is named
tmf_contractor.
8. Create a context for the role:
a. In D2-Config, select Go to > Context on the menu bar, click New, and create a context with a
similar name, such as TMF Contractors.
b. In the Parents area, select TMF Roles.
c. In the Group area, select and move the context group to the Selection area of the context
and save the configuration.

302
Configuration Tasks

9. Click Matrix and map the context to a workspace:.


a. Expand the TMF Roles context.
b. Click to place a check mark at the intersection of the context and a workspace.

303
Configuration Tasks

c. Save the configuration.


10. Copy the TMF Document Roles dictionary to create a new dictionary:
a. Select Data > Dictionary from the menu bar.
b. In the Dictionaries list, select TMF Document Roles and click Create from on the toolbar.
c. Type TMF Contractor Access as the name for this copy of the dictionary. This name should
be the same as the name defined for the participant role in the Taxonomy Dictionary Level
column of the TMF External Contributor Roles dictionary.
d. Do not make any other changes and save the configuration.
11. Update the TMF Classification by Artifact taxonomy with the participant role:
a. Select Data > Taxonomy from the menu bar.
b. In the Taxonomies list, select TMF Classification by Artifact.
In the dialog box, select Excel as the file format and click Modify. Save the file to your local
machine.
c. Add a column with the label TMF Contractors next to the other external participant role
columns. The column name must match the name of the dictionary that you created for the
role.

304
Configuration Tasks

d. In the cells of the TMF Contractors column, type the document-level roles for each artifact
for your defined roles. For this example, type Reviewer in the cells to provide read-only
access for each artifact to the role.

e. Save the file, import it, and save the configuration.

Configuring Quality Check (LSTMF)


Documentum for eTMF provides a scalable Quality Check (QC) process for documents that are
imported into the system. Documents imported into the system in the Index state may be associated
with a specific QC group as described in Quality Check (LSTMF), page 34 based on the QC type
defined for that artifact in the TMF Unique Artifact Names dictionary in D2-Config. This QC group
will be responsible for inspecting the document for quality and finalizing the document. By default,
documents that need to undergo QC have the QC type defined in the TMF Unique Artifact Names
dictionary:
• Technical—Indicates that a document requires a technical quality check. Only users belonging to
the Technical QC group can access these documents.
• Business—Indicates that a document requires a business quality check. Only users belonging to
the Business QC group can access these documents.
• To Draft—Indicates that the imported document moves to the Draft state.
• <Blank>—Indicates that the document is not assigned to any QC process and can be moved to
the Final state.
QC types are set on the placeholders when they are created and does not change if the reference
model is updated. As such, a document's QC process is based on the QC process for the artifact that
was in place when the placeholder was created and not its current value in the reference model.
To view the QC types that are currently defined in the system, you can open the TMF QC Types
dictionary in D2-Config. During the QC process, when you reject a document, you must provide a
QC reject reason. The reject reasons are defined in the TMF QC Reject Reasons dictionary, where you
can define additional reject reasons for each QC type.

If you do not want a document to go through the QC process or change the QC type, perform the
following steps:
1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.

305
Configuration Tasks

3. Select Data > Dictionary.


4. Under Dictionaries, select TMF QC Types.
5. Under Properties, on the Alias tab, you can define additional QC types.
6. Under Dictionaries, select the dictionary, for example, TMF Unique Artifact Names 2.0 or
TMF Unique Artifact Names 3.0.
7. To export the dictionary to an Excel sheet, click Export and select Excel as the file format.
8. In the dictionary Excel sheet, in the QC Type column for a particular artifact, change the QC type
or leave it blank if you do not want the document to go through the QC process.
9. To import the dictionary, click Import and select the updated dictionary Excel sheet.
10. Click Save and then click Update repository.

Distributed Server Method Processing


In the Life Sciences solution, a concurrent server method processing framework is provided to
improve the efficiency and scalability of server methods. Life Sciences supports both multi-threading
and distributed processing for the following operations:
• Applying attribute inheritance rules to multiple documents and registration forms in
Documentum for Quality and Manufacturing, Documentum for eTMF, and Documentum for
Research and Development. For example, to reapply Documentum D2 auto-naming to affected
documents and registration forms when a Manager changes a product code.
• Generating missing TMF placeholders on trial activation and refresh in Documentum for eTMF.
• Updating the dynamic access control groups for external participants in a TMF in Documentum
for eTMF.
For these operations, the standard settings allocate up to five threads for local multi-threaded
processing, but disable distributed processing. To use distributed processing, set up a multi-node
Documentum Server architecture with at least two Documentum Servers. Using D2-Config, enable
the distributed processing options in the System Parameters dictionary. Because of the caching of
the dictionary settings, ensure that you restart the relevant Java Method Server services to make
the configuration changes effective.
If there is sufficient CPU capacity, you can also increase the number of processing threads per server.
Do not over-optimize the system solely for TMF processing, because it can result in diminished
performance of the repository overall. It is important to remember that Documentum has to serve
interactive users, Documentum D2 lifecycle operations, workflows, and other batch processes at the
same time. To balance the loads effectively in multi-node architectures, it is possible to allocate
some of the servers for interactive use and others for batch processing. Contact OpenText Global
Technical Services for assistance with analyzing your TMF system architecture and performance
tuning. For improving the performance of server methods, see Improving Performance of Server
Methods, page 428.

306
Configuration Tasks

Enabling Distributed and Multi-threaded Processing


Distributed processing enables large batch jobs to execute across multiple Documentum Servers.
The system balances the workload over the available Documentum Servers and uses distributed
processing only for large datasets.

1. Verify that multiple Documentum Servers are available. The repository should have more than
one dm_server_config object registered.
2. Log in to D2-Config as an administrator.
3. Select All elements from the configuration filter.
4. Select Data > Dictionary from the menu bar.
5. In the Dictionaries list, select System Parameters.
6. On the Alias tab, configure the system parameters as defined in the following table:

Parameter Description
content_servers Type the Documentum Servers available
for distributed processing. Use a
comma-separated list of dm_server_config
names, or * to use all available servers.

To enable distributed processing, specify two


or more Documentum Servers.

To disable distributed processing, set this


value to a blank string.
distributed_processing_threshold Type the minimum data size for invoking
distributed server methods in multi-node
Documentum Server architectures.

To enable distributed processing, set this


parameter to at least 2. For example, if you
set it to 1000, batches of 1000 objects or more
distribute across the available servers, but
batches with less than 1000 process locally.

To disable distributed processing, set this


parameter to 0 (zero). Distributed processing
is disabled by default.
Note: Set this value to a reasonably high
level to avoid the overhead of distributing
processing tasks when there are relatively few
objects for processing. In this case, it would be
more efficient to process them locally.

307
Configuration Tasks

Parameter Description
max_threads Type the maximum number of server method
threads per Documentum Server for local
processing.

The minimum value is 1, in which case


the tasks are processed sequentially in
single-threaded mode. The default setting
is 5. There is no defined maximum value,
but increasing it up to or beyond the point at
which the Documentum Server CPU becomes
saturated is not recommended, as this can
be detrimental to the performance of the
repository overall.
shared_folder (Optional) Type a dm_location name or
explicit path of a network-shared folder
to use to interchanges data efficiently
between Documentum Servers for distributed
processes.

If you do not define this value, data exchanges


occur through temporary binary objects stored
in the repository.

7. Save the configuration.


8. Restart the Java Method Server.

Disabling Parallel Processing for CFD Methods


Distributed parallel processing has been implemented for the following CDF methods and utilities,
which helps create multiple threads across available Documentum Server instances to share the
load while processing logic:
• CDFVirtualDocumentMethod
• CDFApplyAttributeInheritanceMethod
This existing distributed parallel processing logic reads the System Parameters dictionary and
processes the content_servers configured in the dictionary, assuming that all the server names
configured in the dictionary are up and running. In addition, it just picks the threshold as 100, which
is the default value. It is up to users to enable or disable the distributed processing by configuring the
following parameters in the dictionary:
• content_servers
• distributed_processing_threshold
To disable the parallel processing, the parameters must be set to null.

308
Configuration Tasks

Creating a dm_method Object


To support distributed processing in multi-node Documentum Server architectures, you must create
a dm_method object in Documentum to enable remote processing:
1. Log in to Documentum Administrator (or equivalent tool) as the Documentum installation owner.
2. Run the following DQL statement:
create dm_method object
set object_name = 'CDFApplyD2ConfigurationsMethod',
set title = 'CDF',
set method_verb = 'com.emc.d2.api.methods.D2Method
-class_name com.documentum.cdf.methods.ApplyD2Con
figurations,set method_type = 'java',
set use_method_server = true,
set launch_async = false,
set run_as_server = true,
set timeout_min = 3600,
set timeout_max = 86400,
set timeout_default = 7200

This creates the required dm_method object with timeout set to 7200s (2 hours) by default (adjust
the timeout parameters if necessary).
Due to security enhancements in Documentum Server 7.1 and 7.2, the bindfile API no longer works.
Although this does not affect the Life Sciences system, it may have an impact if the user changes the
run_as_server property to false on the following dm_method objects:
• CDFCreateObjectAsyncMethod
• CDFCreateObjectMethod
• TMFCreateLinks
Ensure that the run_as_server attribute is always set to True (1) for these dm_method objects.

Enabling the Trace Level of the D2 Core Method


Whenever the ApplyD2Configurations method is invoked internally through server methods, the
D2 core method is called for each object. By default, the D2 core method called through the server
methods is in INFO mode. If you want to enable DEBUG for the D2 core method, follow these steps:

1. Log in the D2-Config.


2. In the filter on the main toolbar, select All elements.
3. Select Data > Dictionary.
4. Under Dictionaries, select System Parameters.
5. On the Alias tab, for the d2_core_method_trace_level key, set any value between 0 and 10 to
enable the tracing level.
6. Click Save.
7. Clear the Documentum cache both on the server and client side.

309
Configuration Tasks

Note: This D2 core method tracing is only for core method calls invoked through Life Sciences server
methods. For setting the DEBUG level for the default D2 Core Method calls, see the Documentum D2
Administration Guide.

Enabling the “Convert to virtual document”


Menu Option
In certain Life Sciences solutions, such as Documentum for Quality and Manufacturing, the Convert
to virtual document menu option is disabled by default. If you want to enable this option, follow
these steps:

1. Log in to D2-Config.
2. In the filter on the main toolbar, select All elements.
3. Select Go to > Menu D2.
4. Under Menus, select CDF Contributor Menu.
5. Under Contextual menus, click <Right click> and click Convert to virtual document.
6. Under Conditions, ensure that Selection is not object type "....." is selected.
7. Under Condition parameters, for the Type field, click the ellipsis (...) button.
8. Under Selection, remove the following values:
• cd_quality_gmp_approved
• cd_quality_gmp_effective
9. Click OK and then click Save.

Updating the PDF DocInfo Parameter in D2


Dictionary
The source_object_pdf_docinfo_field parameter specifies an optional PDF Document Information
(DocInfo) field used by the submission publishing tool to record the Documentum r_object_ids of
the original source documents in the published PDF documents. When publishing documents from
Documentum, it is useful to embed the source object IDs in the corresponding published PDFs in
a designated DocInfo field.
In some instances, multiple source documents are combined to generate a single published output
document in the publishing tool. In these cases, the source_object_pdf_docinfo_field parameter can
be updated with multiple r_object_ids of the source documents separated by a delimiter, that is, any
special character such as comma, semicolon, backslash, and so on. This enables navigating to multiple
source documents from a single archived submission element to provide a complete view.
Any of the standard DocInfo fields can be used for this purpose such as Subject. Alternatively,
a custom DocInfo field can be used, such as SourceObjectId, which will not be shown on the
document properties page of Adobe Acrobat. Configuring the Document Information parameter
enables the system to track the original source documents and relate them to the corresponding

310
Configuration Tasks

published submission documents when they are imported into the repository through the Import
Submission function.
Follow these steps to update the PDF document information in the System Parameter D2 dictionary.

1. Log in to D2-Config.
2. Select Data > Dictionary.
3. On the Dictionary page, select All elements in the filter on the toolbar.
4. In the Dictionaries list, select System Parameters.
5. On the Alias tab, add a standard or custom DocInfo field, such as SourceObjectId or Subject, as
the value of source_object_pdf_docinfo_field.
6. In the source_object_pdf_multipleIDs_delimiter field, add the delimiter value that is used to
separate the list of source document r_object_ids in source_object_pdf_docinfo_field such as
comma, semicolon, backslash, colon, and so on.
7. Click Save.
On the Relations tab in the D2 Client, the user can click an imported submission document and
view the original source document that was used to generate it. The published version usually
contains additional markups such as headers, footers, watermarks, bookmarks, and hyperlinks
generated by the publishing tool, but it is sometimes useful to refer back to the original document
for publishing fidelity checking or to enable the original document to be identified and repurposed
for other applications.
If the publishing tool does not support embedding of Documentum attributes into PDF DocInfo
fields, or where the original documents are stored in a different repository, leave this setting blank;
source document relations will not be created at submission import time in that case.

311
Configuration Tasks

312
Chapter 17
Regulatory Submissions

This section provides an overview of regulatory submissions and how it is configured in the Life
Sciences solution.

Submission Overview
A regulatory submission is a collection of documents (submission elements) sent to a regulatory
Authority with respect to an application. For example, an Investigational New Drug (IND)
application to the US Food and Drugs Administration (FDA) for approval to commence clinical trials
in humans or a New Drug Application (NDA) to the FDA for approval to market a drug in the US. The
application type denotes the purpose of the application (IND or NDA, in the preceding examples).
An application may require several submissions to be made before it is approved. Various
amendments, queries, and requests for supplementary information may be requested by the
Authority and post-approval, additional submissions may be necessary from time to time, such
as Periodic Safety Update Reports (PSURs). The submission type indicates the purpose of each
submission, for example, an Initial Filing, Amendment, or PSUR. Both application types and
submission types are regional – different application and submission types are used in different
geographic regions. For example, the IND and NDA application types pertain to the US, whereas
the European equivalents are CTA (Clinical Trials Approval) and MAA (Marketing Authorization
Application), respectively.
The following figure illustrates the relationship between the various submission-related objects.

313
Regulatory Submissions

An application is typically made to one health authority in one country, in accordance with a
National Procedure (NP). The exception is for applications to European member states, which can
follow either a National Procedure for specific member states (a separate application being made
to each member state in that case), or a Centralized Procedure (CP), in which case the application
is made directly to the European Medicines Agency (EMA), the central European regulatory
authority. If the application is approved by the EMA, it is approved for use across all EU member
states. For certain types of applications, such as Biological License Applications (BLAs), approval
by the EMA through the CP is mandatory; for others, it is discretional. There is also an option to
use the Mutual Recognition Procedure (MRP) in the EU, which enables the same application to be
made to two or more member states simultaneously. Once one member state decides to evaluate
the product, it becomes the Reference Member State. The others become Concerned Member States,
acting in a reviewing or monitoring capacity. In this way, the MRP is designed to share the workload
in evaluating medicinal products across national regulatory authorities within the EU, without
compromising safety or regulatory scrutiny. To support MRP applications, it must be possible for an
application to be associated with multiple health authorities within the EU region.

314
Regulatory Submissions

Electronic Common Technical Document


Submission
An application can adopt the electronic Common Technical Document (eCTD) format, as defined
by the International Committee on Harmonization (ICH). In an eCTD submission, each submission
represents a sequence of the eCTD, enumerated from sequence number 0000 for the initial filing. An
XML backbone file (index.xml) is included within the top-level sequence folder, referencing the
documents included in that sequence. The documents in a sequence are organized into the prescribed
ICH eCTD module structure:
• Module 1 (Regional)
• Module 2 (Summaries)
• Module 3 (Quality)
• Module 4 (Non-Clinical)
• Module 5 (Clinical)
The structure of the regional module m1 is not defined by the ICH, but is defined by the regulatory
authorities in each region. It has its own regional XML file in the appropriate format (for example,
us-regional.xml for US, eu-regional.xml for Europe) referring to the regional-specific
documents included in module m1 of the sequence. Document-specific metadata may be included in
the XML backbone and regional XML files, such as the drug substance and manufacturer to which a
particular document pertains and submission-specific metadata may also be included in the regional
XML file, such as the product trade names intended to be used in each country.
The initial submission for an eCTD comprises only new documents. Each subsequent submission
(numbered 0001, 0002, and so on) contains only incremental changes that is, additional
(new) documents, replacements, and supplements to previously-submitted documents, and
previously-submitted documents to be considered as withdrawn. Each sequence includes a new
version of the XML backbone file, specifying the alterations to be made for each leaf document in
terms of operations (new, replace, append or delete).

Regional XML Files for Other Agencies


Additional regional XML files may be included in the standard installation, or downloaded from the
relevant agency websites and manually-imported into the repository post-installation, to support
other regions.

Additional XSL Style Sheets


The XSL style sheets provided by the ICH and regional agencies are installed on the application
server as part of the Documentum Submission Store and View installation. However, they are not
required for XML processing and import purposes.
These files can be used to preview regional XML files in the Submission Viewer if the HTML renditions
are not available. However, HTML renditions are usually generated during import, in which case the

315
Regulatory Submissions

XSL stylesheet files are not required for previewing. They may be useful as examples if additional
stylesheets need to be provided to support new regions or new DTD formats for existing regions.

Submission History XML Files


The submission history view (model) is compiled and maintained automatically by the system as each
submission or sequence is imported into the repository. The model is represented in several XML files:
• The overall submission history file, or Table of Contents view, is stored as the primary content file of
the relevant Regulatory Application Registration Form in XML format. This is used to render
the initial table of contents view when a Regulatory Application Registration Form is selected for
previewing in the Submission Viewer.
• A separate submission view file is also generated for each imported submission or eCTD sequence,
which is stored as the primary content file of the relevant Submission Registration Form (SRF)
in each case. SRFs can be pre-registered prior to importing. Otherwise, new SRFs are created
by the system during import. The SRF XML file is used to render the individual sequence or
submission views in the Submission Viewer when they are selected for previewing. Hyperlinks
are also provided in the left column of the Table of Contents view to facilitate navigation to
the individual submission or sequence views.
• In the case of eCTD submissions, a Current SRF object is also generated as a hidden object in the
repository, in which the cumulative view XML file is stored. This is used to render the Current
and Cumulative views in the Submission Viewer, showing the consolidated set of files in each
eCTD section across all sequences. The Current view shows only the latest versions of each file,
hiding replaced or deleted files, whereas the Cumulative view shows all files.
• Cross-reference files are also generated and attached as xml_rarf_xref renditions to the SRF, current
SRF, and individual submission document objects. These are used internally by the Submission
Viewer to open the view in the appropriate mode when an SRF or archived submission document
is selected for previewing in the Submission Viewer. For example, you can search for a submission
or archived submission document based on certain criteria using the search widgets provided,
and select the appropriate SRF or document to view it in the Submission Viewer.

Submission Import Progress Monitoring and


Error Handling
During submission import operations, a progress report and summary of the outcome of the last
import operation is recorded in the log_entry attribute of the Regulatory Application Registration
Form. This is labelled as User Comments in Documentum. It is useful to add the User Comments
field to the column settings in D2, so that the current progress and last outcome can be tracked by
refreshing the display from time to time. When the import operation is completed, the status of the
Regulatory Application Registration Form is changed from Importing to Active or Import Failed.
If errors or warnings are reported in the User Comments field, the submission import log file can
then be opened to obtain further information. This file contains a rolling log of the import operations
carried out for the regulatory application, including details of any errors or validation warnings
encountered during the import process, for example, PDF files containing invalid cross-reference

316
Regulatory Submissions

hyperlinks. The submission log file is attached to the Regulatory Application Registration Form as a
text or crtext rendition that can be opened in Notepad from the Renditions tab in D2.
The selected submissions or eCTD sequences for the application are downloaded, imported, and
committed in the repository one by one, so that the submission view XML files (including the eCTD
cumulative view) are kept up-to-date and are valid for the last successfully-imported submission
or eCTD sequence. In the event of an irrecoverable error, such as the submission files could not be
downloaded and imported due to a network outage, the changes for the current submission are not
applied to the submission view files. In some cases, the SRF and submission files for the failed
submission or sequence may exist in the repository, even though that submission or sequence is not
visible in the Submission Viewer. However, the SRF itself is not marked as imported until the import
process is complete. This enables the failed submissions or sequences to be re-imported after the
error has been rectified and the system restored. On re-importing the remaining submissions or
sequences, any files that currently exist in the repository are automatically deleted prior to importing
the new files.
To enable previously-imported submissions or sequences to be re-imported, it is necessary to update
the relevant SRFs to mark them as not imported (or the SRFs can be deleted completely). For security
reasons, this can only be accomplished by a Documentum Superuser account, for example in
Documentum Administrator, through the following DQL statement:
update cd_reg_submission_info object
set is_imported = false
where application_description = '<application-description>'
[and submission_number = '<submission-number>']
To enable all imported submissions or sequences for the application to be re-imported, omit the last
and submission_number ... clause, so that all SRFs for the application are updated. It is not
necessary to delete the existing submission folders, subfolders, and documents from the repository,
or to reset the XML view files. This is done automatically when the submissions or sequences are
re-imported.

Diagnostic Files for Submission Import


After importing submissions or eCTD sequences, an additional ZIP file rendition is now attached to
the RARF along with the import log file (text rendition). The ZIP file contains additional diagnostic
files in Comma-Separated Values (CSV) format, which can be downloaded and opened in Microsoft
Excel:
• A Metadata Dump File for each imported eCTD sequence or NeeS submission, containing details
of the Documentum metadata applied to each submission folder or document, including metadata
inherited from the RARF or SRF or extracted from eCTD XML files.
• PDF cross-reference hyperlink information showing the source and destination files for each
hyperlink identified by the system and whether or not the links were resolved successfully in
each case.
• In the case of eCTDs, the modified file information for each sequence extracted from the eCTD
XML backbone files.
• For eCTDs containing Study Tagging Files (STFs), the metadata extracted from the STF XML files,
including the file tags, study-related metadata, and leaf document references, as and where
applicable.

317
Regulatory Submissions

These files are useful for diagnostic purposes and can be used to validate the outcome of a submission
import operation. The dump files are preserved across import operations, and either replaced or
augmented to in subsequent imports. However, if the initial eCTD sequence 0000 is re-imported, the
submission views are rebuilt and any existing dump files are discarded and regenerated in that case.
Note: It is possible to disable this feature by specifying the argument -attach_dump_files as false
in the "(Import Submissions)" transition of the RARF lifecycle configuration. This option is enabled
by default.

Supporting New eCTD XML Formats


Documentum Submission Store and View supports the following eCTD XML formats by default:

318
Regulatory Submissions

eCTD Modules Region/Health DTD/Schema Specification Predefined


Authority Versions Version XML Schema
Configuration
Names
M1 Australia 0.9 0.9 eCTD Australian
(Therapeutic Regional XML file
Goods v 0.9
Administration 3.0 3.0 eCTD Australian
(TGA) Regional XML file
v 3.0
Canada (Health 1.0 1.0 eCTD Canada
Canada) 2.2 2.2 Regional XML file
v x.y
Europe (EMA and 1.0 1.0 eCTD EU
all EU Member 1.1 1.1 Regional XML
State Health file v x.y
Authorities) 1.2.1 1.2.1
1.3 1.3
1.4 1.4
2.0 2.0
3.0 3.0
3.0.1 3.0.1
Gulf Cooperation 1.0 1.2 eCTD Gulf
Council (Saudi Coopeartion
Food and Drug Council Regional
Authority) XML file v 1.2
1.1 1.5 eCTD Gulf
Coopeartion
Council Regional
XML file v 1.5
Japan (Ministry of 1.0, 1.0A, 1.0B 1.0 eCTD Japan
Health, Labour Regional XML
and Welfare file v 1.0
(MHLW))
South Africa 1.0 1.0 eCTD South
(Medicines African Regional
Control Council XML File v 1.0
(MCC)
Switzerland 1.0.1 1.0.1 eCTD Swiss
(Swiss Medic) Regional XML
file v 1.0.1
1.1 1.1 eCTD Swiss
1.2 1.2 Regional XML
file v x.y
1.3 1.3
Thailand (Bereau 0.92 0.92 eCTD Thailand
of Drug Control Regional XML file
319
(BDC)) v 0.92
1.0 1.0 eCTD Thailand
Regulatory Submissions

eCTD Modules Region/Health DTD/Schema Specification Predefined


Authority Versions Version XML Schema
Configuration
Names
M2-M5 Global (ICH)* 3.2 3.2 eCTD ICH XML
Backbone File v
3.2
M4-M5 Study US (FDA) 2.2 2.2 ICH Study
tagging files Tagging File v
(STFs) 2.2
Structured US (FDA) 1.0 1.0 FDA Structured
Product Labeling Product Labeling
file File
*The ICH is not a health authority, but an international industry body responsible for defining and
maintaining standard models such as the format used for eCTD module M2-M5 XML backbone files.
Documentum Submission Store and View uses the XML schema configuration objects in the
repository to represent these eCTD module formats. The predefined XML schema configuration
objects are created automatically in the repository as objects of type cd_ectd_xml_schema, which
are stored in the /System/SSV/XML Schemas folder. These objects are used by Documentum
Submission Store and View to identify and handle XML files that are encountered in eCTD sequences
during the submission import process.
To support additional eCTD XML formats, such as M1 regional XML formats for other regions or
different eCTD versions for existing regions, it is possible to extend the standard configuration
as follows:
1. Using Documentum Administrator, copy and rename an existing cd_ectd_xml_schema
configuration object in the /System/SSV/XML Schemas folder that is similar to the format
required. For example, for US FDA v 2.3 regional M1 XML files, use the eCTD US-FDA
Regional XML file v 2.3 configuration as a starting point. It is also possible to create a new
cd_ectd_xml_schema configuration object through DQL although it is generally easier to
copy and adjust the existing objects. It is recommended that the existing naming conventions
are followed, including the format version number suffix, so that the applicability of each
configuration object can be easily identified.
2. Adjust the properties of the configuration object. See XML Schema Configuration Object Settings,
page 321 for more information.
3. Create an XSL transformation script, if required, to convert the XML into a standard format and
store it in the main content file of the XML schema configuration object as an xsl content file.
Then, enable the XSLT processing option for this schema. This is only required for handling XML
files that use a format or element naming convention that is dissimilar to the standard ICH
format, such as Japanese regional M1 files. See Processing Standard XML Files, page 329 and
Transforming Non-Standard XML Files, page 332 for more information.
4. On the application server, set up an XSL script to extract the regional M1 XML metadata for
previewing in Documentum Submission Store and View. The easiest way to do this is to copy
and edit one of the predefined XSL scripts or adapt an agency-supplied XSL script to enable
the regional XML metadata to be previewed (see Previewing and Processing eCTD Module 1

320
Regulatory Submissions

Regional XML Files, page 335). Link this script to the XML schema configuration object by
adjusting the xml_envelope_stylesheet property accordingly.
5. If necessary, add entries to the D2 dictionary, Submission Regional Metadata Conversions, to
convert regional XML-encoded values into suitable values for storage in Documentum. See
Mapping XML Values to Documentum Attributes, page 347 for more information.
6. Import a sample submission to test the new configuration and ensure that it can be navigated
and previewed correctly.
When working with different locales, you must adjust the operating system and the database with
appropriate locale settings as suggested in the Documentum Server Installation Guide and make the
necessary changes in the Documentum Server. In addition, you must also ensure that JMS is adjusted
accordingly to the appropriate locale settings to be in sync with the Documentum Server and the
database. For example, in the JMS startup script, add the following line in JAVA_OPTS and restart
the service:
'-Dfile.encoding=UTF-8'

XML Schema Configuration Object Settings


The attributes of the cd_ectd_xml_schema objects must be configured in Documentum as follows:

Attribute Data type Description Example Values


for eu-regional.xml
Version 2.0 Schema
object_name Char(255) An arbitrary unique eu-regional.xml v 2.0
identifier for this XML
schema configuration
object in the repository.
title Char(400) An optional eCTD EU Regional
description of the XML XML file v 2.0
format that this schema
object represents.
origin_url Char(255) An optional URL http://esubmission
referring to the website .ema.europa.eu
of the organization or /eumodule1/index.htm
Health Authority that
owns the specification
of this XML format.

321
Regulatory Submissions

Attribute Data type Description Example Values


for eu-regional.xml
Version 2.0 Schema
xpath_qualifier Char(255) An XPath expression /eu-backbone[@dtd
identifying the root -version='2.0']
element in an XML
file that conforms
to this format. This
is used by SSV to
identify and select
the appropriate XML
schema configuration
object to use for each
XML file discovered in
the eCTD sequence.
schema_category Char(32) Indicates the type eCTD Regional XML
of this XML file, as File
follows:

• eCTD Regional
XML File

• eCTD ICH XML


Backbone File

• eCTD ICH Study


Tagging File
xml_format_code Char(32) An abbreviated EU-2.0
display label denoting
this XML format, used
in the SSV Viewer
to show the eCTD
module formats or
versions used in a
particular submission.

322
Regulatory Submissions

Attribute Data type Description Example Values


for eu-regional.xml
Version 2.0 Schema
enable_xslt Boolean Indicates whether F
_preprocessing this XML file needs
to be pre-processed
through an XSLT
transformation script
when loaded, in order
to convert it into
a form that can be
processed more easily.
The XSLT script itself
should be stored as a
main content file or
rendition of the XML
schema configuration
object, using the
Documentum
format code “xsl”.
See Transforming
Non-Standard XML
Files, page 332.

323
Regulatory Submissions

Attribute Data type Description Example Values


for eu-regional.xml
Version 2.0 Schema
preview_stylesheet Char(32) Specifies the name eu-regional_2-0.xsl
of the XSL stylesheet
to be used to render
the XML as HTML in
the relevant preview
widget (optional).

To support server
-based prerendering
of XML to HTML, the
specified stylesheet
must be installed in
the /System/SSV
/Stylesheets
folder as a standard
dm_document object
(with format xsl).

To support
client-based rendering
of XML to HTML,
the stylesheet must
also be installed on
the application server
in the %WEB-APPS%
/XMLViewer/style
folder.
preview_widget_id Char(32) Specifies the widget ID SSVLeafDocument
of the target widget Viewer
that should be used
for previewing,
for example,
SSVStudyTaggingFileViewer
. Optional – if
undefined or
blank, the default
preview widget,
SSVLeafDocumentViewer,
is used.
contains_leaf_docs Boolean Indicates whether T
or not this XML file
contains references
to “leaf elements,”
that is, documents to
be imported into the
repository.

324
Regulatory Submissions

Attribute Data type Description Example Values


for eu-regional.xml
Version 2.0 Schema
xml_extract_leaf_docs Boolean Indicates whether T
or not leaf elements
should be extracted
from this XML file
and incorporated into
the main Submission
History View. This
setting should be
enabled for Regional
M1 files and study
tagging files, and
disabled for ICH
backbone files.
xml_extract_leaf_docs Char(128) Specifies the XPath /eu-backbone/m1-eu
_from of the top-level XML
nodes that contain leaf
documents. Required
if contains_leaf_docs is
enabled. The wildcard
value “*” indicates that
all elements should be
included.
xml_leaf_doc_element Char(32) Specifies the name of leaf
the XML elements
representing leaf
documents (after XSLT
pre-processing, where
necessary). Required if
contains_leaf_docs
is enabled and
xml_extract_leaf_docs
_from is not “*”.

325
Regulatory Submissions

Attribute Data type Description Example Values


for eu-regional.xml
Version 2.0 Schema
xml_leaf_doc_attrs Char(128) Repeating Specifies the (none)
document-level
XML attributes to
be extracted from
leaf document
elements into the
submission history
view or Documentum
attributes, using
the notation
<target-Documentum
-attribute-name>
=<XPath-expression
-for-XML-leaf
-element>. Applies
to eCTD ICH XML
Backbone Files only
(not regional M1
XML files). See XML
Metadata Extraction,
page 338.
is_required_leaf_doc Boolean Repeating For each preceding (none)
_attr value, indicates
whether a defined,
non-blank attribute
value is required for
each leaf element
(T), or whether it is
optional (F). This is not
currently used.
override_leaf_attrs_on Boolean Repeating For each preceding (none)
_rarf value, specifies
whether the value
extracted from the
XML file can be
used as a default
attribute value for
the Regulatory
Application
Registration Form.
This is not currently
used.

326
Regulatory Submissions

Attribute Data type Description Example Values


for eu-regional.xml
Version 2.0 Schema
contains_envelope Boolean Indicates whether T
or not this XML
file contains
submission-level
metadata.
xml_envelope Char(32) Specifies the name envelope
_element of the top-level
XML elements that
contain metadata to
be extracted from
regional XML files.
Required if the
contains_envelope
flag is enabled.
xml_envelope Char(32) The file name of the eu-regional.xsl
_stylesheet XSL stylesheet to use
to render the regional
XML metadata as
HTML in the SSV
document preview
screen. The specified
XSL stylesheet
must be installed in
the %WEBAPPS%
/XMLViewer/style
folder on the D2
application server.
See Previewing and
Processing eCTD
Module 1 Regional
XML Files, page 335.
xml_envelope_attrs Char(128) Repeating Specifies the XML health_authority
attributes to be = agency-name;
extracted from the country_code
envelope elements into = @country;
the submission history application_number =
view or Documentum application/number;…
attributes using the
notation <target
-attribute>=<XPath
-expression>, where
<XPath-expression> is
evaluated in relation to
the envelope element
in each case.

327
Regulatory Submissions

Attribute Data type Description Example Values


for eu-regional.xml
Version 2.0 Schema
is_required_envelope Boolean Repeating For each preceding T;T;T;…
_attr value, indicates
whether a defined,
non-blank attribute
value is required for
each leaf element
(T), or whether it is
optional (F). This is not
currently used.
override_env_attrs_on Boolean Repeating For each preceding T;T;T;…
_rarf value, specifies
whether the value
extracted from the
XML file can be
used as a default
attribute value for
the Regulatory
Application
Registration Form.
This is not currently
used.

The system can support multiple versions of the same XML format. A separate XML schema
configuration object must be defined for each format or version. The xpath_qualifier setting is
used to identify and select the appropriate XML schema configuration object for each XML file
encountered during the eCTD import process, depending on whether the specified XPath expression
matches an element in that XML file. The xpath_qualifier should include a qualifier in each case
so that it only matches XML files of the appropriate format and version, for example, the XPath
expression /eu-backbone[@dtd-version='1.2.1'] only matches XML files containing a root element
named “eu-backbone” (ignoring the ”eu:” XML namespace prefix) where the “dtd-version” attribute
value of the root element is set to “1.2.1”. In other words, this schema only applies to XML files that
use version 1.2.1 of the EU regional M1 XML format, such as the following:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE eu:eu-backbone SYSTEM "../../util/dtd/eu-regional.dtd">
<?xml-stylesheet type="text/xsl" href="../../util/style/eu-regional.xsl"?>
<eu:eu-backbone xmlns:eu="http://europa.eu.int" xmlns:xlink="http://www.w3c.org/1999/xlink"
dtd-version="1.2.1">
<eu-envelope>
<envelope country="uk">
<application>
<number>N123456</number>
</application>
<applicant>Acme Pharma Inc.</applicant>
<agency-name>MHRA</agency-name>
<atc>C10AB05</atc>
<submission type="initial-maa" />
<procedure type="national" />
<invented-name>My Wonder Drug</invented-name>
<inn>wonderdrug</inn>

328
Regulatory Submissions

<sequence>0000</sequence>
<submission-description>Submission of registration dossier</submission-description>
</envelope>
</eu-envelope>
<m1-eu>
<m1-0-cover>
<specific country="uk">
<leaf operation="new" xlink:href="10-cover/uk/en-cover-wonder-drug-50mg.pdf"
xlink:type="simple" checksum-type="md5" application-version="PDF 1.4"
checksum="b132fc1e9e0c5c9f5401a4288f20f60f">
<title>Cover Page (English)</title>
</leaf>
</specific>
</m1-0-cover>
…etc.
</m1-eu>
<eu:eu-backbone>

In this way, different XML formats or versions can be matched to different XML schema configuration
objects with different settings. The file system filename of the XML file itself is not significant (an
EU M1 regional XML file does not need to be named eu-regonal.xml, for instance); neither is the
location of the XML file within the folder structure – the XML file recognition depends only on the
contents of the XML file. If none of the XML schema configuration objects defined in the repository
return a match for a particular XML file, the system logs a warning indicating that the XML file is
not recognized, and treats it as a standard leaf document. It is not possible to extract metadata from
unrecognized XML files. It is also not possible to preview them in the Documentum Submission
Store and View Document Viewer.

Processing Standard XML Files


If the eCTD XML file conforms to a standard format, it is possible for Documentum Submission Store
and View to process it directly by setting up XML schema configuration objects in the repository, as
described in the Supporting New eCTD XML Formats, page 318 section.
To conform to the standard format:
1. The XML file should encapsulate all regional metadata (if any) directly or indirectly in one
XML node, known as the envelope node, and all leaf documents in a separate document node.
For example, in US regional M1 files, the regional metadata is defined in the envelope node
/fda-regional/admin, and the leaf documents in the document node /fda-regional/m1-regional.
Similarly, in EU regional M1 files, the envelope node is /eu-backbone/eu-envelope, and the
documents node is /eu-backbone/m1-eu. ICH eCTD M2-M5 XML files do not have an envelope
node because the regional metadata is stored in a separate M1 XML file. The document node
for these is the root element, /ectd.
2. Each leaf document in the document node must be represented as a <leaf> element with the
following attributes defined for it:
• operation: Denotes incremental modifications to previously-submitted documents, as
and where necessary. Expected values are “new” for new documents; “replace” for new
versions of previously-submitted documents; “append” for extensions (supplements) to
previously-submitted documents; or “delete” for previously-submitted documents to be

329
Regulatory Submissions

regarded as withdrawn. In the initial sequence 0000, only “new” documents are permitted.
(Required)
• xlink:href: Specifies the document content file in terms of a relative file system path from the
sequence-level folder. (Required except where operation = “delete”)
• xlink:type: Specifies the type of the xlink:href value, which is expected to be “simple”, where
defined. (Optional)
• checksum: Specifies a checksum for the content file, which can be used to verify that the
content file has not been modified since the XML file was generated. (Optional)
• checksum-type: Specifies the algorithm used to generate the checksum; usually “md5”
(where defined). This is not currently used by LSSSV. (Optional)
• application-version: Denotes the application name and version number associated with the
content file, that is, the content file format. For example, “PDF-1.4”. (Optional)
• modified-file: Denotes the previously-submitted file affected by the operation (if any) as a
relative file path from the top-level folder containing the sequence folders; the first path
element being the previous sequence number. (Required where operation code is “replace”,
“append” or “delete”; not applicable where it is “new”)
Each <leaf> element should also have a <title> sub-element, in which the document title is
specified. In practice, it is possible to use a different XML element name for leaf documents,
provided the element has attributes and a <title> sub-element defined as above – the actual leaf
document element name is configured in the XML schema configuration object.
3. Non-leaf nodes in the document section are used to represent standard eCTD modules, sections,
and subsections, nested to an arbitrary depth. These usually represent the file system folder
structure of the eCTD sequence. Non-leaf nodes can have the following additional attribute
values defined for them, indicating sections pertaining to specific contexts:
• substance—Name of drug substance
• product-name—Name of drug product
• manufacturer—Manufacturer of drug substance or product
• indication—Specific therapeutic indication
• dosageform—Specific dosage form
• excipient—Name of excipient (inactive ingredient)
These attributes are defined in the relevant ICH DTD. In practice, Documentum Submission Store
and View enables arbitrary metadata to be specified for non-leaf nodes.
4. <node-extension> elements may also be used to represent custom extensions to the standard
eCTD folder structure. Each <node-extension> element has a <title> sub-element defining the
section title (display label), and one or more <leaf> document elements or <node-extension>
sub-elements recursively.
5. In M1 regional modules, <specific> elements can be used to denote country-specific sections,
where the country attribute value denotes the ISO country code in each case. For example,
<specific country="de"> represents a section that is applicable to Germany only. In practice, these
elements only appear in EU regional M1 XML files. The country code value eu or the aliases ema,

330
Regulatory Submissions

emea, or common can be used to denote elements pertaining to the central European Medical
Authority (EMA) or to the EU region in general.
6. In M1 regional modules, <pi-doc> elements can be used to denote package information
documents, such as labeling documents. This may be country-specific (with a country code
defined earlier). These elements may also specify a type attribute value indicating the type of
labeling document in each case, although Documentum Submission Store and View does not use
this. These elements only appear in EU regional M1 XML files.
All the XML files supported in the out-of-the-box configuration conform to this standard format
except for the Japanese regional M1 XML files.
Note: Whenever a large amount of processing is involved during the import of an eCTD submission
that has multiple submission folders, it is better to allocate more resources to the Java Method Server.
For example, memory should be preferably 1024 MB or greater.

331
Regulatory Submissions

Transforming Non-Standard XML Files


For Documentum Submission Store and View to process non-standard-format XML files, it is
necessary to provide a transformation script written in extensible stylesheet language (XSL) to enable
Documentum Submission Store and View to convert them into the standard format so that they can be
processed. For example, Japanese regional M1 XML files use a non-standard format as shown below:
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="../../util/style/jp-regional-1-0.xsl"?>
<universal xmlns="universal" xmlns:xlink="http://www.w3.org/1999/xlink"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="universal ../../util/dtd/jp-regional-1-0.xsd" lang="ja" schema-version="1.0">
<document-identifier>
<title>Japanese title</title>
<doc-id>Document identifier</doc-id>
</document-identifier>
<document>
<content-block param="admin">
…Japanese envelope metadata
</content-block>
<content-block param="m1">
<block-title>Japanese section label </block-title>
<content-block param="m1-03">
<block-title> Japanese sub-section label </block-title>
<doc-content xlink:href="relative content file path">
<title>1.3.5. Japanese document title</title>
<property name="operation" info-type="jp-regional-m1-toc">new</property>
<property name="checksum" info-type="jp-regional-m1-toc"> content file checksum
</property>
<property name="checksum-type" info-type="jp-regional-m1-toc">md5</property>
</doc-content>
…etc.
</content-block>
</content-block>
</document>
</universal>

To convert Japanese regional M1 XML files into the standard format, the following XSL transformation
script is used:
<?xml version=”1.0" encoding="UTF-8" standalone="no"?>
<xsl:transform version="1.0" xmlns:xsl = "http://www.w3.org/1999/XSL/Transform"xmlns:ectd
= "http://www.ich.org/ectd" xmlns:xlink = "http://www.w3.org/1999/xlink"xmlns:gen = "universal">
<xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/>
<xsl:template match="/">
<xsl:element name="jp-backbone">
<xsl:attribute name="dtd-version"><xsl:value-of select="/gen:universal/@schema-version"/>
</xsl:attribute>
<xsl:apply-templates select="*"/>
</xsl:element>
</xsl:template>
<!--Japanese M1 root XML element-->
<xsl:template match="gen:universal">

<!--Generate "admin" section containing document title, ID and Japanese MHLW envelope
information-->
<xsl:element name="admin">
<xsl:element name="title">
<xsl:value-of select="gen:document-identifier/gen:title"/>
</xsl:element>
<xsl:element name="document-id">
<xsl:value-of select="gen:document-identifier/gen:doc-id"/>

332
Regulatory Submissions

</xsl:element>
<xsl:element name="properties">
<xsl:attribute name="label"><xsl:value-of select="/gen:universal/gen:document/gen:
content-block[@param = 'admin']/gen:block-title"/></xsl:attribute>
<xsl:for-each select="/gen:universal/gen:document/gen:content-block[@param = 'admin']/*">
<xsl:choose>
<xsl:when test="name() = 'doc-content'">
<xsl:element name="property">
<xsl:attribute name="item-number"><xsl:value-of select="@param"/></xsl:attribute>
<xsl:attribute name="name"><xsl:value-of select="./gen:property/@name"/></xsl:attribute>
<xsl:attribute name="label"><xsl:value-of select="./gen:title"/></xsl:attribute>
<xsl:attribute name="value"><xsl:value-of select="./gen:property"/></xsl:attribute>
</xsl:element>
</xsl:when>
<xsl:when test="name() = 'content-block'">
<xsl:element name="property">
<xsl:attribute name="item-number"><xsl:value-of select="@param"/></xsl:attribute>
<xsl:attribute name="name"><xsl:value-of select="./gen:doc-content/gen:property/@name"/>
</xsl:attribute>
<xsl:attribute name="label"><xsl:value-of select="./gen:block-title"/></xsl:attribute>
<xsl:attribute name="value"><xsl:value-of select="./gen:doc-content/gen:property"/>
</xsl:attribute>
</xsl:element>
</xsl:when>
</xsl:choose>
</xsl:for-each>
</xsl:element>
</xsl:element>

<!--Generate "m1-jp" section containing M1 modules and leaf documents-->


<xsl:element name="m1-jp">
<xsl:attribute name="label"><xsl:value-of select="/gen:universal/gen:document/gen:
content-block[@param = 'm1']/gen:block-title"/></xsl:attribute>
<xsl:apply-templates select="/gen:universal/gen:document/gen:
content-block[@param = 'm1']/gen:content-block"/>
</xsl:element>
</xsl:template>

<!--M1 module / sub-module-->


<xsl:template match="gen:content-block">
<xsl:element name="{@param}">
<xsl:attribute name="label"><xsl:value-of select="./gen:block-title"/></xsl:attribute>
<xsl:apply-templates/>
</xsl:element>
</xsl:template>

<!--M1 leaf document-->


<xsl:template match="gen:doc-content">
<xsl:element name="leaf">
<xsl:attribute name="operation"><xsl:value-of select="./gen:property[@name = 'operation']"/>
</xsl:attribute>
<xsl:attribute name="checksum-type"><xsl:value-of select="./gen:property[@name =
'checksum-type']
"/></xsl:attribute>
<xsl:attribute name="checksum"><xsl:value-of select="./gen:property[@name = 'checksum']"/>
</xsl:attribute>
<xsl:attribute name="xlink:href"><xsl:value-of select="@xlink:href"/></xsl:attribute>
<xsl:if test="string-length(./gen:property[@name = 'modified']) > 0">
<xsl:attribute name="modified-file"><xsl:value-of select="./gen:property[@name = 'modified']
"/>
</xsl:attribute>
</xsl:if>
<xsl:element name="title">
<xsl:value-of select="./gen:title"/>
</xsl:element>

333
Regulatory Submissions

</xsl:element>
</xsl:template>

<xsl:template match="*"/>
</xsl:transform>

This script is included in the standard installation, and is stored in the repository as the main content
file of the XML eCTD schema configuration object for Japanese regional M1 files. The XSL file is then
downloaded and applied automatically to Japanese M1 XML files at import time as they are loaded,
prior to processing them. A copy of this XSL transformation script can also be found on the Application
Server in the file %WEBAPPS%/XMLViewer/style/jp-regional-1-0-normalize.xslt for
reference, although it is not used by the Viewer.
The output from the XSL transformation script is another XML file, of the following form:
<jp-backbone dtd-version=”Japanese schema version”>
<admin>
<title>Japanese title</title>
<document-id>document identifier</document-id>
<properties label="Japanese section label">
<property item-number="nn" name="property-code" label="Japanese property label"
value="Property value"/>
...
</properties>
</admin>
<m1-jp label="Japanese section label ">
<m1-01 label="Japanese sub- section label ">
<m1-01-03>
<leaf operation="new " checksum-type="md5" checksum="content file checksum"
xlink:href="relative content file path"
<title>Japanese document title</title>
</leaf>
...
</m1-01-01>
</m1-01>
...
</m1-jp>
</jp-backbone>

This now conforms to the standard XML format required by Documentum Submission Store and
View. The XML schema configuration settings are configured accordingly for this XML file to
enable it to be processed as normal. For example, the xml_envelope_element XPath setting is set
to /jp-backbone/admin in the eCTD schema configuration object for Japanese M1 files; similarly,
xml_extract_leaf_docs_from is set to /jp-backbone/m1-jp, and so on.
It is possible to use a similar technique to support new eCTD formats that do not conform to the
standard XML format, as follows:
1. Create an XSL transformation script to convert the eCTD format into a standard format, just as
in the preceding example.
2. Test the XSL transformation script by linking a sample eCTD XML file to it, for example, add an
XML preprocessing instruction of the form <?xml-stylesheet type="text/xsl" href="filename.xsl"?>
to the XML header in the sample file, and opening it in the browser.
3. After verifying that the XSL transformation script is generating XML output appropriately in
the standard format, create a new XML configuration object in the repository representing
the new eCTD format, and configure the settings as described previously. Ensure that the
enable_xslt_processing option is enabled and import the XSL transformation script as the

334
Regulatory Submissions

main content file of the XML configuration object in the repository, using the Documentum
format code xsl.
4. If the transformed XML file is not proper (that is, missing ‘leaf’ nodes or XML document is empty),
then use alternative transformation by using the xsl_ns rendition format of the eCTD schema
object that specifies a named space. Then, verify whether the XML file obtained is proper or not.
5. Test the installation by importing a sample eCTD that contains the XML files that use the new
format.
You can also add XSL scripts to the existing XML schema configuration objects to transform the
metadata in XML files, for example, to convert values to lower-case, translate country names into
corresponding country code values, truncate values that are too long, and so on. However, in the
default installation, XSL transformation is only used for converting Japanese regional M1 XML
files and study tagging files. XSL scripts can also be used on the application server to generate
XML preview renditions.

Previewing and Processing eCTD Module 1


Regional XML Files
XML preview renditions (of format xml_preview) are generated and stored for each module M1 XML
file at import time, alongside the original XML file. This enables the metadata contained in these
XML files to be previewed in the Submission Document Preview panel in the same way as for PDF
documents. At the same time, any leaf documents referenced in the regional XML file are extracted
and incorporated into the main submission history view, so that these files can be navigated and
previewed alongside the main eCTD module 2-5 files. To enable this, the following settings must
be configured in the relevant eCTD schema configuration object:

contains_envelope Set to T (enabled).


xml_envelope_element Specifies the elements to be included in the XML
preview rendition, after having transformed the
XML file into the standard format. This is also
used for XML metadata extraction purposes.
xml_envelope_stylesheet Specifies the name of the stylesheet used
to render the XML file into HTML in the
Submission Document Viewer. Predefined
stylesheets are provided for each of the
supported regions (US, Europe, Canada, and
Japan). Custom stylesheets can be created to
support other regions.

335
Regulatory Submissions

contains_leaf_docs Set to T (enabled) if the XML file contains <leaf>


elements as well as metadata, or F (disabled)
otherwise.
xml_extract_leaf_docs_from Specifies the top-level document element
containing <leaf> elements (indirectly), after
having transformed the XML file into the
standard format, where necessary. The specified
element, together with all of its subordinates,
is removed from the XML preview rendition,
so that the XML preview rendition contains
only envelope metadata. This is because the
leaf document nodes themselves cannot be
navigated from the Submission Document
Viewer pane used for previewing. Instead,
these elements are incorporated directly into
the Submission History XML, so they can be
navigated through the Submission Navigator
view. In this way, the Submission Navigator
provides a consolidated view across all of the
M1-M5 eCTD modules, even though the M1
module is actually defined as a separate XML
file.

For example, the following settings are preconfigured in the schema configuration object named
“us-regional_2-01”, which can be found in the /SSV/XML Schemas folder, and is used to generate
previews of “us-regional.xml” files:

XML Schema Configuration Settings: us-regional_2-01


Configuration Setting Value
contains_envelope T
xml_envelope_element admin
xml_envelope_stylesheet us-regional.xsl
contains_leaf_docs T
xml_extract_leaf_docs_from /fda-regional/m1-regional

This tells the system to extract the XML metadata from the admin element of us-regional.xml
files below the root element, fda-regional, to generate the xml_preview rendition, and extract
the leaf document elements from the m1-regional element below the root element into the main
submission view. If the user clicks the us-regional.xml element in module 1 in the Submission
Viewer, the xml_preview rendition is rendered into HTML by the relevant stylesheet (in this case, the
“us-regional.xsl” stylesheet is used) and displayed in the document preview panel.
Note: The xml_preview rendition is generated and stored in the repository automatically at import
time. It is attached to the regional XML file as a rendition. This is used only for previewing the XML
metadata in Documentum Submission Store and View. The primary content of the regional XML file
is not affected, and preserves a record of the XML file that was included in the submission. If the
imported submission folder is exported back out to the local filesystem, only the original primary

336
Regulatory Submissions

content files are exported, and not the Documentum Submission Store and View renditions, so that
an exact copy of the submitted files is exported.
If a new region or eCTD version for an existing region is to be supported, a new XSL stylesheet for it
must be provided to enable the XML preview to be generated. These stylesheets must be installed
in the %WEBAPPS%/XMLViewer/style folder on the application server, with the predefined
stylesheets. In some cases, the XSL stylesheets provided by the relevant Authority itself can be used
directly, or adapted as necessary. For example, standard eu-regional.xsl stylesheet provided with
Documentum Submission Store and View is based on the stylesheet provided by the EMA. However,
in other cases, a custom stylesheet will need to be developed, where the Agency does not provide one
(such as Canada), or where it is unsuitable for direct use, for example, it contains JavaScript or XSL
functions that refer to M1 leaf elements, which are not included in the XML preview (such as the US
FDA stylesheet). In these cases, you can use one of the standard stylesheets that are pre-installed with
Documentum Submission Store and View as a guideline for developing new stylesheets.
The following is a transcript of the us-regional.xsl script that is provided for this purpose:
?xml version="1.0" encoding="UTF-8"?>
<!--U.S. Regional Stylesheet (us-regional.xsl) for previewing XML metadata in SSV-->
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
xmlns:fda-regional="http://www.ich.org/fda">
<xsl:template match="/">
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=UTF-8"/>
</head>
<body>
<h3>
<img src="icons/countryflags/us.gif" style="padding-right: 10px;"/>
US Food and Drugs Administration - Regulatory Information
</h3>
<table border="0" cellspacing="20px">
<tr><td>Applicant:</td><td><xsl:value-of select="/fda-regional:fda-regional/admin/
applicant-info/company-name"/></td></tr>
<tr><td>Product Name:</td><td><xsl:value-of select="/fda-regional:fda-regional/admin/
product-description/prod-name"/></td></tr>
<tr><td>Application Number:</td><td><xsl:value-of select="/fda-regional:fda-regional/admin/
product-description/application-number"/></td></tr>
<tr><td>Application Type:</td><td><xsl:value-of select="/fda-regional:fda-regional/admin/
application-information/@application-type"/></td></tr>
<tr><td>Submission Type:</td><td><xsl:value-of select="/fda-regional:fda-regional/admin/
application-information/submission/@submission-type"/></td></tr>
<tr><td>Date of Submission (<xsl:value-of select="/fda-regional:fda-regional/admin/
applicant-info/date-of-submission/date/@format"/>):</td><td>
<xsl:value-of select="/fda-regional:fda-regional/admin/
applicant-info/date-of-submission/date"/></td></tr>
<tr><td>Sequence Number:</td><td><xsl:value-of select="/fda-regional:fda-regional/admin/
application-information/submission/sequence-number"/></td></tr>
</table>
</body>
</html>
</xsl:template>

</xsl:stylesheet>

Note that this stylesheet is only used for previewing the XML metadata in Documentum Submission
Store and View: the standard agency-supplied regional stylesheets can still be included in the
eCTD submission itself (in the utils folder) and used for navigating the submission when it has
been exported to the local file system.

337
Regulatory Submissions

XML Metadata Extraction


Submission-level metadata can be extracted from eCTD regional M1 XML files into Documentum,
in order to facilitate browsing and searching for submissions and regulatory applications. This
metadata is extracted from the envelope section of the XML file and stored in the corresponding
Documentum attributes of the top-level cd_submission_folder object representing the sequence folder
in the repository, the cd_reg_submission_info object representing the Submission Registration form,
and the cd_reg_admin_info object representing the Regulatory Application Registration Form for the
application. The following attributes are defined in the default installation for each of these object
types, for XML metadata extraction purposes:

Documentum Attribute Data Type Description


country_code Char(2) The ISO country code for
the country to which the
application is being submitted,
for example, “ca” for Canada.
For European submissions,
the country code “eu” should
be used for Centralized
Procedure (CP) submissions
to the EMA, or otherwise the
country code for the relevant
target EU Member State or
Reference Member State for
National Procedure (NP) or
Mutual Recognition Procedure
(MRP) should be used, for
example, “sv” for Sweden. By
convention, this is assumed to
be the country code of the first
“envelope” element in the EU
Regional XML file
concerned_member_states Char(2) REPEATING ISO country codes for other
EU Member States acting
as reviewers of European
submissions using the MRP.
By convention, this is assumed
to be the country codes for
all except the first “envelope”
element in the EU Regional
XML file. Applies only to MRP
submissions to the EU.
dosage_form Char(128) REPEATING The forms in which a drug
product is provided for a
defined administration route.

338
Regulatory Submissions

Documentum Attribute Data Type Description


dosage_strength Char(32) REPEATING For each dosage form, specifies
the amount of active ingredient
in terms of a defined unit of
measure. For example, the
amount of active ingredient
contained in one dose as
specified in the label.
drug_product_manufacturer Char(128) REPEATING The business entity/entities
engaged in making,
assembling, processing or
modifying a finished drug
product; or packaging,
repackaging or otherwise
changing the container or
wrapper for a drug product.
drug_product_name Char(64) REPEATING For each drug product
manufacturer, the name
of the manufactured drug
product in each case.
drug_substance_manufacturer Char(128) REPEATING The business entity/entities
engaged in making,
assembling, processing or
modifying the pharmaceutical
ingredients used in a drug
product.
drug_substance_name Char(64) REPEATING For each drug substance
manufacturer, the name of the
manufactured drug substance
in each case.
excipients Char(128) REPEATING A list of excipients (inactive
ingredients) used in the
manufacture of a drug product.
indication Char(128) REPEATING The indications (diseases) that
a particular drug product is
designed to treat.
product_generic_name Char(128) REPEATING The unique non-proprietary
or common name for a drug
defined by establishing logical
nomenclature classifications
based on pharmacological or
chemical relationships and
approved by USAN or INN.

339
Regulatory Submissions

Documentum Attribute Data Type Description


product_trade_name Char(128) REPEATING The proprietary name (brand
name) of a drug product
designated for regulatory
approval/marketing.
submission_type Char(48) The type of individual
submission within the
application, as defined by
the relevant Agency.
Some of these attributes, such as region, health_authority, submission_number, submission_type,
and submission_date, may be specified in the Import Submission dialog box at import time, or they
may be overridden with metadata extracted from the XML file, where defined. To enable XML
extraction for these attributes:
1. Ensure that the relevant attributes are defined for the cd_submission_folder and
cd_reg_admin_info object types in Documentum; add them to both object types if necessary.
2. In the relevant XML schema configuration object for the regional M1 XML file:
a. Enable the contains_envelope setting.
b. Set xml_envelope_element to the name of the “envelope” elements containing the regional
metadata in the XML file. In most cases there will be only one such element; however, EU
regional XML files for submissions using the MRP contain multiple “envelope” elements
– one for each EU Member State to which the submission is sent. In that case, the first
“envelope” element is assumed to pertain to the Reference Member State, and the remainder
to Concerned Member States.
c. Specify the attributes to be extracted in the xml_envelope_attrs repeating
attributes of the XML schema configuration object, using values of the form
“<Documentum-attribute-name>=<XML-XPath-expression>”.
The specified XPath expression is evaluated for each envelope element in the regional XML
file and the first defined, non-null value found is used. In the case of repeating Documentum
attributes, the values in each envelope element are combined into a set of distinct values. For
instance, in the EU regional XML files, the submission type is defined in the type attribute of
the <submission> sub-element below the <envelope> element:
<envelope>
<submission type=”initial-maa”/>
</envelope>

To extract this value into Documentum, the xml_envelope_attrs repeating attribute settings
for the EU regioinal schemas contain the following entry:
submission_type=submission/@type

In this case, the extracted value is initial-maa. It is possible to convert the extracted values
using attribute expressions or through an XSLT transformation script. See Mapping XML
Values to Documentum Attributes, page 347 for details.
d. For each of the above, set the corresponding flags in the is_required_envelope_attr and
override_env_attrs_on_rarf repeating attributes to T (true) or F (false), accordingly.
If an attribute is marked as “required” in the XML envelope but is missing from the XML file or has a
blank value specified for it in the XML file, this is logged as a warning. If the “override on RARF”

340
Regulatory Submissions

setting is enabled, the extracted attribute is copied to the relevant Regulatory Application Registration
Form as well as the submission folder, even if it is already defined at that level. If this flag is disabled,
the extracted attribute is only copied to the Regulatory Application Registration Form if the form
itself does not already have a defined value, that is, it is used as a default value for the Regulatory
Application Registration Form in that case.

341
Regulatory Submissions

Example 1: US Submission to the FDA


The following code shows part of a sample us-regional.xml file included in module M1 of an eCTD
submission to the US FDA:
<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE fda-regional:fda-regional SYSTEM "..\..\util\dtd\us-regional-v2-01.dtd">
<?xml-stylesheet type="text/xsl" href="..\..\util\style\us-regional.xsl"?>
<fda-regional:fda-regional xmlns:fda-regional="http://www.ich.org/fda" xmlns:
xlink="http://www.w3c.org/1999/xlink" dtd-version="2.01" xml:lang="en">
<admin>
<applicant-info>
<company-name>Acme Pharmaceuticals Inc.</company-name>
<date-of-submission>
<date format="yyyymmdd">20141205</date>
</date-of-submission>
</applicant-info>
<product-description>
<application-number>123456</application-number>
<prod-name type="proprietary">Wonder Drug</prod-name>
</product-description>
<application-information application-type="ind">
<submission submission-type="original-application">
<sequence-number>0000</sequence-number>
</submission>
</application-information>
</admin>
<m1-regional>
…etc.
</m1-regional>
</fda-regional:fda-regional>

The standard XML schema configuration object for the above is called “us-regional_2-01”, and it
has the following predefined settings:

342
Regulatory Submissions

XML Schema Configuration XML Schema Configuration Documentum Attribute Value


Setting Name Setting Values Extracted from Example XML
File (if any)
xml_envelope_element admin —
xml_envelope_attrs country_code='us' us
health_authority='FDA' FDA
submission_date=./applicant 20141205
-info/date-of-submission/date
submission_date_format yyyymmdd
=./applicant-info/date-of
-submission/date/@format
application_num=./product 123456
-description/application
-number
application_type=./application ind
-information/@application-type
submission_number=. 0000
/application-information
/submission/sequence-number
submission_type=./application original-application
-information/submission/
@submission-type
product_trade_name=./product Wonder Drug
-description/prod-name[@type
='proprietary']
product_generic_name=. —
/product-description/prod
-name[@type='established']

In the preceding table, XPath expressions are used to extract the relevant metadata from the
admin node of the XML file. For more information on using XPath expressions, refer to:
http://www.w3schools.com/XPath/.

Example 2: EU Submission to Multiple EU Countries


This next example is for an eu-regional.xml file included in a submission to three EU Member States
(Sweden, Germany, and France) using the MRP. It has three envelope sections, one for each country:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE eu:eu-backbone SYSTEM "../../util/dtd/eu-regional.dtd">
<?xml-stylesheet type="text/xsl" href="../../util/style/eu-regional.xsl"?>
<eu:eu-backbone xmlns:eu="http://europa.eu.int" xmlns:xlink="http://www.w3c.org/1999/xlink"
dtd-version="1.2.1">
<eu-envelope>
<envelope country="sv">
<application>

343
Regulatory Submissions

<number>pending</number>
</application>
<applicant>ACME Pharmaceuticals Inc.</applicant>
<agency-name>SV-MPA</agency-name>
<atc>C10AB05</atc>
<submission type="initial-maa" />
<procedure type="mrp" />
<invented-name>Wonder Drug</invented-name>
<inn>cureimol monosulphate</inn>
<sequence>0000</sequence>
<submission-description>Submission of registration dossier</submission-description>
</envelope>
<envelope country="de">
<application>
<number>pending</number>
</application>
<applicant>ACME Pharmaceuticals Inc.</applicant>
<agency-name>DE-BfArM</agency-name>
<atc>C10AB05</atc>
<submission type="initial-maa" />
<procedure type="mrp" />
<invented-name>Wunderdroge</invented-name>
<inn>cureimol monosulphate</inn>
<sequence>0000</sequence>
<submission-description>Einreichung des Registrierungsdossiers </submission-description>
</envelope>
</eu-envelope>
<envelope country="fr">
<application>
<number>pending</number>
</application>
<applicant>ACME Pharmaceuticals Inc.</applicant>
<agency-name>FR-AFSSAPS</agency-name>
<atc>C10AB05</atc>
<submission type="initial-maa" />
<procedure type="mrp" />
<invented-name>Médicament miracle</invented-name>
<inn>cureimol monosulphate</inn>
<sequence>0000</sequence>
<submission-description>Présentation d'un dossier d'inscription</submission-description>
</envelope>
</eu-envelope>
<m1-eu>
…etc.
</m1-eu>
</eu:eu-backbone>

In this case, since the XML file contains multiple envelope elements, multiple values can be
extracted for repeating attributes, such as concerned_member_states, product_trade_name, and
product_generic_ name. Duplicate values are ignored. The first envelope element pertains to the
Reference Member State, and the others pertain to Concerned Member States (this only applies to EU
submissions using the MRP; otherwise there would be only one envelope element). The standard
XML schema configuration object for the above is “eu-regional_1-2-1.xml” in this case, and has the
following predefined settings:

344
Regulatory Submissions

XML Schema Configuration XML Schema Configuration Documentum Attribute Value


Setting Name Setting Value(s) Extracted from Example XML
File (if any)
xml_envelope_element envelope —
xml_envelope_attrs country_code=@country sv
concerned_member_states= de; fr
.[position()>1]@country
application_number pending
=application/number
health_authority=agency-name SV-MPA
atc_code=atc C10AB05
submission_type=submission initial-maa
/@type
submission_procedure_type mrp
=procedure/@type
product_trade_name=invented Wonder Drug; Wunderdroge;
-name Médicament miracle
product_generic_name=inn cureimol monosulphate
sequence_number=sequence 0000
title=submission-description Submission of registration
dossier

Note the use of the [position()>1] condition in the XPath expression for concerned_member_states,
which means that all country codes except that of the first envelope element are extracted
and combined together into the concerned_member_states repeating attribute. In the case of
single-valued attributes such as country_code, application_number, and title, where there are
multiple envelope elements, the first element that has a defined, non-null value takes precedence;
usually those that are specified in the first envelope element. Therefore, in this example, country_code
is set to sv (the code for Sweden, as specified in the country attribute of the first envelope element)
and title is set to “Submission of registration dossier” (the first title value; the German and French
title values are ignored in this case, because title is a single-valued attribute in Documentum).
Submission-level metadata is also recorded in the Submission History XML file for each submission,
enabling this metadata to be displayed by the Documentum Submission Store and View Viewer. By
default, the following attributes are recorded for each submission or eCTD sequence:
• submission_number—Recorded in the id XML attribute.
• submission_date—Recorded in separate day, month, and year XML attributes, in order to
facilitate localization.
• submission_type–Recorded in the type XML attribute.
• submission_procedure_type—Recorded in the procedure-type XML attribute (applies to EU
submissions only).

345
Regulatory Submissions

• health_authority—Recorded in the health-authority XML attribute.


• concerned_member_states—Recorded in the concerned-member-states XML attribute (applies to
EU MRP submissions only).
In the standard XSL stylesheet, which is provided as part of the Documentum Submission Store
and View web application (submission-navigator-4.0.xsl), these attributes are listed in the Table of
Contents (TOC) view when an application is selected. You can extend the standard XSL stylesheet to
include additional custom submission-level metadata in this view. However, only metadata that is
recorded in the Submission History XML file can be displayed in this view. To arrange for custom
metadata to be recorded in the Submission History XML file at import time, you can change the D2
lifecycle configuration settings for the Regulatory Application Registration Form lifecycle to include a
–record_submission_attrs argument, overriding the default settings.
Note: Including submission-level metadata in the Submission History XML file that can change
after the submission has been imported is not recommended. This would mean that the values
recorded in the XML file may be different to the corresponding values in Documentum, leading to
inconsistencies between the TOC view in the Submission Navigator and the Submission Registration
Form properties page in Documentum. For this reason, properties such as submission_status are
omitted from the Submission History XML file in the default installation. However, it is safe to
include properties that are read-only after importing.
Document-level metadata can also be extracted for each <leaf> element, from both regional
M1 XML files and ICH M2-M5 eCTD backbone files, and populated on the corresponding
cd_submission_element objects representing the archived submission documents in Documentum.
These attributes are defined in the xml_leaf_doc_attrs setting of the ICH eCTD XML Backbone
schema configuration object. The default settings for this are as follows:
• country_code=@country
• dosage_form=ancestor::*[@dosageform != '']/@dosageform
• drug_product_manufacturer=ancestor::*[@manufacturer != '' and @product-name !=
'']/@manufacturer
• drug_product_name=ancestor::*[@manufacturer != '' and @product-name != '']/@product-name
• drug_substance_manufacturer=ancestor::*[@manufacturer != '' and @substance !=
'']/@manufacturer
• drug_substance_name=ancestor::*[@manufacturer != '' and @substance != '']/@substance
• ectd_application_version=@application-version
• ectd_checksum_type=@checksum-type
• ectd_checksum=@checksum
• ectd_element_name=../name()
• ectd_operation=@operation
• ectd_prev_submitted_file=@modified-file
• excipients= ancestor::*[@excipients != '']/@excipient
• indication=ancestor::*[@indication != '']/@indication
• title=title

346
Regulatory Submissions

Note that some of these attributes, such as dosage_form, drug_product_manufacturer, and so on, are
defined in the parent XML elements containing the <leaf> element, which is why the ancestor XPath
expression function (or axis) is used for those items.

Mapping XML Values to Documentum Attributes


To address the issue of importing a submission that contains XML metadata that does not match the
metadata in Documentum, it is possible to transform the XML before it is processed so that the
extracted XML values are mapped to the corresponding Documentum attributes. To enable this, an
optional D2 dictionary, Submission Regional Metadata Conversions, has been provided.
The keys in this dictionary are of the syntax, "<attribute-name>.<XML-value>", for example,
"health_authority.ukmhra". The Converted Value alias specifies the value to be used in Documentum
Submission Store and View, for example, "MHRA". This enables regional XML files that use different
data encodings in Documentum Submission Store and View to be processed correctly.
It is also possible to use CDF attribute expressions in the XML schema "attributes" configuration
settings, using the syntax:
<attribute-name>=<XPathExpression> => <attribute-expression>

For example,
health_authority=submission/@agency-code=>$upper($arg(xmlvalue))

In the example, the expression converts the agency code attribute in the regional XML file into
uppercase. The $arg(xmlvalue) function refers to the raw extracted XML value.

Worked Example: Extending LSSSV to Support the US


Regional 2.3 eCTD Format (DTD Version 3.3)
From June 2015, the US FDA announced the acceptance a new regional eCTD M1 format, version 2.3,
which uses a revised XML Document Type Definition (DTD) version 3.3. This format is supported in
Documentum Submission Store and View as standard, except for grouped applications (in which
multiple related sequences are included in one submission relating to different drug products).
You can use the following steps as a guide to extend the Documentum Submission Store and
View configuration to support this new format and to extend the configuration further to support
additional regions or new DTD formats for existing regions:
1. Download the regional eCTD XML specification and stylesheet from the relevant Agency
website and obtain a sample XML in the relevant DTD format. For example, the US
regional eCTD M1 specification and stylesheet can be downloaded from the FDA website
(http://www.fda.gov/Drugs/DevelopmentApprovalProcess/FormsSubmissionRequirements
/ElectronicSubmissions/ucm153574.htm).
A transcript of a sample us-regional.xml file using the DTD 3.3 format:
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE fda-regional:fda-regional SYSTEM "http://www.accessdata.fda.gov/static/eCTD/
us-regional-v3-3.dtd">
<?xml-stylesheet type="text/xsl" href="http://www.accessdata.fda.gov/static/eCTD/
us-regional.xsl"?>

347
Regulatory Submissions

<fda-regional:fda-regional dtd-version="3.3" xml:lang="text" xmlns:fda-regional=


"http://www.ich.org/fda" xmlns:xlink="http://www.w3c.org/1999/xlink">
<admin>
<applicant-info>
<id>123456789</id>
<company-name>Wonder Drug Company</company-name>
<submission-description>Pre-Launch Video Sample</submission-description>
<date-of-submission>
<date format="mmddyy">062315</date>
</date-of-submission>
<applicant-contacts>
<applicant-contact>
<applicant-contact-name applicant-contact-type="fdaact4">Regulatory Manager X
</applicant-contact-name>
<telephones>
<telephone telephone-number-type="fdatnt1">1-123-456-7890</telephone>
</telephones>
<emails>
<email>[email protected]</email>
</emails>
</applicant-contact>
<applicant-contact>
<applicant-contact-name applicant-contact-type="fdaact1">Regulatory Manager Y
</applicant-contact-name>
<telephones>
<telephone telephone-number-type="fdatnt1">1-123-456-7891</telephone>
</telephones>
<emails>
<email>[email protected] </email>
</emails>
</applicant-contact>
</applicant-contacts>
</applicant-info>
<application-set>
<application application-containing-files="true">
<application-information>
<application-number application-type="fdaat1">123456</application-number>
</application-information>
<submission-information>
<submission-id submission-type="fdast8">0000</submission-id>
<sequence-number submission-sub-type="fdasst1">0000</sequence-number>
</submission-information>
</application>
</application-set>
</admin>
<m1-regional>
<m1-2-cover-letters>
<leaf operation="new" xlink:href="102-cover-letters/cover.pdf" xlink:type="simple"
checksum-type="md5" ID="id123" application-version="PDF 1.4" checksum=
"8241a7c5bb98dee28b1385dc261345b2">
<title>Cover Letter - Sample</title>
</leaf>
</m1-2-cover-letters>
<m1-15-promotional-material promotional-material-audience-type="fdapmat1">
<m1-15-2-materials promotional-material-doc-type="fdapmdt2">
<m1-15-2-1-material promotional-material-type="fdapmt28" material-id="A">
<m1-15-2-1-1-clean-version>
<leaf operation="new" xlink:href="115-promo-material/1152-materials/a/
tvc-storyboard-0814.pdf" xlink:type="simple" checksum-type="md5" ID="ddjwv124" application-
version=
"PDF 1.4" checksum="8241a7c5bb98dee28b1385dc261345b2">
<title>TV Storyboard</title>
</leaf>
<leaf operation="new" xlink:href="115-promo-material/1152-materials/a/
tv-ad-video-draft.wmv" xlink:type="simple" checksum-type="md5" ID="ddjwv1124" application-

348
Regulatory Submissions

version=
"PDF 1.4" checksum="a28a808f9b7cb01dee5e11779a70c51c">
<title>Proposed Television Advertisement</title>
</leaf>
</m1-15-2-1-1-clean-version>
<m1-15-2-1-2-annotated-version>
<leaf operation="new" xlink:href="115-promo-material/1152-materials/a/
annotated-tvc-storyboard-0814.pdf" xlink:type="simple" checksum-type="md5" ID="ddjwv125"
application-version="PDF 1.4" checksum="8241a7c5bb98dee28b1385dc261345b2">
<title>Annotated TV Storyboard</title>
</leaf>
</m1-15-2-1-2-annotated-version>
</m1-15-2-1-material>
</m1-15-2-materials>
</m1-15-promotional-material>
</m1-regional>
</fda-regional:fda-regional>

2. Formulate an XPath expression that can be used to identify the new XML files and distinguish
them from other XML files, including other versions of the regional XML file for the same region.
In this example, a suitable XPath expression is:
/fda_regional[@dtd-version='3.3']

This XPath expression only matches an element in the XML files with a root element named
fda_regional (ignoring the fda_regional: namespace prefix) and a dtd-version attribute in that
root element with the value 3.3. This is used to set the xpath_qualifier setting in the new eCTD
XML schema configuration object, so that it only applies to US Regional v 2.3 XML files.
Note: The DTD version number is specified as 3.3 within the XML file, and not 2.3. In addition,
the XML namespace prefix, fda_regional:, should not be included in the XPath expression.
Otherwise, it will not be evaluated correctly.
3. Determine whether or not an XSL preprocessing transformation script needs to be applied to the
XML file to convert it into a form that can be processed by Documentum Submission Store and
View. Develop one if necessary. See Processing Standard XML Files, page 329 and Transforming
Non-Standard XML Files, page 332 for more information.
In this case, the XML conforms to the standard eCTD format (apart from the fact that it contains
FDA-specific metadata in the <admin> section), so it would appear that an XSL transformation
script is not required and Documentum Submission Store and View can process these XML files
directly. However, special encodings are used to denote the metadata in the XML file, instead of
literal values: for example,
• applicant-contact-type="fdaact4"
• telephone-number-type="fdatnt1"
• application-type="fdaat1"
• submission-type="fdast8"
• submission-sub-type="fdasst1"
It is necessary to convert these encodings into meaningful values in the XML preview for display
purposes. For example, according to the US FDA guidance documents, the applicant contact
type code “fdaact4” should be displayed as “Promotional Labeling and Advertising Regulatory
Contact”. One way to convert XML values is to add entries to the D2 value mapping dictionary,
Submission Regional Metadata Conversions. However, this only applies to XML metadata
extracted from the XML file into Documentum. Moreover, some of the XML metadata, such as

349
Regulatory Submissions

applicant contact type codes, are not extracted into Documentum attributes at all, as they are not
part of the standard object model.
An XSL transformation script can be used to convert the encoded values into literal values in the
XML preview used in the Submission Viewer. A partial transcript of this script is shown below:
<?xml version="1.0" encoding="UTF-8"?>
<xsl:transform version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:ectd=
"http://www.ich.org/ectd"xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xlink-ectd=
"http://www.w3c.org/1999/xlink">
<xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/>

<xsl:variable name="APPLICATIONCONTACTTYPES" select="concat(
'fdaact1=Regulatory', ';',
'fdaact2=Technical', ';',
'fdaact3=US Agent', ';',
'fdaact4=Promotional / Advertising', ';'
)"/>

<!-- Default rule to copy all XML elements unchanged -->
<xsl:template match="@* | node()">
<xsl:copy>
<xsl:apply-templates select="@* | node()"/>
</xsl:copy>
<-- Translate contact type codes -->
<xsl:template match="//@applicant-contact-type">
<xsl:attribute name="applicant-contact-type">
<xsl:value-of select="substring-before(substring-after($APPLICATIONCONTACTTYPES,
concat(., '=')), ';')"/>
</xsl:attribute>
</xsl:template>

</xsl:transform>

A copy of the complete XSL transformation script can be found on the application server, in the
%WEBAPPS%/style/us-regional_2-3-normalize.xslt file. This file is provided for
reference purposes only and is not used by Documentum Submission Store and View itself.
In the preceding sample script, the application contact type encoding mappings to corresponding
display labels are defined in the XSL variable “APPLICATIONCONTACTTYPES” as a series of
name=value pairs delimited by semicolon characters. A default XSL processing rule is then
defined, which applies to all XML nodes except those that match other, more specific rules. This
copies the XML elements, attributes, and textual data values to the output XML file unchanged.
Then, a rule is defined for converting the “applicant-contact-type” attribute values (in any
XML element) from the encoded forms into the corresponding display values. For example,
the element:
<applicant-contact-name applicant-contact-type="fdaact4">Regulatory
Manager X</applicant-contact-name>
in the input XML file is converted to:
<applicant-contact-name applicant-contact-type=" Promotional /
Advertising'">Regulatory Manager X</applicant-contact-name>
in the output XML file. This enables the Submission Viewer to display applicant contact type
codes as meaningful values in the XML preview. Similar variables and XSL processing rules for
the other FDA value encodings can be defined in the same way, for converting application type
codes, submission type codes, submission sub-type codes, and so on.

350
Regulatory Submissions

Another issue with the US regional XML files is that, according to the FDA regional M1 eCTD
specification, certain documents such as administrative forms can appear in the <admin> section,
as well as in the <m1-regional> section. However, Documentum Submission Store and View
requires all of the leaf documents to be encapsulated within a single XML element, separate to the
XML metadata section (the <admin> element, in this case) so that they can be extracted into the
consolidated module m1-m5 submission tree view. To address this, an additional rule can be
added to the XSL transformation script to copy the <form> elements to a dummy section, named
<m1-1-admin-forms>, below the <m1-regional> node in the output XML file, as follows:
<xsl:template match="//m1-regional">
<xsl:element name="m1-regional">
<xsl:element name="m1-1-admin-forms">
<xsl:for-each select="//form">
<xsl:element name="form">
<xsl:attribute name="form-type">
<xsl:value-of select="substring-before(substring-after($FORMTYPES, concat(@form-type,
'=')), ';')"/>
</xsl:attribute>
<xsl:copy-of select="node()"/>
</xsl:element>
</xsl:for-each>
</xsl:element>
<xsl:apply-templates/>
</xsl:element>
</xsl:template>

These forms is then extracted along with the other <m1-regional> leaf documents, appearing in the
“m1-1-admin-forms” section in the submission tree view. (The form type codes are also converted
into meaningful display values as part of this process, using the technique described previously.)
It is possible to test a new XSL transformation script before installing it by linking a sample XML
file to it. To do this, copy the sample XML file and XSL script to a temporary folder in the local
file system, and open the sample XML file in Notepad or any other XML Editor. Insert an XML
processing instruction into the XML sample file, immediately after the XML header, to link it
to the XSL stylesheet script file, as follows:
<?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet type="text/xsl" href="us-regional-2-3-normalize.xslt"?>
<fda-regional:fda-regional xmlns:fda-regional="http://www.ich.org/fda" xmlns:xlink=
"http://www.w3c.org/1999/xlink" dtd-version="2.01" xml:lang="en">

</fda-regional:fda-regional>

In this example, the XSL script file to be tested is named “us-regional-2-3-normalize.xslt”,


and must reside in the same folder as the XML file itself – a relative path to a script file in
a sub-folder can be specified if preferred. (If the sample XML file already contains an XML
stylesheet processing instruction, change the file name or URL to refer to the local XSL script
file in the href parameter).
Save the changes, then use the Open with menu option in Windows Explorer to open the
XML file in your preferred HTML browser such as Internet Explorer. The transformed output
appears as garbled text in the in the main browser window because it is not formatted as an
HTML document. However, the transformed XML output can be navigated via the browser’s
Development Toolsview, in the HTML tab:

351
Regulatory Submissions

In the Developer Tools HTML window, it is possible to drill down into the transformed XML
output (which is itself an XML document, and not an HTML document in this case) to verify that
the XML structure has been passed through correctly and the encoded attribute values have been
translated into the corresponding display values. If the XML output appears not to have been
transformed, check the Console tab for errors and make sure the stylesheet script file name in
the sample XML file is specified correctly in the XML header.
When the XSL stylesheet is working correctly, it can be installed as the primary content file
of the new schema configuration object and the enable_xslt_preprocessing option turned on
for that schema. Documentum Submission Store and View then downloads and applies the
transformation script automatically to the relevant XML files at import time, prior to any further
processing, such as XML metadata extraction or extraction of module 1 leaf documents into
the submission tree view. Note that this process does not alter the original regional XML file
included in the m1 section of the eCTD submission: instead, the original XML file (the input
file) is saved as the primary content of the regional XML document in the repository, and the
transformed XML output file is saved as a separate xml_preview rendition of this document. This
rendition is only used internally by Documentum Submission Store and View to generate the
XML preview when the regional XML file is selected in the Submission Viewer web application.
If the imported submission files are subsequently exported from Documentum back out to the
local file system, only the original primary content files are exported. Therefore, in this case, the
original regional XML file that was submitted to the FDA is exported, containing the original
encoded values, and not the transformed version.
4. Identify the top-level XML elements containing:
a. the regional XML metadata, and
b. the leaf document references.
In this case, the regional XML metadata is contained in the <admin> element and the leaf
document references reside in the <m1-regional> element (including the admin forms after

352
Regulatory Submissions

transformation, as described previously). These are recorded in the xml_envelope_element and


xml_extract_leaf_docs_from configuration settings in the new schema configuration object,
accordingly.
5. Determine the XML metadata that can be usefully-extracted from the regional XML file into
Documentum. Formulate XPath expressions to extract them in each case. These will be used to
set up the xml_envelope_attrs repeating attribute of the schema configuration object. In this case,
from the FDA eCTD regional M1 XML specification, it can be determined that the following
metadata can be extracted:

Documentum Attribute XPath Expression (Relative to the <admin> Node)


submission_date applicant-info/date-of-submission/date
submission_date_format* applicant-info/date-of-submission/date/@format
application_num application-set/application/application-information
/application-number
application_type application-set/application/application-information
/application-number/@application-type
submission_number application-set/application/submission-information/sequence
-number
submission_type application-set/application/submission-information
/submission-id/@submission-type
* The submission_date_format attribute is not defined explicitly in the Documentum object
model, but it is recognized by Documentum Submission Store and View as a special case. This
is used to define the format in which the submission date value is specified in the XML file.
Documentum Submission Store and View uses this to convert the specified date value in the
XML into a standard date format so that the correct date value is stored in the submission_date
attribute in Documentum.
As part of this process, it may be necessary to define attribute expressions or D2 dictionary
mappings to be applied as the metadata is extracted (see Mapping XML Values to Documentum
Attributes, page 347 for details). In this case, the XSL pre-processing transformation script is
responsible for converting the metadata. Additional data conversion is not required.
6. Devise an XSL preview stylesheet for converting the XML file into HTML. Note that this is a
separate stylesheet to the XSL preprocessing transformation stylesheet. If used; the preprocessing
transformation stylesheet converts the input XML file (the regional XML file) into another XML
file that can be processed by Documentum Submission Store and View, whereas the preview
stylesheet converts the XML file (transformed if necessary) into HTML. This is used to generate
the XML preview in the Submission Viewer.
In some cases, an Agency-supplied stylesheet can be adapted for this purpose or else one of the
existing LSSSV preview stylesheets can be copied and adapted to suit the new XML format.
These can be found in the %WEBAPPS%/XMLViewer/style folder in the repository. When
developing existing stylesheets for previewing in Documentum Submission Store and View, the
following points should be noted:
a. The stylesheet only needs to render the “envelope” section of the XML file, containing the
regional XML metadata. The leaf document references will be extracted from the XML file

353
Regulatory Submissions

into the main submission tree view to combine them with the main eCTD backbone file
references for modules 2-5.
b. The HTML page should include a top-level <div class="envelope"> element in the HTML
body, in order to provide scroll bars in the Submission Viewer document preview panel.
See the following example.
c. Predefined icons for various country flags are provided as part of the Documentum
Submission Store and View XMLViewer web application and can be added to the title bar if
required. See the following example.
<?xml version="1.0" encoding="UTF-8"?>
<!-- U.S. Regional Stylesheet (us-regional.xsl) for previewing XML metadata in SSV -->
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:
fda-regional="http://www.ich.org/fda">
<xsl:template match="/">
<html> <head> <meta http-equiv="content-type" content="text/html; charset=UTF-8"/> </head>
<body>
<div class="envelope">
<h3> <img src="icons/countryflags/us.gif" style="padding-left: 10px; padding-right:
10px;"/>US Food and Drugs Administration - Regulatory Information</h3>
<!-- Application summary info table -->
<table border="0" cellspacing="20px">
<tr><th align="left">Applicant ID:</th><td colspan="2"><xsl:value-of select=
"/fda-regional:fda-regional/admin/applicant-info/id"/></td></tr>
<tr><th align="left">Company Name:</th><td colspan="2"><xsl:value-of select=
"/fda-regional:fda-regional/admin/applicant-info/company-name"/></td></tr>
<tr><th align="left">Submission Description:</th><td colspan="2"><xsl:value-
of select="/fda-regional:fda-regional/admin/applicant-info/submission-description"/></td>
</tr>
</table>
<p/>
<!-- Contact info table -->
<table width="100%" border="1" bordercolor="#564742">
<th width="150px" bgcolor="#FEF4F0" style="color=#333333">Contact Information
</th>
<td>
<xsl:for-each select="/fda-regional:fda-regional/admin/applicant-info/
applicant-contacts/applicant-contact">
<table width="100%" border="0" cellpadding="2">
<tr> <td style="border:0;"><strong><xsl:value-of select="applicant
-contact-name"/>
</strong> - <xsl:value-of select="applicant-contact-name/@applicant-contact-type"/>
</td> </tr>
<xsl:for-each select="telephones">
<tr> <td style="border:0;">Tel.: <font color="red"><xsl:value
-of select="telephone"/>
</font> - <xsl:value-of select="telephone/@telephone-number-type"/></td> </tr>
</xsl:for-each>
<tr> <td style="border:0;">E-mail: <a href="mailto:{emails/email
[1]}?subject={concat(/fda-regional:fda-regional/admin/application-set/application[1]
/application-information/application-number/@application-type, ' application ',
/fda-regional:fda-regional/admin/application-set/application[1]/application-
information/application-number,' - ', /fda-regional:fda-regional/admin/applicant-info/
submission-description)}"><xsl:value-of select="emails/email[1]"/></a></td> </tr>
<xsl:if test="position()!=last()"> <tr> <td style="border:0;">
<hr/> </td>
</tr> </xsl:if>
<xsl:if test="position()=last()"> <tr> <td style="border:0;">
<p/> </td>
</tr> </xsl:if>
</table>
</xsl:for-each>
</td>

354
Regulatory Submissions

</table>
</p>
<!-- Application info table -->
<table width="100%" border="1" bordercolor="#564742">
<th width="150px" valign="middle" bgcolor="#FEF4F0" style="color=#333333">
Application Information</th>
<td>
<table width="100%" border="0">
<xsl:for-each select="/fda-regional:fda-regional/admin/application-set/application">
<tr><td><table width="100%" border="0" bordercolor="#564742" cellpadding="2">
<xsl:choose>
<xsl:when test="@application-containing-files='true' or @application-containing
-files='false'">
<tr> <td style="border:0;">Application Containing Files: <xsl:value-of select=
"@application-containing-files"/></td> </tr>
</xsl:when>
<xsl:otherwise>
<tr> <td style="border:0;">Application Containing Files: This
value must be set to either true or false.</td> </tr> </xsl:otherwise>
</xsl:choose>
<tr> <td style="border:0;">Application Type: <xsl:value-of select="descendant::
application-number/@application-type"/></td></tr>
<tr><td style="border:0;">Application Number: <xsl:value-of select="descendant::
application-number"/></td></tr>
<tr><td style="border:0;">Submission Type: <xsl:value-of select="descendant::
submission-id/@submission-type"/></td></tr>
<xsl:if test="(string-length(/.//submission-id/@supplement-effective-date-type)
&gt; 0) and (.//submission-id/@supplement-effective-date-type != ' ')">
<tr><td style="border:0;">Supplement Effective Date: <xsl:value-of select=".//
submission-id/@supplement-effective-date-type"/></td></tr>
</xsl:if>
<tr><td style="border:0;">Submission Id: <xsl:value-of select="descendant::
submission-id"/></td> </tr>
<tr><td style="border:0;">Submission Sub-Type: <xsl:value-of select=".//
sequence-number/@submission-sub-type"/></td></tr>
<tr><td style="border:0;">Sequence #: <xsl:value-of select='substring
(format-number(descendant::sequence-number, "0000"),1, 4)'/></td></tr>
<xsl:if test="(string-length(/descendant::cross-reference-application
-number) &gt; 0) and (descendant::cross-reference-application-number != ' ')">
<xsl:for-each select=".//cross-reference-application-number">
<tr><td style="border:0;">Cross Reference Number:
<xsl:value-of select="."/></td></tr>
<tr><td style="border:0;"><xsl:value-of select="@application-type"/>
</td></tr>
</xsl:for-each>
</xsl:if>
<xsl:if test="position()!=last()"> <tr><td style=
"border:0;"> <hr/></td></tr> </xsl:if>
<xsl:if test="position()=last()"> <tr> <td style="border:0;"> <p/>
</td> </tr> </xsl:if>
</table>
</td>
</tr>
</xsl:for-each>
</table>
</td>
</table>
</div>
</body>
</html>
</xsl:template>
</xsl:stylesheet>

355
Regulatory Submissions

To test the preview stylesheet prior to installing it, link a sample XML file to it using the technique
described previously for preprocessing transformation stylesheets. For example:
<?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet type="text/xsl" href="us-regional-2-3.xsl"?>
<fda-regional:fda-regional xmlns:fda-regional="http://www.ich.org/fda" xmlns:xlink=
"http://www.w3c.org/1999/xlink" dtd-version="2.01" xml:lang="en">…
</fda-regional:fda-regional>

In this case, the preview stylesheet is named us-regional-2-3.xsl and is expected to reside in the
same folder as the XML document itself. When the XML sample file is opened in the browser,
the transformed HTML output should be displayed correctly:

Note that in this case, the encoded XML metadata values are displayed because the HTML output
is generated directly from the original XML file and not the pre-transformed XML file. But at least
it enables the preview format to be verified. When the preview stylesheet is used in Documentum
Submission Store and View, it uses the pretransformed XML file and shows the decoded values.
7. Install the XSL preview script in the %WEBAPPS%/XMLViewer/style folder on the D2 web
application server. If the stylesheet refers to external files such as images or auxiliary scripts,

356
Regulatory Submissions

make sure these are also installed in the appropriate locations. There is no need to restart the
application server.
Note: It is not necessary to install the new DTD file or stylesheet file provided by the Agency.
These files are included automatically by the eCTD publishing tool as part of the submission,
in the util/dtd and util/style folders below the eCTD sequence folder. Documentum
Submission Store and View imports these into the repository alongside the other submission
files, so that the entire sequence folder can be exported back out to the local file system from
Documentum if necessary. The DTD and stylesheets can then be used to validate and preview
the eCTD XML files in the local file system, outside of Documentum. However, they are not
processed or required by Documentum Submission Store and View itself.
8. To represent the new regional XML format, create a new XML schema configuration object of
type cd_ectd_xml_config in the /System/SSV/XMLSchemas folder in the repository. This is
not currently possible in D2 itself because a D2 creation profile is not defined for these objects.
However, a new schema configuration can be created in Documentum Administrator as follows:
a. Log into Documentum Administrator as the Documentum installation owner, for example,
dmadmin, with Superuser privileges in Documentum.
b. Navigate to the /System/SSV/XMLSchemas folder in the repository.
c. Select an existing XML schema configuration object that is most similar to the new schema,
for example, the us_regional-2-01 schema, or select an arbitrary schema.
d. Create a copy of this schema.
e. Modify the new schema properties as described in the following table:

XML Schema Description Example Settings for US Regional


Attribute 2.3 XML Schema
Object Name Specify a unique object name for us-regional_2-3
this schema configuration object
based on the standard naming
conventions.
Title Enter a descriptive summary of eCTD US-FDA Regional XML File
the XML files to which this schema v 2.3 (DTD v 3.3)
applies.
Origin URL Specify the web site of the Agency http://www.fda.gov/Drugs
from which the XML specification /DevelopmentApprovalProcess/
was obtained. FormsSubmissionRequirements/
ElectronicSubmissions/ucm328835
.htm
XML Format Specify an abbreviated format code US-2.3
Code for this XML file, displayed in the
Submission Navigator bar when
an eCTD sequence that uses this
format is viewed.
XPath Qualifier Enter the XPath expression used /fda-regional[@dtd-version='3.3']
to recognize these XML files,
identified in step 2.

357
Regulatory Submissions

XML Schema Description Example Settings for US Regional


Attribute 2.3 XML Schema
Schema Denotes the type of this XML file eCTD Regional XML File
Category (should be preset to the example
value).
Enable XSLT Enable (set to T) if an XSL T
Pre-processing pre-processing script is required
for these XML files, as identified
in step 3. Disable (set to F) if XSL
pre-processing is not required.
Preview Specify the XSL file name used to us-regional_2-3.xsl
Stylesheet convert the XML file (transformed
if necessary) into HTML to render
the XML preview in the Submission
Viewer.
Preview Widget Denotes the widget used for SSVLeafDocumentViewer
ID previewing this XML file in the
Submission Viewer (should be
preset to SSVLeafDocument
Viewer or left blank).
Contains Leaf Enabled (set to T) if the XML file T
Documents contains leaf document references
to be extracted; disabled (set to F) if
it only contains regional metadata.
Extract Leaf Whether or not the leaf document T
Documents references should be extracted into
module 1 of the main submission
tree view, combined with modules
2-5 for the main eCTD backbone
file.
Extract Leaf Specify the path of the top-level /fda-regional/m1-regional
Documents XML element containing leaf
From document references to be
extracted (from the transformed
output identified in step 4).
XML Leaf Specify the name of the XML leaf
Document element used for leaf document
Element references (identified in step 4;
usually “leaf”).
Contains Enabled (set to T) if the XML T
Envelope file contains contains regional
metadata to be extracted.

358
Regulatory Submissions

XML Schema Description Example Settings for US Regional


Attribute 2.3 XML Schema
XML Envelope Specify the name of the top-level admin
Element XML element containing the
regional metadata to be extracted
(identified in step 4).
XML Envelope Specify the Documentum attribute submission_date=./applicant-info
Attributes names and corresponding XPath /date-of-submission/date
expressions to be used to extract
the regional metadata from the submission_date_format=.
envelope element(s), as identified /applicant-info/date-of-submission
in step 5. /date/@format

application_num=application
-set/application/application
-information/application-number

application_type=application
-set/application/application
-information/application-number/
@application-type

submission_number=application
-set/application/submission
-information/sequence-number

submission_type=application
-set/application/submission
-information/submission-id/
@submission-type
Required For each preceding XML envelope T
attribute, specify T (enabled) if
the attribute is mandatory, or F
(disabled) if it is optional.

If an attribute is marked as
mandatory and there is no defined
value in the regional XML file for it,
or the specified value is blank, and
a value is not specified explicitly
in the SRF (or inherited from the
RARF), a warning message will
be logged in the import log file at
import time.
After the XML schema configuration properties have been set up correctly, click OK to save
the changes in Documentum.
f. If an XSL preprocessing script is to be applied, check out the new XML schema configuration
object, and use the check-in from file option to upload the preprocessing script to it from

359
Regulatory Submissions

the local file system as the primary content file. Select the XSL Stylesheet format for this file
(Documentum format code xsl) and use the Check-in as same version option.
Note: It is possible to check-in the changes as a new version of the schema configuration
object. Documentum Submission Store and View always uses the latest (“CURRENT”)
version in each case.
This completes the Documentum Submission Store and View configuration process and it is
now possible to import eCTD sequences using the new format. It is not necessary to restart the
Documentum Server, Java Method Server, or Application Server. Documentum Submission Store and
View should pick up the new XML schema configuration automatically on the next import cycle.

Non-eCTD Electronic Submission


Applications that do not adopt the eCTD format use a non-eCTD electronic submission (NeeS) format.
For these, each submission has a manually-specified, distinct submission number, and is assumed to
be a complete copy of the files sent to the authority in that submission. Incremental changes are not
supported by this model. After the first submission has been imported into Documentum Submission
Store and View, it is not possible to change the application from NeeS to eCTD format. If the Sponsor
decides to adopt the eCTD format midway through an application, a new application must be made.

Submission Filestore
Submissions are generated by some kind of publishing tool and stored in a network filestore that
is accessible to both the users and the Documentum Server. This enables submission folders to be
selected in the D2 browser, and the contents of those folders to be retrieved asynchronously through
the submission import process, which runs as a back-end server method on the Documentum Server
(Java Method Server).

360
Regulatory Submissions

The folder path from the Documentum Server to the submission filestore can be different to the folder
path from the desktop of the user to the same submission filestore. For example, if the Documentum
Server is running on Unix, the submission filestore path could be /mnt/ext/eSubmissions,
depending on where it is mounted, whereas on the desktop of the user, it could be a UNC path to
the relevant server, for example, \\FSVRM1\eSubmissions. Windows-mapped drives can also
be used for the client path, but the same drive letter must be mapped on each user desktop for the
path to be valid in all cases, for example, S:/eSubmissions. However, Windows-mapped drives
cannot be used for the Documentum Server path, because Windows does not allow them to be used
by processes running as a service, such as Documentum. A UNC path from the server is required.
If the Documentum Server is behind a firewall, it may be necessary to replicate the filestore at the
storage level, or through a sync-and-share utility, such as Syncplicity, so that the same content is
accessible from both sides. WRITE access is required to select the submission filestore because of a D2
limitation. Documentum Submission Store and View does not write to the submission filestore (write
access is only required for submission publishing). If necessary, enable sharing on the submission
filestore itself, so that the appropriate users (including the Documentum installation owner account
used by the Documentum Server, such as dmadmin) have at least Read access to the appropriate
folder. You may need to set up a local user account for dmadmin on the machine hosting the filestore,
so that the Documentum Server can connect to it as that user.
The Documentum for Life Sciences Solution Installation Guide provides the steps for registering
submission filestores.
For eCTDs, the user must select either an individual sequence folder, or a parent folder containing a
series of sequence folders (that is, an application-level folder), in which case those sequences that have
not already been imported are imported into the repository in increments. For NeeS, the selected
folder must be a submission-level folder to be imported in its entirety. Existing submission folders in
the repository are not updated. To replace existing submission folders in the repository, you must
delete the existing submission folder from the repository and reimport it from the external filestore.

Updating the D2 and XMLViewer URLs in D2


Dictionary
Follow these steps to update D2 URL in the System Parameter D2 dictionary, in case it has been
changed.

1. Log in to D2-Config.
2. Select Data > Dictionary.
3. On the Dictionary page, select All elements in the filter on the toolbar.
4. In the Dictionaries list, select System Parameters.
5. On the Alias tab, replace the value of the d2_url key from localhost to the correct URL address
of D2 on the application server.
6. On the Alias tab, replace the value of the xmlviewer_url key with the XML Viewer URL on the
application server (for example, http://<app server host>:<default port>/XMLViewer).
Note: You need to set the XML Viewer URL in case it has not been set during the installation.
7. Click Save.

361
Regulatory Submissions

8. Select Tools > Refresh cache.


Note: If the D2 application server URL is changed subsequently, it will break hyperlinks in existing
pdf_preview renditions. To avoid this, consider using network-address translation (NAT) to provide
a virtual application server URL instead of referencing the IP address or hostname of the application
server directly. You can also use host-address translation (HAT), for example, through the /etc/hosts
file (on UNIX) or C:\Windows\System32\drivers\etc\hosts file (on Windows), to map virtual server
names to physical servers. This enables physical servers to be changed in the future if necessary,
without affecting the URL.

Configuring the LSSSV Viewer Widget URLs


The Documentum Submission Store and View Viewer uses the following external widgets that
connect into the D2 client:
• The Submission Navigator widget provides a navigation panel in the “View Submission”
workspace view, enabling the user to select the required view for the application or submission,
drill down into the submission folder structure and select documents for previewing.
• The Submission Document Preview widget displays the currently-selected document in a
separate panel in the “View Submission” workspace view, enabling the user to browse individual
documents and switch between documents easily. This widget can also be used to display
regional metadata in XML files selected in the Submission Navigator.
• The WG EXT SSV Compare Viewer 1 and WG EXT SSV Compare Viewer 2 widgets provide a
side-by-side view of two selected documents in the “Compare” workspace view to enable them
to be compared easily. These are separate instances of the same widget used by the Submission
Document Preview widget, but with their own configurations in D2, enabling each widget to
respond to different document selection events and display the relevant content in its own panel.
These widgets may need to be configured to use the widget parameters in the URL. Follow these steps:
1. Log in to D2-Config as Administrator.
2. In the filter on the main toolbar, select OpenText Life Sciences Submission Storage and Viewing.
3. Select Widget view > Widget.
4. Under Widgets, select WG EXT Submission History View.
5. In the Widget url field, ensure the Widget URL setting is configured appropriately for your
environment. The following parameters can be specified in the URL:

362
Regulatory Submissions

Submission Navigator Widget URL Parameters (“WG EXT Submission History View”);
Default URL Path: /XMLViewer/XMLViewer.html
URL Parameter Description Default Configuration
Setting
type List of object type(s) to be cd_reg_admin_info,
rendered by this widget.
cd_reg_submission_info

cd_submission_element
style XSL stylesheet to use to render submission-navigator.xsl
the XML file for the relevant
application into HTML in
the widget panel. (The
specified stylesheet must be
installed in the %WEB-APPS%
/XMLViewer/style folder
on the Application Server.)
docViewerWidget Widget ID of the Submission SSVLeafDocViewer
Document Preview widget.
docViewerEvent D2 event code to send to D2_CUSTOM_EVENT
the Submission Document
Preview widget when a
document is selected in the
navigation panel.

363
Regulatory Submissions

Submission Navigator Widget URL Parameters (“WG EXT Submission History View”);
Default URL Path: /XMLViewer/XMLViewer.html
URL Parameter Description Default Configuration
Setting
initView Initial display mode to use toc
for application-level views, as
follows:

• toc–Table of Contents view

• current–eCTD “current”
view or first NeeS
submission (as
appropriate)

• cumulative–eCTD
“cumulative” view or
first NeeS submission (as
appropriate)

• first–first eCTD sequence


or NeeS submission, as
appropriate

This setting only applies if


a Regulatory Application
Registration Form is selected.
If a Submission Registration
Form or Submission Element,
that is, an imported
submission document, is
selected, the appropriate
submission or sequence view
is opened automatically.
initMessage Message to display when Select_a_Regulatory
the widget is initially loaded _Application_Registration
and nothing is selected. _Form, Submission
Underscore characters can be _Registration_Form_or
used here to represent spaces, _archived_submission
to make it easier to specify the _document
message in the URL.

364
Regulatory Submissions

Submission Navigator Widget URL Parameters (“WG EXT Submission History View”);
Default URL Path: /XMLViewer/XMLViewer.html
URL Parameter Description Default Configuration
Setting
refreshLoginTicketInterval Periodic interval, in seconds, 240
to refresh the D2 login
ticket, to prevent tickets
from expiring. The specified
value should be less than the
Documentum login ticket
timeout value, as configured
in the dm_server_config object
for the repository (default =
5 minutes). It should allow a
margin of at least 10 seconds
for D2 to respond to new
ticket requests.
workspaceView D2 view labels in the Submission%20View
workspace in which this
widget is installed. A
comma-delimited list of
view labels can be specified
if necessary; use “%20” to
denote a space character
in the URL. To avoid
unnecessary processing, the
currently-selected Regulatory
Application Registration
Form or Submission
Registration Form XML
file is not downloaded and
rendered until the relevant
view is selected.
active Indicates whether or not false
the widget is active in the
default workspace view. In
single-view workspaces, the
value should be set to true;
for multi-view workspaces,
specify active=true if the
widget is in the initial (default)
view, and active=false
otherwise.
docbase Repository name. The D2 $DOCBASE
token $DOCBASE can be used
here.

365
Regulatory Submissions

Submission Navigator Widget URL Parameters (“WG EXT Submission History View”);
Default URL Path: /XMLViewer/XMLViewer.html
URL Parameter Description Default Configuration
Setting
username Username to use for the $LOGIN
current session. The D2 token
$LOGIN can be used here.
password Password to use for the $TICKET
current session. The D2 token
$TICKET can be used here.
A following URL is an example widget URL for the Submission Navigator widget:
/XMLViewer/XMLViewer.html?workspaceView=View%20Submission&active
=false&format=xml&type=cd_reg_admin_info,cd_reg_submission_info,cd
_submission_element&style=submission-navigator&docViewerWidget
=SSVLeafDocViewer&docViewerEvent=D2_EVENT_CUSTOM&initMessage=Select_a
_Regulatory_Application_Registration_Form,_Submission_Registration_Form
_or_archived_submission_document&docbase=$DOCBASE&locale=en&username
=$LOGIN&password=$TICKET
6. Click Save.
7. Under Widgets, select WG EXT SSV Leaf Element View and configure the widget URL using
the following URL parameters:

Submission Document Preview Widget URL Parameters (“WG EXT SSV Leaf Element
View”); Default URL Path: /XMLViewer/DocViewer.html
URL Parameter Description Default Setting
widget_id The ID of this widget, SSVLeafDocViewer
as specified in the
docViewerWidget widget
in the Submission Navigator
URL.
workspaceView D2 view labels in the Submission%20View
workspace in which this
widget is installed. A
comma-delimited list of
view labels can be specified
if necessary; use “%20” to
denote a space character
in the URL. To avoid
unnecessary processing, the
currently-selected Regulatory
Application Registration
Form or Submission
Registration Form XML
file is not downloaded and
rendered until the relevant
view is selected.

366
Regulatory Submissions

Submission Document Preview Widget URL Parameters (“WG EXT SSV Leaf Element
View”); Default URL Path: /XMLViewer/DocViewer.html
URL Parameter Description Default Setting
active Indicates whether or not false
the widget is active in the
default workspace view. In
single-view workspaces, the
value should be set to true;
for multi-view workspaces,
specify active=true if the
widget is in the initial (default)
view, and active=false
otherwise.
autoSelect Indicates whether or not false
the widget should track and
preview the currently-selected
document in the D2 Doc List
widget. If false, the widget
responds to custom menu
events or events fired by some
other widget (for example, the
Submission Viewer widget)
targeted at this widget ID.
type Comma-delimited list of null
expected object types. Where
specified, only selected
documents of the specified
type are rendered. Optional;
if unspecified, all object types
are rendered, if they have
suitable content.
useC2Overlays Set as true to display PDF false
renditions with watermarks
or signature pages, according
to the C2 view configuration;
false to display PDFs without
C2 watermarks or signature
pages.
imageFormats Comma-delimited list jpeg,gif,png
of in-line image formats
supported by the browser.
docCompareLeftWidgetId Widget ID of the left lsDocViewer1
document preview panel
used in the “Compare” view.
Optional. If undefined or left
blank, the comparison view is
disabled.

367
Regulatory Submissions

Submission Document Preview Widget URL Parameters (“WG EXT SSV Leaf Element
View”); Default URL Path: /XMLViewer/DocViewer.html
URL Parameter Description Default Setting
docCompareRightWidgetId Widget ID of the right lsDocViewer2
document preview panel
used in the “Compare” view.
Optional. If undefined or left
blank, the comparison view is
disabled.
docCompareEvent D2 event code to send to the D2_EVENT_CUSTOM
comparison widgets to initiate
a side-by-side comparison
operation.
initMessage Message to display when Select_a_document_in_the
the widget is initially loaded _submission_viewer.
and nothing is selected for
previewing. Underscore
characters can be used to
represent spaces here, to
make it easier to specify the
message in the URL.
docbase Repository name. The D2 $DOCBASE
token $DOCBASE can be used
here.
username User name to use for the $LOGIN
current session. The D2 token
$LOGIN can be used here.
password Password to use for the $TICKET
current session. The D2 token
$TICKET can be used here.

8. Click Save.
The use of the standard D2 PDF Preview widget is not recommended in multi-view workspaces
because it responds to all document selection events in the Doc List widget irrespective of whether
or not the widget is visible. This can lead to performance issues when navigating the repository.
Instead, it is recommended that the custom Life Sciences document preview widget is used for
previewing. In the standard workspaces, the Life Sciences document preview widget is used by
default in several places, with different URL parameters passed to it in each case, so that the widget
only downloads and renders content when it is in the currently-active view.
The following table summarizes the default widget configurations used for document previews:

368
Regulatory Submissions

D2 Widget Applies To Default URL


Configuration
WG EXT LSCI LSS Doc Browse and My /XMLViewer/DocViewer.html?widget
Viewer (Browse) Sites views in _id=lsDocViewerBrowse&active
workspaces with =true&config=WG_EXT_LSCI
a Welcome page _LSS_Doc_Viewer_(Browse
)&workspaceView=Browse,My
_Sites&autoSelect=true&useC2Overlays
=true&excludeTypes=dm_cabinet,dm
_folder,cd_submission_folder,cd
_submission_subfolder,cd_clinical
_trial_info&excludeFormats
=excel8book,excel12book&initMessage
=Select_a_document&docbase
=$DOCBASE&locale
=en&username=$LOGIN&password
=$TICKET&d2WebContext=D2
WG EXT LSCI LSS Doc Browse view in /XMLViewer/DocViewer.html?widget_id
Viewer (Initial Browse) workspaces without =lsDocViewerInitialBrowse&config
a Welcome page =WG_EXT_LSCI_LSS_Doc_Viewer
_(Initial_Browse)&active
=true&autoSelect=true&useC2Overlays
=true&excludeTypes=excludeTypes=dm
_cabinet,dm_folder,cd_submission
_folder,cd_submission_subfolder,cd
_clinical_trial_info&excludeFormats
=excel8book,excel12book&initMessage
=Select_a_document&docbase
=$DOCBASE&locale
=en&username=$LOGIN&password
=$TICKET&d2WebContext=D2
WG EXT LSCI LSS Doc Tasks view /XMLViewer/DocViewer.html?widget_id
Viewer (Tasks) =lsDocViewerTasks&active=true&config
=WG_EXT_LSCI_LSS_Doc_Viewer_(Tasks
)&workspaceView=Tasks&autoSelect
=true&excludeTypes=dm_cabinet,dm
_folder,cd_submission_folder,cd
_submission_subfolder,cd_clinical
_trial_info&excludeFormats
=excel8book,excel12book&useC2Overlays
=true&initMessage=Select_a
_document&docbase=$DOCBASE&locale
=en&username=$LOGIN&password
=$TICKET&d2WebContext=D2

369
Regulatory Submissions

D2 Widget Applies To Default URL


Configuration
WG EXT LSCI SSV Doc Compare view on /XMLViewer/DocViewer.html?widget
Compare Viewer 1 the left pane _id=lsDocViewer1&config=WG_EXT
_LSCI_SSV_Doc_Compare_Viewer
_1&workspaceView=Compare&initMessage
=Select_a_document_for_comparison_in
_Viewer_1&docbase=$DOCBASE&locale
=en&username=$LOGIN&password
=$TICKET&refreshTicketInterval
=240&d2WebContext=D2
WG EXT LSCI SSV Doc Compare view on /XMLViewer/DocViewer.html?widget
Compare Viewer 2 the right pane _id=lsDocViewer2&config=WG_EXT
_LSCI_SSV_Doc_Compare_Viewer
_2&workspaceView=Compare&initMessage
=Select_a_document_for_comparison_in
_Viewer_2&docbase=$DOCBASE&locale
=en&username=$LOGIN&password
=$TICKET&refreshTicketInterval
=240&d2WebContext=D2
WG EXT LSCI SSV Leaf Right document /XMLViewer/DocViewer.html?widget
Element Viewer preview panel in the _id=lsSSVLeafDocViewer&config
View Submission =WG_EXT_LSCI_SSV_Leaf_Element
view _Viewer&workspaceView=View
_Submission&useC2Overlays
=false&docCompareLeftWidgetId
=lsDocViewer1&docCompareRightWidgetId
=lsDocViewer2&initMessage=Select
_a_document_in_the_Submission
_Viewer&docbase=$DOCBASE&locale
=en&username=$LOGIN&password
=$TICKET&refreshTicketInterval
=240&d2WebContext=D2
If you decide to remove the Welcome pages, change the standard view labels or allow users to add
the Life Sciences Document Preview widget to other workspace views, it may be necessary to adjust
the widget URL parameters so that the widget is enabled correctly for the new view labels. Do not
activate the same widget in more than one view within a multi-view workspace, as this will cause
performance issues. If necessary, create a new D2 widget configuration in D2-Config for the new
view, with the appropriate URL parameters, alongside the standard widget configurations described
above. This must then be enabled in the D2 configuration matrix for the appropriate roles and added
to the default workspace view configuration XML files as necessary. In addition, make sure that the
widget description is set appropriately in D2-Config in each case. When users want to add preview
widgets to the standard workspaces, they can identify the correct widget in the gallery.

370
Regulatory Submissions

Updating the XML Viewer D2 External Widgets


The XML Viewer configuration is necessary to provide the ability to view imported submissions by
Documentum Submission Store and View-related roles in workspaces. The automatic installation
updates the URLs. You can change the URLs as required.

1. In D2-Config, select All elements from the configuration filter.


2. Select Widget view > Widget from the menu bar.
3. Select the WG EXT LSCI SSV Leaf Element Viewer widget and in the Widget url, type the
following:
/XMLViewer/DocViewer.html?widget_id=lsSSVLeafDocViewer&config
=WG_EXT_LSCI_SSV_Leaf_Element_Viewer&workspaceView=View
_Submission&useC2Overlays=false&docCompareLeftWidgetId
=lsDocViewer1&docCompareRightWidgetId=lsDocViewer2&initMessage=Select_a
_document_in_the_Submission_Viewer&docbase=$DOCBASE&locale=en&username
=$LOGIN&password=$TICKET&refreshTicketInterval=240&d2WebContext=D2
4. Select the WG EXT LSCI Submission History View widget and in the Widget url, type the
following:
/XMLViewer/XMLViewer.html?widget_id=lsSSVSubmissionViewer&config
=WG_EXT_LSCI_Submission_History_View&workspaceView=View
_Submission&format=xml&type=cd_reg_admin_info,cd_reg_submission
_info,cd_submission_element&style=submission-navigator&docViewerWidget
=lsSSVLeafDocViewer&docViewerEvent=D2_EVENT_CUSTOM&stfViewerWidget
=lsSSVStudyTaggingFileViewer&initMessage=Select_a_Regulatory
_Application_Registration_Form,_Submission_Registration_Form_or
_archived_submission_document&docbase=$DOCBASE&locale=en&username
=$LOGIN&password=$TICKET&d2WebContext=D2
5. Select the WG EXT LSCI SSV Doc Compare Viewer 1 widget and in the Widget url, type the
following:
/XMLViewer/DocViewer.html?widget_id=lsDocViewer1&config=WG_EXT_LSCI
_SSV_Doc_Compare_Viewer_1&workspaceView=Compare&initMessage=Select
_a_document_for_comparison_in_Viewer_1&docbase=$DOCBASE&locale
=en&username=$LOGIN&password=$TICKET&refreshTicketInterval
=240&d2WebContext=D2
6. Select the WG EXT LSCI SSV Doc Compare Viewer 2 widget and in the Widget url, type the
following:
/XMLViewer/DocViewer.html?widget_id=lsDocViewer2&config=WG_EXT_LSCI
_SSV_Doc_Compare_Viewer_2&workspaceView=Compare&initMessage=Select
_a_document_for_comparison_in_Viewer_2&docbase=$DOCBASE&locale
=en&username=$LOGIN&password=$TICKET&refreshTicketInterval
=240&d2WebContext=D2
7. Select the WG EXT LSCI Study Tagging File Navigator widget and in the Widget url, type
the following:
/XMLViewer/DocViewer.html?widget_id=lsSSVStudyTaggingFileViewer&config
=WG_EXT_LSCI_Study_Tagging_File_Navigator&workspaceView
=View_Submission&useC2Overlays=false&docViewerWidget

371
Regulatory Submissions

=lsSSVLeafDocViewer&initMessage=Select_a_study_tagging_file
_in_the_Submission_Viewer&docbase=$DOCBASE&locale=en&username
=$LOGIN&password=$TICKET&refreshTicketInterval=240&d2WebContext=D2
8. Save the configuration.
Note: If you use Documentum D2 in HTTPS mode, then you must also specify the Documentum
Submission Store and View URLs in HTTPS mode. This is also applicable in cases where the HTTP
login gets automatically redirected to the HTTPS mode for D2.

Processing of PDFs and Inter-Document


Hyperlinks
Documentum Submission Store and View generates a PDF preview rendition (represented by
the pdf_preview object in the Documentum Submission Store and View object model) for each
document in an eCTD or NeeS submission that has the eCTD PDF Document or NeeS PDF Document
element type, respectively, in the file info map. The generated renditions include hyperlinks to other
submission documents. The document is opened for reading through the itext PDF library, which is
included with D2 and is used by the C2 plug-in to process PDF files.
If the document contains a PDF Document Information (DocInfo) field, which represents the source
document object ID with a non-null value referring to a valid dm_document object in the current
repository, a relation of type Source Document is created in the repository between the parent
document and the imported child document in the repository. A Used in Submission relation is also
created between the source parent document and the child submission folder.
The source_object_pdf_docinfo_field parameter specifies an optional PDF DocInfo field used by the
submission publishing tool to record the Documentum r_object_ids of the original source documents
in the published PDF documents. When publishing documents from Documentum, it is useful to
embed the source object IDs in the corresponding published PDFs in a designated DocInfo field.
Any of the standard DocInfo fields can be used for this purpose such as Subject. Alternatively, a
custom DocInfo field can be used, such as SourceObjectId, which will not be shown on the document
properties page of Adobe Acrobat. Configuring the Document Information parameter enables the
system to track the original source documents and relate them to the corresponding published
submission documents when they are imported into the repository through the Import Submission
function. The Documentum for Life Sciences solution Installation Guide provides the steps for configuring
the PDF DocInfo parameter in the D2 dictionary.
If the document contains inter-document hyperlinks (that is, relative filepath links, also known as
“/GoToR” links in PDF terminology), they are converted into D2 hyperlinks (“/URI” links) to the
relevant object in the repository. The resulting document, if modified, is saved as a new PDF file and
is checked-in as a pdf_preview rendition of the relevant document in the repository. The original
PDF content file/rendition is preserved. The format attribute of the corresponding XML DOM object
is updated to the pdf_preview object accordingly. If the file does not contain any inter-document
hyperlinks, the system does not generate a pdf_preview rendition; the primary content file (pdf
format) is previewed. Note that this processing does not apply to other types of PDF links, such as
intra-document links (“/GoTo” links) or web links (“/URI” links). Such links are preserved as is in the
PDF rendition. Sometimes, even bookmarks in PDF documents can refer to some other documents
within the same sequence or different sequence folders.

372
Regulatory Submissions

When a submission is imported, the inter-document PDF hyperlinks and bookmarks are resolved
during import based on the d2_url and xmlviewer_url parameter values specified in the System
Parameters dictionary. However, if you change the Application Server URL after importing the
submission, it impacts the PDF hyperlink navigation on the imported submission documents because
they still reflect the old Application Server URLs. In such a scenario, the user can run the Index
Submission Folders Migration utility. For more information, see Using the Index Submission Folders
Migration Utility, page 386.
Another option is to use an LBR (load balancing router) or proxy server as it enables the target
app server hosts to be reconfigured easily. You then configure the base URL for D2 to point to the
LBR or proxy server, and it forwards the HTTP requests to the relevant app server. It should be
configured to use “sticky sessions” so that the same app server is used for a particular client session,
rather than spreading it across multiple servers. For example, you can set up an Apache Tomcat
proxy server. For setting up the proxy server, refer to the “Proxy Support HOW-TO” article on
the Apache Tomcat website.

Resolving Broken Hyperlinks


In Internet Explorer and Mozilla FireFox, broken hyperlinks in the content of PDF documents (page
link annotations) are highlighted in the SSV preview renditions with a red border. When you click
these hyperlinks, a “document not available” message is displayed. However, this does not work in
Google Chrome—the red border is not displayed but the link is deactivated and no longer produces
an HTTP 404 error when clicked.
Broken hyperlinks are also reported in the LSSSV import log file, attached to the text rendition of the
RARF. Documents with unknown or unrecognized content formats are also logged as warnings in the
log. If the -attach_dump_files option is enabled, additional details on the hyperlinks detected in each
submission and whether the links were resolved can be found in the ZIP file attached to the RARF.
Note: In Internet Explorer and FireFox, it may be necessary to change the browser settings on the
user's desktop to use the Acrobat plugin to render the PDF content instead of using the native PDF
viewer. For example, in FireFox, click Open menu > Options and under Applications, for PDF file,
select Use Adobe Acrobat (in FireFox).
The “document not available” message is localizable. you can create a D2 dictionary named System
Messages in the repository and add the key Invalid Hyperlink to the dictionary (it is case-sensitive)
with translations for various supported locales. LSSSV uses the locale of the user who requested the
import to determine which language code to use. The message can include the "$PATH" symbol as
a placeholder for the missing target document's file system path, relative to the current document.
If the System Messages D2 dictionary is not defined, or the Invalid Hyperlink key is missing, or a
localized value for the user's locale is not defined in the dictionary, LSSSV reverts to using the default
system error message in English.
In all browsers, broken hyperlinks in the PDF bookmark tree (PDF outlines) are disabled, with the
“document not available” error message appended to the bookmark title. An X is also prepended to
the bookmark title to make it clear that the link is defunct.
If you pass the argument -enable_pdf_javascript_for_links true to the import method, which is
invoked through the RARF lifecycle transition “(Import Submissions),” then the target documents of
resolved hyperlinks are opened in a separate tab or window within the browser through a JavaScript
action, instead of replacing the current page with the target document through a D2 download URL.

373
Regulatory Submissions

This makes it easy for users to switch back and forth between the source and target documents.
However, the JavaScript in embedded PDF documents is only supported on Internet Explorer and
FireFox; it does not work in Chrome. In the latest version of Chrome (62.0.3202.6), it is no longer
possible to replace the built-in PDF viewer with the Acrobat plugin. Therefore, for compatibility
with the Chrome browser, this option is disabled by default. If Chrome does not need to be
supported, enabling this option is recommended for improved usability. However, existing imported
submissions may need to be reindexed or reimported in order to take advantage of this functionality.
Note: These changes only apply to the PDF preview renditions used internally by LSSSV for
inter-document hyperlink navigation within D2. They do not affect the original submitted PDF
content files. If the submission folder is exported, the original submitted PDF files (only) are included
in the export.

Study Tagging Files


Study Tagging Files (STFs) are XML files that may be included in the m4 (Non-Clinical) and m5
(Clinical) sections, primarily for submissions to the US FDA. These are optional for European
submissions, and are not allowed for Japanese submissions. Each STF relates to a specific study, and
may refer to several study documents (for example, Clinical Study Reports) in a particular section of
the eCTD. By convention, the STF XML file is named “stf-<study-id>.xml”, where <study-id> is a
unique study identifier, and is located in the appropriate folder in the eCTD, alongside the documents
it references. There may be several STFs in one folder, one for each study.
The purpose of an STF is to:
• Provide additional metadata about each study, for example, the study ID, title, duration, species
and type of control used in each case
• Enable the individual study documents to be grouped by study
• Provide additional information about the contents of each study document, known as its file
tags. Usually, there is one file tag per study document indicating the purpose of that document,
if defined; for example, “synopsis” or “study-report-body”. However, it is possible for a study
document to have multiple file tags in an STF.
If an STF is not provided for a particular study, the LSSSV Viewer displays the study documents in
the relevant eCTD section as a flat list, exactly as they appear in the eCTD XML backbone file.
Each section in modules M4 and M5 can contain multiple study report documents pertaining to
multiple studies. However, where there are many study documents, it may be difficult for users to
navigate the submission structure and find documents related to a particular study. STFs facilitate
this process by providing additional metadata for each study, over and above that which is provided
in the standard ICH XML backbone file. An example code is as follows:
<?xml version=”1.0” encoding=”UTF-8”?>
<!DOCTYPE ectd:study SYSTEM "../../util/dtd/ich-stf-v2-2.dtd">
<?xml-stylesheet type="text/xsl" href="../../util/style/ich-stf-stylesheet-2-2a.xsl"?>
<ectd:study xmlns:ectd="http://www.ich.org/ectd" xml:lang="en" dtd-version="2.2"
xmlns:xlink="http://www.w3.org/1999/xlink">
<study-identifier>
<title>Wonderdrug Phase III Clinical Study Report: SN-3.001</title>
<study-id>3.001</study-id>
<category name="species" info-type="ich">human</category>
<category name="route-of-admin" info-type="ich">oral</category>
<category name="type-of-control" info-type="ich">dose-response-without-placebo</category>

374
Regulatory Submissions

</study-identifier>
<study-document>
<doc-content xlink:href="../../../../index.xml#id106245">
<title>SN-3.001.01 – Overview of study objectives</title>
<file-tag name="synopsis" info-type="ich"/>
</doc-content>
<doc-content xlink:href="../../../../index.xml#id106246">
<title>SN-3.001.02 – Long-term dose response results in human trials for Wonderdrug 50mg
capsules</title>
<file-tag name="study-report-body" info-type="ich"/>
</doc-content>
<doc-content xlink:href="../../../../index.xml#id106247">
<title>SN-3.001.03 – Amendments to Clinical Study Protocol</title>
<file-tag name="protocol-or-amendment" info-type="ich"/>
</doc-content>
</study-document>
</ectd:study>

The STF XML file resides in the same folder as the study report documents; in this case, the
5312-compar-ba-be-stud-rep folder (used for comparative bioavailabilty/bioequivalence studies).
Documentum Submission Store and View uses a built-in XSL script, stf-2-2-normalize.xslt, to convert
the STF XML files into a standard format that is easy to process (just as for other non-standard XML
files, such as Japanese regional M1 XML files). This script is installed as the primary content file of the
XML schema configuration object named stf_2-2, which applies to v 2.2 Study Tagging Files; XSL
transformation is enabled for this schema, so that all STF XML files are transformed automatically by
this script.
This information can then be manifested in the main Submission Viewer panel, with links to the three
study reports arranged underneath the study tagging file node itself. If the version 2.2 study tagging
file format is revised in the future, a new XSL transformation script may need to be developed to
transform it into the same format as above, so that it can be processed by LSSSV. To accomplish this:
1. Copy the /XMLViewer/style/stf-2-2-nornalize.xslt script provided, rename it accordingly, and
edit it to convert the new format into the same “normalized” XML format described above.
2. In Documentum Administrator, copy the stf_2-2 XML schema configuration object and rename it
accordingly. Adjust the xpath_qualifier setting to refer to the new version of the STF.
3. Use the check-out/check-in from file functionality in Documentum Administrator to replace the
main content file of the new XML schema configuration object with the new XSL transformation
script developed in step 1.
4. Import a sample eCTD submission using the new STF format and verify that it can be navigated
correctly.

Previewing of Media Files


Media files, such as video, can be included in submissions and previewed in Documentum
Submission Store and View. See Media Files, page 287 for more information.

375
Regulatory Submissions

Adding Custom Format Icons


If new content formats are to be supported, it may be necessary to provide icons for them in the
%WEBAPPS%/XMLViewer/icons/format folder, so that the correct icons are displayed in the
Submission Viewer. Icons for the standard formats are pre-installed. Additional icon files must
be provided as GIF (Graphics Interchange Format) files of size 16x16 pixels, and the file name
must be the same as the corresponding dm_format name in each case (for example, mpeg.gif). In
many cases, GIF files for new formats can be obtained from the Documentum Administrator web
application folder, %WEBAPPS%/da/wdk/theme/documentum/icons/format. Copy the relevant
f_<format>_16.gif file to the %WEBAPPS%/XMLViewer/icons/format folder, and rename
the file as <format>.gif. Do not use the larger 32-bit icon files as this can disrupt the tree view
in Documentum Submission Store and View.

376
Chapter 18
Migration and Integrity Check Utilities

This section describes how to install and use migration and integrity check tools included in the
Life Science solutions.

D2 Configuration Migration Tool


The D2 Configuration (ApplyD2Configurations) Migration tool is a command-line tool that can be
used by System Administrators to apply D2 auto-linking, auto-naming and/or security configurations
selectively to specific objects in the repository. You may need to run this tool in the following cases:
• After migrating a set of new or legacy documents into the repository into a staging area folder, D2
auto-linking may need to be applied to them.
• After making changes to the D2 auto-linking, auto-naming, or security configurations that are to
be applied to the existing documents.
• After a bulk DQL update operation that changes the users and groups assigned to various roles, in
order to update the security on those documents.
The tool supports multi-threaded parallel processing, and also distributed processing across multiple
Documentum Servers. The tool can be scaled-out both horizontally, by providing more servers, and
vertically, by increasing the number of parallel processing threads on each server.
You may face significant challenges when migrating legacy repositories to D2-enabled applications
such as the Life Sciences solution. Although the Migration Appliance (EMA) tool can be
used to convert legacy object types and metadata between object models and migrate content
efficiently between repositories, it does not apply D2 context-based auto-naming, auto-linking,
security, and initial lifecycle state transition actions to migrated objects. To accomplish this, the
ApplyD2Configurations command-line utility (provided as part of the Life Sciences Controlled
Document Foundation (CDF) package) can be used to apply the relevant D2 configurations selectively
to repository objects, after having processed them in EMA.
The EMA tool can migrate objects at a rate of approximately 300,000 objects per hour. However, even
though the ApplyD2Configurations tool supports multi-threaded and distributed processing, it is
difficult to achieve processing rates of more than 50,000 objects per hour per server, and in many
cases, depending on the number and complexity of the D2 configurations that need to be applied,
the processing rate can be less than 25,000 objects per hour. This is therefore a significant bottleneck
in the migration process. To process 1 million documents at 25,000 documents per hour, it would
take 400 hours, or around 17 days of processing.

377
Migration and Integrity Check Utilities

The root cause is because of the way in which the underlying D2 core method is implemented. This
is the server method (part of the D2 core product) that is responsible for applying the relevant D2
configurations to an individual repository object. It processes objects one by one in serial fashion,
and in each case it evaluates the D2 context rules, as defined in the D2 configuration matrix for the
application (which are essentially DQL qualifiers), to determine which auto-naming, auto-linking,
and security configurations apply to the object. This does not scale up well when applied to a large
set of repository objects. It does not support multi-threaded processing, and causes redundant
processing for objects with similar properties. The processing rate when using the default D2 core
method in this way is very poor – typically around 3000 objects per hour.
The ApplyD2Configurations utility provides a wrapper around the D2 core method that supports
multi-threaded processing, which can boost the processing rate by a factor of 10 or more. With a
multi-node Documentum Server architecture (comprising two or more active Documentum Servers),
it is possible to scale the process out horizontally, using the distributed processing options. However,
to match the processing rate of EMA, some 10 Documentum Servers or more may be required.
Another limitation of the ApplyD2Configurations utility is in a Documentum Server failover setup
where the method does not have the capability to check if the Documentum Server instance is
available to process the batch. It assumes all the Documentum Server instances are up and running
and available to process the batch. Therefore, you must enable the distributed parallel processing
only when all the Documentum Server instances are available but not in the failover setup.
The Documentum Controlled Document Foundation Developer API Javadocs provides more information
about the ApplyD2Configurations tool.

Installing the D2 Configuration Migration Tool


The D2 Configuration Migration tool is shipped as part of the Life Sciences solution. Batch scripts for
both UNIX and Windows installations are provided within the utils/migration folder of the
Life Science solution installation package. These files can only be executed on a Documentum Server
host that has the Life Sciences solution server components installed on it. To install the scripts, copy
the relevant files to an arbitrary folder on the Documentum Server. On UNIX, ensure that the shell
scripts are set to be executable through the chmod o+x command.
To install the tool on a system that has D2 but does not have any of the Life Sciences Solution
modules installed:
1. Copy the CDF-Methods.jar file (the Controlled Document Foundation server method
library) and the CDFBase.jar file to the Java Method Server lib folder on each Documentum
Server, along with the D2 jars, found in the location, %Documentum%\jboss7.1.1\server
\DctmServer_MethodServer\deployments\ServerApps.ear\lib
2. Restart the Java Method Server on each node after installing the JAR file, so that it loads the
new JAR.
3. To support distributed processing in multi-node Documentum Server architectures, see Creating
a dm_method Object, page 309.

378
Migration and Integrity Check Utilities

Configuring the D2 Configuration Migration Tool


To view the usage notes for the tool:
• On Windows, double-click the ApplyD2Configurations.bat script in Windows Explorer.
• On UNIX, run the ApplyD2Configurations.sh script without any parameters, or with the
help parameter.
You can configure the default settings for the tool parameters in the D2 System Parameters dictionary,
which contains the default settings for distributed / parallel processing methods, in D2-Config, rather
than specifying them explicitly on the command line.
The following is an example of a Windows command that applies D2 configurations
(auto-naming, auto-linking, and security) to all controlled documents below the
/Clinical/Cardiology-Vascular/Vasonin folder, including subfolders of this folder:
C:\Users\dmadmin> ApplyD2Configurations -docbase_name documentum -user_name dmadmin -password ""
-qualifier "cd_controlled_doc where folder('/Clinical/Cardiology-Vascular/Vasonin', descend)"
-verbose true

In this example, C:\Users\dmadmin> is the Windows command prompt, indicating the folder where
the scripts are installed. The parameters are the same for UNIX:
$ sh ApplyD2Configurations -docbase_name documentum -user_name dmadmin -password ""
-qualifier "cd_controlled_doc where folder('/Clinical/Cardiology-Vascular/Vasonin', descend)"
-verbose true

D2 auto-naming, auto-linking and security are applied by default. To apply them selectively, specify
the appropriate flags on the command line explicitly; for example,
C:\Users\dmadmin> ApplyD2Configurations -docbase_name documentum -user_name dmadmin -password ""
-qualifier "cd_controlled_doc where folder('/Clinical/Cardiology-Vascular/Vasonin', descend)"
-verbose true –auto_naming false –auto_linking false –security true

To process new documents, include the –create true argument, so that D2 applies auto-naming
and auto-numbering correctly. Otherwise, it retains the existing auto-numbering values. You can also
specify a D2 lifecycle state transition to apply to each document, through the –state_transition
argument. For example, –state_transition "(init)" applies the initial lifecycle state actions
to each document, as specified in the D2 lifecycle configuration for the “(init)” state.
The –delete_empty_folders option is also useful for cleaning up any empty folders below the
specified top-level folder. For example,
C:\Users\dmadmin> ApplyD2Configurations -docbase_name documentum -user_name dmadmin -password ""
-qualifier "cd_controlled_doc where folder('/Clinical/Cardiology-Vascular/Vasonin', descend)"
-verbose true –auto_naming true –auto_linking true –security true –delete_empty_folders
"/Clinical/Cardiology-Vascular/Vasonin"

When using the –auto_linking option, empty folders are sometimes generated as a result of
conflicts between parallel processing threads. These are prefixed by dm_fix_me labels in the folder
names. Such folders can be cleaned out by enabling the –delete_empty_folders option.
Note that the use of the –auto_linking and –state_transition options, in particular, can
increase the processing time required by the tool substantially. If the documents can be migrated
into the relevant folders directly and initialized with the appropriate metadata in advance, it is
better to do so, in order to avoid lengthy processing. If this cannot be avoided, and there are many
documents to be processed, consider increasing the number of processing threads available through
the –max_threads parameter. If multiple Documentum Servers are available, you can enable
distributed processing as follows:

379
Migration and Integrity Check Utilities

C:\Users\dmadmin> ApplyD2Configurations -docbase_name documentum -user_name dmadmin -password ""


-qualifier "cd_controlled_doc where folder('/Clinical', descend)"
-verbose true –content_servers "^*" –max_threads 8 –distributed_processing_threshold 1000

In this example, assuming there are more than 1000 documents to be processed (the -distributed_
processing_threshold value), the tool splits the processing tasks across all available servers.
Note the use of the caret symbol before the “*” symbol, which prevents the Windows command-line
shell from expanding this symbol into a list of files in the current directory. On UNIX, use
–content_servers “\*” to achieve the same results.
To improve the processing rate, specify the -use_private_sessions parameter in the D2 System
Parameters dictionary. If the value of this parameter is null or undefined, 'false' is taken as the
default value.

TMF Admin Integrity Checking and Repair Tool


The TMF Integrity Checking and Repair tool (also known as the TMF Admin tool) is a command-line
system administration utility. This tool enables System Administrators to validate the external role
group security settings for TMF documents and folders and optionally repair them.
This tool can:
• Examine the TMF documents and subfolders in a specified top-level folder, and highlight those
that have incorrect role groups assigned to them. Optionally, it can automatically fix those
documents and folders and reapply D2 security to them, create any missing groups as required,
and delete groups that are no longer used.
• Force a refresh of the external group members for specific trials, countries, and sites. This ensures
that appropriate levels of access are granted to the external users on the relevant TMF artifacts. It
also ensures that read-only access is granted to the TMF folders and registration forms to enable
navigation and searching on specific products, trials, countries and sites.
• Dump selected group hierarchies to enable group members to be examined more closely, and
optionally purge or delete them completely.
You may need to run this tool in the following scenarios:
• After migrating a set of new TMF documents into the repository for a currently-active trial.
• After upgrading from a previous version of the Documentum for eTMF solution to migrate the
correct TMF role-based security settings to the existing documents.
• After updating the external user registrations in a registration form outside of the D2 Client, for
example, through the TMF SDK or bulk DQL update.
• After changing the TMF role group naming conventions in D2-Config.
• After changing product codes, combining products, moving a trial to a different product, or
moving a site to a new trial, requiring the role groups to be reconstructed.

380
Migration and Integrity Check Utilities

Installing the TMF Admin Tool


The TMF Admin tool is provided in the form of batch scripts in the utils/tmf subfolder within the
Life Sciences solution package. Batch scripts for both UNIX and Windows installations are provided.
These scripts must be installed on a Documentum Server host that has the Life Sciences solution server
components installed on it. Copy the relevant files to an arbitrary folder on the Documentum Server.
On UNIX, ensure that the shell scripts are set to be executable through the chmod o+x command.
To view the usage notes:
• On Windows, double-click the TMFAdmin.bat script in Windows Explorer to view the usage
notes.
• On UNIX, run the TMFAdmin.sh script without any parameters, or with the help parameter.

Examining Access Control Groups


The –show command can be used to dump the contents of a group, including the group members
and its sub-groups. For example, the following Windows command dumps the entire dynamic
role group hierarchy:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password "" –show
tmf_contributors | more

In the preceding example, C:\Users\dmadmin> is the Windows command prompt, indicating the
folder where the scripts are installed. The parameters are the same for UNIX:
$ sh TMFAdmin.sh -docbase_name documentum -user_name dmadmin -password "" -show tmf_contributors
| more

In both cases, the output is piped to the more command to enable it to be scrolled through.
The tool can generate a lot of output if you dump the entire group structure in this way. It is more
useful to focus on a particular subgroup. For example, the groups for clinical trial AMX001 only
can be obtained using the command:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–show tg_amx001 | more

Checking and Repairing the TMF Security Settings for


External Users
The –show command can be used to perform an integrity check on the external access control groups
currently assigned to documents and folders at and below a specified TMF folder. In this case, you
should specify the cabinet or folder path of the top-level folder to check. For example,
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–show "/Clinical/Cardiology-Vascular/Vasonin" –verbose true

381
Migration and Integrity Check Utilities

The output shows that the access control groups on all of the TMF documents and folders for the
product named “Vasculin” are configured correctly. However, the groups themselves may still
need to be refreshed.
In the next example, another product named “AIR” is selected, and this time, the tool reports a
number of issues. To focus on the errors, the –verbose option is turned off:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–show "/Clinical/Cardiology-Vascular/AIR" –verbose false

382
Migration and Integrity Check Utilities

The output shows that there are 95 documents and one folder that must be repaired, and one
redundant group “tg_A001” that should be deleted. These issues were brought about by bulk DQL
updates to the repository, which caused the trial “A001” to be deleted, and changes to the TMF
configuration in D2 affecting the way the external roles and folder levels are interpreted. To fix this,
run the tool again in repair mode, as follows:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–repair "/Clinical/Cardiology-Vascular/AIR" –verbose false

The parameters are the same except that –show has been changed to –repair. The tool then
fixes all documents and folders that need repairing, reapplying D2 security in each case to ensure
the access controls are corrected. You can combine both the –show and –repair arguments in
one operation. For example,
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–repair "/Clinical/Cardiology-Vascular/AIR" –show "/Clinical/Cardiology-Vascular/AIR"
–verbose false

In this case, the repair phase is always carried out first followed by the revalidation
phase. The repair phase uses the same distributed/parallel processing mechanism as the
D2-Configuration Migration tool and supports the same -max_threads, -content_servers,
and –distributed_processing_threshold parameters, enabling potentially very large-scale
repairs to be carried out in extreme scenarios.

383
Migration and Integrity Check Utilities

Refreshing TMF Access Control Groups for Registered


External Users
External user registrations are stored:
• at the trial (study) level, in the Clinical Trial Registration Forms
• at the country (regional) level, in the Country Registration Forms for a particular trial
• at the site level, in the Site Registration Forms for a particular trial/country
The registrations are stored in repeating attributes associated with these forms. Whenever these
attributes are updated through the Manage External Users lifecycle action in D2, the corresponding
role groups are updated automatically to include only those users with active registrations. This
immediately grants or revokes access to the relevant TMF documents and folders to the external
users. It is not necessary to update the documents and folders themselves; only the group members
must be updated.
The group refresh operation is also carried out automatically on a daily basis for those registrations
that are active through the D2 batch lifecycle job. This operation takes account of registrations that
are active for a specified time interval. However, it is also possible to trigger the refresh operation
on-demand through the TMF Admin tool. To do this, specify the –refresh parameter on the
command line, followed by a DQL qualifier identifying the registration forms to be processed.
For example, to refresh the external user groups for a specific site, with site ID “CH000001”, use:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
-refresh "cd_clinical_site_info where tmf_site_id = 'CH000001'"

To refresh the external user groups for all sites in a specific country, it is only necessary to identify
the relevant Country Registration Form. For example, the following refreshes the groups for trial
“AMX001” and country code “CH” (Switzerland), both at the country level and at the site level:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
-refresh "cd_clinical_country_info where clinical_trial_id = 'AMX001' and country_code = 'CH'"

Likewise, to refresh all site/country groups across all countries for a particular trial, it is only
necessary to identify the relevant Clinical Trial Registration Form. For example,
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
-refresh "cd_clinical_trial_info where clinical_trial_id = 'AMX001'"

To refresh all groups for all currently-active trials for a particular product, for example, “Vasonin”,
use:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
-refresh "cd_clinical_trial_info where product_code = 'Vasonin' and a_status = 'Active'
and use_dynamic_access_ctrls = true"

The use_dynamic_access_ctrls = true qualifier avoids unnecessary processing of TMFs that


do not have dynamic access controls enabled.
Finally, to refresh all external role groups throughout the repository, use:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
-refresh "cd_clinical_trial_info where a_status = 'Active' and use_dynamic_access_ctrls = true"

Note: This can potentially involve a lot of processing if there are many active trials in the system,
and it should be used with caution. The mechanism utilizes the distributed/parallel processing
mechanism to process the groups when a large number of groups must be updated, and you can

384
Migration and Integrity Check Utilities

use the -max_threads, -content_servers, and –distributed_processing_threshold


parameters.
In each case, the –reset true option can be specified in order to purge the relevant groups. For
example, the following command can be used to revoke all external access to a particular trial:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
-refresh "cd_clinical_trial_info where clinical_trial_id = 'AMX001' –reset true"

Purging and Deleting Specific Groups


The –purge and –delete commands can be used to forcibly revoke external user access as a last
resort:
• -purge <group-name> clears out all of the user members from the specified group, including
its sub-groups in recursive fashion, leaving the group hierarchy empty but intact.
• -delete <group-name> detaches the specified group from any ACLs that refer to it, then
deletes the group from the repository, including its subgroups recursively.
These commands can be used to purge and drop any Documentum group and not just TMF access
control groups. However, they should be used with caution, as the effects may be irreversible. A
backup of the repository database should be taken before making such changes.
The following example purges the external access control groups at and below the trial level for
the Clinical Trial “AMX001”:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–purge tg_amx001"

The same effect can be brought about by refreshing the relevant Clinical Trial Registration Form with
the –reset option enabled as described previously, which is the preferred technique, as this ensures
the correct groups are purged. However, it may occasionally be necessary to revoke external user
access to all trials for a particular product. The most efficient way to do this is to purge the relevant
product group. For example, for the product “Vasonin”:
C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–purge pg_vasonin"

The relevant trial, country, and site registration forms may also need to be deactivated to prevent
them from being refreshed and reinstating external user access. The following DQL query can be
used to achieve this:
IDQL> update cd_common_ref_model objects set a_status = 'Inactive'
where r_object_type in ('cd_clinical_trial_info', 'cd_clinical_country_info',
'cd_clinical_site_info') and product_code = 'Vasonin'

The –delete option can be used where the group hierarchy needs to be rebuilt, for example, if
the group structure has been reorganized. This is usually only necessary for migration/upgrade
purposes. It is possible to rebuild the entire TMF group hierarchy using the following technique:
• Delete the top-level tmf_contributors group.
• Run the tool in repair mode against the entire /Clinical cabinet. This rebuilds all of the required
groups, although they will not be populated at this stage.
• Refresh all active trials to repopulate the groups.
• Verify that the groups have been created successfully.

385
Migration and Integrity Check Utilities

The following command accomplishes this in one operation:


C:\Users\dmadmin> TMFAdmin -docbase_name documentum -user_name dmadmin -password ""
–delete tmf_contributors –repair "/Clinical" -refresh
"cd_clinical_trial_info where a_status = 'Active' and use_dynamic_access_ctrls =
true" –show "/Clinical"

This command should only be used as a last resort to rebuild the access control groups, as it is likely
to involve very substantial processing, especially in mature repositories.

Using the Index Submission Folders Migration


Utility
The Index Submission Folders Migration Utility is a command-line tool that can be used by
Administrators to process submission folders and documents that already reside in a Documentum
repository, so that they can be navigated and viewed in LSSSV. There are three primary use-cases
for this tool:
• After migrating a set of new or legacy submissions into the repository through some process, such
as the Enterprise Migration Appliance (EMA), to enable those submissions to be viewed in LSSSV
as if they had been imported through the native Import Submissions function in LSSSV.
• After upgrading LSSSV to a new version that requires the existing imported or previously-
indexed submissions to be reprocessed. For example, if the latest version has updated style sheets,
or uses a different internal model to represent the submission views.
• After updating application or submission-specific metadata that is manifested in LSSSV, such as
application numbers, submission dates, and so on.
The utility is designed to index or re-index submission folders and documents in an efficient manner
by avoiding the need to export the entire submission folders from Documentum and re-import
them through the Import Submission method. The following is a list of the key features supported
by this tool:
• Multithreaded parallel processing, and also distributed processing across multiple Documentum
Servers that enable the tool to be scaled-out both horizontally, by providing more servers, and
vertically, by increasing the number of parallel processing threads on each server. This is
particularly useful in large-scale submission migration exercises.
• Incremental processing that enables submissions to be processed in batches and new submissions
to be indexed through a repeatable process.
• PDF hyperlink reprocessing that enables PDF documents containing inter-document links to
be reprocessed as part of the indexing operation, for example, following a change to the D2
download URL format, or to enable inter-document links in legacy PDFs to be navigated in the
LSSSV Submission Viewer widget.
• HTML re-rendering that enables new XSL stylesheets to be applied to the existing Submission
Viewer XML models, for example, following an upgrade to LSSSV that provides new features
in the Submission Viewer widget.
• Automatic generation of Submission Registration Forms (SRFs), for submission folders that
do not have SRFs defined for them in the repository. Note that auto-generation of Product
Registration Forms (PRFs), Quality Project Registration Forms (QPRFs), and Regulatory

386
Migration and Integrity Check Utilities

Application Registration Forms (RARFs) is not supported. These must be created as part of the
migration process prior to indexing with the relevant metadata, as required.
• Automatic metadata inheritance from submission folders to the associated subfolders and
documents and to the SRF itself, including metadata extracted from eCTD XML backbone or
regional XML files, for example, application numbers, submission dates, and so on. This includes
an option to merge the extracted metadata back onto the RARF, for example, for new dosage
forms, dosage strengths, and so on. Alternatively, the submission-level metadata can just be
compared against the RARF and any discrepancies logged as warnings.
• Report generation, including a summary of the indexing process for each application (similar to
the log file generated by the existing submission import method), and optionally, a zip file that
contains further details which may be useful for validation purposes.
The parameters supported by the tool are described subsequently in this document, and are also
documented in the Developer JavaDocs for the SSV-Methods.jar package, which is the definitive
reference guide.

Installing and Configuring the Tool


Sample batch scripts for both UNIX/Linux and Windows installations are provided, together with
the required Java archive (JAR) files containing the indexing method code. Note that these scripts
can only be executed on a Documentum Server host that has LSSSV server components (version
4.1 or later) installed on it.
To install the scripts, copy the relevant files to an arbitrary folder on the Documentum Server. If you
are running it on UNIX, ensure that the shell scripts are set to be executable through the chmod
o+x command.
To use the utility on a pre-16.4 installation (LSSSV 4.1, 4.2, or 4.3), you must also:

1. Download LSSSV 16.4 from OpenText My Support.


2. Unzip the LSSSV 16.4 binaries package.
3. Edit the IndexSubmissionFolder.bat (for Windows) or IndexSubmissionFolder.sh (for
UNIX/Linux) script:
• For Windows, append the path of the folder containing the 16.4 method server JARs to the
set LOCAL_JARS_FOLDER= statement in the IndexSubmissionFolder.bat script. For
example, set LOCAL_JARS_FOLDER=c:\installs\LSS16.4\LSSSV\method_server_lib
• For UNIX/Linux, append the path of the folder containing the 16.4 method server JARs to the
LOCAL_JARS_FOLDER= statement. For example, LOCAL_JARS_FOLDER=/opt/installs
/LSS16.4/LSSSV/method_server_lib
This will configure the batch scripts to use the latest LSSSV 16.4 jars instead of the current JARs
for submission indexing only.
4. Launch the Documentum idql utility and log in to the target repository as the Documentum
installation owner (for example, dmadmin). Run the following DQL statements:
alter type cd_reg_admin add submission_error_count integer,
submission_warning_count integer publish

387
Migration and Integrity Check Utilities

alter type cd_submission_element add ectd_element_alias_ids char(48)


repeating, has_submission_errors boolean, submission_errors char(255)
repeating publish
This adds new attributes to the regulatory application / submission registration form object
types and submission document object type, which were introduced in LSSSV 16.4 to improve
error tracking and to enable documents that are used in more than one section of the eCTD
to be processed correctly.
5. Log in to D2-Config as the Documentum installation owner.
6. Navigate to the System Parameters D2 dictionary, and add a new dictionary entry:
• Key: xmlviewer_url
• Value (alias): <D2-webapps-URL>/XMLViewer
where <D2-webapps-URL> is the URL used to access the D2 web applications. This can be
copied from the existing d2_url System Parameters value, replacing the “/D2” suffix with
“/XMLViewer”. For example, if the d2_url value is set to https://myappserver:8080/D2, then the
value for xmlviewer_url should be set to https://myappserver:8080/XMLViewer, accordingly.
To enable distributed processing in multi-node Documentum Server architectures on LSSSV 16.4,
or to enable the method to be invoked through a DQL execute do_method statement, you need to
create a dm_method object in Documentum. To do this, log into idql (or equivalent tool) as the
Documentum installation owner and run the following DQL statement:
create dm_method object
set object_name = 'SSVIndexSubmissionFoldersMethod',
set title = 'Indexes or re-indexes a series of submission folders in the repository,
to enable them to be viewed in the Submission Viewer tool.',
set method_verb = 'com.emc.d2.api.methods.D2Method -class_name com.documentum.ssv.methods.
IndexSubmissionFolders',
set timeout_min = 3600,
set timeout_max = 86400,
set timeout_default = 86400,
set run_as_server = true,
set method_type = 'java',
set use_method_server = true

This creates the required dm_method object with timeout set to 86400 seconds (24 hours) by default.
Adjust the timeout parameters if necessary.
Note: Distributed processing is only supported on LSSSV version 16.4 onwards. This enables a
master process to be invoked on one server, which will automatically distribute processing tasks to
other available servers. Distributed processing takes place at the regulatory application level, so it
is only useful when indexing submission folders for multiple applications, and when more than
one Documentum Server is available.
To emulate distributed processing on previous versions of LSSSV, the tool can be installed and
scripted to run concurrently on multiple Documentum Servers, provided that each instance indexes
submissions for a distinct set of regulatory applications, so that they do not overlap and conflict
with one-another. However, a master script that can launch remote server batch processes is not
provided by default.
Note: The -apply_d2_configs option is not supported on previous versions of LSSSV (versions 4.1,
4.2, or 4.3) and must be explicitly-disabled on those versions (with -apply_d2_configs false). This
means that if the -create_missing_srfs option is enabled, any new SRFs generated by the indexing tool
will be created in the specified -temp_repository_folder location (or the “/Temp” cabinet by default)

388
Migration and Integrity Check Utilities

and will not have D2 autonaming, autolinking, and security applied to them. Likewise, any existing
SRFs that are updated by the indexing tool (for example, to synchronize the properties with regional
XML metadata) will not have D2 auto-naming reapplied. However, the ApplyD2Configurations
command-line utility can be used to post-process these SRFs.

Server Method Arguments


On Windows, if you double-click on the IndexSubmissionFolders.bat script in Windows
Explorer, the usage notes for the tool are displayed as shown in the following screenshot:

389
Migration and Integrity Check Utilities

On UNIX, running the IndexSubmissionFolders.sh script without any parameters, or with the
help parameter, has a similar effect:
$ sh IndexSubmissionFolders.sh help

The following table lists the parameters that can be passed when running the IndexSubmissionFolders
script.

Parameter Description
–docbase_name Specifies the name of the repository to connect
to. Required.
–user_name The Documentum user name to use to connect
to the repository. The user account must
have Documentum superuser privileges. It
is recommended that the installation owner
account be used, since a password does not
need to be specified in that case (trusted
authentication is used if the script is running
on the Documentum Server as the installation
owner). Required.
–password Optional password to use to login to
Documentum, if the installation owner account
is not used. The password is not masked.
–id Object ID of a context object, that is, an SRF,
RARF, or PRF associated with the submission or
eCTD sequence folders to be indexed.

Optional — the submission or eCTD sequence


folders to be indexed can be specified in terms
of a DQL query if necessary.
–if Precondition in the form of a DQL qualifier
to apply to the context object. If the specified
precondition (if any) is not satisfied, no action
is taken. For example, if the context object is a
RARF, the precondition -if "a_status = 'Active'
and enable_submission_import = true" ensures
processing does not apply to inactive regulatory
applications, or those for which submission
importing is disabled.

Optional — if undefined, or where a context


object is not specified, processing continues.

390
Migration and Integrity Check Utilities

Parameter Description
–submission_folders DQL "select" statement identifying the top-level
submission folders in the repository containing
the documents to be indexed. Where specified,
the query must return the r_object_id values of
the top-level submission folders to be indexed
only (that is, the eCTD sequence folders or
top-level non-eCTD submission folders) in the
appropriate submission number or sequence
number order.

Required if a context object is not specified in


the -id argument; otherwise, it is Optional
— in that case, by default, all submission or
sequence folders associated with the relevant
registration form are processed.
Note:
• Any submission folder object types can be
indexed although they must have the relevant
required attributes defined for them. The
standard submission folder object types used
by LSSSV are cd_ectd_submission_folder,
for eCTD sequence folders, and
cd_nees_submission_folder for non-eCTD
electronic submissions (NeeS).

• An application (indicated by the


application_description key attribute
value) cannot have both eCTD sequences and
NeeS folders associated with it. If necessary,
create two distinct applications (RARFs) with
distinct application description keys, one
for the eCTD sequences and one for NeeS
folders, referring to the same health authority,
application number, and so on. Separate the
eCTD sequence folders from the NeeS folders
accordingly.

391
Migration and Integrity Check Utilities

Parameter Description
–required_attrs Comma-delimited list of mandatory attributes
that must be defined as non-null or non-blank
values on each submission folder, in addition to
the standard required attributes, which are:

• product_code

• application_description

• application_num

• application_type

• region

• country_code

• health_authority

• is_ectd_submission

• submission_number

• submission_type

These are the minimum required attributes. To


include additional required attributes, specify
them in this argument. Submission folders with
missing required attribute values are logged as
erroneous and skipped.

Optional — by default, only the standard


attributes are required.
–create_missing_srfs If true, SRFs are generated automatically
for each submission folder where necessary.
Otherwise, the SRFs are expected to exist in
the repository, and any submission folders that
do not have corresponding SRFs are logged as
warnings and skipped.

Optional — the default is false. Note that


irrespective of this setting, the associated
PRFs and RARFs must exist. These are not
auto-generated. However, "$CURRENT" SRFs
representing eCTD current or cumulative views
are always auto-generated as hidden objects
where necessary (used internally by LSSSV).

392
Migration and Integrity Check Utilities

Parameter Description
–apply_d2_configs If true, D2 autonaming, autolinking, and security
are applied to newly-created SRFs and re-applied
to modified SRFs. Otherwise, any new SRFs are
created in the temporary repository folder, with
the default object name set to <application-no>
- <submission-no> - <submission-type> or
<application-no> - Current - Cumulative (the
"ApplyD2Configurations" utility can be used to
process these subsequently).

Optional — the default is true.


Note: This option can only be used on LSSSV
16.4 installations or later. On 4.x versions, it is
not supported, and must be explicitly-disabled.

393
Migration and Integrity Check Utilities

Parameter Description
–custom_rarf_inherited_attributes Comma-delimited list of custom attributes to
be inherited from the RARF to the imported
submission folders, subfolders, and documents
as default values, in addition to the standard
RARF-inherited attributes, as and where
applicable. Optional.

The standard RARF-inherited default attributes


are:

• application_status

• submission_procedure_type

• concerned_member_states

• submission_date

• submission_status

• submission_pub_path

• product_chemical_names

• drug_substance_name

• drug_substance_manufacturer

• drug_product_manufacturer

• dosage_form

• dosage_strength

• indication

• product_generic_name

• product_trade_name

• product_trade_country

• inn_names

• product_compound_id

394
Migration and Integrity Check Utilities

Parameter Description
–override_rarf_attributes Whether or not the attributes are to be updated
automatically on the RARF if they differ from
the submission-specific values extracted from
the eCTD backbone or regional XML files.
In either case, the discrepancies between the
currently-defined RARF attributes and those of
the imported submission are logged.

Optional — the default is false.


–case_sensitive_folders Whether or not folder names for custom
<node-extension> folders (including Study
Tagging File sub-folders used to group study
reports) are regarded as case-sensitive when
combining or merging them into the cumulative
or current views.

Optional — the default is true.


–reindex If true, the specified submission folders are
re-indexed even if they have already been
indexed or imported through the Submission
Import process to regenerate the Submission
Viewer XML models and preview renditions.
This may be necessary as part of an LSSSV
upgrade process, for example, following a
change to the internal XML format or stylesheets
used by the Submission Viewer, requiring legacy
Submission Viewer XML models to be updated.

Optional — the default is false, meaning


that submission folders that have already been
indexed or imported into LSSSV are skipped
from processing. Disabling this option enables
new submissions to be indexed incrementally in
an efficient way.

395
Migration and Integrity Check Utilities

Parameter Description
–out_of_sequence If true, the eCTD sequence folders are not
necessarily specified in sequence number order,
and may have other following sequences that
have already been indexed which are not
included in the sequences to be indexed in this
batch. In that case, the submission views for the
following sequences may need to be updated, if
they contain “modified-file” references to files
in the sequences to be re-indexed. Note that
this option only applies to eCTD sequences that
have already been at least partially indexed. If
all sequences are included in the appropriate
order, and none have yet been indexed (or the
-reindex option is enabled), this option should
be disabled, to eliminate redundant processing.

Optional — the default is false.

396
Migration and Integrity Check Utilities

Parameter Description
–process_pdf_links Determines how submission documents with
PDF files as their primary content format should
be processed, as follows:

• all: All PDF files are exported and processed.


Those that contain inter-document hyperlinks
("/GoToR" or "/Launch" actions) are rewritten
to use D2 download URL links instead,
which are navigable in the LSSSV document
preview widget. The modified PDFs are
added as pdf_preview renditions, and the
LSSSV view models are adjusted accordingly.
The original PDF files are preserved as the
primary content files in the repository, for
submission export purposes. Use this option
if the original PDF files contain, or potentially
contain, inter-document hyperlinks that have
not been processed. Enabling this option can
incur significant additional file processing
overheads.

• refresh: PDFs are exported and reprocessed


only if there is already a pdf_preview
rendition for the associated document in
the repository. Use this option to migrate
existing PDF links that have previously been
processed by LSSSV. For example, following
an upgrade to LSSSV that requires legacy
PDF links to be updated to use a new D2
download URL format.

• none (default): The existing PDF files are


not reprocessed. Any documents that have
pdf_preview renditions in the repository are
assumed to contain navigable D2 URL links
that are still valid. Those that do not are
assumed not to contain any inter-document
links at all and the native PDF content files
are used for previewing in those cases.

397
Migration and Integrity Check Utilities

Parameter Description
–index_documents_and_folders Whether or not the individual submission
documents and folders are also to be indexed.
If true, metadata inherited from the RARF or
SRF, or that which is extracted from eCTD XML
backbone or regional M1 XML files, is applied
to the submission documents and folders in
the repository. Not only does this facilitate
searching, but also it enables eCTD lifecycle
operations, such as modified-file references,
to be resolved correctly. XML cross-reference
renditions are also attached to each document
to enable selected documents to be opened
in LSSSV in the relevant submission view.
If the -reindex option is also enabled, the
existing Xref renditions (if any) are replaced.
Otherwise, they are added only where they
are currently missing. Indexing of submission
documents and folders may be required when
migrating legacy submissions to ensure that the
document or folder properties in the repository
are consistent with the top-level submission
folders or SRFs, and to enable eCTD lifecycle
operations to be displayed and navigated
correctly in the Submission Viewer. This is
usually a one-off requirement at migration time.
After the submission documents and folders
have been indexed, this metadata is generally
static, and subsequent re-indexing is not usually
required unless the submission itself has been
re-imported with revised metadata.

Optional — the default is false.


–document_object_type Specifies the object type used for submission
documents.

Optional — the default is cd_submission


_element (the standard document object type
used by LSSSV).
–subfolder_object_type Specifies the object type used for submission
sub-folders below the top-level eCTD or NeeS
submission folder.

Optional — the default is cd_submission


_subfolder (the standard folder object type used
by LSSSV).

398
Migration and Integrity Check Utilities

Parameter Description
–delete_empty_folders Whether or not empty subfolders should be
deleted from the repository prior to indexing, so
they do not appear in the Submission Viewer.

Optional — the default is false.


–submission_element_creation_profile Specifies the name of the creation profile to be
used to create imported submission elements
in the repository, such as missing Submission
Registration Forms.

Optional — if unspecified, the default creation


profile, Submission Elements, is used.
–regional_metadata_conversions_dictionary Specifies the name of a D2 dictionary used to
translate metadata values extracted from eCTD
regional M1 XML files into corresponding
values used in the Life Sciences solution, for
example, to convert Health Authority codes and
submission type codes extracted from the XML
file into valid dictionary key values.

Optional — the default dictionary is


Submission Regional Metadata Conversions.
The keys in this dictionary should be specified
in the form "<attribute-name>.<XML-value>"
(for example, "health_authority.FR-ASSAPS"),
and the "Converted Value" alias provides the
translated value (for example, "AFSSPS”).

399
Migration and Integrity Check Utilities

Parameter Description
–rerender_views_only If true, the existing Submission Viewer XML
models for previously imported or indexed
submissions are assumed to be up-to-date
and are not regenerated. However, the
precached HTML files that provide the views
are regenerated using the latest XSL stylesheets.
Use this option to update the submission views
when new versions of the LSSSV XSL stylesheets
are installed but no other changes are required.
Note: This option overrides all other settings
and cannot be used in conjunction with
the –create_missing_srfs, –override_rarf
_attributes, –out_of_sequence, –process
_pdf_links, -index_documents_and_folders,
–delete_empty_folders, or –attach_dump_files
options. The –reindex option is implied when
this setting is used. However, submission
folders that have not yet been indexed are not
processed.

Optional — the default is false.


–temp_dir The path of a temporary directory in the
backend file system on the Documentum Server
to use for temporary working files. If distributed
processing is used, the same temporary directory
path must be valid on each Documentum Server.

Optional — if unspecified, the temporary


working directory of the Documentum Server
operating system is used.
–temp_repository_folder The cabinet or folder path of a staging folder
in the Documentum repository in which new
registration forms are created initially. D2
auto-linking is then applied to move these to the
appropriate location.

Optional — the /Temp cabinet is used by


default.
–append_logs Whether to append the index log files to the
existing RARF logs in the repository, or to
overwrite them. The indexing logs are attached
as "crtext" or "text" renditions to the RARF for
each application.

Optional — the default is false (replace


existing logs).

400
Migration and Integrity Check Utilities

Parameter Description
–attach_dump_files If true, additional diagnostic files are generated,
zipped, and attached to the RARF for each
application as additional renditions. In each
case, the zip file contains the following dump
files, as and where applicable:

• A Submission Metadata Info file for each


submission folder processed, containing
details of the Documentum metadata applied
to each document or folder in the repository.

• A PDF Xref Info file for each submission,


containing details of inter-document
hyperlinks between PDF files, where
applicable (including broken or unresolved
hyperlinks).

• An eCTD Modified File Refs file for eCTD


sequences containing modifications to files in
preceding sequences (including unresolved
references).

• An eCTD Study Tagging File Info file for eCTD


sequences containing Study Tagging Files.

These dump files are written in the CSV format


(comma-separated values) which can be viewed
in Microsoft Excel. The information provided
in these files, together with the text log file
rendition summarizing the indexing process,
may be useful for reporting, validation, and
diagnostic purposes.

Optional — the default is false.

401
Migration and Integrity Check Utilities

Parameter Description
–max_recorded_errors_per_file Limits the number of errors recorded in the
repository for each document to the specified
number, to prevent swamping where files have
many errors (for example, numerous broken
hyperlinks). If the limit is exceeded, a summary
message is recorded, indicating the number
of additional errors omitted; however, all of
the errors are recorded in the import log file
attached to the RARF.

Optional — the default limit is 10 errors per


file.
Note: Setting the error limit to 0 disables
error recording in the repository completely,
which may improve the performance of import
operations slightly, but will make it harder for
users to identify submission documents with
errors.
–debug If true, the diagnostic dump files are generated
and retained in the working directory, along
with any other temporary files generated as
part of the indexing process. This includes
any submission files that were exported for
processing, to facilitate debugging. Otherwise,
temporary files are deleted automatically on
completion of the indexing process even if the
process fails.

Optional — the default is false. Enabling


this option is not recommended in production
environments, as it will cause temporary files
to accumulate in the working directory, which
may eventually reach full capacity and cause
multiple system failures.

The following additional parameters can be used to control distributed or parallel processing:

402
Migration and Integrity Check Utilities

Parameter Description
–max_threads The number of threads to use for local parallel
processing of submission folders for distinct
regulatory applications on each server. The
default is 5.
Note: The submission folders for each
application are processed serially by one master
thread, in the appropriate order. As each
submission or eCTD sequence is indexed, the
changes are committed to the repository within a
database transaction. This protects the integrity
of the submission view models and enables the
process to be restarted in an incremental fashion
if necessary.
–max_file_processing_threads Specifies the number of parallel processing
sub-threads to use for local file processing of
each submission folder including XML file
processing and PDF hyperlink processing,
where enabled.

Optional — the default is 1.


–use_private_sessions Whether or not each parallel processing thread
should establish its own DFC session to the
repository. If disabled (the default setting),
a single DFC session is used by all threads
running on the local Documentum Server.
Enabling this setting can improve performance
by reducing conflicts between threads, at the
expense of additional memory, CPU, and
database resources.
–content_servers Specifies a list of Documentum Server names
(that is, dm_server_config object names) to
use for distributed processing in multi-node
environments. The wildcard symbol “*” can
be used to denote all available Documentum
Servers. If undefined, the local Documentum
Server only is used (the default setting).

403
Migration and Integrity Check Utilities

Parameter Description
–distributed_processing_threshold Denotes the minimum batch size for distributed
processing in multi-node environments. Batches
that are smaller than the specified size are always
processed locally, avoiding the overheads in
invoking distributed processes when there are
relatively few repository objects to be processed.
If set to 0, distributed processing is disabled (the
default setting).
–shared_folder Specifies an optional network folder path or
dm_location object name of a shared folder to
use for transmitting data efficiently between
remote processes when distributed processing
is used. The shared folder must be accessible
from each Documentum Server in read/write
mode. If undefined, process data is transmitted
via binary objects stored temporarily in the
Documentum repository. This is not as efficient
but obviates the requirement for a shared folder
to be established in the Documentum server
environment.
It is possible to configure the default settings for these parameters System Parameters dictionary
in D2-Config to suit the environment, rather than specifying them explicitly on the command line.
The exception is –max_file_processing_threads, which is not currently supported by the System
Parameters dictionary as it is specific to this method.
The following is an example of a Windows command that indexes all submissions below the
/Submissions/Legacy folder:
C:\Users\dmadmin> IndexSubmissionFolders.bat -docbase_name documentum
-user_name dmadmin –password "" -submission_folders "select r_object_id
from cd_submission_folder where folder('/Submissions/Legacy',
descend) order by application_description, submission_number"
-reindex true -create_missing_srfs true -process_pdf_links all
–max_file_processing_threads 3 -index_documents_and_folders true
-override_rarf_attributes true -attach_dump_files true
In the above, C:\Users\dmadmin> is the Windows command prompt, indicating the folder where
the scripts are installed. The parameters are the same for UNIX/Linux:
$ sh IndexSubmissionFolders.sh -docbase_name documentum -user_name
dmadmin –password "" -submission_folders "select r_object_id
from cd_submission_folder where folder('/Submissions/Legacy',
descend) order by application_description, submission_number"
-reindex true -create_missing_srfs true -process_pdf_links all
–max_file_processing_threads 3 -index_documents_and_folders true
-override_rarf_attributes true -attach_dump_files true
To process submissions incrementally (skipping those that have already been indexed), disable or
remove the -reindex argument.

404
Migration and Integrity Check Utilities

To index or re-index submission folders related to a particular product or application, or the


submission folder for an individual pre-registered submission, you can specify the Documentum
object ID of the relevant PRF, RARF, or SRF in the –id argument. A –submission_folders query does
not need to be specified in that case, as it is implied. For example:
C:\Users\dmadmin> IndexSubmissionFolders.bat -docbase_name documentum
-user_name dmadmin –password "" –id "09de75d180154cec" -reindex true
-create_missing_srfs true -process_pdf_links all –max_file_processing
_threads 3 -index_documents_and_folders true -override_rarf_attributes
true -attach_dump_files true
In this example, if the –id "09de75d180154cec" argument refers to a RARF, the relevant submission
folders for that RARF (with the appropriate product_code and application_description key values)
are processed, in submission_number order.
When passing an SRF in the –id argument in order to re-index an individual submission folder, it
may be necessary to enable the –out_of_sequence option in case there are following sequences
which have already been indexed containing “modified-file” references to files in that sequence. The
affected following sequences are then also re-indexed automatically to update the “modified-file”
references as necessary. For example:
C:\Users\dmadmin> IndexSubmissionFolders.bat -docbase_name documentum
-user_name dmadmin –password "" –id "09de75d180152479" –if "is_imported
= true" -reindex true –out_of_sequence true -process_pdf_links all
–max_file_processing_threads 3 -index_documents_and_folders true
-override_rarf_attributes true -attach_dump_files true
In this case, –id "09de75d180152479" refers to an SRF for an intermediate eCTD sequence that may
have following sequences modifying its files, so the -out_of_sequence option is enabled, accordingly.
The –if "is_imported = true" argument is an example of a precondition applied to the context object
specified in the –id argument. In this case, it is to avoid unnecessary processing if the SRF is for a
pre-registered submission that has not been imported or indexed. The –create_missing_srfs option is
not required here, because the SRF already exists.
If PDF hyperlinks do not need to be processed, specify –process_pdf_links none (or remove that
argument completely). This will improve the performance of the indexing process significantly, as
it will not have to download and analyze the PDF files in the affected submissions in that case.
However, if the PDFs contain inter-document hyperlinks that have not been processed, the links will
not be navigable in the LSSSV Submission Viewer. Users will need to export the relevant submission
folders to the desktop in order to navigate them. Using –process_pdf_links refresh is a good
compromise when the hyperlinks have already been identified, but the existing “pdf_preview”
renditions need to be updated for some reason, for example, to take account of broken hyperlinks
referring to previously-missing files that can now be resolved, changes to the D2 application server
host name, or changes to the D2 download URL format used in LSSSV, and so on. In that case, only
PDF documents with existing “pdf_preview” renditions are reprocessed.
Increasing the –max_file_processing_threads parameter can improve the indexing rate, although
it tends to becomes I/O-bound very quickly, at which point increasing the number of threads any
further is counter-productive. If the submission folders to be indexed span multiple regulatory
applications, they can be processed in parallel. Increasing the –max_threads parameter and enabling
the –use_private_sessions option can improve the indexing rate in that case.

405
Migration and Integrity Check Utilities

Preparing Legacy Submissions for Indexing


The following guidelines should be considered when migrating legacy submissions into a form
that can be indexed and processed by LSSSV:

1. Legacy submissions can reside anywhere in the repository. However, it is useful to organize
them into a specific cabinet or folder so that they can be identified easily; for example,
/Submissions/Legacy. They can co-exist with submissions that have been imported by
LSSSV, which in the standard installation are auto-filed as follows:
a. European submissions using the National submission procedure:
/Submissions/<product_code>/<region>/<country>/<application
_description>
b. European submissions using the Mutual Recognition Procedure (MRP):
/Submissions/<product_code>/<region>/Reference Member State
/<country>*/<application_description>; /Submissions/<product
_code>/<region>/Concerned Member State/<country>*/<application
_description>
*Cross-linked to each applicable country folder.
c. All other submissions:
/Submissions/<product_code>/<region>/<application_description>
2. Each submission or eCTD sequence should reside in a self-contained folder, directly below
the regulatory application level folder. It is recommended that the submission number or
eCTD sequence number is used for the top-level folder name for consistency with LSSSV. The
subfolders and documents within each submission or sequence folder should reflect the content
of the original submission. However, it is not necessary to include the Windows file extensions in
document object names (such as “.pdf” suffix) as this is implied by the content format code in
Documentum, which should be set appropriately in each case, for example, “pdf” for Acrobat
PDF files; “xml” for XML files, and so on. The original submitted files must be provided as the
primary content file of each document in Documentum. Additional preview renditions may be
generated at indexing time for internal use by LSSSV, but these are not considered to be part of
the original submission. They are not exported to the user’s desktop when the D2 “export folder”
function is used (only the primary content files are exported in that case).
3. The object types used for legacy submission folders and documents should be converted to the
appropriate LSSSV object types if necessary, as follows:
a. The top-level submission folder for an eCTD sequence should be converted to
cd_ectd_submission_folder, and the top-level submission folder for a non-eCTD electronic
submission (NeeS) should be converted to cd_nees_submission_folder.
b. In both cases, all subfolders below the top-level submission or sequence folder should be
converted to cd_submission_subfolder.
c. All submission documents should be converted to cd_submission_element.
This can be accomplished through a series of “change object” DQL statements.
Application-level folders (and above) can be standard dm_folder object types, as they do not
require any specific metadata in LSSSV. It is recommended that the application_description
value for the corresponding RARF be used for application-level folder object names for

406
Migration and Integrity Check Utilities

consistency with LSSSV. It is also recommended that the submission number or eCTD sequence
number is used for top-level submission or sequence folder object names.
4. The minimum required metadata should be populated for each top-level submission or sequence
folder, as follows:
• product_code
• application_description
• application_num
• application_type
• region
• country_code
• health_authority
• is_ectd_submission
• submission_number
• submission_type
When indexing eCTD sequences, some of the required metadata may be specified in regional
XML files, such as application_number, application_type, submission_type and submission_date,
which may be difficult to extract at migration time as it depends on the XML DTD format used in
each case. However, the indexing tool supports automatic extraction of metadata from eCTD
XML files, just as for submission import operations in LSSSV. If the XML-extracted metadata or
sequence folder metadata differs from the SRF, the discrepancies are logged as warnings, and
the SRF is updated accordingly. It is also possible to apply the changes to the RARF as well,
through the –override_rarf_attributes setting. Thus, at migration time, the relevant fields can
be populated on the RARFs and top-level sequence folders with dummy values initially, for
example, application_num = “TBD”, to be replaced at indexing time with the appropriate values
from the regional XML files. However, in the case of non-eCTD electronic submissions (NeeS), all
of the required metadata must be provided on the RARF and top-level submission folders, since
there are no standardized XML files for these.
It is not strictly necessary to populate this metadata on the submission subfolders and
individual submission documents, as this can be done at indexing time through the
–index_documents_and_folders option.
5. The relevant PRFs and RARFs must be created in the repository for each legacy submission
or eCTD sequence, with the appropriate metadata, prior to indexing. However, it is
not necessary to create SRFs as these can be generated automatically at indexing time
by enabling the –create_missing_srfs option. The SRF metadata is populated from the
submission folder metadata in that case. It is also not necessary to link the RARF to the
corresponding application-level folder containing the legacy submission / sequence folders,
as this is done automatically at indexing time if necessary. Submission folders with invalid
application_description values cannot be indexed. These are logged as errors and skipped at
indexing time. However, it is acceptable for submission or sequence folders to refer to RARFs that
are in the “Inactive” state. In that case, LSSSV does not allow additional submissions / sequences
to be imported until the Regulatory Managers re-activate the application by reverting the RARF
to the “Active” state. This enables legacy submissions to be locked down as archived copies.

407
Migration and Integrity Check Utilities

6. Role-based security for legacy submissions should be set up as follows:


• The default role-based security settings are configured at the application level on the RARF,
as follows:
— form_managers: Users and groups acting as Application Managers (these users can
modify the properties of the RARF, including the role settings).
— form_users: Users and groups acting as Application Users (with read-only access to the
RARF, they can create documents associated with the application but cannot change the
application-level properties).
— submission_managers: Users and groups acting as Submission Managers (these users can
import submissions, modify the properties of the SRFs, and manage access to individual
submissions).
— submission_archivists: Users and groups acting as Submission Archivists (these users can
import submissions, but cannot modify the properties of the SRFs or manage access to
individual submissions).
— submission_users: Users and groups with read-only access to the submissions related to
this application by default.
• The default role-based security settings for individual submissions can be overridden by
creating SRFs explicitly and setting attributes as follows:
— form_managers = Submission Managers (as above)
— form_users = Submission Users (as above)
• The relevant Submission Managers and Submission Users roles should also be applied to the
folder_managers and folder_users roles on the top-level submission or eCTD sequence folder
and its subfolders, respectively. If necessary, access to specific parts of the submission can
be modified by changing these settings as and where appropriate; for example, to restrict
access to the m3-quality section of an eCTD.
• At the document level, the doc_coordinators role should be set to the Submission Managers,
and the readers role to the Submission Users. In practice, both roles are currently given
read-only access to the submission documents, as they are immutable archive copies, although
support for importing Draft submissions may be provided in the future. In each case, the
category value (control category) should be set to 3, and the a_status value should be set to
Effective.
The D2 security configurations should then be applied to the relevant registration forms, folders
and documents, as appropriate.
Note: The indexing method itself does not support an option to apply D2 security to legacy
submission folders and documents, although it does apply D2 security to any new SRFs generated
as a result of the indexing process when the –create_missing_srfs option is used. However, the
Apply D2 Configurations utility can be used to apply D2 security to legacy submission folders
and documents as part of the migration process, prior to indexing.
7. To support XML DTD schemas in eCTDs that are not supported by LSSSV by default, it is
necessary to define additional cd_ectd_schema_config objects and configure them as appropriate,
so that the XML files can be recognized and processed correctly. This may be necessary in
order to index legacy eCTDs using older DTD versions, or for regions that are not supported
by LSSSV as standard. It may also be necessary to add entries to the standard Submission
Regional Metadata Conversions dictionary in order to transcode metadata extracted from legacy

408
Migration and Integrity Check Utilities

regional XML files into the appropriate values expected by LSSSV. For more information on these
procedures, see the Documentum for Life Sciences Administration Guide.

Example
The following example demonstrates how submissions can be imported from the local file system
into Documentum as standard dm_document and dm_folder object types initially, which are then
converted and prepared for indexing. In practice, these steps would normally be carried out as part
of the submission migration process.
In this example, the DQL Tester tool is used to import the sample submission folders, although
migration tools such as the Enterprise Migration Appliance (EMA) may be more suitable in practice.
Note: It is recommended that the repository is set up to use parent folder ACLs for new repository
object by default, instead of the user’s default ACL. Ensure the top-level cabinets have the correct
ACLs assigned to them, so that any new sub-folders created during the migration process inherit
the same ACLs by default. For more information, see the Documentum Server Administration and
Configuration Guide.

1. Create the necessary PRFs, QPRFs, and RARFs for each distinct application for the submissions
to be processed. One way to accomplish this is to use DQL statements to create the registration
form objects in the “Temp” cabinet initially, and then use the “Apply D2 Configurations” utility to
apply D2 auto-linking and security to them, in order to build out the folder structure and move
the registration forms into the relevant folders in accordance with the D2 configurations.
For example, to create a new RARF, you could use a DQL statement such as:
create cd_reg_admin_info object
set object_name = 'ExternalRARF01',
set product_code = 'WonderDrug',
set application_description = 'ExternalRARF01',
set is_ectd_submission = true,
set application_num = 'TBD',
set application_type = 'CTA',
set region = 'EU',
set country_code = 'FR',
set health_authority = 'AFSSPS',
set a_status = 'Active',
append form_managers = 'cd_admingroup',
append form_users = 'docu',
link '/Temp'

Then, run the Apply D2 Configurations utility with the appropriate arguments:
C:\Users\dmadmin> ApplyD2Configurations -docbase_name documentum
-user_name dmadmin -password "" -qualifier "cd_reg_admin_info where
folder('/Temp')" –auto_naming false –auto_linking true –security true
It is not necessary to apply D2 auto-naming here because the RARF object name is set explicitly
to the application_description value, following the standard D2 naming conventions. When
applying D2 configurations to documents, it is recommended to apply D2 auto-naming with the
–new flag enabled, even if the document object names are set up correctly, in order to initialize
any sequence counters used in the D2 auto-naming rules such as document_id values. However,
sequence counters are not used in the standard configurations for registration forms.

409
Migration and Integrity Check Utilities

It is not strictly necessary to create SRFs for each submission or eCTD sequence folder, as
these can be created automatically at indexing time through the –create_missing_srfs option.
However, doing so can facilitate application of the correct metadata and role-based security to
the submission folders and documents, as this can be inherited from the SRFs in each case,
as and where appropriate.
If the relevant target folders can be identified and created in advance as part of the migration
process, the new registration forms can be linked directly to the relevant folders at creation
time (for example, /Regulatory Application Library/WonderDrug/EU in this case),
obviating the need to apply D2 auto-linking in the above. This is recommended for large-scale
migration exercises where it is feasible to do so, as it is much more efficient than applying D2
auto-linking en masse.
2. Create the top-level folders and sub-folders in the repository into which the sample
submissions will be stored, including folders for each distinct application, for example,
/Submissions/Legacy/WonderDrug. These can be created as standard dm_folder object
types; use the application_description value as the application folder name, following
the standard LSSSV folder naming conventions. Note that if a regulatory application has
both NeeS and eCTD sequences associated with it, it is necessary to create two RARFs –
one for the NeeS folders, and a separate one for the eCTD sequence folders, with distinct
application_description key values, although the RARFs can refer to the same application
number – and to create two corresponding application-level folders, accordingly; for example,
/Submissions/Legacy/WonderDrug (eCTD) and /Submissions/Legacy/WonderDrug
(NeeS).
3. Import the sample submission folders into the appropriate application-level folders as necessary,
using the standard Documentum dm_folder and dm_document object types initially, to simulate
migration from a legacy repository that does not use the LSSSV object types.
Using the DQL Tester utility, this can be accomplished as follows:
a. Launch DQL Tester and connect to the repository as the Documentum installation owner
(for example, dmadmin).
b. Open the Docbase Browser tool and navigate to the top-level folder into which the
submissions are to be imported.
c. Right-click on the top-level folder and select Import File.
d. In the File Importer dialog box, click Folder, and select the relevant file system folder
containing the submission folders or eCTD sequence folders to be imported. Verify that the
list of files to be imported is correct.
e. In the Object Name list, select Use File Name.
f. Ensure that the Create Folder Structure option is selected.

410
Migration and Integrity Check Utilities

g. Click Import to initiate the import process.


When the import process is complete, you can verify that the submission folder structure has
been imported correctly through the Docbase Browser.

411
Migration and Integrity Check Utilities

4. Verify that the primary content format (a_content_type) settings for each document are set to
the appropriate Documentum content format names in each case. For example, DQL Tester
assumes that the most recently-created dm_format object with the appropriate DOS extension
is the one that applies to each file, which can cause incorrect content type assignments. For
example, for XML documents, a_content_type may be set to “d2_bin_dump” instead of “xml”,
and for Acrobat PDF documents, it may be set to “pdflink” instead of “pdf” (this can be verified
by clicking on imported files of various types in the Docbase Browser and checking the Content
Type settings). To fix these, use bulk DQL update statement such as:
update dm_document objects
set a_content_type = 'xml'
where folder('/Submissions/Legacy/WonderDrug', descend)
and a_content_type = 'd2_bin_dump'
go

update dm_document objects


set a_content_type = 'pdf'
where folder('/Submissions/Legacy/WonderDrug', descend)
and a_content_type = 'pdflink'
go

5. Convert the object types to the types expected by LSSSV using a series of DQL statements, as
follows:

412
Migration and Integrity Check Utilities

a. First, convert all folders and subfolders below the application-level folder from dm_folder to
cd_controlled_folder object types (the common parent folder type used for all submission
folder types in LSSSV). For example:
change dm_folder object to cd_controlled_folder where folder('
/Submissions/Legacy/WonderDrug', descend)
b. Next, change the top-level submission folders immediately below the application-level folder
(only) from cd_controlled_folder to cd_submission_folder:
change cd_controlled_folder object to cd_submission_folder where
folder('/Submissions/Legacy/WonderDrug')
c. Next, the same top-level folders to either cd_ectd_submission_folder or
cd_nees_submission_folder. In this case, the RARF is for an eCTD, so we use:
change cd_submission_folder object to cd_ectd_submission_folder
where folder('/Submissions/Legacy/WonderDrug')
d. Next, change the submission folders below the submission or sequence-level folders from
cd_controlled_folder to cd_submission_subfolder:
change cd_controlled_folder object to cd_submission_subfolder where
folder('/Submissions/Legacy/WonderDrug', descend)
Note: This does not include the top-level submission or sequence folders, which are no
longer cd_controlled_folder object types, but sub-types of that type.
e. Finally, change the submission documents from dm_document to cd_submission_element:
change dm_document objects to cd_submission_element where
folder('/Submissions/Legacy/WonderDrug', descend)
6. Set up the minimum required metadata at the top-level submission or sequence folder level:
update cd_ectd_submission_folder objects
set product_code = 'WonderDrug',
set application_description = 'A1234567',
set submission_number = object_name,
set submission_type = 'Initial MAA',
set is_ectd_submission = true,
set application_num = 'TBD',
set application_type = 'CTA',
set region = 'EU',
set country_code = 'FR',
set health_authority = 'AFSSPS'
where folder('/Submissions/Legacy/WonderDrug')

Note:
• This assumes that the top-level submission folder names are the same as the corresponding
submission or sequence numbers, following the LSSSV folder naming conventions.
• It is only strictly necessary to apply this metadata to the top-level submission or sequence
folders, as the indexing tool supports an –index_documents_and_folders option, which can
be used to synch-up the metadata on the relevant submission folders and documents at
indexing time.
• For eCTDs, some of this metadata can be extracted from regional XML files at indexing time
(for example, application_num, submission_type, and so on) in which case the relevant
values can be set to “TBD” or “Pending” initially; they are overridden when the submissions

413
Migration and Integrity Check Utilities

are indexed. For NeeS, or where the metadata is not defined in the eCTD regional XML files,
the values must be specified explicitly in advance, or copied from the relevant RARF.
7. To set up role-based security on the submission folders, subfolders, and documents:
a. For the submission folders and subfolders, set the folder_managers and folder_users roles to
the appropriate users and groups:
update cd_controlled_folder objects
truncate folder_managers,
append folder_managers = 'cd_admingroup',
truncate folder_readers,
append folder_readers = 'docu'
where folder('/Submissions/Legacy/WonderDrug', descend)

This assumes that the legacy submissions are available to all users across all domains as
read-only artifacts, and that the Controlled Document Administrators (cd_admingroup) are
responsible for managing access to them. Adjust these groups as necessary. It also assumes
that the same role-based security applies to all folders and documents in the submission. If
necessary, it can be refined for specific folders within a submission in top-down fashion.
b. For the submission documents, set the doc_coordinators and readers roles to the appropriate
users and groups, and also set the document control category 3 and lifecycle state (a_status
value) to Effective:
update cd_submission_element objects
truncate doc_coordinators,
append doc_coordinators = 'cd_admingroup',
truncate readers,
append readers = 'docu',
set category = 3,
set a_status = 'Effective'
where folder('/Submissions/Legacy/WonderDrug', descend)

c. Run the Apply D2 Configurations utility with the relevant parameters to set up D2 role-based
security on these submission folders, subfolders, and documents:
C:\Users\dmadmin> ApplyD2Configurations -docbase_name documentum
-user_name dmadmin -password "" -qualifier "dm_sysobject where
folder('/Submissions/Legacy/WonderDrug', descend)" –auto_naming
false –auto_linking false –security true
Note: An alternative approach is to configure the default roles on the RARF in advance
and then invoke the Update Security lifecycle state transition on the RARF. This can be
accomplished by running the Apply D2 Configurations utility:
C:\Users\dmadmin> ApplyD2Configurations -docbase_name documentum
-user_name dmadmin -password "" -qualifier "cd_reg_admin_info
where application_description = 'A1234567'" –auto_naming false
–auto_linking false –security false –state_transition "Update
Security"
This mimics the effect of an Application Manager changing the default submission security
at the application level, causing the changes to be cascaded down to the related SRFs,
submission folders, and documents. However, this technique only works if SRFs already exist
for the relevant submission or eCTD sequence folders, so it can only be used after indexing,
or where the SRFs are created explicitly as part of the migration process prior to indexing.

414
Migration and Integrity Check Utilities

8. The submission folders and documents should now be ready for indexing. Run the indexing tool
to process them with the appropriate parameters:
C:\Users\dmadmin> IndexSubmissionFolders -docbase_name documentum
-user_name dmadmin -password "" -submission_folders "select r_object_id
from cd_submission_folder where folder('/Submissions/Legacy', descend)
order by application_description, submission_number" -create_missing
_srfs true -index_documents_and_folders true -override_rarf_attributes
true -process_pdf_links all -max_file_processing_threads 1
-attach_dump_files true
This indexes all submissions or sequences for all applications in the /Submissions/Legacy
folder that have not already been indexed (if necessary, add –reindex true to the command line
to force them to be reprocessed). You can increase the number of file processing threads to
improve the processing throughput, and if multiple applications are involved, the standard
CDF distributed or parallel processing options can also be used to reduce the overall indexing
time, as described in the preceding section.

Validating the Indexing Process


After indexing, the relevant applications and submissions should be searchable and navigable in the
Submission Viewer, just as if they had been imported through the native LSSSV Submission Import
functionality. Previews of recognized eCTD regional XML files and Study Tagging Files should also
be provided, and inter-document links between PDF files should be navigable in the Submission
Document Preview Widget, as normal.
To validate the results of an indexing operation, allow the indexing process to finish. Then examine
the rendition files attached to the RARFs for the affected regulatory applications. For example, in
DQL Tester, drill down and select a RARF in the /Regulatory Application Library cabinet
for an application that has been indexed, and click the Renditions menu (they can also be accessed
from D2 through the Renditions tab):

415
Migration and Integrity Check Utilities

The primary content format (rendition number 0) would be xml if the RARF has been indexed (this
provides the internal table of contents view model, with links to each SRF), and there is an html
rendition for this, comprising the pre-rendered HTML table of contents view displayed by the
Submission Viewer (obviating the requirement for the browser itself to render the XML into HTML
on demand, which has been found to be unacceptably slow on some browsers, especially when
viewing large submissions).
The crtext or text rendition (depending on whether the Documentum Server is running on
Windows or UNIX) contains a summary of the most recent indexing operation for each application.
Double-click it to view it in Notepad.
Any errors or warnings logged in this file should be investigated, in case they are significant. In
most cases, warnings relate to incorrect or missing metadata or PDF cross-reference links that could
not be resolved.
If the –attach_dump_files option is enabled at indexing time, an additional zip file rendition is
attached to the RARF, containing a set of dump files in Comma-Separated Values (CSV) format:

416
Migration and Integrity Check Utilities

These CSV files can be opened in Microsoft Excel, providing further details about the indexing of
specific submissions or eCTD sequences. The Submission Metadata Info files, for instance, tabulate
the Documentum metadata associated with each folder and document for a particular submission or
eCTD sequence folder.
This information can be useful for verifying that the correct metadata has been identified in each case,
including metadata extracted from eCTD XML files or inherited from the RARF or SRF, as well as
metadata defined implicitly by the system (for example, eCTD element names). The first data row, for
the relative file path “.”, refers to the top-level submission or eCTD sequence folder itself, and the last
column lists out any indexing errors or warnings for each file, such as broken hyperlinks.
Note: The metadata is not actually applied to the corresponding submission folders and documents in
the Documentum repository unless the –index_documents_and_folders option is enabled. Usually,
this only has to be done once as the metadata is generally static. However, the SRF metadata is always
updated as and where necessary, and likewise the RARF metadata, if the –override_rarf_attributes
option is enabled. Any discrepancies between the current versus expected SRF or RARF metadata
values are logged as warnings.
The PDF Xref Info files contain details about any cross-reference hyperlinks discovered in each
submission or sequence, and whether or not they were successfully resolved:

417
Migration and Integrity Check Utilities

In the above example, it can be seen that the drug-product.pdf file in the m2/23-qos (Quality
Overall Summary) folder has many cross-reference hyperlinks, including two unresolved links.
On closer inspection of the Target Object Path column, it can be seen that for the two unresolved
links, the target file is specified incorrectly as an absolute path in the PDF file, and not as a relative
path from the folder containing the source PDF file. This is indicative of a publishing error of some
kind. Such links are not generally navigable, even if the files are exported from Documentum to the
user’s desktop.
Note: PDF Xref Info files are only generated for submissions or sequences containing at least one
recognized inter-document hyperlink. Where they are missing, it can be assumed that no hyperlinks
were found, or PDF hyperlink processing was not enabled at indexing time. Also, since Documentum
for Life Sciences 16.4*, inter-document hyperlinks in the PDF bookmark tree (that is, the PDF
“outlines”) are now also processed, just as for hyperlinks represented as page annotations in the PDF
content. In the Xref Info file, bookmark hyperlinks are indicated by a source page number of 0,
whereas page annotation hyperlinks have source page numbers starting from 1.
*Hotfixes or patches may also be available to retro-fit this functionality on previous Life Sciences 4.x
releases. See OpenText My Support for more information.
In the case of eCTDs, for sequence numbers other than 0000 (the initial sequence), an eCTD Modified
File Info file may also be generated, if the sequence contains leaf elements with eCTD lifecycle
operations other than new (for example, replace, append, or delete operations).

418
Migration and Integrity Check Utilities

In this example, it can be seen that sequence 0002 modifies 6 files submitted in sequence 0000 — 3 files
have new appendices or supplements, and 3 have replacement versions. The last column, Resolved,
indicates whether or not the modified file reference is valid, that is, whether or not the modified file in
the preceding sequence exists in the repository. For resolved references, a dm_relation is created
automatically in the repository between the modified file in the preceding sequence and the new
file in the following sequence, reflecting the eCTD lifecycle operation (just as in the native LSSSV
submission import process), and the links can be navigated in the Submission Viewer.
Note: In the case of eCTD delete operations, a file is not actually provided in the current sequence,
so a dummy placeholder (content-less document) is created in the current sequence folder in the
repository to act as an anchor-point for the dm_relation, and to enable Documentum metadata to be
associated with it. However, these dummy placeholders are only used internally by LSSSV, and are
not included in submission export operations.
If eCTD sequence folders are not indexed in the appropriate sequence number order, it can result
in unresolved modified file references. The links for these will not be navigable in the Submission
Viewer, and the dm_relation objects will be missing. To fix this, the sequences containing the
modified file references need to be re-indexed once the referenced sequences have themselves
been indexed. Enabling the –out_of_sequence option at indexing time forces the indexing tool to
automatically re-index following sequences that have already been indexed. However, this requires
additional processing, so it should only be used when needed (for example, when re-indexing
individual sequence folders). Otherwise, it is best avoided by including a suitable “order by” clause
in the –submission_folders DQL query argument.
If intermediate sequence folders are missing entirely from the repository, it can also result in
unresolved modified file references. In this case, the missing sequences must be imported through
the native LSSSV Submission Import function post-indexing, as and when they become available.
This takes out-of-sequence imports into account and automatically fixes unresolved modified file
references in previously-imported following sequences.
Finally, if an eCTD sequence contains Study Tagging Files (STFs), an eCTD Study Tagging File Info
CSV file is also included in the ZIP, tabulating the metadata extracted from each file:

419
Migration and Integrity Check Utilities

STFs provide additional metadata relating to non-clinical and clinical study reports. In LSSSV, this
information is used to reorganize the study documents in the relevant study folders to make them
easier to navigate (they are grouped by study, document type and site ID, where specified, instead of
presenting them as a flat list of files) and to manifest the additional study-related metadata in the
Submission Viewer. The CSV file records all of the the extracted metadata, and the last column,
Resolved, indicates whether or not the links in the STF (the leaf document references) are valid.
Note: An STF can refer to multiple study documents, in that case, the common study-level metadata
is recorded only for the first document (with leaf number 0), in order to avoid duplication, and to
make it easier to interpret the spreadsheet. Subsequent rows are then written for each additional
study document showing the document-level metadata only in the right-hand columns.

Identifying Applications, Submissions, and Submission


Documents with Errors
From 16.4 onward, LSSSV records the number of submission errors and warnings detected for each
submission or eCTD sequence in the submission_error_count and submission_warning_count
attributes on the SRF, respectively. The total number of submission errors and warnings for the
application are also recorded in these same attributes on the RARF. This applies to both import and
indexing operations.
These attributes are not shown on the standard SRF or RARF properties pages. To identify
applications with imported submissions that may need attention, you can run a DQL query such
as follows:
select r_object_id, application_description, submission_error_count from
cd_reg_admin_info where submission_error_count > 0 order by 2
Similarly, to identify individual submissions with errors, you can use a DQL query of the form:
select r_object_id, application_description, submission_number, submission
_error_count from cd_reg_submission_info where submission_error_count >
0 order by 2, 3
Submission errors may also be recorded on the imported submission documents themselves,
depending on the -max_recorded_errors_per_file server method argument (described previously).
If this is set to a value greater than zero, errors are recorded in the submission_errors repeating
attribute (for example, invalid hyperlinks). The has_submission_errors flag is also set to true for
the relevant documents. Therefore, to identify individual submission documents with errors, use
a DQL query such as the following:
select r_object_id, application_description, submission_number,
ectd_element_name, filename, submission_errors from cd_submission_element
where has_submission_errors = true order by 2, 3, 4, 5
If a document has more than the specified maximum number of errors, the submission_errors
repeating attribute is truncated. A summary message is appended indicating the number of
additional errors that were omitted for brevity. This helps to prevent an excessive number of errors
from being recorded for files that are badly broken (for example, those that contain many broken

420
Migration and Integrity Check Utilities

hyperlinks). However, all the errors are recorded in the indexing or import log file attached to the
RARF as a text rendition, for reference.

421
Migration and Integrity Check Utilities

422
Chapter 19
Life Sciences Software Development
Kit

This section provides an overview of the Life Sciences solution Software Development Kit (SDK).

Overview
The Life Sciences solution Software Development Kit (SDK) enables System Integrators and
Developers to integrate an individual solution such as Documentum for eTMF, Documentum for
Research and Development, or Documentum Submission Store and View with an existing external
application or management system. Developers can develop plugins for the management system in
Java and use the SDK to synchronize the Documentum repository in response to the management
system events.
The SDK exposes basic functions of the solution, such as creating registration forms, through a Java
API and uses standard Documentum Foundation Services (DFS) Web Service calls for integration
purposes. Detailed JavaDocs on the SDK are provided in the javadocs folder of the SDK package.

Web Services Integration


DFS provides a set of Web Services for integrating applications with Documentum in a standard way.
This enables external applications to store, retrieve, and manipulate objects in the repository, execute
queries, and download or upload content files from and to the repository. DFS is a separate WAR file
that has to be deployed for the SDK to use the DFS Web Services. By default, DFS is hosted by the
Documentum Java Method Server (JMS) running under JBoss. However, it is possible to host DFS
Web Services on a separate host if necessary. The DFS documentation available on OpenText My
Support (https://support.opentext.com/) provides more information about installing and using DFS.
DFS provides basic repository services and uses a generic data model for representing repository
objects such as documents and folders. It does not use D2 configurations implicitly, although
it is possible to query D2 configurations and appIy them through DFS service calls. It does not
provide data models for Life Sciences solution-specific entities, for example, TMF-specific entities
such as Clinical Trial Registration Forms, TMF documents, and so on. It does not support Life
Sciences solution-specific functionality either. Therefore, to facilitate the development of external
applications that integrate with a Life Sciences solution, a Java Client Library (JCL) is provided as
part of the solution.

423
Life Sciences Software Development Kit

Java Client Library


The JCL acts as a productivity layer on top of the standard DFS SDK client library, simplifying access
to the repository and providing a set of solution-specific Java classes. The following diagram shows
the DFS Web Service architecture for the Life Sciences solution:

The JCL productivity layer is packaged as a JAR file that can be included in Java applications.
Although it is possible for client applications to directly call the DFS Web Services through the
SDK, it is recommended that the operation be carried through the JCL to safeguard the application
against future changes to the Life Sciences solution.

Supported Functionality in the SDK


The JCL in the Life Sciences solution SDK supports the following functionality:
• General convenience functions for managing connections to the Documentum repository,
retrieval of login credentials and other applications-specific parameters from properties files,
running DQL queries, and so on.
• Creation, retrieval, modification, and deletion of eTMF registration forms of various types,
including Product Registration Forms, Country / Site Registration Forms, and Clinical Trial
Registration Forms, including template registration forms for each of the above.
• Retrieving, merging, and modifying file plans associated with registration forms of various
types, that is, lists of expected artifacts (document types) to be created at the product, trial, country
and/or site level in the TMF structure. This includes Product, Country, Trial, and Site-level file
plans and associated templates.
• Validation of file plans to ensure the relevant fields are defined and valid in the context of the trial,
for example, to ensure country codes or site IDs refer to registered countries or sites for the trial.
• Storage and retrieval of file plans in Microsoft Excel format, which is predefined with built-in
value-assistance provided in the form of lookup tables.

424
Life Sciences Software Development Kit

• Activation and deactivation of file plans to create or remove placeholders for the expected
documents. File plans can be planned in stages if preferred, and activated in incremental fashion;
an initial trial setup stage followed by an active trial stage, then a final close-out stage. A rollover
function is provided to enable the next stage to be activated (this can also be configured to occur
automatically if necessary). Alternatively, all defined stages can be activated at once if preferred,
so that stages can be used to break up the overall file plan into manageable units.
• Update and retrieval of progress statistics indicating the current level of completion of a trial (up
to and including the currently-active stage).
• D2 configuration lookup functions for dictionary entries, default value sets, creation matrices,
and taxonomies, to enable country codes to be translated into locale-specific names, default
values to be applied to new repository objects, and so on. Currently, this is restricted to read-only
operations. D2 configuration changes are not supported through the JCL; this must be done
through D2-Config.
• TMF placeholder retrieval/update functions, enabling placeholders in the repository to be
identified and replaced with document content programmatically for example, to transfer a Site
Management Report (SMR) document from a Clinical Trial Management System (CTMS) into
Documentum for eTMF in response to a particular CTMS event such as SMR review/approval.
• Ability to invoke D2 lifecycle transition actions on TMF documents and placeholders, enabling
the relevant actions to be configured through D2-Config and invoked through the SDK just as
in the D2 Client.
• Registration of external users for access to the TMF documents at the trial, country and site
levels in the appropriate roles, with automatic refresh of the corresponding role groups when
the forms are saved.
• Creation, retrieval, modification, and deletion of Non-Clinical Study Registration Forms
including updating the Group, Subgroup, and other information specific to the registration form.
• Creation, retrieval, modification, and deletion of Non-clinical documents including updating
the metadata of the Non-clinical documents during creation or modification.
• Creation, retrieval, modification, and deletion of Regulatory Application Registration Forms
including updating the Group, Subgroup, and other information specific to the registration form.
• Creation, retrieval, modification, and deletion of Regulatory documents including updating the
metadata of the Regulatory documents during creation or modification.
• Creation, retrieval, modification, and deletion of Quality Project Registration Forms including
updating the metadata during the creation or modification of the registration form.
• Creation, retrieval, modification, and deletion of Regulatory Application Labeling documents
including updating the metadata during the creation or modification of the document.
• Creation, retrieval, modification, and deletion of Regulatory Core Labeling documents
including updating the metadata during the creation or modification of the document.
• Creation, retrieval, modification, and deletion of Regulatory Correspondence documents
including updating the metadata during the creation or modification of the document. This also
includes importing Correspondence email containing attachments.
• Creation, retrieval, modification and deletion of Regulatory Clinical documents including
updating the metadata during the creation or modification of the document.

425
Life Sciences Software Development Kit

• Creation, retrieval, modification and deletion of Regulatory Safety documents including


updating the metadata during the creation or modification of the document.
• Creation, retrieval, modification and deletion of Regulatory Quality documents including
updating the metadata during the creation and modification of the document.
• Creation, retrieval, modification and deletion of GMP documents including updating the
metadata during the creation and modification of the document.
• Creation, retrieval, modification, and deletion of Generic Life Science documents including
updating the metadata during the creation or modification of the document. Domain-specific
methods are not included because this is a generic implementation. Therefore, there are no
implied CDF Auto Inheritance calls wrapped inside convenience methods similar to those in other
domains. It is up to the user to ensure that the additional updates are done either manually, or
through a lifecycle transition that supports the additional calls.
For more information, see the Working with the Life Sciences Software Development Kit whitepaper
on OpenText My Support.

Supporting External Registration Forms


Currently, registration forms are created and maintained within the Life Sciences framework.
Through SDK, the Life Sciences solution allows for registration forms (product, project, regulatory
application, and submission registration forms only) to be created and updated by external systems
(such as Insight) and added to the Life Sciences system. However, registration forms created
externally cannot be edited in the Life Sciences application.
A new attribute, system_external_attribute, is used to flag whether the registration form is internal or
external. The flag helps the system to prevent users from manipulating the registration form from
within the application and limit changes to the source system calls through the SDK. For registration
forms created within the Life Sciences solution, the default value of the attribute is set to false, which
enables Life Sciences users to modify the registration forms. However, for external registration forms,
the value is set to true, which prevents the Life Sciences users from modifying these registration
forms from within the Life Sciences application.
Only external systems (owners of the registration forms) can modify the registration forms through
SDK methods. Additionally, lifecycle actions on externally created registration forms can only
be triggered through SDK and not the menu options. Similarly, external systems cannot modify
registration forms created within the Life Sciences application through the SDK.

426
Chapter 20
Configuration Settings to Improve
Performance

This section provides various configuration settings that you can make to improve the performance of
the Life Sciences solutions.

D2 4.7 Performance Best Practices


Refer to the D2 4.7 Best Practices whitepaper, available at OpenText My Support, which summarizes
the best practices and guidelines to get better performance with D2 4.7.

Overriding the Cache Time-out


In the D2 configuration, the current cache time-out setting may lead to performance delays. Therefore,
users can override the cache time-out setting in D2 by modifying the following default values in the
cache tag in the <Application Server>\webapps\D2\WEB-INF\classes\d2-cache.xml file:
<cache name="workflow_config-cache"
maxElementsInMemory="10000"
timeToIdleSeconds="600"
timeToLiveSeconds="600"
overflowToDisk="false"
eternal="false"
memoryStoreEvictionPolicy="LFU"/>

Deactivating the myInsight Agent Job


The myInsight Agent job is configured to run once every five minutes, which can impact the overall
performance. To improve performance, you can decrease the frequency to hour level or deactivate
the job.
1. Log in to Documentum Administrator.
2. On the left pane, expand Job Management, and click Jobs.
3. On the right pane, right-click myInsight Agent, and click Properties.

427
Configuration Settings to Improve Performance

4. In the Job Properties dialog box, select Inactive and click OK.
Deactivating the myInsight Agent job does not affect report generation.

Disabling RSA Security Providers for myInsight


Reports
To ensure that the performance of myInsight reports are good, it is recommended that you disable the
RSA security providers in the Java Method Server's JDK. Once the myInsight reports were enabled,
the following two RSA related JDK security providers can introduce severity performance impact to
any JMS related events, mostly related to lifecycles:
• security.provider.2=com.rsa.jsafe.provider.JsafeJCE
• security.provider.3=com.rsa.jsse.JsseProvider
Comment out these two security providers in the java.security file on Documentum Server (for
example, <documentum>/java64/<JDK version>/jre/lib/security)

Internet Explorer Browser Settings (LSSSV)


The response time in Internet Explorer is slow when a user selects the Current View tab of the
Submission Viewer for a Regulatory Application Registration Form that contains a large submission.
For better performance, it is recommended that you disable any virus scanning add-ons such as the
scriptproxy add-on from McAfee. To improve the response time in Internet Explorer, follow these
steps:

1. In Internet Explorer, go to Tools > Internet options.


2. In the Internet Options dialog box, under Browsing history, click Delete.
3. In the Delete Browsing History dialog box, ensure that only the History option is selected
and click Delete.
4. Click OK.
5. Go to Tools > Manage add-ons.
6. In the Manage Add-ons dialog box, disable the virus scanning add-on and then click Close.

Improving Performance of Server Methods


Server methods that support distributed or parallel processing, now support an additional argument,
-use_private_sessions true|false, which is disabled by default. use_private_session is a
new feature of parallel tasks handling mechanism in the Life Sciences solution.
When it is enabled, every thread is able to create its own Documentum Server session for handling
the tasks in the thread. The benefit is that each session holds its own connection between JMS and
the Documentum Server, and therefore, its process does not block processes being handled by other

428
Configuration Settings to Improve Performance

threads at the same time. Enabling this option can improve the performance of process-intensive
operations, such as TMF trial activation, submission import, and so on, at the expense of additional
Documentum Server or JMS resources. A default value for the -use_private_sessions argument can
be configured in the System Parameters dictionary, or it can be specified explicitly in the method
arguments.
If allowed, remove RSA providers from the JRE security file as it can maximize the benefit of parallel
handling mechanism in the Life Sciences solutions.
This argument applies to the following server methods:
• ApplyD2Configurations (all solutions)
• ApplyAttributeInheritance (all solutions)
• ImportSubmissionMethod (LSSSV only)
• ReconcileArtifacts (LSTMF only)
• TMFAdmin (LSTMF only)
• UpdateContributorGroups (LSTMF only)
Note: Avoid using max_file_processing_threads and max_import_processing_threads
in the Import Submissions Lifecycle parameters when use_private_sessions is set to true as it
can exhaust the sessions at a faster rate than they are released.

Database Settings
This section provides the different configuration settings that you can make to improve performance
for the following databases that you use in your environment.

SQL Server
As a general rule, the performance on SQL Server can degrade if statistics and fragmentation are not
kept in check. There are a number of tables that are rapidly modified, which make the fragmentation
and index maintenance a key issue. Client Database Administrators (DBAs) must identify their
most volatile tables in the day-to-day operation to determine the best schedule for daily, weekly, or
monthly maintenance.
Recommended settings:
• Increase the memory allocated to SQL Server to 20 GB for small to mid-sized clients
• Run the DB Statistics job as outlined in the Documentum Server Administration and Configuration
Guide.
The following table lists the parameters that you can modify for each of the components:

429
Configuration Settings to Improve Performance

Component Parameters
SQL Server • Max Degree of Parallelism = 0

• Auto Create Statistics = False

• Auto Update Statistics = True

• Auto Update Statistics Asynchronously = True

• Is Read Committed Snapshot On = True

• Parameterization = Forced
Documentum Server Server.ini
• return_top_results_row_based = F

• concurrent_sessions = 1000

System environment variable


• DM_LEFT_OUTER_JOIN_FOR_ACL=T
Application Server D2FS.properties
— D2 • maxResultSetSize=1000

Oracle
Changes to the default SPFIle properties:
• processes 1000 (2X concurrent sessions)
• sessions 500
• open_cursors 2000
• cursor_sharing = Forced
• optimizer_mode = ALL_ROWS

Documentum Server Settings


This section provides the different configuration settings that you can make to the Documentum
Server to improve performance.

Scheduled Jobs
The dm_LogPurge job cleans up the myInsight TEMP logs. This has to be scheduled to run in order
to clean the JOB logs from the repository. When the reports are generated, it creates a repository copy

430
Configuration Settings to Improve Performance

of the HTML file, which needs to be purged. This can be done by writing a custom job to clean the
generated HTML file while running myInsight reports.
You can also disable unnecessary jobs by running the following commands:
update dm_job object set is_inactive=True where object_name='D2JobWFFollowUpTaskNotif'
update dm_job object set is_inactive=True where object_name='D2JobImportMassCreate'
update dm_job object set is_inactive=True where object_name='D2JobWFReceiveTaskMail'
update dm_job object set is_inactive=True where object_name='D2JobWFSendTaskMail'
update dm_job object set is_inactive=True where object_name='dm_Initialize_WQ'
update dm_job object set is_inactive=True where object_name='dm_QmThresholdNotification'
update dm_job object set is_inactive=True where object_name='dm_QmPriorityNotification'
update dm_job object set is_inactive=True where object_name='dm_QmPriorityAging'
update dm_job object set is_inactive=True where object_name='dm_bpm_XCPAutoTaskMgmt'
update dm_job object set is_inactive=True where object_name='myInsight Agent'
update dm_job object set run_mode=1,set run_interval=60 where object_name='D2JobSubscription'

In the Life Sciences solutions, the D2JobWFLaunchScheduledWorkflows job is made Inactive by


default as it can cause conflicts in launching workflows.

Server.ini File Settings


Make the following changes in the server.ini file:
• DM_LEFT_OUTER_JOIN_FOR_ACL=T
• DM_GROUP_LIST_LIMIT=500
• concurrent_sessions = 500 (The maximum setting is based on the server operating system. For
Windows – 3275, for Linux – 1020.)
• return_top_results_row_based = F
• validate_database_user = F
• gethostbyaddr = F
• check_user_interval = 0
• upd_last_chg_time_from_db = F

JBOSS Access Log


You can configure the JBOSS log by editing the standalone.xml located in the
<DOCUMENTUM_HOME>/jboss7.1.1/server/DctmServer_MethodServer/configuration/
folder. The log files are created in the specified location with the format, access_log.<date>

Java Server Method Settings


The JMS setting needs to be 40 to 50% of available memory of the system. Run the following
command:

431
Configuration Settings to Improve Performance

set USER_MEM_ARGS=-Xms2048m -Xmx2048m -XX:PermSize=128m -XX:MaxPermSize=256M -Xss192k


-XX:+DisableExplicitGC -Xrs -verbose:gc -Xloggc:D:/shared/Documentum/MethodServer/jms_gc.log

D2 Web Server Settings


This section provides the configuration settings that you can make to the D2 Web Server to improve
performance.

JMS Configuration
For a mid-size client (1000 to 3000 seat count), run the following command to configure the
Application Server JVM settings:
set USER_MEM_ARGS=-Xms4096m -Xmx4096m -XX:PermSize=64m -XX:MaxPermSize=1024m -Xss256k
-XX:+DisableExplicitGC -Xrs -Xloggc:gc.log -verbose:gc -XX:+UseParNewGC

For a small-size client (100 to 300 seat count), configure the following Application Server JVM settings:
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Apache Software Foundation\Procrun 2.0\Tomcat8080\
Parameters\Java
JvmMs = 2048 (Mb)
JvmMx = 2048 (Mb)
JvmSs = 256 (k)
Options =
-XX:PermSize=256m
-XX:MaxPermSize=256m
-XX:+UseParNewGC
-XX:NewRatio=4
-XX:NewSize=256m
-XX:-UseAdaptiveSizePolicy
-XX:SurvivorRatio=8
-XX:MaxTenuringThreshold=0
-XX:+UseConcMarkSweepGC
-XX:+CMSClassUnloadingEnabled
-XX:+CMSPermGenSweepingEnabled
-verbose:gc
-Xloggc:D:/shared/Tomcat8080/tomcat_gc.log

D2 Caching
Open the <App_Server>/webapps/D2/WEB-INF/classes/d2-cache.xml file and add the
following lines before the <cache> tag named xml-cache:
<cache name="object_ids-cache"
maxElementsInMemory="10000"
eternal="false"
overflowToDisk="false"
memoryStoreEvictionPolicy="LFU">
<cacheExtensionFactory class="com.emc.common.java.cache.PropertiesExtensionFactory"
properties="type=d2_documentset, d2_documentset_switch, d2_filter_config,
x3_preferences, x3_image, d2_query_config, d2c_preferences" propertySeparator=";"/>
</cache>

432
Configuration Settings to Improve Performance

D2 Client Configurations
This section provides the Life Sciences-specific UI configurations and recommendations to improve
performance.

Optimizing the D2 GUI Performance


Minimize the number of widgets in your workspace views. All widgets in all views are loaded into
the browser at login time. More number of widgets results in D2 taking longer time to open and
load the widgets. When more widgets are available in the workspace view that responds to object
selection, it takes time for all the widgets to refresh.
Avoid using high-impact widgets that are active by default because it impacts browsing as these
widgets respond to every selection event. The D2 Properties Preview widget is particularly high
impact. Move these widgets away from the first tab position so that it is not active unless explicitly
selected by the user.
D2 PDF Preview widget is not recommended for use in multi-view workspaces. It is always
active; it downloads and renders the selected document even when in an inactive view. If used in
multiple views, all instances will respond to each selection event, causing multiple downloads,
which degrades performance of the D2 client.
D2 Thumbnail Preview widget is not recommended for large documents as it requires the Thumbnail
Server and Advanced Document Transformation Services (ADTS) to generate additional thumbnail
renditions.
Use the Life Sciences Document Preview widget instead as it is an external widget that is included
with all Life Sciences solutions and used in all Life Sciences workspaces by default. It can be
configured to respond in active views and only for selected objects of the appropriate type or format.
This helps avoid unnecessary downloads or renderings. It is used for multiple purposes in the
Life Sciences solutions as a standard document preview widget, left/right compare widget, and as
a Documentum Submission Store and View leaf document viewer. It supports additional formats
such as JPEG, GIF, BMP images, and media files. It avoids D2 login ticket timeout by requesting
new tickets automatically.

Life Sciences Document Preview Widget Configuration


To configure the Life Sciences Document Preview widget, pass the following arguments in the D2
widget URL configuration setting:

Argument Value
widget_id Unique ID for this widget instance. Used for event filtering
workspaceView View (label) in which this widget instance is used such as Browse.
Used to prevent unnecessary downloads
active Whether or not the widget is in the initial view that is, is active
by default

433
Configuration Settings to Improve Performance

Argument Value
autoSelect • true — the widget should respond to selection events (for
example, standard doc preview)

• false — the widget should respond to custom menu events


or on-click events raised by other widgets (for example,
Documentum Submission Store and View Submission Viewer)
for the specified widget_id
useC2Overlays Whether or not to render PDFs with dynamic C2 watermarks
type, excludeTypes, Optional filters
excludeFormats
refeshLoginTicketInterval Ticket auto-refresh period (default = 240 seconds = 4 minutes)
There are multiple D2 widget configurations predefined for this widget with different URL
parameters. The relevant widget configuration is referenced in the default workspace view definition
files in each case. For example, WG EXT Doc Viewer (Browse) is used in the Browse view of the
workspaces that have a welcome page for the default view. Similarly, WG EXT Doc Viewer (Initial
Browse) is used in the Browse view of the workspaces that do not have a welcome page; Browse” is
the default view.
The WG EXT SSV Doc Compare Viewer 1 widget configuration is used in the left-hand side-by-side
comparison view. This is configured to respond to the Show in Viewer 1 menu events. The WG EXT
SSV Doc Compare Viewer 2 is configured similarly for the right-hand comparison view (widget ID
lsDocViewer2). Accordingly, the D2 menu configuration for Show in Viewer 1 is configured to pass
the appropriate widget ID in the widget_id parameter of the event message. In this case, the widget
only responds to D2_EVENT_CUSTOM events targeted at lsDocViewer1 and ignores other events.
The right-hand view menu is configured similarly, with events sent to widget ID lsDocViewer2.

434
Appendix A
Configuration Planning Checklist

Table 38. Configuration planning checklist

Configuration Components Steps


1. Security 1. Determine mapping to existing roles in the
solution.

2. Determine if additional roles must be added.

3. Identify the users or groups that should be


added to each solution role. Determine the
authors, reviewers, coordinators, managers,
and so on in each department within your
organization.

4. Determine if security configuration needs to


be adjusted.

5. Determine if access to creation


/administration configurations must
be adjusted.

6. Determine if preconditions on menus must


be adjusted.
2. Object model Review the Life Sciences standard object model
to identify how your organization’s documents
and metadata will map to the object model.

435
Configuration Planning Checklist

Configuration Components Steps


3. D2 processing Review the following Life Sciences
configurations and identify any changes
needed for alternate processing (documents and
registration forms).

• Creation Profiles

— Addition or removal of creation profiles

— Addition or removal of individual


document types within existing creation
profiles

— Changing the control category for artifacts.

— Registration forms

• Default values

• Lifecycles

• Workflows

• Check in/check out

• Rendition requests

436
Configuration Planning Checklist

Configuration Components Steps


4. Artifacts 1. Review the Life Sciences set of artifacts
and identify any additions or modifications
needed.

2. Identify the attributes applicable to the


artifact.

3. Identify the scopes applicable to the artifact


(some organizations make each artifact
scope specific (for example, artifact name
(site), artifact name (country)).

4. Identify default templates for each artifact if


the artifact is to be created within LSTMF
(not needed if all documents are created
elsewhere and imported).

5. Identify naming conventions if using


CDFSetAttribute mechanism for object
names (if relying on D2 auto-naming, there
will be one configuration for all documents.
This, needs to be identified as well).
5. Review the procedures and best practices in
Chapter 2, Customizing D2-Based Solutions.

437
Configuration Planning Checklist

438
Appendix B
Troubleshooting

This appendix provide information on the log files that you can refer to in case you want to
troubleshoot any issue in the Life Sciences solution.

Log Files
The logs are the combination of D2-specific logs, Life Sciences-specific logs, and the supporting
component logs.

D2 Log Files
D2 logs can be accessed during runtime when using the application. The files can also be accessed
during design time when making configuration changes. The list of D2 logs in the Application
Server machine are listed as follows:
• C:\logs\D2-config.log
• C:\logs\D2.log
The list of D2 logs in Documentum Server are listed as follows:
• C:\logs\D2-CS.log
• C:\logs\D2-JMS.log
You can use both D2 Documentum Server and Application Server logs to troubleshoot many issues,
including D2 Configuration-related issues and workflow and lifecycle-related problems.

Life Sciences Log Files


During the automated installation of the Life Sciences solution, installation log files are created.
These logs are available on both Windows and Linux in the <Temp Extracted Installation
Package>/<Installation package name>/working/logs folder.
The log files for the individual solutions listed in the following table:

439
Troubleshooting

Solution Log File


Life Sciences solution • iHub_LSSuite_config.log

• iHub_LSSuite_UrlUpdate.log

• LSSuite_applyD2Configurations.log

• LSSuite_configImport.log

• LSSuite_Config_backup_export.log

• LSSuite_copy_bpmlibs.log

• LSSuite_copy_serverlibs.log

• LSSuite_dars.log

• LSSuite_eCTDSchema.log

• LSSuite_eSubmission.log

• LSSuite_index.log

• LSSuite_populateroles.log

• LSSuite_populateroles_myInsight.log

• LSSuite_postInstall.log

• LSSuite_preInstall.log

• LSSuite_updateversion.log

• LSSuite_UrlUpdate.log

• LSSuite_virtualDocTemplate.log
Documentum for • iHub_LSTMF_config.log
eTMF
• iHub_LSTMF_UrlUpdate.log

• LSTMF_applyD2Configurations.log

• LSTMF_Config_backup_export.log

• LSTMF_ConfigImport.log

• LSTMF_copy_serverlibs.log

• LSTMF_createregtables.log

• LSTMF_dars.log

• LSTMF_generategroups.log

• LSTMF_index.log

• LSTMF_populateroles.log
440
• LSTMF_populateroles_myInsight.log

• LSTMF_postInstall.log
Troubleshooting

Solution Log File


Documentum • iHub_LSQM_config.log
for Quality and
Manufacturing • iHub_LSQM_UrlUpdate.log

• LSQM_applyD2Configurations.log

• LSQM_configImport.log

• LSQM_Config_backup_export.log

• LSQM_copy_bpmlibs.log

• LSQM_copy_serverlibs.log

• LSQM_dars.log

• LSQM_index.log

• LSQM_populateMDroles.log

• LSQM_populateroles.log

• LSQM_populateroles_myInsight.log

• LSQM_postInstall.log

• LSQM_preInstall.log

• LSQM_updateversion.log

• LSQM_UrlUpdate.log

441
Troubleshooting

Solution Log File


Documentum • iHub_LSRD_config.log
for Research and
Development • iHub_LSRD_UrlUpdate.log

• LSOnlyRD_postInstall.log

• LSRD_applyD2Configurations.log

• LSRD_configImport.log

• LSRD_Config_backup_export.log

• LSRD_copy_serverlibs.log

• LSRD_createregtables.log

• LSRD_dars.log

• LSRD_generategroups.log

• LSRD_index.log

• LSRD_preInstall.log

• LSRD_populateroles.log

• LSRD_populateroles_myInsight.log

• LSRD_postInstall.log

• LSRD_updateversion.log

• LSRD_UrlUpdate.log

• LSRD_virtualDocTemplate.log
Documentum • iHub_LSSSV_config.log
Submission Store
and View • iHub_LSSSV_UrlUpdate.log

• LSSSV_applyD2Configurations.log

• LSSSV_configImport.log

• LSSSV_Config_backup_export.log

• LSSSV_copy_bpmlibs.log

• LSSSV_copy_serverlibs.log

• LSSSV_dars.log

• LSSSV_eCTDSchema.log

• LSSSV_eSubmission.log

• LSSSV_index.log
442
• LSSSV_populateroles.log

• LSSSV_populateroles_myInsight.log
Troubleshooting

Underlying Products Log Files


You can use the log files for the underlying products, such as Content Transformation Servers, xPlore,
Documentum Server, Thumbnail Server, and Java Method Server, to troubleshoot issues related
to Life Sciences.

Content Transformation Services

The default location of the log file on the Content Transformation Services host is the
%CTS_HOME%\logs folder. The Content Transformation Services log files include:
• CTS_Log.txt: Contains the errors and exceptions that are specific to the server.
• <Plug-In>_Log.txt: Contains individual plug-in-related errors and exceptions.
• IMAGE3_log.txt: Contains the errors that are specific to the Image3 plug-in. The Image3
plug-in logs errors when generating storyboards because the PDFStoryBoard plug-in cannot
produce images at a resolution higher than 96 dpi.
Note: If separate logging is enabled, log files can be found in the %CTS_HOME%\docbases
\<docbasename>\config\logs folder.
The Documentum Content Transformation Services Administration Guide provides more information
about the log files.

xPlore

You can view indexing, search, content processing service (CPS), and xDB logs in xPlore
Administrator. To view a log file:
1. In xPlore Administrator, select an instance and click Logging.
2. Click the tabs for dsearch, cps, cps_daemon, or xdb to view the last part of the log. Indexing
and search messages are logged to dsearch.
3. Click Download All Log Files to obtain download links for each log file.
The Documentum xPlore Administration and Development Guide provides more information about
the log files.

Documentum Server

Documentum Server logging and tracing provides information about Documentum Server
operations. This logging information and tracing information is recorded in the following files:
• Repository log file: Contains information about root server activities. This file is also sometimes
referred as the Documentum Server log file.
• Session log files: Contains all information, warning, error, and fatal error messages and, by
default, all SQL commands generated from DQL commands.

443
Troubleshooting

Session log files are stored in the %DOCUMENTUM%\dba\log\hex_repository_id\username


($DOCUMENTUM/dba/log/hex_repository_id/username) directory, where hex_repository_id is
the repository ID expressed as a hexadecimal value and username is the user account under which
the session is running. The session log name is the session ID.
The server creates a separate session log for each new connection. Sessions that are connected to the
primary Documentum Server create their session logs under the primary server. Sessions that are
connected to one or more remote Documentum Servers create their session logs under the remote
server(s). Because sessions are assigned using a round-robin method, you must look in both places
for session logs. Some features, such as jobs, also record tracing and logging information specific
to those features in log files.
The Documentum Server Administration and Configuration Guide provides more information about
the log files.

Java Method Server

The CDF-Methods.jar, CDFBase.jar, TMF-Methods.jar, SSV-Methods.jar, and


ControlledPrintServerMethods.jar files run on the Java Method Server. The server logs
errors for these JMS method execution in the server.log file located in the %DM_HOME%\
jboss[version]\server\DctmServer_MethodServer\log\ folder.

Third-Party Log Files


You can use the log files for the supported third-party products, such as myInsight, to troubleshoot
issues related to these products.

myInsight

When an myInsight report is generated manually or as a scheduled task, a log file is created and
placed in the Temp cabinet in the Life Sciences solution. If the myIsight Agent job is run, the log file is
created in the /Temp/jobs/myInsight Agent folder.
The AMPLEXOR myInsight User Guide provides more information about the messages that appears in
the log file.

Enabling Logging, Debugging, and Tracing


You can enable logging for D2, the underlying products, and the third-party products.

444
Troubleshooting

Configuring Logging for D2


For D2 Client issue:
• To troubleshoot D2 Client-related (4.x) issues, set the trace level to 5 for the Java Console log.
• Charles/Fiddler Traces can help in troubleshooting Client-Server or AppServer-Server related
issues.
For D2-Config related issues:
• Refer to the D2-Config.properties file located in the <webapp_root>\WEB-INF\classes
folder.
• To enable tracing, set logLevel=trace. Restarting the application server should not be required.
If the new logLevel trace setting does not take effect, then log in to D2-Config and select Tools >
Reload D2 Options. Note that this requires the client URL information to be correctly filled out in
D2-Config>Tools>Client URL box, that is, http://<url>/D2-Config. Incorrect/unreachable
URLs results in errors being reported in the D2 logs.
• An alternative method of enabling debug tracing for D2-Config is through the
<webapp_root>\WEB-INF\classes\logback.xml file.
• Edit the <level>${logLevel:-warn}</level> and set to the desired logging level such
as debug, that is <level>debug</level>.
• Reproduce an issue and make sure it is captured in the trace output.
For D2 (4.x client) related issues:
• The Logback.xml file can be found in the <webapp_root>\WEB-INF\classes folder.
• To enable tracing, find the <level>xxx</level> tags in the log file and set the value to
<level>debug</level>. Changes to the logging level should be picked up automatically
within 60 seconds.
• Enable Java Console logging (level 5) on the client side through the Java Console window.
• Reproduce an issue and make sure the problem is captured in the trace output.

Configuring Logging for Controlled Print


Debug logging should be enabled for the Controlled Print functionality to retain the generated
prints and to see the finer details of controlled print execution steps. The logging level for controlled
print can be updated by changing the following line in the log4j.properties file found in the path,
<webapp_root>\ControlledPrint\WEB-INF\classes:
log4j.logger.com.emc.services=DEBUG

The generated prints are saved in the path, <<root_directory>>/tmp/<<object_id of the document>>, if
the log level is set to DEBUG. These files will be cleaned up by the application when the log level
is set to INFO.

445
Troubleshooting

Configuring Debugging for Custom External Widgets


For all the default external widgets provided by Life Sciences (other than myInsight external
widgets), you can set the debug = true parameter in the external widgets URL. This will show
the URL parameters passed and will help in debugging. If required, you can explicitly add the URL
parameter debug=true in the required external widgets for debugging.
For example:
/XMLViewer/DocViewer.html?debug=true&initMessage=Select_a_document_for_comparison
_in_Viewer_1&docbase=$DOCBASE&locale=en&username=$LOGIN&password=$TICKET&widget
_id=viewer1&useC2Overlays=true&refreshTicketInterval=240

You can set the debug parameter for the following external widgets:
• WG EXT SSV Leaf Element Viewer
• WG EXT PDF Viewer 1
• WG EXT PDF Viewer 2
• WG EXT Submission History View
• WG EXT LSCI LSS Doc Viewer (Browse)
• WG EXT LSCI LSS Doc Viewer (Initial Browse)
• WG EXT LSCI LSS Doc Viewer (Tasks)
• WG EXT LSCI SSV Doc Compare Viewer 1
• WG EXT LSCI SSV Doc Compare Viewer 2
• WG EXT LSCI Study Tagging File Navigator
• WG EXT LSCI LSS Doc Viewer (QC Index)

Configuring Logging for Underlying Products


You can configure logging for the underlying products, such as Content Transformation Servers,
xPlore, Documentum Server, Thumbnail Server, and Java Method Server, to troubleshoot issues
related to Life Sciences.

Documentum Server

For Documentum Server 7.1 and later, because of JBOSS changes, certain configuration changes must
be made to get proper D2 Java Method Server log information:
1. Delete the jboss-log4j.xml file from the <dctm_home>\jboss7.1.1\server
\DctmServer_MethodServer\deployments\ServerApps.ear\APP-INF\classes
folder.
2. Create a logback.xml file in the <dctm_home>\jboss7.1.1\server\DctmServer
_MethodServer\deployments\ServerApps.ear folder. You can use the following sample
log file to create your log file.

446
Troubleshooting

3. Stop the Documentum JMS Service.


4. Delete the contents of the following folders:
• {jBoss Home}/server/DctmServer_MethodServer/log
• {jBoss Home}/server/DctmServer_MethodServer/logs
• {jBoss Home}/server/DctmServer_MethodServer/tmp
• {jBoss Home}/server/DctmServer_MethodServer/work
5. Start the Documentum JMS Service.
6. To enable tracing, in the logback.xml file, find the <level>#####</level> tag and set
the value to DEBUG. Changes to the log file are automatically picked up within 60 seconds. If
not, restart the Java Method Server.

Content Transformation Services

Separate logging appenders are added to the log4j.properties file for logging the polling and
capability caching information. Refer to the following entries in the log4j.properties file:
• log4j.category.POLLINGAppender=INFO, POLLINGAppender
• log4j.appender.POLLINGAppender=org.apache.log4j.DailyRollingFileAppender
• log4j.appender.POLLINGAppender.File=R $C(CTS, PARENT_DIR)\\logs\\Polling_log.txt
• log4j.appender.POLLINGAppender.Append=true
• log4j.appender.POLLINGAppender.lay out=org.apache.log4j.PatternLayout
• log4j.appender.POLLINGAppender.lay out.ConversionPattern=%d{HH\:mm\:ss,SSS} %10r %5p
[%10t] %-20c - %5x %m%n
• log4j.appender.POLLINGAppender.DatePattern=’.’yyyy-ww-dd
• log4j.category.CAPABILITYAppender=INFO, CAPABILITYAppender

447
Troubleshooting

• log4j.appender.CAPABILITYAppender=org.apache.log4j.DailyRollingFileAppender
• log4j.appender.CAPABILITYAppender.File= $C(CTS, PARENT_DIR)\\logs\\Capability_log.txt
• log4j.appender.CAPABILITYAppender.Append=true
• log4j.appender.CAPABILITYAppender.layout=org.apache.log4j.PatternLayout
• log4j.appender.CAPABILITYAppender.layout.ConversionPattern=%d{HH\:mm\:ss,SSS} %10r
%5p [%10t] %-20c - %5x %m%n
• log4j.appender.CAPABILITYAppender.DatePattern=’.’yyyy-ww-dd
The log files, Polling_log.txt and Capability_log.txt, corresponding to these appenders
contain the logs related to polling and capability caching information respectively. This information
is not logged to the main CTS_log.txt file. The log level can be set to DEBUG for more information
to be captured in the log file.

xPlore

Basic logging can be configured for each service in xPlore administrator. Log levels can be set
for indexing, search, CPS, xDB, and xPlore administrator. You can log individual packages
within these services, for example, the merging activity of xDB. Log levels are saved to
indexserverconfig.xml and are applied to all xPlore instances. xPlore uses slf4j (Simple Logging
Façade for Java) to perform logging.
To set logging for a service:
1. In xPlore Administrator, click System Overview.
2. Click Global Configuration.
3. On the Logging Configuration tab, configure logging for all instances. Open one of the services
such as xDB and set levels on individual packages.
To customize the instance-level log setting, edit the logback.xml file in each xPlore instance. The
logback.xml file is located in the WEB-INF/classes folder for each deployed instance war file.
Levels set in logback.xml take precedence over log levels in xPlore Administrator.
Note: Logging can slow the system and consume disk space. In a production environment, run the
system with minimal logging.
Each logger logs a package in xPlore or in your custom code. The logger has an appender that
specifies the log file name and location. DSEARCH is the default appender. Other defined appenders
in the primary instance logback configuration are XDB, CPS_DAEMON, and CPS. You can add a
logger and appender for a specific package in xPlore or your custom code. The following example
adds a logger and appender for the package com.mycompany.customindexing:
<logger name="com.mycompany.customindexing" additivity="false" level="INFO">
<appender name="CUSTOM" class=" ch.qos.logback.core.rolling.RollingFileAppender">
<file>C:/xPlore/jboss7.1.1/server/DctmServer_PrimaryDsearch/ logs/custom.log </file>
<encoder> <pattern>%date %-5level %logger{20} [%thread] %msg%n</pattern>
<charset>UTF-8</charset>
</encoder>
<rollingPolicy class="ch.qos.logback.core.rolling. FixedWindowRollingPolicy">
<maxIndex>100</maxIndex>
<fileNamePattern>C:/xPlore/jboss7.1.1/server/DctmServer_ PrimaryDsearch
/logs/custom.log.%i</fileNamePattern>

448
Troubleshooting

</rollingPolicy>
<triggeringPolicy class="ch.qos.logback.core.rolling. SizeBasedTriggeringPolicy">
<maxFileSize>10MB</maxFileSize>
</triggeringPolicy>
</appender>
</logger>

You can add custom logger and appender to logback.xml. To capture log entries in the
logs in xPlore Administrator, add the custom logger to a logger family, which are defined in
indexserverconfig.xml. This is an optional step – if you do not add your custom logger to a
logger family, it still logs to the file that you specify in your appender. Logger families are used to
group logs in xPlore Administrator. You can set the log level for the family, or expand the family
to set levels on individual loggers.
The log levels include TRACE, DEBUG, INFO, WARN, and ERROR. The levels are in increasing
severity and decreasing amounts of information. Therefore, TRACE displays more than DEBUG,
which displays more than INFO.

Thumbnail Server

To activate thumbnail logging:


1. Log in to the Thumbnail Server host as an Administrator.
2. Stop the Thumbnail Server service.
3. Navigate to the %DM_HOME%\thumbsrv\container\webapps\thumbsrv\web-inf folder
and open the web.xml file in any text editor.
4. Change the <debug> flag to TRUE.
5. Save and close the web.xml file.
6. Start the Thumbnail Server service.

Java Method Server

To enable logging for CDF, LSTMF, and LSSSV server methods, set the logging level to DEBUG on
the Java Method Server by adding the following settings to the log4j.properties file, located
in \Documentum\jboss[version]\server\DctmServer_MethodServer\deployments
\ServerApps.ear\APP-INF\classes:
• log4j.logger.com.documentum.d2=INFO
• log4j.logger.com.documentum.cdf=DEBUG
• log4j.logger.com.documentum.tmf=DEBUG
• log4j.logger.com.documentum.ssv=DEBUG
• log4j.logger.com.documentum.utils=DEBUG
• log4j.logger.com.emc.documentum.ls.utils=DEBUG**
** Only for LSTMF and Life Sciences solution setups where lock mechanism is used while creating
dynamic groups during trial activation and reactivation.

449
Troubleshooting

Life Sciences SDK

The log4j logging framework is used in the Life Sciences SDK. The log4j properties can be set for the
logs specific to the SDK by setting the log levels. For example:
log4j.logger.com.documentum.ws=DEBUG.

Configuring Logging for Third-Party Products

myInsight

To troubleshoot issues related to myInsight reports, myInsight logs can be collected by placing
the logging.properties file the WEB-INF/classes folder in the myInsight web application
and restarting the Application Server. The log levels can be FINE and FINEST. The following code
can be added to the log file:
handlers = org.apache.juli.FileHandler
############################################################
# Handler specific properties.
# Describes specific configuration info for Handlers.
############################################################
org.apache.juli.FileHandler.level = FINE
org.apache.juli.FileHandler.directory = ${catalina.base}/logs
org.apache.juli.FileHandler.prefix = myInsight.
java.util.logging.ConsoleHandler.level = FINE
java.util.logging.ConsoleHandler.formatter = java.util.logging.SimpleFormatter

Connection Issues
The Life Sciences solution is a web-based application. If the repository does not appear or other
connection issues occur, verify that the required services are running.

To start the required services:


1. Log in to the Windows system using the login credentials.
2. Open the Services console.
3. Start the following services in the listed order:
a. Documentum Docbroker Service Docbroker
b. Documentum Docbase Service <Repository Name>
c. Documentum Java Method Server
4. Start the respective Application Server where D2 and other web applications are running.

450
Troubleshooting

D2 Performance Issues
D2 is taking time to load the workspace, widgets, creation profiles, and other configurations.
Resolution

Follow these steps to improve the performance of the application with respect to the workspace,
widgets, creation profiles, and other configurations.
1. Stop the application server.
2. Update the D2FS.properties file located in the /WEB-INF/classes folder in D2.war by
turning on maxResultSetSize by uncommenting the line (remove the #). For example:
maxResultSetSize=1000

3. Configure the d2-cache.xml file located in the /WEB-INF/classes folder in D2.war, by


adding more d2-cache configurations:
<?xml version="1.0" encoding="UTF-8"?>
<ehcache xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="d2-cache.xsd"
updateCheck="false" monitoring="autodetect">
<cache name="xml-cache" maxElementsInMemory="10000" eternal="false" overflowToDisk=
"false"/>
<cache name="dictionary_dico-cache" maxElementsInMemory="10000" eternal="false"
overflowToDisk="false"/>
<cache name="dictionary_dql-cache" maxElementsInMemory="10000" eternal="false"
overflowToDisk="true" maxElementsOnDisk="10000000" diskPersistent="false"
memoryStoreEvictionPolicy="LFU"/>
<cache name="object_ids-cache" maxElementsInMemory="10000" eternal="false"
overflowToDisk="false" memoryStoreEvictionPolicy="LFU">
<cacheExtensionFactory class="com.emc.common.java.cache.PropertiesExtensionFactory"
properties="type=d2_acl_config,d2_documentset,d2_documentset_switch,x3_widget_config,
d2c_preferences,d2_create_config" propertySeparator=";"/>
</cache>
<cache name="user_group-cache" maxElementsInMemory="10000" eternal="false"
overflowToDisk="false"/>
</ehcache>

4. Start the application server.

Exporting Large Folders Issue


Exporting folders that contain a large number of files from the D2 Client, it may sometimes fail in
completing the operation successfully.
Resolution
To resolve this issue, you need to adjust the following parameters in both D2 and Documentum
Server to enable the export operation to succeed and also enhance the performance of the operation:
• contentTransferUrlTicketTimeout=15 (in the /WEB-INF/classes/D2FS.properties
file of the D2 Client deployment in the Application Server.)
• download.folderexport.batchsize=250 (in the /WEB-INF/classes/settings
.properties file of the D2 Client deployment in the Application Server.)
• concurrent_sessions=100 (in the server.ini file on Documentum Server.)

451
Troubleshooting

As these values are interconnected, it is advisable not to provide large variations over the
recommended settings.

Static Objects in XMLViewer not Cached


When Tomcat is accessed via HTTPS, static objects under /XMLViewer are not cached. This is
because a no-cache request is added in the header if no other cache-control directives are sent. To
avoid the static object, create a file called context.xml in the <App Server>/WEB-INF folder
and add the following lines:
<Context>
<Valve className="org.apache.catalina.authenticator.BasicAuthenticator"
disableProxyCaching="false" />
</Context>

452
Appendix C
Visual Representation of Attribute
Cascading in Life Sciences

During document creation or import, a document inherits a set of attributes from a corresponding
registration form. For example, if you create a non-clinical document, the system requires you to
associate the document to a Non-clinical Study Registration Form. The individual document inherits
a set of product and study information from the registration form.
The following diagrams represent how information is cascaded from the registration forms to the
individual documents in each of the solutions.
Legend:
• TXT — Free Form Text Entry
• DQL — DQL-based set used for the drop-down list
• DQL-prod — DQL-based set tempered by selected product to be used for the drop-down list
• DICT — Dictionary-based set used for the drop-down list
• DICT+FREE — Dictionary-based set used for the drop-down list; unlisted values accepted
• DICT-DQL-prod — Dictionary-based value set through DQL tempered by the selected product
• RO — Read only value; set earlier
• DQL-RO — Read-only value set through a DQL query on another field
• DQL-Dep — DQL-based set tempered by the first value in the set
• DQL-Dep-RO — Read-only value set through DQL tempered by the first value in the set
• DQL-Dep-Free — DQL-based set tempered by the first value in the set; unlisted values accepted

453
Visual Representation of Attribute Cascading in Life Sciences

Figure 1. Product and Project Registration Form Attributes

Figure 2. Cascading of Attributes in the Clinical Domain

454
Visual Representation of Attribute Cascading in Life Sciences

Figure 3. Cascading of Attributes in the Non-clinical Domain

455
Visual Representation of Attribute Cascading in Life Sciences

Figure 4. Cascading of Attributes in the Regulatory Domain

456
Visual Representation of Attribute Cascading in Life Sciences

Figure 5. Cascading of Attributes in the Safety and Quality Domain

457
Visual Representation of Attribute Cascading in Life Sciences

458
Appendix D
D2 Configurations

The following table lists the D2 configuration elements that are used by the Life Sciences solutions.
When customizing the D2 configurations to address customer-specific business requirements,
these D2 configurations must not be renamed or removed from the Life Sciences system. These
configurations are used by the Life Sciences Server Methods internally to address specific and
respective business operations.

Configuration Category Configuration Name


Creation Profiles Submission Elements

TMF Documents by TMF Unique Artifacts


Default Value Templates Clinical Trial Country Registration Form Default Values

Clinical Trial Registration Form Default Values

Clinical Trial Site Registration Form Default Values

Non-Clinical Study Registration Form Default Values

Product Registration Form Default Values

Quality Project Registration Form Default Values

Regulatory Submission Registration Form Default Values

Regulatory-Administrative Registration Form Default Values

TMF Placeholder Defaults

TMF Template Default Values

459
D2 Configurations

Configuration Category Configuration Name


Dictionary Clinical Study Phases

Clinical Trial Control Types

Geographic Regions

GMP Artifacts

Object Type to Taxonomy mapping

Regulatory-Admin Artifact

Route of Administration List

States of America

Submission Element Types

Submission Filestores

Submission Regional Metadata Conversions

System Parameters

TMF Artifact Scopes

TMF Artifact Scopes for Bulk Upload Spreadsheets

TMF Artifact Scopes for Individual Documents

TMF Countries

TMF Document Roles

TMF External Contributor Roles

TMF Folder Groups

TMF Geographic Regions

TMF Lifecycle States for Bulk Upload Spreadsheets

TMF Models

TMF Unique Artifact Names 2.0

TMF Unique Artifact Names 3.0

TMF Site Facility Types

wf_rule_attributes

460
D2 Configurations

Configuration Category Configuration Name


Extended Creation Profile Manufacturing (Copy from Quality)

TMF Clinical Trial Management Artifacts

TMF Documents by TMF Unique Artifacts

TMF Document Create

TMF Document Create Cat 2

TMF Document Create Cat 3

TMF Document Import


Lifecycle Config TMF Cat 2 Controlled Document Lifecycle Model

TMF Cat 3 Controlled Document Lifecycle Model

TMF Clinical Trial Lifecycle Model

TMF Controlled Document Lifecycle Model

TMF Document Indexing Lifecycle


D2 mailing config Send Contributor Change Notification

Send user index notification


Registered Table Config notify_message

tmf_progress_history
D2 Taxonomy GMP Artifacts

GMP Artifacts by Group

TMF Geographic Regions

Regulatory-Admin Classification by EU Region

Regulatory-Admin Classification by Region

Regulatory-Application type by Region

Regulatory-Submission type by Region

461

You might also like