Informatica Basic Study
Informatica Basic Study
1 Basics
Education Services
Version PC7B-20040608
Introduction
Course Objectives
By the end of this course you will: Understand how to use the major PowerCenter components for development Be able to build basic ETL mappings and mapplets Be able to create, run and monitor workflows Understand available options for loading target data Be able to troubleshoot most problems
About Informatica
Founded in 1993 Leader in enterprise solution products Headquarters in Redwood City, CA Public company since April 1999 (INFA) 2000+ customers, including over 80% of Fortune 100 Strategic partnerships with IBM, HP, Accenture, SAP, and many others Worldwide distributorship
Informatica Resources
www.informatica.com provides information (under Services) on: Professional Services Education Services Technical Support my.informatica.com sign up to access: Product documentation (under Products, documentation downloads) Velocity Methodology (under Services) Knowledgebase Webzine devnet.informatica.com sign up for Informatica Developers Network
Decision Support
Data Warehouse
Transaction level data Optimized for transaction response time Current Normalized or De-normalized data
Aggregate data Cleanse data Consolidate data Apply business rules De-normalize data
Extract
PowerCenter 7 Architecture
Native
Sources Informatica Server
Native
Targets
TCP/IP
Native
Repository Designer Workflow Workflow Rep Server Manager Manager Monitor Administrative Console
Repository
Not Shown: Client ODBC Connections for Source and Target metadata
These allow companies to directly source from and integrate with a variety of transactional applications and real-time services
PowerExchange (for mainframe, AS/400 and complex flat files) PowerConnects for: Transactional Applications Essbase PeopleSoft SAP R/3 SAP BW SAS Siebel Real-time Services JMS MSMQ MQSeries SAP IDOCs TIBCO WebMethods Web Services
Server group management, automatic workflow distributio across multiple heterogeneous servers ZL Engine, always-on non-stop sessions, JMS connectivity, and real-time Web Services provider Data smart parallelism, pipeline and data parallelism, partitioning Version control, deployment groups, configuration management, automatic promotion
PowerCenter
Server engine, metadata repository, unlimited designers, workflow scheduler, all APIs and SDKs, unlimited XML an flat file sourcing and targeting, object export to XML file, LDAP authentication, role-based object-level security, metadata reporter, centralized monitoring
Watch for short virtual classroom courses on these options and XML!
1. Create Source definition(s) 2. Create Target definition(s) 3. Create a Mapping 4. Create a Session Task 5. Create a Workflow with Task components 6. Run the Workflow and verify the results
Demonstration
Repository Server
TCP/IP
Import from: Relational database Flat file COBOL file XML object Create manually
Repository Agent
Native
DEF
Repository
Relational DB Source
Table View Synonym
DEF
Repository Server
TCP/IP
Repository Agent
Native
DEF
Repository
Flat File
DEF
Repository Server
TCP/IP
Repository Agent
Native
DEF
Repository
.CBL File
DEF
Repository Server
TCP/IP
DATA
Repository Agent
Native
DEF
Repository
Repository Server
TCP/IP
DATA
Repository Agent
Native
DEF
Repository
Data Previewer
Preview data in
Relational database sources Flat file sources Relational database targets Flat file targets
Metadata Extensions
Allows developers and partners to extend the metadata stored in the Repository Metadata extensions can be:
User-defined PowerCenter users can define and create their own metadata Vendor-defined Third-party application vendor-created metadata lists
For example, applications such as Ariba or PowerConnect for Siebel can add information such as contacts, version, etc.
Metadata Extensions
Can be reusable or non-reusable Can promote non-reusable metadata extensions to reusable; this is not reversible Reusable metadata extensions are associated with all repository objects of that object type A non-reusable metadata extensions is associated with a single repository object Administrator or Super User privileges are required for managing reusable metadata extensions
View Synonym
Repository Agent
Native
DEF
Repository
DAT A
Repository Agent
Native
DEF
Repository
Mappings
Mappings
By the end of this section you will be familiar with: The Mapping Designer interface Transformation objects and views Source Qualifier transformation The Expression transformation Mapping validation
Mapping Designer
Iconized Mapping
Midstream XML Generator: writes XML to database table or message queue More Source Qualifiers: read from XML, message queues and applications
Transformation Views
A transformation has three views:
Iconized shows the transformation in relation to the rest of the mapping Normal shows the flow of data through the transformation Edit shows transformation ports (= table columns) and properties; allows editing
Usage
Convert datatypes For relational sources:
Modify SQL statement User Defined Join Source Filter Sorted ports Select DISTINCT Pre/Post SQL
Expression Transformation
Perform calculations using non-aggregate functions (row level)
Ports Mixed Variables allowed Create expression in an output or variable port Usage Perform majority of data manipulation
Expression Editor
An expression formula is a calculation or conditional statement for a specific port in a transformation Performs calculation based on ports, functions, operators, variables, constants and return values from other transformations
Expression Validation
The Validate or OK button in the Expression Editor will:
Parse the current expression Remote port searching (resolves references to ports in other transformations) Parse default values Check spelling, correct number of arguments in functions, other syntactical errors
Character Functions Used to manipulate character data CHRCODE returns the numeric value (ASCII or Unicode) of the first character of the string passed to this function CONCAT is for backward compatibility only. Use || instead
Conversion Functions
Used to convert datatypes
Used to process data during data cleansing METAPHONE and SOUNDEX create indexes based on English pronunciation (2 different standards)
Date Functions
Used to round, truncate, or compare dates; extract one part of a date; or perform arithmetic on a date To pass a string to a date function, first use the TO_DATE function to convert it to an date/time datatype
Test Functions Used to test if a lookup result is null Used to validate data
Variable Ports
Use to simplify complex expressions
e.g. create and store a depreciation formula to be referenced more than once
Use in another variable port or an output port expression Local to the transformation (a variable port cannot also be an input or output port) Available in the Expression, Aggregator and Rank transformations
Selected port
Informatica Datatypes
NATIVE DATATYPES TRANSFORMATION DATATYPES
Specific to the source and target database types Display in source and target tables within Mapping Designer
PowerCenter internal datatypes based on UCS-2 Display in transformations within Mapping Designer
Native
Transformation
Native
Transformation datatypes allow mix and match of source and target database types When connecting ports, native and transformation datatypes must be compatible (or must be explicitly converted)
For further information, see the PowerCenter Client Help > Index > port-to-port data conversion
Mapping Validation
Connection Validation
Examples of invalid connections in a Mapping: Connecting ports with incompatible datatypes Connecting output ports to a Source Connecting a Source to anything but a Source Qualifier or Normalizer transformation Connecting an output port to an output port or an input port to another input port
Mapping Validation
Mappings must: Be valid for a Session to run Be end-to-end complete and contain valid expressions Pass all data flow rules Mappings are always validated when saved; can be validated without being saved Output Window displays reason for invalidity
Workflows
Workflows
By the end of this section, you will be familiar with: The Workflow Manager GUI interface Creating and configuring Workflows Workflow properties Workflow components Workflow tasks
Workspace
Status Bar
Output Window
Task Developer
Create Session, Shell Command and Email tasks Tasks created in the Task Developer are reusable
Worklet Designer
Creates objects that represent a set of tasks Worklet objects are reusable
Workflow Structure
A Workflow is set of instructions for the Informatica Server to perform data transformation and load Combines the logic of Session Tasks, other types of Tasks and Worklets The simplest Workflow is composed of a Start Task, a Link and one other Task
Link
Start Task
Session Task
Session Task
Server instructions to run the logic of ONE specific mapping
e.g. source and target data location specifications, memory allocation, optional Mapping overrides, scheduling, processing and load instructions
Becomes a component of a Workflow (or Worklet) If configured in the Task Developer, the Session Task is reusable (optional)
Sample Workflow
Session 1
Command Task
Session 2
Concurrent
Combined
Note: Although only session tasks are shown, can be any tasks
Creating a Workflow
Select a Server
Workflow Properties
Customize Workflow Properties
Workflow log displays
Workflow Scheduler
Workflow Links
Required to connect Workflow Tasks Can be used to create branches in a Workflow All links are executed unless a link condition is used which makes a link false
Link 1 Link 3
Link 2
Conditional Links
Workflow Summary
1. 2. 3. Add Sessions and other Tasks to the Workflow Connect all Workflow components with Links Save the Workflow 4. Start the Workflow
Session Tasks
Session Tasks
After this section, you will be familiar with: How to create and configure Session Tasks Session Task source and target properties
Or Select menu Tasks | Create and select Session from the drop-down menu
Set connection
Set properties
Set connection
Set properties
Monitoring Workflows
Monitoring Workflows
By the end of this section you will be familiar with: The Workflow Monitor GUI interface Monitoring views Server monitoring modes Filtering displayed items Actions initiated from the Workflow Monitor Truncating Monitor Logs
Workflow Monitor
The Workflow Monitor is the tool for monitoring Workflows and Tasks Choose between two views: Gantt chart Task view
Task view
Monitoring Operations
Perform operations in the Workflow Monitor
Stop, Abort, or Restart a Task, Workflow or Worklet Resume a suspended Workflow after a failed Task is corrected Reschedule or Unschedule a Workflow
committing data during the timeout period, the threads and processes associated with the Session are killed
Stopping a Session Task means the Server stops reading data
Status Bar
Monitoring filters can be set using drop down menus. Minimizes items displayed in Task View
Right-click on Session to retrieve the Session Log (from the Server to the local PC Client)
Filter Toolbar
Select type of tasks to filter Select servers to filter View all folders or folders owned only by current user Filter tasks by specified criteria Display recent runs
Repository Manager
The Repository Manager Truncate Log option clears the Workflow Monitor logs
Debugger
Debugger
By the end of this section you will be familiar with: Creating a Debug Session Debugger windows and indicators Debugger functionality and options Viewing data with the Debugger Setting and using Breakpoints Tips for using the Debugger
Debugger Features
Wizard driven tool that runs a test session View source / target data View transformation data Set break points and evaluate expressions Initialize variables Manually change variable values Data can be loaded or discarded Debug environment can be saved for later use
00
Debugger Interface
01
Debugger Tips
Server must be running before starting a Debug Session When the Debugger is started, a spinning icon displays. Spinning stops when the Debugger Server is ready The flashing yellow/green arrow points to the current active Source Qualifier. The solid yellow arrow points to the current Transformation instance Next Instance proceeds a single step at a time; one row moves from transformation to transformation Step to Instance examines one transformation at a time, following successive rows through the same transformation
02
03
Filter Transformation
Filter Transformation
By the end of this section you will be familiar with: Filter functionality Filter properties
05
Filter Transformation
Drops rows conditionally
Ports All input / output Specify a Filter condition Usage Filter rows from input flow
06
07
Sorter Transformation
Sorter Transformation
By the end of this section you will be familiar with: Sorter functionality Sorter properties
09
Sorter Transformation
Can sort data from relational tables or flat files Sort takes place on the Informatica Server machine Multiple sort keys are supported The Sorter transformation is often more efficient than a sort performed on a database with an ORDER BY clause
10
Sorter Transformation
Sorts data from any source, at any point in a data flow
Sort Keys
Ports Input/Output Define one or more sort keys Define sort order for each key Example of Usage Sort data before Aggregator to improve performance
Sort Order
11
Sorter Properties
12
Aggregator Transformation
Aggregator Transformation
By the end of this section you will be familiar with: Basic Aggregator functionality Creating subtotals with the Aggregator Aggregator expressions Aggregator properties Using sorted data
14
Aggregator Transformation
Performs aggregate calculations
Ports Mixed Variables allowed Group By allowed Create expressions in output ports Usage Standard aggregations
15
Aggregate Expressions
Aggregate functions are supported only in the Aggregator Transformation
Conditional Aggregate expressions are supported: Conditional SUM format: SUM(value, condition)
16
Aggregator Functions
AVG COUNT FIRST LAST MAX MEDIAN MIN PERCENTILE STDDEV SUM VARIANCE
Return summary values for non-null data in selected ports Use only in Aggregator transformations Use in output ports only Calculate a single value (and row) for all records in a group Only one aggregate function can be nested within an aggregate function Conditional statements can be used with these functions
17
Aggregator Properties
Sorted Input Property
Instructs the Aggregator to expect the data to be sorted Set Aggregator cache sizes for Informatica Server machine
18
Sorted Data
The Aggregator can handle sorted or unsorted data
Sorted data can be aggregated more efficiently, decreasing total processing time
The Server will cache data from each group and release the cached data upon reaching the first record of the next group Data must be sorted according to the order of the Aggregators Group By ports Performance gain will depend upon varying factors
19
No rows are released from Aggregator until all rows are aggregated
20
Each separate group (one row) is released as soon as the last row in the group is aggregated
21
Active transformation
Can operate on groups of data rows AND/OR Can change the number of rows on the data flow Examples: Aggregator, Filter, Source Qualifier
22
Passive T T T
Active T
Example holds true with Normalizer instead of Source Qualifier. Exceptions are: Mapplet Input and sorted Joiner transformations
23
Joiner Transformation
Joiner Transformation
By the end of this section you will be familiar with: When to use a Joiner transformation Homogeneous joins Heterogeneous joins Joiner properties Joiner conditions Nested joins
25
Homogeneous Joins
Joins can be performed within a Source Qualifier (using a SQL Query) when:
The source tables are on the same database server and The database server performs the join
26
Heterogeneous Joins
Joins cannot be performed within a Source Qualifier when
The source tables or on different database servers The sources are heterogeneous e.g.
An Oracle table and a DB2 table Two flat files A flat file and a database table
27
Joiner Transformation
Performs heterogeneous joins on different data flows
Active Transformation Ports All input or input / output M denotes port comes from master source Examples Join two flat files Join two tables from different databases Join a flat file with a relational table
28
Joiner Conditions
29
Joiner Properties
Join types:
Normal (inner) Master outer Detail outer Full outer Set Joiner Caches
Joiner can accept sorted data (configure the join condition to use the sort origin ports)
30
Nested Joins
Used to join three or more heterogeneous sources
31
32
33
Lookup Transformation
Lookup Transformation
By the end of this section you will be familiar with: Lookup principles Lookup properties Lookup conditions Lookup techniques Caching considerations Persistent caches
35
Return value(s)
36
Lookup Transformation
Looks up values in a database table or flat file and provides data to other components in a mapping
Ports Mixed L denotes Lookup port R denotes port used as a return value (unconnected Lookup only see later) Specify the Lookup Condition Usage Get related values Verify if records exists or if data has changed
37
Lookup Conditions
38
Lookup Properties
Lookup table name Lookup condition
Native database connection object name Source type: Database or Flat File
39
Policy on multiple match: Use first value Use last value Report error
40
Lookup Caching
Caching can significantly impact performance Cached
Lookup table data is cached locally on the Server Mapping rows are looked up against the cache Only one SQL SELECT is needed
Uncached
Each Mapping row needs one SQL SELECT Rule Of Thumb: Cache if the number (and size) of records in the Lookup table is small relative to the number of mapping rows requiring the lookup
41
Persistent Caches
By default, Lookup caches are not persistent; when the session completes, the cache is erased Cache can be made persistent with the Lookup properties When Session completes, the persistent cache is stored on the server hard disk The next time Session runs, cached data is loaded fully or partially into RAM and reused A named persistent cache may be shared by different sessions Can improve performance, but stale data may pose a problem
42
43
Set prefix for persistent cache file name Reload persistent cache
44
45
Target Options
Target Options
By the end of this section you will be familiar with: Default target load type Target properties Update override Constraint-based loading
47
48
Target Properties
Edit Tasks: Mappings Tab Session Task
Select target instance Target load type Row loading operations Error handling
49
Delete SQL
DELETE from <target> WHERE <primary key> = <pkvalue>
50
Constraint-based Loading
pk1
fk1, pk2
fk2
To maintain referential integrity, primary keys must be loaded before their corresponding foreign keys: in order Target1, Target2, Target 3
51
52
can change the number of rows on the data flow Examples: Source Qualifier, Aggregator, Joiner, Sorter, Filter
Active source
Active transformation that generates rows Cannot match an output row with a distinct input row Examples: Source Qualifier, Aggregator, Joiner, Sorter (The Filter is NOT an active source)
Active group
Group of targets in a mapping being fed by the same active
source
53
Example 1
With only one Active source, rows for Targets1, 2, and 3 will be loaded properly and maintain referential integrity
fk1, pk2
fk2
pk1
Example 2
With two Active sources, it is not possible to control whether rows for Target3 will be loaded before or after those for Target2
fk1, pk2
fk2
54
55
57
Ports All input / output Specify the Update Strategy Expression IIF or DECODE logic determines how to handle the record Example Updating Slowly Changing Dimensions
58
59
60
61
62
Router Transformation
Router Transformation
By the end of this section you will be familiar with: Router functionality Router filtering groups How to apply a Router in a Mapping
64
Router Transformation
Rows sent to multiple filter conditions
Ports All input/output Specify filter conditions for each Group Usage Link source data in one pass to multiple filter conditions
65
Router Groups
Input group (always one) User-defined groups Each group has one condition ALL group conditions are evaluated for EACH row One row can pass multiple conditions Unlinked Group outputs are ignored Default group (always one) can capture rows that fail all Group conditions
66
67
Lab 13 Router
68
70
Ports Two predefined output ports, NEXTVAL and CURRVAL No input ports allowed Usage Generate sequence numbers Shareable across mappings
71
72
74
System Variables
SYSDATE
SESSSTARTTIME
$$$SessStartTime
Returns the system date value as a string Uses system clock on machine hosting Informatica Server
format of the string is database type dependent Used in SQL override Has a constant value
75
76
User-defined names
77
78
79
Parameter Files
You can specify a parameter file for a session in the session editor Parameter file contains folder.session name and initializes each parameter and variable for that session. For example:
[Production.s_MonthlyCalculations] $$State=MA $$Time=10/1/2000 00:00:00 $InputFile1=sales.txt $DBConnection_target=sales $PMSessionLogFile=D:/session logs/firstrun.txt
80
81
Unconnected Lookups
Unconnected Lookups
By the end of this section you will know: Unconnected Lookup technique Unconnected Lookup functionality Difference from Connected Lookup
83
Unconnected Lookup
Physically unconnected from other transformations NO data flow arrows leading to or from an unconnected Lookup Lookup data is called from the point in the Mapping that needs it Lookup function can be set within any transformation that supports expressions
Function in the Aggregator calls the unconnected Lookup
84
IIF ( ISNULL(customer_id),:lkp.MYLOOKUP(order_no))
Lookup function
Condition is evaluated for each row but Lookup function is called only if condition satisfied
85
86
Must check a Return port in the Ports tab, else fails at runtime
87
Part of the mapping data flow Returns multiple values (by linking output ports to another transformation) Executed for every record passing through the transformation More visible, shows where the lookup values are used Default values are used
Separate from the mapping data flow Returns one value - by checking the Return (R) port option for the output port that provides the return value Only executed when the lookup function is called Less visible, as the lookup is called from an expression within another transformation Default values are ignored
88
89
90
Heterogeneous Targets
Heterogeneous Targets
By the end of this section you will be familiar with: Heterogeneous target types Heterogeneous target limitations Target conversions
92
93
Oracle table
Tables are EITHER in two different databases, or require different (schemaspecific) connect strings One target is a flat file load
Oracle table
Flat file
94
The two database connections are different Flat file requires separate location information
95
Only the following overrides are supported: Relational target to flat file target Relational target to any other relational database type SAP BW target to a flat file target
CAUTION: If target definition datatypes are not compatible with datatypes in newly selected database type, modify the target definition
96
97
Mapplets
Mapplets
By the end of this section you will be familiar with: Mapplet Designer Mapplet advantages Mapplet types Mapplet rules Active and Passive Mapplets Mapplet Parameters and Variables
99
Mapplet Designer
00
Mapplet Advantages
Useful for repetitive tasks / logic Represents a set of transformations Mapplets are reusable Use an instance of a Mapplet in a Mapping Changes to a Mapplet are inherited by all instances Server expands the Mapplet at runtime
01
02
03
Unsupported Transformations
Use any transformation in a Mapplet except: XML Source definitions COBOL Source definitions Normalizer Pre- and Post-Session stored procedures Target definitions Other Mapplets
04
External Sources
Mapplet contains a Mapplet Input transformation Receives data from the Mapping it is used in
Mixed Sources
Mapplet contains one or more of either of a Mapplet Input transformation AND one or more Source Qualifiers Receives data from the Mapping it is used in, AND from the Mapplet
05
Passive Transformation Connected Ports Output ports only Usage Only those ports connected from an Input transformation to another transformation will display in the resulting Mapplet Connecting the same port to more than one transformation is disallowed Pass to an Expression transformation first
Transformation
Transformation
06
Resulting Mapplet HAS input ports When used in a Mapping, the Mapplet may occur at any point in mid-flow
Mapplet
07
Mapplet
08
Usage Only those ports connected to an Output transformation (from another transformation) will display in the resulting Mapplet One (or more) Mapplet Output transformations are required in every Mapplet
09
10
11
CAUTION: Changing a passive Mapplet into an active Mapplet may invalidate Mappings which use that Mapplet so do an impact analysis in Repository Manager first
12
Passive
Active
Multiple Active Mapplets or Active and Passive Mapplets cannot populate the same target instance
13
14
Lab 17 Mapplets
15
Reusable Transformations
Reusable Transformations
By the end of this section you will be familiar with: Transformation Developer Reusable transformation rules Promoting transformations to reusable Copying reusable transformations
17
Transformation Developer
Make a transformation reusable from the outset, or test it in a mapping first
Reusable transformations
18
Reusable Transformations
Define once, reuse many times Reusable Transformations
Can be a copy or a shortcut Edit Ports only in Transformation Developer Can edit Properties in the mapping Instances dynamically inherit changes Caution: changing reusable transformations can invalidate mappings
19
20
3. Drop the transformation into the mapping 4. Save the changes to the Repository
21
22
24
Error Types
Transformation error
Data row has only passed partway through the mapping
Data reject
Data row is fully transformed according to the mapping
logic Due to a data issue, it cannot be written to the target A data reject can be forced by an Update Strategy
25
Data rejects
Appended to reject file Not written to reject file (one .bad file per target)
26
In Session task
27
In Session task
28
X X
29
30
31
32
33
34
35
36
37
Workflow Configuration
39
Workflow Configuration
Workflow Server Connections Reusable Workflow Schedules Reusable Session Configurations
40
41
42
43
44
FTP Connection
Create an FTP connection Instructions to the Server to ftp flat files Used in Session Tasks
45
46
47
48
49
50
Session Configuration
Define properties to be reusable across different sessions Defined at folder level Must have one of these tools open in order to access
51
52
53
54
55
Reusable Tasks
Reusable Tasks
Three types of reusable Tasks
Session Set of instructions to execute a specific Mapping Command Specific shell commands to run during any Workflow Email Sends email during the Workflow
57
Reusable Tasks
Use the Task Developer to create reusable tasks These tasks will then appear in the Navigator and can be dragged and dropped into any workflow
58
Reusable
Non-reusable
59
Command Task
Specify one or more Unix shell or DOS commands to run during the Workflow
Runs in the Informatica Server (UNIX or Windows)
environment
Shell command status (successful completion or failure) is held in the pre-defined variable $command_task_name.STATUS Each Command Task shell command can execute before the Session begins or after the Informatica Server executes a Session
60
Command Task
Specify one (or more) Unix shell or DOS (NT, Win2000) commands to run at a specific point in the workflow Becomes a component of a workflow (or worklet) If created in the Task Developer, the Command task is reusable If created in the Workflow Designer, the Command task is not reusable Commands can also be invoked under the Components tab of a Session task to run pre- or post-session
61
62
Add Cmd
Remove Cm
63
Email Task
Configure to have the Informatica Server to send email at any point in the Workflow Becomes a component in a Workflow (or Worklet) If configured in the Task Developer, the Email Task is reusable (optional) Emails can also be invoked under the Components tab of a Session task to run pre- or post-session
64
65
66
67
Non-Reusable Tasks
Non-Reusable Tasks
Six additional Tasks are available in the Workflow Designer
Decision Assignment Timer Control Event Wait Event Raise
69
Decision Task
Specifies a condition to be evaluated in the Workflow Use the Decision Task in branches of a Workflow Use link conditions downstream to control execution flow by testing the Decision result
70
Assignment Task
Assigns a value to a Workflow Variable Variables are defined in the Workflow object
71
Timer Task
Waits for a specified period of time to execute the next Task
General Tab Timer Tab
72
Control Task
Stop or ABORT the Workflow
Properties Tab General Tab
73
74
75
Events Tab
76
General Tab
Properties Tab
77
Worklets
Worklets
An object representing a set or grouping of Tasks Can contain any Task available in the Workflow Manager Worklets expand and execute inside a Workflow A Workflow which contains a Worklet is called the parent Workflow Worklets CAN be nested Reusable Worklets create in the Worklet Designer Non-reusable Worklets create in the Workflow Designer
79
Re-usable Worklet
In the Worklet Designer, select Worklets | Create
80
81
Non-Reusable Worklet
1. Create worklet task in Workflow Designer
2.
Right-click on new worklet and select Open Worklet Workspace switches to Worklet Designer
3.
82
83
84
85
86