Intro To IBM Problem Determination Tools
Intro To IBM Problem Determination Tools
Introduction to the
IBM Problem
Determination Tools
Overview of the Problem Determination
Tools offering
Larry Kahm
Anand Sundaram
ibm.com/redbooks
International Technical Support Organization
April 2002
SG24-6296-00
Take Note! Before using this information and the product it supports, be sure to read the
general information in “Special notices” on page 219.
This edition applies to IBM Fault Analyzer for OS/390, Version 1 Release 1(PTF UQ54113), IBM
File Manager for OS/390, Version 1 Release 1, and IBM Debug Tool, Version 1 Release 2.
When you send information to IBM, you grant IBM a non-exclusive right to use or distribute the
information in any way it believes appropriate without incurring any obligation to you.
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix
The team that wrote this redbook . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix
Special notice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi
IBM trademarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi
Comments welcome . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi
Contents v
5.2 File Manager components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
5.2.1 Templates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
5.2.2 File associations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
5.2.3 Steps toward implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
5.2.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
5.3 Debug Tool components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
5.3.1 Load modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
5.3.2 Listings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
5.3.3 Side files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
5.3.4 Steps toward implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
5.4 Common ground. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
Contents vii
Old and new do not mix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
Perform the conversion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210
Conversion batch job . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210
Batch report output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211
Data set comparison . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
Results after the conversion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221
This IBM Redbook describes the IBM Problem Determination Tools and includes
scenarios that show how to use the tools to recognize, locate, and fix errors in
application programs.
Part 1, “IBM Problem Determination Tools” describes the three program products
that make up the suite: IBM Fault Analyzer, IBM File Manager, and IBM Debug
Tool. It also discusses how you can implement these products at your site.
Part 2, “Scenarios using the Problem Determination Tools” walks you through
detailed scenarios that demonstrate how the tools can be used in a practical
day-to-day manner.
Part 3, “Appendixes” contains code samples, reports and listings, and examples
that are too large to include in the chapters.
Thanks to the following people for their technical contributions to this project:
Marty Shelton and David Tran, IBM Silicon Valley Laboratory, San Jose,
California
Special notice
This publication is intended to help traditional application programmers, who
develop and maintain mainframe-based COBOL application programs, in their
everyday work. It is also intended to help systems programmers by providing hints
and tips regarding the Problem Determination Tool’s post-installation tasks. The
information in this publication is not intended as the specification of any
programming interfaces that are provided by IBM Fault Analyzer for OS/390,
Version 1 Release 1(PTF UQ54113); IBM File Manager for OS/390, Version 1
Release 1; and IBM Debug Tool, Version 1 Release 2.
See the PUBLICATIONS section of the IBM Programming Announcement for IBM
Fault Analyzer for OS/390, Version 1 Release 1(PTF UQ54113); IBM File
Manager for OS/390, Version 1 Release 1; and IBM Debug Tool, Version 1
Release 2, for more information about what publications are considered to be
product documentation.
IBM trademarks
The following terms are trademarks of the International Business Machines
Corporation in the United States and/or other countries:
e (logo)® Redbooks Logo™
IBM ® OS/390®
CICS® S/390®
DB2® SecureWay®
ES/9000® SP™
IMS™ System/390®
Language Environment® WebSphere®
MVS™ z/OS™
Comments welcome
Your comments are important to us!
Preface xi
Use the online Contact us review redbook form found at:
ibm.com/redbooks
Send your comments in an Internet note to:
[email protected]
Mail your comments to the address on page ii.
We begin with an overview of the products. For each product, we describe the
software levels that are needed to use it effectively, and some of the
post-installation tasks. Each chapter contains a review of key features and
functions. We end this part with a chapter that discusses how these products can
be implemented in your environment.
By using these tools, an application programmer can more quickly and easily
identify and resolve problems that occur in batch and CICS application programs.
There are many features within this suite of tools that can help you perform
day-to-day tasks. You can enhance your application development skills by
learning how these tools work and by using them effectively.
The code and examples presented in these chapters will work with these
releases, and should (with very little modification) work with future releases of
these products.
Towards the end of our residency, two products were updated and we worked
with them briefly.
IBM Fault Analyzer for OS/390, Version 1 Release 1 (PTF UQ55392)
IBM File Manager for z/OS and OS/390, Version 2 Release 1
Because of the lead time between our work and the publication date, even newer
releases or versions of these products may be available. We invite you to review
the Web sites listed in the Bibliography for the latest available product
information.
Specific system exits are required to allow Fault Analyzer to intercept abends
when they occur. We describe these exits in detail in Chapter 2, “Introduction to
Fault Analyzer” on page 15.
The fault history file is organized in chronological order, with the most recent
abends at the top. Line commands (the ones available for each item are shown
on the right-hand side of the panel), allow you to view the dump information, or to
request additional information in foreground or batch mode. In either mode, you
can customize the level of detail that is reported.
Listing
This is the standard output file from a compile (or the ADATA file from a High
Level Assembler program).
Side file
This is a highly condensed form of the compiler listing, produced by a Fault
Analyzer utility after a compile.
In 2.3.3, “How to create a side file” on page 20, we show you how to create a side
file. We provide you with further insights into their usefulness in “Implementing
the tools in your environment” on page 101.
We provide you with samples of two REXX user exits for you to use as is, or to
modify for your needs.
Version 1 update
The latest Program Temporary Fix (PTF) for Fault Analyzer, UQ55392, provides
the following changes to the product.
Improved the performance of the fault history file
This is accomplished by changing the structure of the file from a VSAM KSDS
file to a partitioned data set (PDS) or PDS/E.
Note: The updated user’s guide, including the documentation that describes the
utility program, is supplied with the PTF.
Appendix B, “Fault Analyzer fault history file conversion” on page 207, describes
our experiences after this PTF was implemented.
Version 2 Release 1
The most recent version of Fault Analyzer contains several product updates,
which includes the ones introduced in the last PTF for Version 1, and offers the
following new features:
CICS system abend support, including:
– Trace table analysis
– Last 3270 screen analysis
– CICS domain control block mapping
MQ Series support, including:
– Analysis of abends when calling MQ Series Application Programming
Interfaces (APIs)
– Display of COBOL or PL/I source code that led to the abend
Improved security, including:
– Additional subsystem security options
– Rules-based security administrator options
Two key features of File Manager enable you to perform advanced or very
detailed data manipulation:
Templates
REXX functions
1.3.1 Templates
File Manager uses templates to provide a logical view of your data. To enable
File Manager to determine the record structure of a file, supply a copybook
containing COBOL data description entries. File Manager interprets each
Level-01 group item in the copybook as a record structure, and each elementary
item as a field.
You can save templates (to eliminate the need to recreate them each time you
browse or edit a file) and use them with different File Manager utilities.
You can develop REXX procedures to take the place of repetitive, manual
functions and then save these routines to a common data set.
We describe how to use templates and provide sample REXX routines that you
can use as is, or modify for your use, in Chapter 3, “Introduction to File Manager”
on page 39.
Debug Tool requires you to compile your application program with the TEST
compile option and, depending on the execution environment, link-edit the
appropriate object modules. You use the TEST runtime option to execute your
application program, which starts the Debug Tool session. We describe these
options in detail in Chapter 4, “Introduction to Debug Tool” on page 75.
Because you have the ability to directly manipulate variables in storage during a
debugging session, a variety of different logic paths can be tested within a short
period of time. You can spend more time drilling down into the complex aspects
of your application programs for greater understanding.
The available PF keys are displayed at the bottom of the screen. These provide a
basic set of screen manipulation and debugging commands. You can customize
the screen display and these keys to suit your testing and development needs.
To focus on a problem area, you can step line-by-line through the execution of an
application program. For example, when an application program stops for a
breakpoint, you can carefully examine each line that follows. Single-step
debugging, along with the ability to set dynamic breakpoints, allows you to
monitor, interrupt, and continue through the flow of the application program to
identify errors easily.
Debug Tool lets you count how many times a statement or verb has been
processed in an application program. This allows you to verify the coverage of
your application logic.
Dynamic Debug
The Dynamic Debug feature allows you to debug COBOL for OS/390 & VM
programs compiled without debug hooks. Debug hooks are added into the object
for the programs when you specify the TEST compiler option with any of its
sub-options (excluding NONE). Debug hooks increase the size of the object and
can decrease performance. Dynamic Debug allows you to create smaller objects
by removing the need for compiled-in debug hooks.
Because of the wide variety of environments in which Debug Tool runs, and the
number of programming languages it supports, we provide systems
programmers with a concise table of Authorized Program Analysis Reports
(APARs) in 4.1.1, “APAR information” on page 76.
1.5 Summary
The Problem Determination Tools have powerful functions and features.
Organizations that chose to use them gain the ability to improve the overall
health of their application portfolios.
We have outlined the basic features and functions of the Problem Determination
Tools:
Fault Analyzer
File Manager
Debug Tool
In the remaining chapters of this part, we delve into more detail about each of
these products. We include a chapter that describes how you can implement
these tools in your environment.
We start this chapter with a description of the software levels that are required to
use Fault Analyzer. We take a detailed look at how application programmers can
use the product. We continue by presenting a review of information that systems
programmers need to know to customize Fault Analyzer for their site. We briefly
review the creation and use of user exits, and present some useful information
that was discovered during our research. We conclude with a review of recent
product updates.
You can see the software level in the Fault Analyzer panel heading, as shown in
Figure 2-1.
If your PTF level is lower than the one we have listed, apply the necessary
maintenance. All of the examples in this book were developed at this
maintenance level.
Refer to 2.8, “Product updates” on page 37, for additional information about the
latest Fault Analyzer Version 1 PTF.
SYSLOG
User application
Abend
MVS abend
processing Fault Analyzer Fault history
Fault analysis
SYSMDUMP report
One of Fault Analyzer’s powerful features is its ability to use the application
program’s compiler listing to identify the source statement of the line that caused
the abend. Another feature that benefits you, a typical application programmer, is
its ability to make use of IBM’s vast library of error messages and abend codes.
The history file also shows you the line commands (displayed alphabetically) that
are available to process each entry in the list.
If Fault Analyzer performs a successful analysis, it will suppress the dump from
being written to any of the standard dump output statements. However, if there is
no compiler listing or side file available for use by the analysis process, the dump
will be written.
Fault Analyzer uses the compiler listing to analyze the cause of abend, list the
statement that caused the abend, and list the data values in the working-storage
section.
For details of the options required for other compilers, refer to the IBM Fault
Analyzer for OS/390 User’s Guide, SC27-0904.
The only reason to recompile an application program for use by Fault Analyzer is
if you did not use one of this options when the application program was originally
compiled.
Note: Fault Analyzer requires that the listings be saved as members in a PDS
or PDS/E.
To speed up the actual analysis, Fault Analyzer uses a side file. This is a
streamlined extract of the compiler listing that is much smaller in size. It contains
only the pertinent information Fault Analyzer needs to perform the fault analysis.
First, it looks for a side file in the data set specified by the IDILANGX option. If one
is not found, Fault Analyzer looks for a compiler listing in the data set specified by
the IDILCOB option. If one is found, a side file is generated; if not, there will be no
source line information — although the dump analysis continues.
When you create a side file, you can take advantage of the following benefits:
Reduced processing time
If a side file is available, Fault Analyzer does not have to generate one
dynamically from the compiler listing.
Decreased storage space
Side files are much smaller than compiler listings.
Example 2-1 contains a portion of a batch COBOL compile job. It has been
modified to invoke the program, IDILANGX, to create a side file for Fault
Analyzer.
Example 2-1 Sample batch compile job including the creation of a side file
//DAVIN6C JOB (12345678),’IDI TEST’,CLASS=A,MSGCLASS=H,MSGLEVEL=(1,1),
// REGION=32M,NOTIFY=&SYSUID
// JCLLIB ORDER=(IGY.V2R1M0.SIGYPROC) <== INSTALLATION
//*
//************************************************************/
//* IBM Problem Determination Tools */
//* Sample member IDISAMP1 */
//* */
//* THIS JOB RUNS A COBOL COMPILE and PRODUCES A SIDE FILE */
//* FROM A PROGRAM LISTING THAT FAULT ANALYZER CAN USE FOR */
//* OBTAINING SOURCE INFORMATION. */
//* THE COMPILE OUTPUT IS SAVED FOR USE BY A */
//* CHANGE MANAGEMENT TOOL */
//************************************************************/
//*
The fourth step, IEBGENR2, copies the listing back to the format that was
previously expected by the site’s change management system.
Note: You can specify the side file or compiler listing location before
re-analysis is done. Refer to 2.4.3, “Specifying listings to Fault Analyzer for
re-analysis” on page 26.
You enter the option number of the report you want, or position the cursor over
the highlighted number, and press Enter to display the details.
Synopsis section
This section lists the following information to get you started with your problem
determination:
The cause of the abend
The statement that caused the abend
The variables involved in the statement and their values at time of abend
Point-of-failure section
This section lists more details pertaining to the abend. Values in all the general
purpose registers, a list of open files (for batch only) and working-storage section
details can be viewed in this section. Figure 2-4 contains a display of the bottom
portion of this section. To view the details of the storage areas, the register
values, and the file buffer area, position the cursor over the highlighted text and
press Enter. Press PF3 or PF12 to get back to the original panel.
Events section
This section lists all of the events that occurred up to the point of failure. Figure
2-5 contains one such display. To view the details pertaining to any event,
position the cursor over the highlighted event number and press Enter. Another
panel that contains the details for that event is displayed. The details are similar
to those found in the Point-of-failure section.
This section pertains specifically to CICS abends. This section has the screen
buffer area of the application program.You can view the data that was entered by
a user in the application screen at the time of abend. It also has CICS trace
details.
Options-in-effect section
This section lists the Fault Analyzer system and user options-in-effect at the time
of the abend.
The batch re-analysis report has exactly the same format as the real-time
analysis report.
You specify the location of a compiler listing or side file in the same way that you
do for interactive re-analysis.
You can perform a batch re-analysis even for CICS application program abends.
You specify the dataset containing the compiler listing or the side file via the Fault
Analyzer DATASETS option. We will create a sample, shown in Example 2-2. The
side file library is identified by the IDILANGX sub-option. The compiler listing
library is identified by the IDILCOB sub-option.
Example 2-2 Portion of IDIOPTS member
DATASETS(IDILANGX(DAVIN7.FA.SIDEFILE)
IDILCOB(DAVIN7.FA.LISTINGS))
Enter the data set and member name in the last two fields on the panel. Figure
2-6 shows what a completed Interactive Options panel looks like.
Fault Analyzer has another option to control the abend analysis of CICS
transactions. To enable this control facility, a transaction must be defined that is
associated with the program IDIXFA.
For example, if the transaction is defined as IDCN, the following commands can
be issued:
IDCN INSTALL Enables CICS transaction abend analysis.
IDCN UNINSTALL Disables CICS transaction abend analysis.
IDCN Displays the current status of the Fault Analyzer exit.
IDIXDCAP exit
This exit can be used for both LE and non-LE batch application programs. For
Fault Analyzer to analyze an abend via this exit, a SYSMDUMP, SYSUDUMP, or
SYSABEND DD statement must be code in the job step.
Note: Fault Analyzer comes with a usermod, IDIUTAB, that eliminates the
need to code a SYSMDUMP, SYSUDUMP, or SYSABEND DD statement in
the JCL.
There is a sample job which includes IDIXDCAP in the IEAVTABX installation exit
list.
IDIXCEE exit
There is a sample job which will add IDIXCEE to the CEEEXTAN CSECT for
Language Environment for OS/390.
All user exits are normally passed two data structures. The first is a common
environment structure. The second is specific to the function being called.
How you use these user exits is entirely up to your site’s requirements, as well as
your imagination. We describe two examples in 2.7, “Hints and tips” on page 32.
For more specific information about these exits, refer to IBM Fault Analyzer for
OS/390 User’s Guide, SC27-0904.
Fault Analyzer comes with a sample batch job to help you with this.
As you can see, Fault Analyzer offers a tremendous amount of flexibility to enable
you to obtain critical information when and where you need it.
In this example, the options are specified as in-stream data. However, they can
also be specified in a data set. Here, we modify the following settings:
DETAIL, where we specify the LONG sub-option.
If the program, MYAPPL, abends, the options specified in the IDIOPTS DD will
be used to override the site’s default and customized settings.
We found that the Interactive Options and the Batch Options panels, which are
displayed when you initiate re-analysis, are a little short. As you can see in Figure
2-7, it is impossible to specify a value for the Member Name of the Options data
set, because it is hidden by the PF key values.
Figure 2-7 Fault Analyzer Interactive options panel with PF keys displayed
No command line is available, so you can not turn off the PF key display on this
panel. Fortunately, you can clearly see that you need to use PF12 to issue
CANCEL to delay the start of the analysis (and turn off your PF key display).
Example 2-4 is a REXX exec that uses the Fault Analyzer Analysis Control user
exit to determine where a fault entry will be directed.
Example 2-4 Analysis control user exit - REXX exec
/* Rexx */
/**********************************************************************/
/* Exec: SendIt2 */
/* Function: Send an abend to the appropriate FA fault history file...*/
/* History: 06/15/2001 - LMK - Created */
/**********************************************************************/
/* */
/* This exit can optionally be used with IBM Fault Analyzer for */
/* OS/390 to direct the output of batch abends to an appropriate */
/* fault history file. */
/* */
/* On entry, two stems are provided: */
/* - ENV */
/* - CTL */
/* Both of these data areas are described in the User’s Guide. */
/* */
/* To use this exit, the name of the EXEC (in this example, */
/* SENDIT2 is used, but this can be any name) must be specified */
/* in an EXITS option as follows: */
/* */
/* EXITS(CONTROL(REXX((SENDIT2))) */
/* */
/* For the exit to be invoked by Fault Analyzer, it must be made */
/* available via the IDIEXEC DDname: */
/* */
/* IDIEXEC(IDI.REXX) */
/* */
Exit
This REXX exec is identified to Fault Analyzer via the EXITS option in the
IDICFN00 member of SYS1.PARMLIB, as follows:
EXITS(CONTROL(REXX((SENDIT2)))
The USER_ID variable is interrogated to determine who submitted the batch job.
Production OPC jobs are sent to the production history file, while Test OPC jobs
are sent to the test history file. UAT OPC jobs are further distinguished by the
value of the JOB_CLASS variable.
All batch jobs submitted by the systems programmers (whose TSO user IDs all
begin with the letters, DAVIN) are sent to individual fault history files.
The Fault Analyzer Notification user exit permits a file to be sent via SMTP to a
valid e-mail ID. With this in mind, we crafted such a user exit. The complete text
appears in Appendix A, “Fault Analyzer Notification user exit” on page 192.
Restriction: The ability to use SMTP in the Notification user exit became
available after applying PTF UQ55392. See 2.8, “Product updates” on
page 37 for additional information.
Figure 2-8 depicts a sample e-mail produced by this exit during our testing.
After you apply this PTF, the software level in the Fault Analyzer main panel
heading changes, as shown in Figure 2-9.
We were able to use this utility to convert our fault history file. Refer to Appendix
A, “Fault Analyzer fault history file conversion” on page 207 for our experiences.
We did not have sufficient time during our residency to use the other utility
functions.
We start this chapter with a description of the software levels that are required to
use File Manager. Then, rather than reiterate the contents of the user’s guide, we
take a detailed look at how the product can actually be used. We continue by
presenting examples of utility functions that you can use or modify for your
needs. We briefly review the creation and use of templates, and present some
useful information that was discovered during our research. We conclude with a
review of recent product updates.
You can validate this by starting a File Manager session in ISPF. Type VER on the
command line and review the message that is displayed. The panel should match
the one in Figure 3-1.
If your PTF level is lower than the one we have listed, apply the necessary
maintenance. All of the examples in this book were developed at this
maintenance level.
An explicit reference to the File Manager load library is required only if File
Manager is not installed in LINKLIST. An explicit reference to the COBOL
compiler load library is required only if the COBOL compiler is not installed in
LINKLIST, and when copybooks are processed into templates.
If File Manager and the COBOL compiler are installed in LINKLIST at your site,
your systems programmer should modify the ISPF skeleton, FMNFTEXC. You
may either comment or remove the STEPLIB statement.
We decided to use the File Manager function DSC (Data Set Copy), along with
some simple REXX code, to perform a very selective global find and replace.
Scenario
As a Production Support Specialist, you need to help an application developer
set up a portion of their job stream for a User Acceptance Test (UAT).
You need to take the production job card members (not the procedures) that
were created for production, and convert them to UAT standards. The changes,
depicted in Table 3-1, need to be made.
Table 3-1 Modifications to make in selected members of a PDS
Field From To
MSGCLASS S J
Note: Any jobs that invoke the program FTP must be copied, but must not be
changed. These jobs contain the string XMIT2 in the accounting information
parameter of the JOB card. To ensure that no transmissions occur, the program
name in the procedure will be changed to from FTP to IEFBR14. (How to make this
change is not covered as part of this scenario.)
The File Manager program keyword DSC is used to invoke the Data Set Copy
function. The input and output files are identified, and the keyword PROC is used to
indicate that an in-stream REXX routine is being supplied.
The File Manager control cards indicate that all of the members should be
selected, and that if any already exist in the output file, they should be replaced.
This allows us to run this sample repeatedly.
The first line of the REXX routine selects only non-comment lines for processing.
Otherwise, the appropriate changes are made to the JCL and are written to the
output file.
Note: Each page in the report starts with the title, “IBM File Manager for OS/390.”
This report has been edited (represented by facing sets of slashes) to fit within
the confines of this section.
Example 3-2 Report of global find and replace
IBM File Manager for OS/390
$$FILEM DSC INPUT=DDIN,MEMBER=*,
$$FILEM OUTPUT=DDOUT,REPLACE=YES,
$$FILEM PROC=*
Member SVLD011P - Copied
12 record(s) copied: 0 truncated: 0 fields truncated
MEMBER NOT COPIED BECAUSE IT IS FOR TEST
Member SVLD011T - Copied
0 record(s) copied: 0 truncated: 0 fields truncated
The first page contains a copy of the input commands. This is followed by a
series of status messages that indicate the processing performed during the
copy.
The DSC function writes any of the PRINT statements from the REXX routine
before it writes its own statistics. These contain the name of the member and the
action taken (copied or replaced), followed by the number of records copied.
We found that when the number of records is zero (0), the member is not copied,
despite what the action indicates.
DSC
Copies data from one file to another. The file can be any of the File Manager
supported structures (VSAM, QSAM, or PDS).
CO
If the string being searched for is contained in the input record, then CONTAIN
returns 1. Otherwise, CONTAIN returns 0.
WRITE
Writes a record to the specified data sets. If the WRITE function is successful, it
returns a value of 0. If the WRITE function is unsuccessful, it raises the REXX
syntax error condition.
EXIT
In REXX, you can use the EXIT instruction to leave a procedure. You can
optionally specify a character string as a parameter on the EXIT instruction. This
character string is returned to the caller of the procedure.
STOP IMMEDIATE
The character string STOP IMMEDIATE tells File Manager to terminate the current
function without writing the current record to the primary output data set. When
used with DSC, the member is not copied.
In this example, we use File Manager to create one VSAM file by using another
as a model:
1. Access File Manager in your ISPF session.
2. Go to Catalog Services (Option 3.4) and list the VSAM files for your
application.
3. Select a file that has attributes which resemble those of the file you want to
create.
Note: If you are going to use the pull-down menus, your cursor must be on
the same line as the data set name. You can either scroll the list until the
file you want to work with is the first one displayed, or position your cursor
and press PF6 (PROCESS) to display the process pull-down.
4. Type LIST in the line commands area or select the Process pull-down and
select LIST.
The VSAM Entry Detail panel is displayed with information for the current file.
5. Press PF3 to return to the Data set list panel.
Figure 3-2 VSAM Define panel with model file’s attributes displayed
Important: You need to press a scroll key (PF7 or PF8) to remove the
message that is displayed. Do not press the Enter key. If you do, you will
receive an error message about duplicate catalog entries.
7. Change the data set name as well as the data and the index names to the
new file’s name by typing over the existing information.
Note: At this site, if we did not erase the value in the Catalog ID field, we
could not locate the file without explicitly pointing to the catalog and the
volume. Have your systems programmer validate the rules at your site with
your storage management group during a post-installation review.
If you need to, modify panel FMNPSCKD to set this field to null. A
completed example of this is shown in Appendix A, “File Manager ISPF
panel modifications” on page 198.
In this example, we use File Manager to perform that process; one that does not
depend on different control cards for each file size.
To start, you need an empty VSAM file. You can use the method described
previously or IDCAMS control cards.
1. Access File Manager in your ISPF session.
2. Go to Data Create Utility (Option 3.1).
3. Enter the name of the new VSAM file.
4. Indicate the number of Records to be created.
5. Specify a Fillchar of x’00’ (binary zeros).
9. Press Enter.
The JCL for the batch job is displayed. It should resemble the code in
Example 3-3.
Example 3-3 Batch step to create low values record in VSAM file using DSG
//FILEMAN EXEC PGM=FILEMGR
//STEPLIB DD DSN=FMN.SFMNMOD1,DISP=SHR
//* DD DSN=IGY.SIGYCOMP,DISP=SHR
//SYSPRINT DD SYSOUT=*
//FMNTSPRT DD SYSOUT=*
//SYSTERM DD SYSOUT=*
//SYSIN DD *
$$FILEM DSG DSNOUT=DAVIN6.VSAM.LO.VALUE,
$$FILEM FILLCHAR=x’00’,
$$FILEM DISP=OLD,
$$FILEM NLRECS=1
Save a copy of this JCL. In, “Modify the JCL for generic use” on page 51, we
show you how it can be applied to every VSAM file you’ll ever need to initialize.
The fill character is specified as a hexadecimal zero and the number of logical
records is specified as one.
Note: Each page in the report starts with the title, “IBM File Manager for OS/390.”
Example 3-4 Report of DSG low value record creation
IBM File Manager for OS/390
$$FILEM DSG DSNOUT=DAVIN6.VSAM.LO.VALUE,
$$FILEM FILLCHAR=x’00’,
$$FILEM DISP=OLD,
$$FILEM NLRECS=1
1 record(s) written
The first page contains a copy of the input commands. This is followed by a
message that states that the requested number of records were written to the
output file.
DSG
Initializes VSAM data sets, sequential data sets, and PDS members.
You specify the output data set name, the disposition, the number of logical
records, and the fill character.
To fill each byte of each record with data, specify one of the following:
char To write a character, such as 0, in each byte
X’cc’ To write a binary character, such as X’04’, in each byte
AN To write alphanumeric characters (A to Z and 0 to 9)
BIN To write binary characters (X'00' to X'FF')
RAND To write random binary characters (X'00' to X'FF')
We added the PROC statement and changed the DSG parameter DSNOUT to
OUTPUT. This lets you use an override statement in the JCL to point to your file.
You can now have a batch job, similar to the one in Example 3-6, specify a VSAM
file to initialize.
Example 3-6 Invoking the DSG proc using JCL with a file override
//DAVIN6CC JOB ,CLASS=A,NOTIFY=&SYSUID,MSGCLASS=H,MSGLEVEL=(1,1)
//*
//* JOB TO INITIALIZE A VSAM FILE
//*
//PROCLIB JCLLIB ORDER=DAVIN6.WORK.PROCLIB
//*
//VSAMINIT EXEC DSGPROC
//DSGPROC.DDOUT DD DISP=OLD,DSN=any.vsam.file.to.initialize
//
The File Manager step of the batch job, shown in Example 3-7, uses an in-stream
REXX routine to process the records. The complete batch job is shown in
Appendix A, “File Manager batch job to process multi-record file” on page 199.
Example 3-7 Batch step to split a file into multiple parts using DSC
//FM EXEC PGM=FILEMGR
//STEPLIB DD DSN=FMN.SFMNMOD1,DISP=SHR
//* DD DSN=IGY.SIGYCOMP,DISP=SHR
//SYSPRINT DD SYSOUT=*
//RECORDS DD DISP=SHR,DSN=DEMOS.PDPAK.SAMPLES(SAMPFIL1)
//REC01 DD DISP=OLD,DSN=DAVIN6.SPLIT.REC01
//REC02 DD DISP=OLD,DSN=DAVIN6.SPLIT.REC02
//REC03 DD DISP=OLD,DSN=DAVIN6.SPLIT.REC03
//EXTRA DD DISP=OLD,DSN=DAVIN6.SPLIT.EXTRA
//SYSIN DD *
$$FILEM DSC INPUT=RECORDS,
$$FILEM OUTPUT=EXTRA,
$$FILEM PROC=*
DDNAME = ‘REC’ || FLD(1,2)
IF NCO(FLD(1,2),1,2,3) THEN DO
WRITE(DDNAME)
EXIT ‘DROP’
END
/+
/*
The File Manager program keyword DSC is used to invoke the Data Set Copy
function. The input and output files are identified, and the keyword PROC=* is used
to indicate that an in-stream REXX routine is being supplied.
The first line of the routine sets a variable, DDNAME, to the value of RECxx,
where xx matches the two-byte value in the record (using the FLD function)
starting in position 1 for a length of 2.
The second line of the routine checks for the numeric contents of the same
portion of the record, to see if it matches 1, 2, or 3.
The result is that all type 01 records end up in REC01, type 02 records go to
REC02, type 03 records go to REC03, and all other record types go to the file
EXTRA.
Note: Each page in the report starts with the title, “IBM File Manager for OS/390.”
Example 3-8 Report of DSC multiple record split
IBM File Manager for OS/390
DSC WRITE summary report
------------------------------------------------------------------------------
Total records written to REC01 = 20
Total records written to REC02 = 20
Total records written to REC03 = 15
IBM File Manager for OS/390
67 record(s) read
12 record(s) copied: 0 truncated: 0 fields truncated
The first page contains the output of the record split operation (a copy). Notice
that you do not have to do any extra programming to obtain the number of
records sent to each file; File Manager does that automatically.
The second page contains the total number of records processed. In this case,
12 records did not meet any of the selection criteria, and were written to the
default file (EXTRA).
FLD
Returns the value of a field from the current input record (INREC), starting at
start_column, of length number of bytes, interpreted according to the specified
type:
B If the field is binary. If you specify B for type, length must be 2, 4, or 8.
C If the field contains characters.
NCO
If the numeric value of any of the match arguments is equal to the numeric value
of number, then NCONTAIN returns 1. Otherwise, NCONTAIN returns 0.
WRITE
Writes a record to the specified data sets. If the WRITE function is successful, it
returns a value of 0. If the WRITE function is unsuccessful, it raises the REXX
syntax error condition.
EXIT
In REXX, you can use the EXIT instruction to leave a procedure. You can
optionally specify a character string as a parameter on the EXIT instruction. This
character string is returned to the caller of the procedure.
DROP
The character string DROP tells File Manager to not write the current record to the
primary output data set.
Compare this with another product
The code to perform a similar extract of records, using IBM’s DFSORT, is shown
in Example 3-9.
Example 3-9 Batch step to split a file into multiple parts using DFSORT
//SORT EXEC PGM=SORT
//SYSOUT DD SYSOUT=*
//SORTIN DD DISP=SHR,DSN=DEMOS.PDPAK.SAMPLES(SAMPFIL1)
//REC01 DD DISP=OLD,DSN=DAVIN6.SORT.REC01
//REC02 DD DISP=OLD,DSN=DAVIN6.SORT.REC02
//REC03 DD DISP=OLD,DSN=DAVIN6.SORT.REC03
//EXTRA DD DISP=OLD,DSN=DAVIN6.SORT.EXTRA
//SYSIN DD *
SORT FIELDS=(1,2,CH,A)
OUTFIL FNAMES=REC01,INCLUDE=(1,2,CH,EQ,C’01’)
OUTFIL FNAMES=REC02,INCLUDE=(1,2,CH,EQ,C’02’)
OUTFIL FNAMES=REC03,INCLUDE=(1,2,CH,EQ,C’03’)
OUTFIL FNAMES=EXTRA,SAVE
/*
The code to perform this function with File Manager is shown in Example 3-10.
Example 3-10 File Manager string replace batch step
//*
//* FILE MANAGER BATCH: REPLACE A STRING IN A SPECIFIC LOCATION
//*
//STEP01 EXEC PGM=FILEMGR
//STEPLIB DD DSN=FMN.SFMNMOD1,DISP=SHR
//* DD DSN=IGY.SIGYCOMP,DISP=SHR
//SYSPRINT DD SYSOUT=*
//FMNTSPRT DD SYSOUT=*
//SYSTERM DD SYSOUT=*
//DDIO DD DISP=OLD,DSN=YOUR.FILE.TO.EDIT
//SYSIN DD *
$$FILEM DSU INPUT=DDIO,
$$FILEM PROC=*
OUTREC=OVERLAY(‘VALUE’,INREC,11)
/+
The utility reads records sequentially from the input file. When File Manager
processes them, it uses two built-in REXX variables, INREC and OUTREC to refer to
the input and output records.
The code to perform this function with File Manager is shown in Example 3-13.
Example 3-13 File Manager copy selected variably blocked records batch step
//*
//* FILE MANAGER BATCH: COPY SELECTED VB RECORDS TO TEST
//*
//STEP01 EXEC PGM=FILEMGR
//STEPLIB DD DISP=SHR,DSN=FMN.SFMNMOD1
//* DD DISP=SHR,DSN=IGY.SIGYCOMP
The DDOUT statement describes the new file that is allocated to contain the
records we want.
The File Manager program keyword DSC is used to invoke the Data Set Copy
function. The input and output files are identified, and the keyword PROC is used to
indicate that an in-stream REXX routine is being supplied.
The first line of the routine checks the contents of the record (using the function
FLD), starting in position 14 for a length of 1, to see if it matches one of the listed
values.
If it does, the second line of the routine writes out the records to the default
output file (DDOUT). Otherwise, the fourth line third line ignores the records.
The first page shows the number of records File Manager wrote to DDOUT; the
number of records that were not processed is not displayed.
Note: The COPY parameter RDW=3 indicates that the 4-byte record descriptor word
should be ignored; therefore, the field offset starts at position 1.
The file, DDIN (the default input file for the FCH function), is the PDS you want to
search.
The fifth line ignores any records that do not contain the strings.
12s IF=(1,0,C’UNIT=CART’),
14s IF CO(INREC,’UNIT=CART’) | ,
15s CO(INREC,’UNIT=TAPE’) THEN DO
833s // DISP=(,CATLG,DELETE),UNIT=CART,EXPDT=99000,
862s // DISP=(,CATLG,DELETE),UNIT=CART,EXPDT=99000,
9s SRCHFOR ‘UNIT=CART’
10s SRCHFOR ‘UNIT=TAPE’
IF CO(INREC,’UNIT=CART’) | ,
CO(INREC,’UNIT=TAPE’) THEN
EXIT
ELSE
EXIT ‘DROP’
Each of the members in which either one of the strings was found is listed. The
lines on which the strings were found are displayed.
Notice that our test file is still in use; no search was performed on this member
(otherwise, the string would have been found there, as well).
The summary statistics appear at the end of the report, along with a display of
the search commands.
The code to perform this search using ISPF SuperC is shown in Example 3-19.
Example 3-19 ISPF SuperC string find in a PDS batch step
//*
//* ISPF SUPERC BATCH: SEARCH FOR STRING
//*
//STEP01 EXEC PGM=ISRSUPC,
// PARM=(SRCHCMP,’ANYC’)
//NEWDD DD DISP=SHR,DSN=YOUR.CHANGE.MGMT.UAT.JCLLIB
//OUTDD DD SYSOUT=*
//SYSIN DD *
SRCHFOR ‘UNIT=CART’
SRCHFOR ‘UNIT=TAPE’
/*
Figure 3-4 Processing Options section on File Manager Data Set Edit panel
Whenever you use a copybook (or a template) to process a data set, and select
Above as a value for Processing Options, FIle Manager saves the relation
information in a member of your ISPF profile.
That information can be used at a later time. For example, you know you want to
edit a data set, but you do not recall the name of the copybook. If you select
Previous, you can leave the Copybook Data set name and Member fields blank.
File Manager will supply them, based on the retained profile information.
Please note: If the member is no longer in the data set, or if the data set no
longer exists, you will receive one of two messages:
No matching member name
Data set not found
Even with this drawback, this is still a useful feature for development efforts.
In this example, we use File Manager to create a template with the substituted
values.
1. Review the source statements in your application program to ascertain the
value of the pseudo-text and the string that should be substituted.
2. Access File Manager in your ISPF session.
3. Go to Templates (Option 7).
The Template Workbench panel is displayed.
4. Enter the copybook data set name and member.
5. Enter the template data set name and member.
6. Select the Options pull-down.
7. Type 1 to adjust your File Manager processing options, as depicted in Figure
3-5, and press Enter.
8. On the Set Processing Options panel, adjust the COBOL Replacing Options.
a. Enter the From string (the pseudo-text) that is found in the copybook.
b. Enter the To string (the string) that should be placed in the source program
at compile time.
Your panel should look similar to the one displayed in Figure 3-6.
You can have up to five sets of values specified in your profile. The substitution of
values is based on the strings found in your copybook.
File Manager displays data if a record structure matches a copybook file layout.
For File Manager to perform this task with different members from the same data
set, you must provide a single point of reference.
To do that, create a new copybook, and using a valid COBOL construct, include
the other copybooks in it using the COPY command. Figure 3-7 depicts what this
could look like.
File Manager can now manipulate this one copybook and create a template for it
like any other copybook. You now have the ability to view or edit the entire file
without any errors.
The code in this skeleton, shown in Example 3-21, determines if any of the Set
Processing Options (Option 0) values are filled in for a job card. If none of them
are, then a job card is created dynamically from this skeleton.
Example 3-21 ISPF skeleton FMNFTJOB
)CM
)CM ISPF file tailoring job card skeleton.
)CM
)CM IBM File Manager
)CM
)CM 5697-F20
)CM (C) Copyright IBM Corp. 2000 All rights reserved.
)CM
)CM The source code for this program is not published or otherwise
)CM divested of its trade secrets, irrespective of what has been
)CM deposited with the U.S. Copyright Office.
)CM
)CM Modifications:
Pay particular attention to the CLASS and MSGCLASS parameters in the JOB card.
We found that message class A does not produce any output at this site. It took
several phone calls and e-mails to verify the product was working correctly. After
we changed the message class to H, we saw the output we expected.
If the ISPF profile data set does not have enough space, at some point some
program product (not necessarily File Manager) will not be able to update a table.
You should review the default attributes of the ISPF profile data set that is created
for new application programmers and make any appropriate changes. Also,
consider reviewing the size of existing profile data sets to see if they are
approaching either a directory block or extent limit.
APF authorization
The IBM File Manager for OS/390 Installation and Customization Guide,
GC27-0814, has some interesting notes regarding File Manager and APF
authorization.
You can determine if File Manager is APF authorized only by executing a batch
job and including the following input statement:
$$FILEM VER
Note: Our recommendation for new users (even though we know it uses up
valuable screen realestate) is to turn on the function key display, until you get
used to the product. To do that, use either one of the following commands:
FKA ON
PFSHOW ON
We discovered File Manager has twenty-nine different keylists. Even to us, this
seemed excessive. But, as you navigate through the product, the function key
values change. You need to be aware of them.
Note: We found only one IBM product that had more keylists. That was IBM
BookManager, with 49. So do not say there is not trivia to be learned from our
redbook.
When you use Data Set Browse (Option 1) to browse a file which has a key and
you are using a copybook (or template), File Manager automatically displays a
screen with a Key field.
To locate a record whose key you know, simply type it (or the starting value) in
the Key field.
An example, shown in Figure 3-8, depicts a VSAM file at the first record of the
file.
Type s in the Key field and press Enter to automatically scroll to records that start
with that key, as shown in Figure 3-9.
This feature is only available when records are displayed in TABL or SNGL format
(which requires the use of a copybook or template). Regrettably, this feature does
not appear in Data Set Edit (Option 2).
We recommend that you always review the compilation listing. This way you can
quickly judge how much work is involved in correcting the error.
After you select Option 1, the compile listing is displayed using the Print Browse
function, as shown in Figure 3-11.
To locate the start of your copybook, issue the FIND command with your
copybook as the string. Then issue the RFIND command. (There is no LAST
operand for the Find command in the Print Browse function).
Review the listing to determine what changes are necessary. Press PF3 twice to
return to the Template Workbench panel.
You must extract the record layout and create a copybook to perform any file
manipulation with File Manager.
We suggest a certain amount of caution when doing this. You should use
whatever resources are available to you so that you do not create multiple
versions of the same file layout.
This may have been related to the Systems Managed Storage (SMS) rules at this
site.
Depending on your file structure, this might not matter to you. Otherwise, you will
have more data in your file than you may have intended. You just have to be
careful.
There are now three different elements of File Manager, all contained in one
program product:
File Manager for z/OS and OS/390 (the base product), for working with z/OS
or OS/390 data sets (QSAM data sets, VSAM data sets and PDS members)
File Manager/DB2 Feature, for working with DB2 data
File Manager/IMS Feature, for working with IMS data
When you type VER on the command line, the panel shown in Figure 3-12 is
displayed.
File Manager Settings (Option 0) have been split into several panels. There are
now more options to specify, as shown in Figure 3-13.
Field reference numbers begin counting from 1 at the start of each record type in
a template; field #1 always refers to the level-01 group item in the current record
type. (In Version 1, field reference numbers continued incrementing between the
record types in a template.)
When browsing or editing data in SNGL or TABL display format, the information
displayed next to (SNGL) or above (TABL) each field now includes: data type,
starting column and length.
Figure 3-14 depicts the how Version 1 displays data being edited in SNGL mode,
while Figure 3-15 depicts how Version 2 displays the same data.
Please keep this in mind: Debug Tool requires the TEST option at both
compile-time and runtime. Each use of the keyword has a different set of
sub-options. We include a review of each one so you can see how they are used.
CICS Version 4.1 PQ36558 Adds support for the Debug Tool side
file
CICS Transaction Server PQ36683 Adds support for the Debug Tool side
1.2 and 1.3 file
Additional notes
The following PTFs are currently required (at the time of writing) for the Dynamic
Debug feature:
OS/390 V2R6 and above
– UQ54286, UQ54287, and UQ54288 (or newer)
Important: Please note that these PTFs have been superseded by the
ones listed above:
OS/390 V2R6 through OS/390 V2R9
– UQ43269, UQ43270, and UQ43271
OS/390 2.10 and above
– PTFs UQ49030, UQ49031, and UQ49032
Note: COBOL for OS/390 & VM requires the appropriate PTFs to be applied to
support these options, as described in 4.1.1, “APAR information” on page 76.
TEST causes the compiler to create symbol tables and to insert program hooks at
selected points in your application program’s object module. Debug Tool uses the
symbol tables to obtain information about program variables and the program
hooks to gain control of your application program during its execution. Debug
hooks increase the size of the object module and can decrease runtime
performance.
You can specify any combination of sub-options; however, you can specify
SEPARATE only when SYM is in effect.
Usage notes
When you use TEST with or without any of the sub-options, the OBJECT compiler
option goes into effect.
When you use any TEST sub-option other than NONE, the NOOPTIMIZE compiler
option goes into effect. TEST(NONE,SYM) does not conflict with OPTIMIZE, which
allows you to debug optimized application programs with some limitations.
We recommend that you use LIST and retain the compiler listings, despite the
increase in output file size and the possible performance degradation of
Debug Tool. Refer to “Implementing the tools in your environment” on
page 101 for a complete discussion.
The attributes of the files that can be used by Debug Tool are listed in Table 4-3.
Table 4-3 Files created by the compiler for use by Debug Tool
Data set type Record Record Structure Notes
format length
//DAVIN6CC JOB
,CLASS=A,NOTIFY=&SYSUID,MSGCLASS=H,MSGLEVEL=(1,1)
//*
//********************************************************************
//********************************************************************
//********************************************************************
//*
// PARM=(DYNAM,LIB,RENT,APOST,MAP,XREF,LIST,NOSEQ,
// NONUMBER,’TEST(ALL,SYM,SEPARATE)’)
//STEPLIB DD DISP=SHR,DSN=IGY.V2R1M0.SIGYCOMP
//SYSLIB DD DISP=SHR,DSN=DAVIN6.PDPAK.COPYLIB
//SYSIN DD DISP=SHR,DSN=DAVIN6.PDPAK.SOURCE(TRADERB)
//SYSLIN DD DISP=(MOD,PASS),DSN=&&LOADSET,UNIT=SYSALLDA,
// SPACE=(CYL,(1,1))
//SYSUT1 DD SPACE=(CYL,(1,1)),UNIT=SYSALLDA
//SYSUT3 DD SPACE=(CYL,(1,1)),UNIT=SYSALLDA
//SYSUT4 DD SPACE=(CYL,(1,1)),UNIT=SYSALLDA
//SYSUT5 DD SPACE=(CYL,(1,1)),UNIT=SYSALLDA
//SYSUT6 DD SPACE=(CYL,(1,1)),UNIT=SYSALLDA
//SYSUT7 DD SPACE=(CYL,(1,1)),UNIT=SYSALLDA
//SYSPRINT DD DISP=SHR,DSN=DAVIN6.PDPAK.LISTING(TRADERB)
//SYSDEBUG DD DISP=SHR,DSN=DAVIN6.PDPAK.SIDEFILE(TRADERB)
//*
// PARM=’LIST,XREF,MAP,RENT’
//SYSLIB DD DISP=SHR,DSN=CEE.SCEELKED
//SYSLMOD DD DISP=SHR,DSN=DAVIN6.PDPAK.LOADLIB
//SYSLIN DD DISP=(OLD,DELETE),DSN=&&LOADSET
// DD DDNAME=SYSIN
//SYSPRINT DD SYSOUT=*
//SYSUT1 DD UNIT=SYSALLDA,DCB=BLKSIZE=1024,
// SPACE=(1024,(200,20))
//SYSIN DD *
NAME TRADERB(R)
In this example, we use the full capabilities of Debug Tool and specify the
SEPARATE sub-option of TEST. As a result, we also include a partitioned data set
and member name in the SYSDEBUG DD statement to contain the symbolic
table output.
4.2.6 Summary
Debug Tool requires the use of the COBOL compile time option TEST.
After Debug Tool is invoked, it gains control of your application program and
suspends execution to allow you to perform tasks like setting breakpoints,
checking the value of variables, or examining the contents of storage.
At the present time, the most effective location to execute a batch application
program is in TSO READY mode. This allows you to enter any of the necessary file
allocations and the application program invocation command.
You can also run your debug session in batch mode with a commands file
(script).
N o te :
(1 ) S E P A R A T E a n d N O S E P A R A T E a r e a v a ila b le o n ly f o r C O B O L fo r O S /3 9 0 p r o g r a m s .
Notes
When an asterisk is used in place of an actual commands file, the terminal is
used as the source of the commands. However, when Debug Tool is run in batch
mode, a commands file is required.
To generate this report, include the parameter when you execute your application
program, as shown in Example 4-2.
Example 4-2 Program execution requesting list of LE runtime options
//*********************************************
//GO EXEC PGM=TRADERB,
// PARM='/RPTOPTS(ON)'
//STEPLIB DD DISP=SHR,DSN=DAVIN6.WORK.LOADLIB
The complete report listing from this example can be found in Appendix A,
“Language Environment runtime options report” on page 201.
At the present time, Debug Tool does not provide a utility or mechanism to
allocate the files required for a debugging session. You must issue the TSO
ALLOCATE commands either at the READY prompt or by using a CLIST or REXX
exec.
If you are not familiar with using either CLISTs or REXX execs to perform file
allocations, the following books (depending on your coding language preference)
will come in handy:
OS/390 TSO/E CLISTs, SC28-1973
OS/390 TSO/E User’s Guide, SC28-1974
OS/390 TSO/E Command Reference, SC28-1969
OS/390 TSO/E REXX User’s Guide, SC28-1968
OS/390 TSO/E REXX Reference, SC28-1975
Additional information
Note: You can use a COBOL listing or a side file, but not both.
If you use the SEPARATE sub-option of TEST at compile time, you cannot specify a
compiler listing to Debug Tool at runtime. This is because the side file actually
contains the listing.
The contents of a Log file can be edited and then reused as a member of a
Commands file.
Note: When debugging in batch mode, use QUIT to explicitly end your session.
Example 4-4 Invoking Debug Tool via batch job
//DAVIN7X JOB CLASS=A,MSGCLASS=H,MSGLEVEL=(1,1),
// REGION=32M,NOTIFY=&SYSUID
//*****************************************************************
//* JCL TO RUN A BATCH DEBUG TOOL SESSION
//* PROGRAM TRADERB WAS PREVIOUSLY COMPILED WITH THE COBOL
//* COMPILER TEST OPTION
//*****************************************************************
//*
//STEP1 EXEC PGM=TRADERB,
// PARM=’/TEST(,INSPIN,,)’
//*
//STEPLIB DD DISP=SHR,DSN=DAVIN7.PDPAK.LOAD
// DD DISP=SHR,DSN=EQAW.V1R2M0.SEQAMOD
//SYSPRINT DD SYSOUT=*
//SYSABEND DD SYSOUT=*
//COMPFILE DD DISP=SHR,DSN=DAVIN6.PDPAK.COMPFILE
//CUSTFILE DD DISP=SHR,DSN=DAVIN6.PDPAK.CUSTFILE
//TRANSACT DD DISP=SHR,DSN=DAVIN7.PDPAK.TRANFILE
//REPOUT DD SYSOUT=*
//TRANREP DD SYSOUT=*
//*
//INSPIN DD DISP=SHR,DSN=DAVIN7.DT.COMMANDS
//INSPLOG DD SYSOUT=*,DCB=LRECL=72,RECFM=FB
//*
To invoke your application program with Debug Tool, you have two options:
Using TSO commands
Using the TSO Call Access Facility
TSO commands
To use the TSO command interface to start executing your application program,
issue the DSN command to invoke DB2. Then, issue the RUN subcommand and
include the TEST runtime option as a parameter.
For example:
CALL ‘change.mgmt.test.loadlib(progname)’ ‘/TEST’
Figure 4-3 depicts the panel displayed after the DTCN transaction is issued.
You enter any one or more of the fields of the application program you want to
debug:
Terminal ID
Transaction ID
Program ID
User ID
Change or enter any applicable LE runtime options that are needed for your
application program. This allows you to modify settings and include runtime
options without adding a user runtime module (CEEUOPT) to your program:
Press PF3 to return to the DTCN primary menu.
Press PF4 to save your changes.
Press PF3 to exit DTCN.
DTCN stores one profile for each DTCN terminal. This profile is retained until it is
explicitly deleted, or CICS is brought down.
Note: When you are finished testing, invoke DTCN. To turn off the profile,
press PF6 to delete the profile then PF3 to exit DTCN.
Figure 4-5 depicts the start of a debugging session with the sample batch
program, TRADERB.
Source window
This window displays the application program source or listing, with the current
statement highlighted. In the prefix area at the left of this window, you can enter
commands to set, display, and remove breakpoints. There is also an optional
suffix area on the right, that can be used to display frequency counts.
Log window
This window records and displays your interactions with Debug Tool and,
optionally, shows program output. This window contains the same information as
the log file. You can exclude STEP and GO commands from appearing by
specifying SET ECHO OFF in your Preferences file.
AT
The AT command defines a breakpoint. You temporarily suspend your application
program’s execution when you issue this command. You can review the
processing that has already taken place or issue other Debug Tool commands.
Example:
at line 334 list “about to setup files”;
go;
Result:
AT LINE 334
LIST "About to set up files" ;
GO ;
EQA1140I About to set up files
Examples:
clear at;
clear log;
Note: The last example does not clear the contents of a Log file directed to
SYSOUT in a batch job.
COMPUTE
The COMPUTE command assigns the value of an arithmetic expression to a
WORKING-STORAGE variable.
Example:
compute holdings = dec-no-shares * 10;
DESCRIBE
The DESCRIBE command displays information about the application program,
variables, and the environment.
Example:
describe attributes ws-current-date;
Result:
DESCRIBE ATTRIBUTES WS-CURRENT-DATE ;
EQA1102I ATTRIBUTES for WS-CURRENT-DATE
EQA1105I Its length is 8
EQA1103I Its address is 089826CD
EQA1112I 02 TRADERB:>WS-CURRENT-DATE
EQA1112I 03 TRADERB:>WS-YR XXXX DISP
EQA1112I 03 TRADERB:>WS-MM XX DISP
EQA1112I 03 TRADERB:>WS-DD XX DISP
DISABLE / ENABLE
The DISABLE command makes the AT breakpoint inoperative, but does not clear
it; you can ENABLE it later without typing the entire command again.
Example:
disable at statement 334;
LIST
The LIST command displays information about a program, such as the values of
variables, frequency information, and the like.
Use parenthesis around working storage variables to prevent any confusion with
actual LIST operands.
Example:
list (ws-current-date);
Refer to 4.6.4, “Recording how many times each source line runs” on page 99,
for an example of the use of LIST FREQUENCY.
MONITOR
The MONITOR command allows you to observe changes to WORKING-STORAGE
variables in the Monitor window while the program executes.
Example:
monitor list dec-no-shares;
Result:
MOVE
The MOVE command transfers data from one area of storage to another. This
allows you to manipulate the contents of WORKING-STORAGE variables, and possibly
alter the flow of the program as it executes.
Example:
move 250 to dec-no-shares;
QUERY
The QUERY command displays the values of Debug Tool settings and information
about the current program. There are more than 30 forms to this command.
Example:
query location;
SET
The SET command sets various switches that affect the operation of Debug Tool.
Example:
set echo off;
Result:
STEP and GO statements do not appear in the Log window, but they do go to the
Log file.
STEP
The STEP command causes Debug Tool to execute your program one (or more)
statements at a time.
Example:
step 5;
Result:
Debug Tool will execute five lines of code, one line at a time.
For a complete description of all the available commands, refer to Debug Tool
User’s Guide and Reference, SC09-2137.
Normally, debug hooks are added into the object module when you specify the
TEST compiler option with any of its sub-options (except NONE). Debug hooks
increase the size of the object and can decrease runtime performance. Dynamic
Debug allows you to create smaller objects by removing the need for compiled-in
debug hooks. It also gives you the benefit of not having to recompile your
COBOL application programs prior to moving them into production.
To prepare your application program for Dynamic Debug, compile it using the
TEST(NONE,SYM) compile-time option.
Important: No debug hooks are created; however, symbolic debug tables are
created.
The symbolic debug tables allow you to access variables and other symbol
information while you are debugging. The symbolic debug tables are placed in
the object by default. To further reduce the size of the load module, consider
using the other new feature of Debug Tool, called Separate Debug File.
To prepare your application program for Separate Debug File, you need to use
the new SEPARATE sub-option of the TEST compiler option. When you use the
SEPARATE sub-option, you also need to include a SYSDEBUG DD statement in
your JCL. The compiler stores the symbolic debug tables in the file or data set
specified on the SYSDEBUG DD statement.
You can use the SEPARATE sub-option with the Dynamic Debug feature to create
the smallest modules that are still debuggable. Use the following compiler option
to create these small modules:
TEST(NONE,SYM,SEPARATE)
If you have been saving your listings to help debug future production abends, you
can now save the separate debug files. The size of a symbolic debug table is
significantly smaller than a listing.
Note: To take advantage of the smaller load module and separate debug files,
you must modify your compiler procedures. This can have some impact on
your change management process.
CICS set-up
After Debug Tool has been properly installed, the following tasks still need to be
performed to set up the product.
1. Refresh the CICS definitions for Debug Tool.
You can find these definitions in the members EQACCSD and EQACDCT of
the EQAW.V1R2M0.SEQASAMP data set.
Note: If you have CICS Transaction Server, you can omit the EQACDCT
updates and remove the comments from the transient data definitions at the
end of EQACCSD.
2. Update the JCL that starts CICS:
a. Include Debug Tool’s load library (EQAW.V1R2M0.SEQAMOD) and the
Language Environment runtime load library (CEE.SCEERUN) in the
DFHRPL concatenation.
b. Include EQA00DYN from Debug Tool’s load library in the STEPLIB
concatenation. You can do this in one of two ways:
• APF authorize the EQAW.V1R2M0.SEQAMOD data set and add the
data set to the STEPLIB concatenation.
• Copy the EQA00DYN module from the EQAW.V1R2M0.SEQAMOD
data set to a library that is already in the STEPLIB concatenation.
c. Ensure that no DD statements exist for:
• CINSPIN
• CINSPLS
• CINSPOT
APAR information
As a final reminder, 4.1.1, “APAR information” on page 76, contains a list of the
software updates required to use these features.
NOTEST 3770
TEST(ALL,SYM,NOSEPARATE) BC00
TEST(NONE,SYM,SEPARATE) 38A0
Clearly, the first load module is the smallest; however, there does not appear to
be that much of a difference between it and the third.
Each site should make the determination based on their business practices.
Tip: You can specify the new debug file or data set name using the SET
DEFAULT LISTINGS or SET SOURCE commands in your command file.
Alternatively, you can press PF4 at the start of your debug session to select a
file from a list, or to enter a new location.
This is useful if your change management software moves the file to different
libraries when the load module is promoted.
Example 4-5 shows an extract of a log file from one of our sample application
programs.
Example 4-5 Frequency counts from TRADERB
* Frequency of statement executions in TRADERB
* 330.1 = 1
* 332.1 = 1
* 334.1 = 1
* 336.1 = 1
* 339.1 = 1
* 340.1 = 1
* 341.1 = 1
* 342.1 = 1
* 343.1 = 1
* 344.1 = 1
* 346.1 = 1
* 348.1 = 1
* 351.1 = 0
//\\
\\//
* 806.1 = 2
* 807.1 = 2
* 809.1 = 0
* 811.1 = 2
* 813.1 = 0
* Total Statements=264 Total Statements Executed=137 Percent
* Executed=52
Important: If you have read the preceding chapters, you know that Fault
Analyzer and Debug Tool each have an output component called a side file.
What, you mean you just jumped here for some quick answers? Shame on
you! Go back and (at least) read Chapter 1, “Overview of the Problem
Determination Tools” on page 3.
Please keep in mind: They are not the same file; they do not have the same
construct; they merely share the same name.
For Fault Analyzer to provide the greatest degree of failure analysis at the time an
application program abends, you need a compiler listing or a side file; you do not
need both.
When Fault Analyzer attempts to analyze an abend, it looks for source line
information in the following order:
It looks for a side file.
If one cannot be located, then Fault Analyzer looks for a compiler listing.
– If a listing is found, then Fault Analyzer generates a side file (and places it
in a temporary data set that will be deleted after the analysis is complete).
– If a listing cannot be found, then Fault Analyzer is not able to provide
source line detail, although it can still provide an analysis of the abend.
When application programmers have the source line information available at the
time of an abend, they need less time to correct the problem.
5.1.1 Listings
Fault Analyzer requires the assembler-level instructions contained in a COBOL
application program and the relationship among, and location of, the data areas
to determine the exact instruction at which an abend occurred.
Note: These options are also used by other, third-party debugging and dump
analysis program products to obtain the assembler-level instructions for either
compile-time or post-compile processing.
If you already have the DASD allocated (or designated) to retain these listings,
you can, and should, continue to save them.
Naming conventions
Fault Analyzer requires that the compiler listing be retained as a member of a
partitioned data set (PDS) or a PDS/E. The member name of the listing must
match the name of the program.
Note: If your site stores compiler listings as sequential files and you want to
use them with Fault Analyzer, there is a sample REXX exec that can help.
It is designed to read multiple sequential files and copy the contents into
members in a PDS. The program name must be appear in the data set name
for this routine to work. Refer to Appendix A, “Convert multiple sequential files
to members of a PDS” on page 203.
You use a Fault Analyzer program to create a side file from a compiler listing. To
do this, either add a separate step to the batch job during compile time, or specify
a batch or an interactive re-analysis during a Fault Analyzer ISPF session. We
described this process in detail in 2.3.3, “How to create a side file” on page 20.
We regret that we cannot show you an example, but coding one would have
been beyond the scope of this project.
After you create and store a side file, there is no benefit to Fault Analyzer in
keeping the listing. However, the listing is still beneficial to application
programmers who may need to review the fully expanded contents of their
application programs.
Listing 38
Side file 7
Model 1
You need to set up contingency plans, because economic conditions are forcing
you to reduce your application programming staff. Unfortunately, your application
programs abend with a greater frequency than permitted by service level
agreements.
With these changes in place, all CICS transactions in your production regions
and all production batch jobs will have a full analysis performed at the time they
abend.
Because the full analysis at the time of the abend can only be performed with a
side file on a going forward basis, we include a statement directing Fault Analyzer
to point to existing listing files.
This means that any application programmer responsible for the application
program will not need to spend any time searching for listings or other output in
the event of a failure. Fault Analyzer will display the source code of the line in
error.
For programs that are in development or any level of test, Fault Analyzer displays
a warning message that the side file and the load module do not match. An
example is shown in Figure 5-1.
Example 5-2 and Example 5-3 show how the appropriate statements can be
specified in an Options File for UAT and Development environments, respectively.
Example 5-2 Suggested parameters for UAT environment
IDILANGX(DATASET(UAT.BATCH.SIDEFILE
UAT.CICS.SIDEFILE))
IDILCOB(DATASET(UAT.BATCH.LISTINGS
UAT.CICS.LISTINGS))
IDILANGX(DATASET(TEST.BATCH.SIDEFILE
TEST.CICS.SIDEFILE))
IDILCOB(DATASET(TEST.BATCH.LISTINGS
TEST.CICS.LISTINGS))
The application programmer can submit the batch job to perform a re-analysis of
the dump and can review the analysis report in the JES spool.
Model 3
A budget freeze has been mandated at your site, but your DASD reserves are
slowly being consumed. You need an alternative method to perform dump
analysis.
In this model, only the Fault Analyzer side file is retained. Source line analysis is
obtained as needed:
Eliminate the retention of compiler listings.Modify the compile step of your
batch process to place the compiler listing into a member of a temporary
PDS.
Add the creation of side files to your compile process.
Add a new step to process the compiler listing and create a side file as a
member of a PDS/E.
Promote this new component through the life-cycle into production level data
sets.
Validate your change management process.
Use the same component type and library names for the side files.
5.1.5 Summary
We offered three models for implementing Fault Analyzer components in your
environment. Here is a summary of the advantages and disadvantages
associated with these models:
Advantages
The main advantage of these implementation plans is the ability to have a full
dump analysis — with complete source line information — at the time of abend or
during re-analysis.
Application programmers may still need to refer to the complete (i.e., fully
expanded) compiler listing for assistance during problem determination. As such,
retaining the compiler listings provides value.
You do not need a separately licensed program product option to translate the
assembler-level instructions to source code format at the time of an abend.
Disadvantages
Additional DASD may be required to store compiler listings or side files,
depending on the model you chose to implement.
Each site should determine whether the cost of additional DASD for storing side
files is offset by the ability to identify and to isolate problems quickly.
A template is the product of a COBOL compile step that reads and processes the
copybook.
5.2.1 Templates
You create templates in two ways:
Use the Template Workbench in File Manager.
In an Edit or Browse session, select the Edit copybook or template field.
Some File Manager functions in batch allow you to use templates; however, they
do not allow you to save them.
Note: IBM recommends that templates be created and saved to avoid the
overhead of compiling copybooks during each edit or browse session.
After a template is created, you can use REXX functions to provide additional
processing or record selection.
Naming conventions
Templates can perform different functions with respect to the records that are
displayed or selected.
File Manager does not mandate any form of naming convention, nor does it
impose any restrictions on how you name templates.
This function allows you to browse or edit an application file without specifying a
copybook or template. It does require you to establish this mapping at least once
before you take advantage of the product’s memory.
Model 1
Have your standards group review the naming standards for COBOL
components and establish a similar charter for naming templates.
You can choose to develop naming standards based on the function the template
performs:
Give a template the same name as the copybook from which it was derived.
Store these templates in data sets named like the ones in which copybooks
are contained.
If you choose to do this, establish a standard which states these templates
only perform a mapping function.
Give a template the name of the file it maps when multiple copybooks define
the file.
A specific example would be the creation of one template for multi-record files
that use several copybooks.
After you create an appropriate naming convention for your site, add it to your
standards documentation. Consider updating the File Manager ISPF Tutorial
panels with this information.
There is no utility to let you to compile copybooks into templates in batch mode,
either. This means you cannot take your existing copybook library and process it
in its entirety.
Restriction: All template creation must be done one member at a time and
must be done using the File Manager dialog.
Advantages
The main advantage of this implementation plan would be a greater
understanding of the data contents of application files. If you adhere to existing
conventions, the ability to locate new components can be accomplished more
quickly.
Templates can use REXX to perform pattern matching and record selection.
Disadvantages
The main disadvantage is the time it takes for your standards group to evaluate
how templates can be used and how they should be named.
Templates use REXX to perform various functions; some may consider this to be
a disadvantage. At issue are the answers to the following questions:
Do your application programmers know REXX?
How much money is in your training budget to provide REXX education?
Additional considerations
Despite providing the ISPF interface in the Template Workbench, the product
lacks the ability to automate the template creation. There is no utility to let you to
compile copybooks into templates in batch mode. Therefore, there can be no
interface with any change management system to automatically update a
template based on a change to a copybook.
The associations that File Manager builds between data set names and
copybook or template names is done on an individual ISPF user basis. This
function cannot be externalized, although the contents of the ISPF table can be
mapped.
For example, Brenda can create a template for the same application file that
Eddie is working with. Each of their templates may simply map the record layout
of the file. Or, each of their templates may perform different functions with the
data in the file.
Individual file associations can result in duplicated work for each application
programmer. If you develop standard naming conventions and control the
location of templates, you can reduce this re-work.
By using the TEST compile option (and any of the sub-options), you create a load
module that Debug Tool can use.
For Debug Tool to provide the ability to step through an application program, you
need a compiler listing or a side file; you do not need both.
5.3.2 Listings
Debug Tool, like other debugging products (e.g., those from Compuware and
Computer Associates), requires a compiler listing with the following compile-time
options:
SOURCE
LIST
MAP
XREF
However, unlike other tools, it does not use the assembler-level instructions from
the LIST option for its processing. It places its debug hooks directly in the load
module.
Note: These are the same compiler options required by Fault Analyzer; we
recommend that you use them.
Naming conventions
Debug Tool places no restrictions on how a compiler listing should be retained.
However, if it is stored in a PDS, the member name of the listing must match the
name of the program.
Naming conventions
Debug Tool places no restrictions on how a side file should be retained. However,
if it is stored in a PDS, the member name of the listing must match the name of
the program.
This model covers the key environments where you would most likely use Debug
Tool:
For development and test environments, use the full features of Debug Tool.
You can set your compile-time parameter to:
TEST(ALL,SYM)
In a production environment, do not use any Debug Tool features.
You can set your compile-time parameter to:
NOTEST
Alternatively, if you wish to have some level of Debug Tool support in
production, use the Dynamic Debug feature with Separate Side File support.
You can set your compile-time parameter to:
TEST(NONE,SYM,SEPARATE)
Advantages
The primary advantage is full debugging capabilities before the load module is
moved to production.
Disadvantages
Additional DASD is required to store compiler listings or side files to permit
debugging in development and test environments.
Each site should determine whether the cost of additional DASD for storing side
files is offset by the ability to trace through program logic.
Restriction: If you compile a program with the Separate Debug File feature,
and you somehow lose the side file, you must recompile the program. You
cannot use the listing in a debugging session; Debug Tool will not let you.
Fault Analyzer
Fault Analyzer uses these key components:
One (the compiler listing) is for geared for application programmers, the other
(the side file) is geared for the tool itself. The component used by the tool is
produced automatically.
The tool is easy for application programmers to understand and is easy for
them to use.
The tool might be time-consuming for systems programmers to customize.
File Manager
File Manager uses two key components:
One (the copybook) is geared for application programmers, the other (the
template) is geared for the tool. The component used by the tool is produced
manually; there is no batch facility to do this.
The tool provides some very useful features. However, it requires application
programmers to be skilled with REXX to be able to obtain the greatest benefit.
Debug Tool
Debug Tool uses three key components:
Again, one (the listing) is geared for application programmers, the other two
(the load module and the side file) is geared for the tool. The components
required by the tool can be produced automatically.
The tool is moderately difficult to use, but is extremely powerful. However, it
requires that the application program load module be modified by a
compile-time option. It also requires a different set of sub-options for full
debugging capability versus production runtime.
The only components that can be used jointly are the compiler listings and the
different side files. To produce these and maintain them with a change
management tool requires significant modification to existing processes.
Each site needs to review their requirements for ongoing development and
independently assess how they need to approach standardized problem
determination.
In part two we present a set of scenarios, in CICS and batch, that demonstrate
how to use the Problem Determination Tools.
Portions of these scenarios were adapted from a tutorial provided by the IBM
WebSphere Application Development Solution (ADS) for OS/390. You do not
need access to an ADS system to use this book. We have provided the means
for you to download the applications and run them, if you wish.
In the chapters that follow, we create scenarios based on the Trader application.
In each scenario, we deliberately introduce errors into the application to allow us
to demonstrate the functionality of the tools. We then describe, in detail, the
steps that you take to isolate the error and to correct the problem.
Product updates
Throughout this chapter we refer to the term, product updates. This denotes a
software upgrade to two of the Problem Determination Tools. Specifically, a PTF
for Fault Analyzer and a new version for File Manager.
The Company file contains the stock name and the past week’s quotes. The
Customer file contains a record for each customer and company that he or she
owns, including the number of shares held.
MYTRADM
CICS
Customer Company
MYTRADS
Note: When you invoke this application, you can use any username and
password. But, if you want to see the status from previous trading, use the same
username each time.
Customer Company
Site B
Tranfile TRADERB
Site C
Figure 6-2 Trader application: multiple remote site transactions with batch
Note: You should always lists the holdings of a username to determine the
number of shares in a portfolio before you begin to trade with it.
Product updates
After the products were updated, the application programs listed in Table 6-2
were created and installed.
Table 6-2 Application programs in the Trader application after product updates
Application program Subsystem Purpose
If you intend to follow these examples on your own, you also need the system
software. Refer to 6.3.1, “S/390 software prerequisites” on page 126. We assume
you have access to an S/390 with a similar configuration.
You use some of the demo files to build the application files, and others to build
the application programs.
Edit the JCL and change all instances of XXXXXXX to your TSO user ID.
After the job finishes, you need to edit the members of the JCL data set to
validate the following information:
DB2 load library and runtime library
COBOL compiler load library
Language Environment (LE) runtime library
CICS load library
In addition, you need to change the string, YOUR-TSO-USERID to your TSO user ID.
If you wish, you can use the File Manager Find/Change Utility to perform this
step.
Note: Make certain you validate the names of all of the product libraries
before you submit these batch jobs.
For some batch jobs, you need to pre-allocate your output data sets.
3. Create the two VSAM data sets (COMPFILE and CUSTFILE) with the
DEFVSAM1 job.
This loads the VSAM files with sample data.
4. Define all of the necessary application resources to CICS:
a. The four MYTRADxx programs from Step 2
b. The mapset MYTRAD
c. The transactions MYTD and TDB2
d. The two VSAM files from Step 3
These resource definitions are contained in DEMOS.PDPAK.JCL(PDPAK).
Review this file for changes that are applicable for your site’s standards.
5. To add these definitions to the DFHCSD, the CICS definitions list, use
DEMOS.PDPAK.JCL(DEFPDPAK).
Install the defined resources.
6. Create the DB2 plan, MYTRADD with DEMOS.PDPAK.JCL(BIND).
7. Grant execution access to this plan with DEMOS.PDPAK.JCL(GRANT).
8. Define the DB2 tables, CUSTOMER_DETAILS and COMPANY_DETAILS,
with DEMOS.PDPAK.JCL(TABLES).
9. Populate these DB2 tables with DEMO.PDPAK.JCL(DATA).
Product updates
The following software products were used after two of the Problem
Determination Tools were updated:
DB2 Universal Database for OS/390 V6.1
OS/390 SecureWay Communications Server V2.9, configured with SMTP
The Problem Determination Tools
– IBM Fault Analyzer for OS/390 Version 1, Release 1 (with Program
Temporary Fix, UQ55392)
– IBM File Manager for z/OS and OS/390 Version 2, Release 1
– IBM File Manager/DB2 Feature
It is possible that other levels of these software components may work, but the
applications were tested with the levels listed here.
Before you can start the applications, the subsystems must be started. On our
system, we issue the following chain of commands to display a list of all active
jobs and to sort them alphabetically:
=a;sd;da;prefix *;sort jobname
CICSC001 The CICS demonstration region, in which the CICS COBOL (and
DB2) demo application programs (MYTRADxx) are installed and
configured.
Buy shares
The following steps will buy shares:
1. Enter the number of shares to purchase.
2. A confirmation message is issued.
Sell shares
The following steps will sell shares:
1. Enter the number of shares to sell.
2. A confirmation message is issued.
6.5 Summary
We have presented an overview of the system where we developed the
scenarios that use the Trader application.
We provided you with instructions that helped you install the application software.
We outlined the steps necessary to set-up the applications.
We force the application to abend and describe, in detail, the steps needed to
identify the cause of an abend in the application, using Fault Analyzer. We then
describe how to manipulate the data to correct the problem, using File Manager.
Fault Analyzer
Ensure Fault Analyzer is correctly installed in your CICS region. See 2.5.2, “CICS
set-up” on page 27.
You must have a compiler listing or side file for the programs MYTRADMV and
MYTRADS.
File Manager
You need the copybooks that contain the record structure of the VSAM files
DEMOS.PDPAK.CUSTFILE and DEMOS.PDPAK.COMPFILE
Copybooks CUSTFILE and COMPFILE
Make sure you run the DEFVSAM1 batch job to load the VSAM files. Refer to
6.2.3, “Set up the applications” on page 124.
Before you start the application, access CICSC001, or your own CICS
application region.
The Logon screen of the application is displayed in Figure 7-1. A username and
a password are required to access the application.
Note: In the Trader application, navigation keys are displayed at the bottom of
each screen. PF3 is used to go back to the previous screen (except on the
Logon screen) and PF12 is used to terminate the application.
After you press Enter, the Company Selection screen is displayed, as shown in
Figure 7-2. This screen lists the companies you can trade.
On this screen, you select the trading option you want to perform:
Obtain real-time quotes for a company.
Buy additional shares of the company.
Sell existing shares of the company.
You continue by selecting each option, in turn.
This screen displays the price of the company’s share over the past seven days,
the number of shares held, and the value of those shares based on the current
day’s price:
The company’s share price is read from the VSAM file
DEMOS.PDPAK.COMPFILE (COMPFILE).
The details of the user’s portfolio, (e.g., the number of shares held), are read
from the VSAM file DEMOS.PDPAK.CUSTFILE (CUSTFILE).
Enter the number of shares you want to buy and press Enter.
The Options screen is re-displayed with a message in the lower, left-hand corner
of the screen indicating the status of the transaction.
If the process is successful, the value of the number of shares held is updated in
the CUSTFILE.
Enter the number of shares you want to sell and press Enter.
The Options screen is re-displayed with a message in the lower, left-hand corner
of the screen indicating the status of the transaction.
If the process is successful, the value of the number of shares held is updated in
the CUSTFILE.
Note: We wanted to perform all of the steps in this example and obtain the
screen shots at the same time. In reality, the process of writing this section
took several days. As a result, there are discrepancies with the dates and
times in some of the figures. These are not deliberate errors, and they are not
meant to mislead you.
Tip: If you need to switch to a different fault history file, do the following:
1. Select Options -> Change Fault History File Options.
2. Move your cursor to the appropriate file on the list and press Enter.
3. Press PF3 to display the fault history records.
Each abend is assigned a fault ID when it is recorded in the fault history file.
You can identify an abend by knowing the transaction ID and the date and
time at which it occurred.
In this example, the fault ID is F00052. An ASRA is listed in the Abend column.
4. Enter v in the line command area next to the fault ID, to view the details of this
abend.
The real-time analysis synopsis report is displayed, as shown in Figure 7-9. It
is generated at the time of the abend.
5. Look more closely at the report in Figure 7-9. You can see:
a. Program MYTRADS experienced the abend.
b. The detail of the abend; in this example it is a data exception.
c. A short explanation of the abend
d. An attempt to identify the instruction that caused the abend
The source statement cannot be identified because the compiler listing (or
side file) for the program MYTRADS was not available to Fault Analyzer
when the program abended.
Refer to Chapter 5, “Implementing the tools in your environment” on
page 101, for a discussion about using side files in production and
development environments.
2. Enter the name of the Options Data Set that contains the name of your
compiler listing or side file. For details about the options file and its contents,
refer to 2.4.3, “Specifying listings to Fault Analyzer for re-analysis” on
page 26.
Recall, the compiler listing or side file is required by Fault Analyzer to identify
the source line instruction in the program MYTRADS.
The summary panel of the Interactive Analysis report, shown in Figure 7-11,
is displayed after re-analysis is complete.
5. You determine the cause of the error by looking at the values of the variables.
9. Place the cursor over the highlighted text and press Enter.
This displays a listing of the WORKING-STORAGE section of the program:
– The data definitions are shown on the left-hand side of the listing.
– The data values, in character format, start on the right-hand side of the
listing.
– The data values, in hexadecimal format, are on the far right-hand side of
the listing.
10.Issue the following command to find the field in error:
Find ‘DEC-NO-SHARE’
11.Scroll up until you find the level-01 group item that contains this field, as
shown in Figure 7-14.
12.Split the ISPF screen and view this program’s (MYTRADS) compiler listing to
find out how this level-01 group item is loaded with data values.
Note: If you did not retain a compiler listing as output from the batch compile,
refer to the source code directly.
In this example, CUSTOMER-IO-BUFFER is loaded from a CICS READ INTO
statement as shown in Figure 7-15.
Now that you have determined the problem is with data in the CUSTFILE, you
can correct the error with the help of File Manager.
Note: You must disable and close the CUSTFILE in the CICS region before
you attempt to edit it. If you do not, File Manager will display the following error
message when you edit the file:
5. Press PF2 (ZOOM) to display the record in SNGL format, as shown in Figure
7-19.
File Manager displays the invalid data as a string of highlighted asterisks.
The data is displayed in character format, but to edit a packed decimal field,
you must switch to hexadecimal mode.
7. Scroll down to the field in the record that must be changed. In this example, it
is DEC-NO-SHARES.
The data in DEC-NO-SHARES is character value 100 (displayed as
F0F1F0F0). However, the field is defined as packed decimal.
8. Correct the data so that it matches the characters shown in Figure 7-21:
a. Clear the field of asterisks (ERASE EOF).
b. Enter the correct value in the field (100).
c. Press Enter.
Note: Once any data in the record is changed, all of the fields associated with
the record are highlighted.
9. Enter HEX OFF in the command line to return the fields to character format.
10.Press PF3 to save the changes and to end from the Edit session.
Note: Do not forget to enable and open the CUSTFILE in the CICS region.
This results in a successful execution, and the screen shown in Figure 7-22 is
displayed.
We detailed a process whereby Fault Analyzer was used to identify the cause of
an abend in the application. We continued with a description of File Manager’s
capability to identify and correct the data that caused the problem.
We force the application to produce incorrect output and describe, in detail, the
steps needed to identify the logic error in the application, using Debug Tool in
batch mode. We then describe how to step through the program to isolate and to
correct the problem, using Debug Tool in foreground mode.
Debug Tool
You must have a compiler listing or side file for the program TRADERB.
If you are not using the supplied batch job to compile this program, make sure
you specify the following compiler options:
LIST,XREF,MAP,RENT,TEST
If you prefer to use a side file instead of a compiler listing, include the SEPARATE
sub-option of the TEST compiler option. Recall the side file required by Debug
Tool is different from the one required by Fault Analyzer. See 4.2.3, “Required
output files” on page 79 for more details.
After the program processes the input file, it generates two output reports:
REPOUT, which contains a list of all customer portfolios.
TRANREP, which contains a detailed list of the transaction activity and
processing status.
The record layout for the Transaction file is shown in Table 8-2.
Table 8-2 Transaction file record layout
Column Description Field name
36 Dot FILLER
After the process completes successfully, the program updates the Customer file,
DEMOS.PDPAK.CUSTFILE.
The program also produces a Transaction report, as shown in Example 8-4. This
report lists the transaction file input request and the status of the processing. The
STATUS column in the report lists how the request was processed. If the
processing is successful, the message PROCESSED is printed, otherwise the
message *ERROR* is printed.
Example 8-4 Batch Trader application Transaction report listing BUY shares
-------------------------------------------------------------------------------
CUSTOMER COMPANY QTY REQ-TYP STATUS
-------------------------------------------------------------------------------
RB_DEMO IBM 30 BUY PROCESS
After the process completes successfully, the program updates the Customer file,
DEMOS.PDPAK.CUSTFILE.
The program also produces a Transaction report, as shown in Example 8-5. This
report lists the transaction file input request and the status of the processing. The
STATUS column in the report lists how the request was processed. If the
processing is successful, the message PROCESSED is printed, otherwise the
message *ERROR* is printed.
Example 8-5 Batch Trader application Transaction report listing SELL shares
-------------------------------------------------------------------------------
CUSTOMER COMPANY QTY REQ-TYP STATUS
-------------------------------------------------------------------------------
RB_DEMO IBM 30 SELL PROCESS
RB_DEMO Veck_Transport 25 SELL PROCESS
In this example, you have a Transaction file that contains the day’s trading activity
for the customer, RB_DEMO:
Buy 30 shares of IBM.
Buy 25 shares of Veck_Transport.
List the shares held by RB_DEMO.
The TRADERB application program reads the input from the Transaction file and
processes the requests. The results of the transaction processing is printed as a
report, as shown in Example 8-7.
Example 8-7 The TRANOUT report showing transactions processed
-------------------------------------------------------------------------------
CUSTOMER COMPANY QTY REQ-TYP STATUS
-------------------------------------------------------------------------------
RB_DEMO IBM 30 BUY PROCESS
RB_DEMO Veck_Transport 25 BUY PROCESS
Your business user, who reviews these reports on a daily basis, tells you there is
an error. He shows you the report from July 7th. It only lists the shares held by
the customer RB_DEMO in company Glass_and_Luget_plc, which doesn’t
reconcile with his account.
You check the Transaction Report (Example 8-8), and sure enough, it shows that
the buy requests for IBM and Veck_Transport were processed successfully. To
make sure, you access the CICS Trader application (see “Scenario 1: Using Fault
Analyzer and File Manager” on page 131) to review RB_DEMO’s account. The
shares for both of these companies are listed.
8.3.1 Using Debug Tool in batch mode to try to find the error
You figure that you will use Debug Tool to show you the flow of program so that
you can find out where the program is experiencing the problem. You can do this
by listing the paragraphs that are performed when the job executes.
To do this, you create a Commands file for Debug Tool commands, and instruct
Debug Tool to use this file at the start of the debug session.
This routine requests a listing of the line number and name of each paragraph
(label) in the program.
You include the TEST runtime option and to point to your Commands file. The
output from the Commands file will be directed to the JES spool (although it
could also go to a sequential file).
Submit this job. After the batch job completes, review the output of the Log file.
You recall that the Customer file has one record for every company in which the
customer holds shares.
When a transaction to list shares is processed, the program starts to read the
Customer file. It reads the records one at a time and prints the details, until the
record of a different customer is read.
You review the Transaction file and see the two transactions. You realize that it
does not matter if RB_DEMO had no shares in IBM and Veck_Transport before
the Trader batch job executed, because two records were written to the
Customer file when the program processed these records. One was for IBM and
another was for Veck_Transport.
You recognize that when TRADERB processes the record in the Transaction file
to list the shares held by RB_DEMO, the paragraph READ-CUSTFILE-NEXT should
be executed at least four times (one read past the current Customer record).
You look carefully at the Log file, which shows that READ-CUSTFILE-NEXT is only
executed twice. This proves to you there is a problem with the logic in the section
of the program that reads the Customer file.
Because you determined the problem occurs when reading the Customer file;
you decide to set a breakpoint when the START command is issued on the
Customer file.
5. Enter the string, “START-CUSTFILE” (in double quotes), on the command line
and press PF5 (FIND).
This finds the first occurrence of the string START-CUSTFILE. The cursor is
positioned on that line, as shown in Figure 8-2.
Tip: If you already know the line number, you can set the breakpoint by
entering the command explicitly on the command line:
AT 708;
But at this point, you can see the values of both the variables are still equal.
The value of the field, CUST-NM of KEYREC, is RB_DEMO and the value of the field,
COMP-NM of KEYREC, is Glass_and_Luget_plc. Control is transferred to the
CALCULATE-SHARE-VALUE paragraph and the record details are printed.
12.Continue to press PF2 until the next READ statement.
13.Check the values of these variables after the READ statement.
The values in the variables are different, as shown in Figure 8-5, and the READ
process for customer RB_DEMO is terminated.
You can see that the record is for RB_DEMO, because the field CUST-NM of
KEYREC has that value.
But the key value, KEYREC, is different from WS-CUST-KEY because the field
COMP-NM of KEYREC has a new value, IBM, and the variable WS-CUST-KEY still has
the old value.
Because these values are different, control is not transferred to the
CALCULATE_SHARE-VALUE paragraph, as shown in Figure 8-6, and the READ
process for this customer is terminated.
You found that saving the value of the previously read key value of the Customer
record and checking it with key value immediately after the next read is causing
the problem.
Because the customer has one record for every company in which he holds
shares, the program logic must be changed to check only the CUST-NM of KEYREC.
Saving the CUST-NM field of KEYREC and checking it just after a READ NEXT should
solve the problem.
After
MOVE CUST-NM OF CUSTOMER-IO-BUFFER TO WS-CUST-NM
After
PERFORM CALCULATE-SHARE-VALUE
UNTIL CUST-NM OF CUSTOMER-IO-BUFFER NOT EQUAL
WS-CUST-NM.
This results in a successful execution, and the report shown in Example 8-13 is
displayed.
Example 8-13 Successful output of TRADERB following program changes
CUSTOMER : RB_DEMO 07/07/2001
-----------------------------------------------------------------
COMPANY SHARES SHARE TOTAL
HELD VALUE COST
-----------------------------------------------------------------
Glass_and_Luget_plc 60 19.00 1,140.00
IBM 60 113.00 6,780.00
We explain the processing that is performed in the CICS DB2 Trader application.
We force the application to encounter an error and describe, in detail, the steps
needed to identify the cause of the problem in the application, using Debug Tool.
We then describe how to manipulate the data to correct the problem, using File
Manager/DB2.
Debug Tool
You must have a compiler listing or side file for the programs MYTRADMD and
MYTRADD.
If you are not using the supplied batch jobs to compile these programs, make
sure you specify the following compiler options:
LIST,XREF,MAP,RENT,TEST
You must also include the load module EQADCCXT in the link-edit step of the
compile job.
File Manager/DB2
You need the dynamically created templates for the DB2 tables
CUSTOMER_DETAILS and COMPANY_DETAILS
Make sure you run the TABLES batch job to create the DB2 tables, and then run
the DATA batch job to load the DB2 tables. Refer to 6.2.3, “Set up the
applications” on page 124.
There is no visual difference between this example and the one presented in
“Scenario 1: Using Fault Analyzer and File Manager” on page 131. Only the
back-end processing is different. This scenario uses DB2 tables instead of VSAM
files. Figure 9-1 depicts the processing that occurs in the CICS DB2 application.
CICS
Customer Company
MYTRADD
Figure 9-1 Trader application: single user transaction CICS with DB2
5. Select Option 1 on the Options screen to obtain real-time quotes and a listing
of the shares held.
The share details are listed, as shown in Figure 9-3. Note the Number of
Shares Held field has a value of 5.
Clearly, Deanna has pointed out a serious problem with this series of
transactions.
You believe the problem is with the data in the table, CUSTOMER_DETAILS, or
in the program that reads the table.
You decide to look first at the specific customer record in the database to see if
that will help you understand more about the problem.
Note: If your system contains only one active DB2 subsystem, File
Manager/DB2 automatically connects to that subsystem.
However, if you are working in an environment that contains more than one
active DB2 subsystem, you need to select a DB2 subsystem before File
Manager/DB2 can connect to it.
Overtype the ID of DB2 subsystem currently shown in the DB2 SSID field
with the ID of the active DB2 subsystem you want, and press Enter.
12.Issue the FIND command (just like you would in ISPF) to find the record for
customer RB_DEMO.
Figure 9-8 File Manager/DB2 Table Browse panel after a FIND command
13.Issue the RFIND command until you locate the record that has a value of IBM in
the COMPANY column.
In Figure 9-8, you can see that the value in the NO_SHARES column is -5. This is
incorrect data in the database.
At this point, you believe the problem is due to faulty logic in the program that
updates the CUSTOMER_DETAILS table.
You review the compiler listing to get an overview of the program and to see
where the table is processed.
2. Enter the program name MYTRADMD in the Program ID field and press
Enter.
3. Press PF4 to save the profile.
4. Repeat these two steps for program MYTRADD.
Note: We could have used just the terminal ID to achieve the same results.
5. Press PF3 to exit from this screen.
6. Enter the transaction ID TDB2.
The debugging session starts, as shown in Figure 9-10.
7. Issue the following commands on the command line to stop the program’s
execution when the program MYTRADD is invoked:
AT APPEARANCE MYTRADD;
AT ENTRY MYTRADMD::>MYTRAD
8. Press PF9, to cause the program to run.
9. Press PF9 repeatedly and enter the appropriate values until the Shares-Buy
screen is displayed.
10.In the Shares-Buy screen, enter 5 in the Number of Shares to Buy field and
press Enter.
11.Press PF9 to continue program execution.
The program stops when the program MYTRADD is invoked.
12.Issue the following command to monitor the value of the NO-SHARES field (the
host variable for the column NO_SHARES in the CUSTOMER_DETAILS table):
MONITOR LIST NO-SHARES;
The value of this variable is displayed in the Monitor window, as shown in
Figure 9-11.
13.Press PF2 to step through the program one line at a time. As you do, keep
monitoring the value of NO-SHARES in the Monitor window.
You see that the value in NO-SHARES is -5, as shown in Figure 9-12, after the
record in the CUSTOMER_DETAILS table is read in the READ-CUSTOMER-TABLE
paragraph.
14.Press PF2 to check the program flow before the program updates the
Customer table.
Conclusion 1: The Buy process actually zeros the value; therefore, the display
shows zero number of shares.
You continue the debugging session to review the Sell processing portion of the
program.
15.Press PF9.
The Options screen is displayed.
16.Select Option 3 and press Enter.
17.Enter 5 in the Number of Shares to Sell field.
18.Press PF2 to step through the program one line at a time. You continue to
watch the value of NO-SHARES in the Monitor window.
You can see that the value of NO-SHARES after the READ-CUSTOMER-TABLE
paragraph is executed is 0, as shown in Figure 9-14.
Figure out how to correct this problem. You need to add a validation routine to the
program that encapsulates the following logic:
We detailed a process which used Debug Tool, running under CICS, to identify a
problem with the logic in the application. We continued with a description of File
Manager/DB2’s capability to correct the data that resulted from the problem.
Part 3 Appendixes
Note: SMTP must be established at the site for this exit to work.
Example: A-1 Fault Analyzer Notification user exit RWAKUP3AM
/* Rexx */
/**********************************************************************/
/* Exec: WakUp3AM */
/* Function: Send an e-mail to notify programmer of application abend */
/* History: 06/15/2001 - LMK - Created */
/* 07/09/2001 - LMK - Modified to use DEST parm of IDIALLOC */
/**********************************************************************/
/* */
/* This exit can optionally be used with IBM Fault Analyzer for */
/* OS/390 to notify application developers of a production batch */
/* abend via e-mail. */
/* */
/* On entry, two stems are provided: */
/* - ENV */
/* - NFY */
/* Both of these data areas are described in the User’s Guide. */
/* */
/* To use this exit, the name of the EXEC (in this example, */
/* WAKUP3AM is used, but this can be any name) must be specified */
/* in an EXITS option as follows: */
/* */
/* EXITS(NOTIFY(REXX((WAKUP3AM))) */
/* */
/* For the exit to be invoked by Fault Analyzer, it must be made */
/* available via the IDIEXEC DDname: */
/* */
/* IDIEXEC(IDI.REXX) */
/* */
/**********************************************************************/
/* Processing:
If a batch job abends on any system other than System A, then:
- Obtain all of the system-level variables and formulate a message
- Get the application ID (3rd field in accounting info)
- Read the Contact list; make sure it is sorted; ignore any comments
- Look up the application and find the person’s name and e-mail id
- Use SMTP to send the message to his/her text pager
/**********************************************************************/
If Env.Job_Type = ‘C’ | ,
Env.System_Name = ‘ASYS’ Then Do
/* You don’t want to process anything from CICS or Sys A */
&FMNVDCAT = &Z
-------------------------------------------------------------------------------
Installation default ABPERC(NONE)
Installation default ABTERMENC(ABEND)
Installation default NOAIXBLD
Installation default ALL31(OFF)
Installation default ANYHEAP(16384,8192,ANYWHERE,FREE)
Installation default NOAUTOTASK
Installation default BELOWHEAP(8192,4096,FREE)
Installation default CBLOPTS(ON)
Installation default CBLPSHPOP(ON)
Installation default CBLQDA(OFF)
Installation default CHECK(ON)
Installation default COUNTRY(US)
Installation default NODEBUG
Installation default DEPTHCONDLMT(10)
Installation default ENVAR(““)
Installation default ERRCOUNT(0)
Installation default ERRUNIT(6)
Installation default FILEHIST
Default setting NOFLOW
Installation default HEAP(32768,32768,ANYWHERE,KEEP,8192,4096)
Installation default HEAPCHK(OFF,1,0)
Installation default
HEAPPOOLS(OFF,8,10,32,10,128,10,256,10,1024,10,2048,10)
Installation default INFOMSGFILTER(OFF,,,,)
Installation default INQPCOPN
Installation default INTERRUPT(OFF)
Installation default LIBRARY(SYSCEE)
Installation default LIBSTACK(4096,4096,FREE)
Installation default MSGFILE(SYSOUT,FBA,121,0,NOENQ)
Installation default MSGQ(15)
Installation default NATLANG(ENU)
Installation default NONONIPTSTACK(4096,4096,BELOW,KEEP)
Installation default OCSTATUS
Installation default NOPC
Note: The member name must appear as a qualifier in the data set name for this
routine to work.
Example: A-5 Sequential files to members of a PDS
/* Rexx */
/**********************************************************************/
/* Exec: Seq2PDS1 */
/* Function: Create PDS members from multiple sequential files... */
/* History: 08/07/1997 - LMK - Created */
/* 07/17/2001 - LMK - Modified for RedBook samples... */
/**********************************************************************/
/* Note: Data set/member name pattern is hardcoded - modify as needed!*/
/**********************************************************************/
Copybooks DEMOS.PDPAK.COPYLIB
Samples DEMOS.PDPAK.SAMPLES
JCL DEMOS.PDPAK.JCL
After the PTF was applied and the system was IPLed, we invoked the Fault
Analyzer ISPF dialog to see if we could view the old VSAM fault history file.
Figure B-1 depicts the messages that were issued.
Figure B-1 Fault Analyzer TSO messages during attempt to access old format
We are not at all certain what the second message means, but this screen shot
was sent to the development team.
Even after issuing these messages, the Fault Analyzer ISPF dialog did not end,
but continued processing. Figure B-2 depicts the panel that was displayed. It
contains no records, although it proudly indicates the PTF number in the title.
We used one of our example CICS programs to force a dump, just to see what
would happen. The JES log, displayed in Figure B-3, contains the error message
(IDI0060S) issued by Fault Analyzer. This CICS dump was not added to the file.
Figure B-3 JES log with error message during attempt to dump to old format
Note: The Adobe Acrobat PDF version of the updated user’s guide is included
as a member in the IDI.SIDIBOOK data set.
The input fault history file, IDIHFIN, is one of the test files that was created during
this project. The utility simply reads this file.
The new file has one member, $$INDEX, which contains a pointer to each of
these entries.
As you can see, the actual space occupied by the file remains almost the same.
Note: To access the new file, you must select Options —> 1. Change Fault
History File Options, and enter the file name in the Fault History File field.
Figure B-4 depicts Fault Analyzer with the converted fault history file displayed.
Notes
We also performed a conversion on the existing IDI.HIST file, which was created
at the end of August 2000. We found the following idiosyncrasies:
CICS entries did not have valid transaction IDs listed, although all of the data
was valid.
Entries for new CICS transaction abends were listed correctly.
Fault Analyzer reuses empty fault IDs.
We deleted some entries from the file early in the project. The new fault ID
numbers appear at the top of the list.
We logged these items with the development team. The effect of this behavior is
shown in Figure B-5.
Select the Additional materials and open the directory that corresponds with
the redbook form number, SG246296.
The extracted files are all in binary format. They are the output of the TSO
TRANSMIT command.
Use your mainframe file transfer protocol to upload the binary files. You must use
the following attributes: FB, LRECL=80, BLKSIZE=3120.
After each file is uploaded, issue the following command from the TSO READY
prompt:
RECEIVE INDA(xxxx)
The default high-level qualifier assigned to the file will be your TSO user ID.
Note: You can delete the zipped file and the temporary folder after you finish
uploading all of the files.
The publications listed in this section are considered particularly suitable for a
more detailed discussion of the topics covered in this redbook.
IBM Redbooks
For information on ordering these publications, see “How to get IBM Redbooks”
on page 218.
Other resources
These publications are also relevant as further information sources:
COBOL for OS/390 & VM Language Reference, SC26-9046
COBOL for OS/390 & VM Programming Guide, SC26-9049
Debug Tool User’s Guide and Reference, SC09-2137
IBM Fault Analyzer for OS/390 User’s Guide, SC27-0904
IBM File Manager for OS/390 User’s Guide and Reference, SC27-0815
Language Environment for OS/390 & VM Programming Reference,
SC28-1940
OS/390 MVS JCL Reference, GC28-1757
OS/390 SecureWay Communications Server IP User's Guide, GC31-8514
OS/390 TSO/E CLISTs, SC28-1973
OS/390 TSO/E Command Reference, SC28-1969
OS/390 TSO/E REXX Reference, SC28-1975
OS/390 TSO/E REXX User’s Guide, SC28-1974
OS/390 TSO/E User’s Guide, SC28-1968
Newly released publications are available for the latest versions of some Problem
Determination Tools:
IBM File Manager for z/OS and OS/390 User’s Guide and Reference,
SC27-1315
IBM File Manager for z/OS and OS/390 DB2 Feature User’s Guide and
Reference, SC27-1264
Redpieces are Redbooks in progress; not all Redbooks become Redpieces and
sometimes just a few chapters will be published this way. The intent is to get the
information out much quicker than the formal publishing process allows.
Information in this book was developed in conjunction with use of the equipment
specified, and is limited in application to those specific hardware and software
products and levels.
IBM may have patents or pending patent applications covering subject matter in
this document. The furnishing of this document does not give you any license to
these patents. You can send license inquiries, in writing, to the IBM Director of
Licensing, IBM Corporation, North Castle Drive, Armonk, NY 10504-1785.
Licensees of this program who wish to have information about it for the purpose
of enabling: (i) the exchange of information between independently created
programs and other programs (including this one) and (ii) the mutual use of the
information which has been exchanged, should contact IBM Corporation, Dept.
600A, Mail Drop 1329, Somers, NY 10589 USA.
The information contained in this document has not been submitted to any formal
IBM test and is distributed AS IS. The use of this information or the
implementation of any of these techniques is a customer responsibility and
depends on the customer's ability to evaluate and integrate them into the
customer's operational environment. While each item may have been reviewed
by IBM for accuracy in a specific situation, there is no guarantee that the same or
similar results will be obtained elsewhere. Customers attempting to adapt these
techniques to their own environments do so at their own risk.
Any pointers in this publication to external Web sites are provided for
convenience only and do not in any manner serve as an endorsement of these
Web sites.
Java and all Java-based trademarks and logos are trademarks or registered
trademarks of Sun Microsystems, Inc. in the United States and/or other
countries.
Microsoft, Windows, Windows NT, and the Windows logo are trademarks of
Microsoft Corporation in the United States and/or other countries.
UNIX is a registered trademark in the United States and other countries licensed
exclusively through The Open Group.
SET, SET Secure Electronic Transaction, and the SET Logo are trademarks
owned by SET Secure Electronic Transaction LLC.
A compiler options 19
abend 18, 20, 22, 23, 140 convert multiple sequential files to members PDS
ADATA 6 203
analysis control 28 Copy 10
analysis of abends when calling MQ Series 7 COPY REPLACING 63
application programming interfaces 7 copybook 8, 10, 61, 64, 70, 133, 146
Assembler 18
D
B DATASETS 29
batch 17, 20, 25, 48, 86, 104, 120, 154, 160, 173, DB2 9, 10, 11, 18, 72, 75, 124, 126, 127, 171, 173,
210 177, 185
batch report tailoring 28 DB2 Universal Database for OS/390 126
batch utility program 7 Debug Tool
breakpoint 13 application program debugging 83
Browse 10 components 112
dynamic debug 13
full screen 11
C implementation 113
C/C++ 5, 11, 18
interface 90
CBLRUN 21
new features 94
CICS 11, 17, 18, 24, 25, 27, 48, 75, 80, 85, 86, 88,
overview 11
97, 104, 120, 121, 124, 125, 128, 131, 132, 133,
prepare application program 77
139, 145, 151, 171, 173, 187, 209
separate debug file 14
CICS domain control block mapping 7
tasks 13
CICS system abend support 7
Debug Tool commands
CICS Transaction Server for OS/390 126
AT 91
COBOL 5, 7, 8, 9, 10, 11, 13, 18, 19, 20, 21, 41,
CLEAR 92
65, 73, 75, 77, 78, 79, 80, 81, 82, 86, 101, 102,
COMPUTE 92
108, 112, 124
DESCRIBE 92
COBOL listing 86
DISABLE 92
code listings
ENABLE 92
components of the demo application 191
GO 93
convert multiple sequential files to members of
LIST 93
a PDS 191
MONITOR 93
Fault Analyzer notification user exit 191
MOVE 93
File Manager batch job to process multi-record
QUERY 93
file 191
SET 94
File Manager ISPF panel modifications 191
STEP 94
Language Environment run-time options report
Debug Tool components
191
compiler listing 112
Commands file 86
load module 112
compiler listing 102
side file 112
compiler listing read 28
demo files 123
M S
MAP 19, 102, 112
Save file 86
MAXFAULTNUMBER 30
Scenario 1
MAXMINIDUMPPAGES 30
abend 138
message and abend code explanation 28
components 132
monitor window 12, 91
Fault Analyzer/File Manager 132
MQ Series support 7
initiating interactive re-analysis 141
multi-record files 9
using File Manager to correct data 146
MVS 28
viewing abend with Fault Analyzer 139
walkthrough 133
N Scenario 2
NCONTAIN 9 components 154
Index 223
Debug Tool in foreground 163 user ID 88
tracking a problem 158
using Debug Tool/batch mode 160
walkthrough 155
V
VM 11, 13, 19, 75, 78, 101
Scenario 3
VSAM 7, 38, 41, 42, 46, 47, 50, 51, 72, 120, 138,
components 172
146, 155, 207
File Manager/DB2 and Debug Tool 171
tracking a problem 174
using Debug Tool 179 X
using File Manager/DB2 185 XMIT 44
viewing data in File Manager/DB2 176 XPCABND 27
walkthrough 173 XREF 19, 102, 112
Scenarios
applications set up 124
components 205
Z
z/OS 4, 9, 11, 17, 18, 41, 72
demo code 215
demo file installation 123
overview 120
programs overview 120
system configuration 125
validate installation/configuration 127
security 7
separate debug file 13, 94
SEPARATE debug file (side file) 86
side file 6, 19, 22, 102, 103
SOURCE 19, 79, 102, 112
source window 12, 91
SQL 10
subsystem security 7
SYSABEND 28
SYSMDUMP 28
SYSUDUMP 28
T
TALLY 9
Templates 8, 61
terminal ID 88
TEST 14
trace table analysis 7
transaction ID 88
TSO 35, 83, 85, 87, 163, 208, 216
TSO Call Access Facility 87
U
UNIX 18
user acceptance test 42, 107
user exits 6
User ID 18
Introduction to the
IBM Problem
Determination Tools
Overview of the This IBM Redbook describes the IBM Problem Determination
Problem Tools and includes scenarios that show how to use the tools
INTERNATIONAL
Determination Tools to recognize, locate, and fix applications. The products TECHNICAL
offering included in this suite of tools are: SUPPORT
ORGANIZATION
IBM Fault Analyzer, which helps you find the cause of abends
Introduction to Fault
in application programs. You can use it for problem
Analyzer, File
determination while developing application programs or while
Manager, Debug Tool they are in production. BUILDING TECHNICAL
INFORMATION BASED ON
Hints and tips for PRACTICAL EXPERIENCE
IBM File Manager, which provides powerful functions for you
using the tools to use as an application developer or system support person.
Utilities provide the ability to: IBM Redbooks are developed by
-Browse or update VSAM data, tape, or disk volumes the IBM International Technical
-Define, display, change, and delete catalog entries Support Organization. Experts
from IBM, Customers and
-Search data sets and records for specific data
Partners from around the world
-Manipulate DB2 data and IMS data create timely technical
information based on realistic
IBM Debug Tool, which is a robust, interactive source-level scenarios. Specific
debugging tool. It helps you examine, monitor, and control the recommendations are provided
to help you implement IT
execution of programs written in C/C++, COBOL, PL/I, or Java
solutions more effectively in
(each compiled with the appropriate IBM compiler) on a z/OS, your environment.
OS/390, MVS, or VM system. Debug Tool supports debugging
of applications in various subsystems including CICS, IMS,
and DB2.
For more information:
ibm.com/redbooks