PS 001 Fatproc R0
PS 001 Fatproc R0
CS No: POC250-A2200XX-PS-001
FACTORY ACCEPTANCE TEST (FAT) PROCEDURE EMR No : M1284-EMR-P-002
Cust. No: -
Customer Name : PT PLN ( Persero) UPDK Mahakam Rev. No: 0 Rev. Date: 02-Nov-22
PENGADAAN MATERIAL DAN JASA
UPGRADE OVATION UL PLTGU TANJUNG
BATU
DISAPPROVED
- NOT RELEASE FOR FABRICATION, CORRECT AND RE-SUBMIT FOR APPROVAL
For
This document is the property of PT. Control Systems Arena Para Nusa. It must not be stored, reproduced or disclosed to others whithout written authorisation from PTCS.
Document Number:
CS No: POC250-A2200XX-PS-001
FACTORY ACCEPTANCE TEST (FAT) PROCEDURE EMR No : M1284-EMR-P-002
Cust. No: -
Customer Name : PT PLN ( Persero) UPDK Mahakam Rev. No: 0 Rev. Date: 02-Nov-22
Document/Page
No. Customer Comments Engineering Responses Remarks
Reference
This document is the property of PT. Control Systems Arena Para Nusa. It must not be stored, reproduced or disclosed to others whithout written authorisation from PTCS.
Project
Document Number: Project Title:
Number:
FAT PROCEDURE
Revision History
Rev Date Description Developed by Reviewed by Approved by
ISSUED FOR RICHA DHARMVIR DHARMVIR
0 31-10-2022
APPROVAL GAVANE SINGH SINGH
Distribution List
Name Company Name Company
CONFIDENTIAL
This document, including the information it contains, is the property of Emerson Automation Solutions. It must be held in
strict confidence and properly safeguarded by the recipient at all times. It may not be copied or reproduced, or provided or
revealed to any other party, except with the prior written authorization of Emerson Automation Solutions. Any authorized
copy or reproduction must include this notice and may only be used, for the purpose for which Emerson Automation
Solutions has provided it to recipient. This document and all copies and reproductions must be returned upon request. By
accepting this document, the recipient agrees to the foregoing.
Page 1 of 20
TABLE OF CONTENTS
1. INTRODUCTION ........................................................................................................... 4
2. DCS SYSTEM HARDWARE ......................................................................................... 5
2.1 DCS SYSTEM HARDWARE ......................................................................................... 5
3. DCS STANDARD SOFTWARE FUNCTION ................................................................. 6
3.1 REPORT GENERATION PROCEDURE ....................................................................... 6
4. HARDWARE FAILURE MODE TESTING ..................................................................... 7
4.1 WORKSTATION POWER LOSS .................................................................................. 7
4.2 OVATION WORKSTATION HIGHWAY REDUNDANCY .............................................. 8
5. HARDWARE DISCREPANCY TESTING ...................................................................... 9
5.1 CONTROLLER MEMORY MONITORING .................................................................... 9
5.2 SOFTWARE FAILOVER TO BACKUP ......................................................................... 9
6. DCS CONTROL SYSTEM FUNCTIONAL TEST ......................................................... 10
7. SUPPORTING SYSTEMS ........................................................................................... 13
8. SYSTEM ACCEPTANCE/DISCREPANCY RESOLUTION ......................................... 15
9. FACTORY ACCEPTANCE TEST SIGN-OFF .............................................................. 16
9.1 FACTORY ACCEPTANCE TEST SIGN-OFF (FAT CERTIFICATE) ......................... 16
Page 2 of 20
APPENDIX LIST
Page 3 of 20
1. INTRODUCTION
Tanjung Batu Evergreen Project includes a Factory Acceptance Test (FAT) for Ovation
Distributed Control System (DCS) Ovation. The FAT is to be conducted for both hardware and
software functionality to the furthest extend possible without other existing plant controllers
and hardware.
This FAT Procedure will be completed to ensure functionality and performance as intended
based upon project specifications and material from the DCS design review. This plan is not
intended to itemize every test and procedure but provide a general guide to test the major
aspects of the Ovation DCS.
To aid the testing, the following documents will be available as references during the FAT:
Page 4 of 20
2. DCS SYSTEM HARDWARE
Before test begins, a brief familiarization of the system hardware components and peripherals
will be conducted. This will include an overview of the system architecture with brief
descriptions of the major components and peripherals.
The following is a checklist to verify that workstations are assembled according to design.
Refer to the Bill of Material and System Architecture Drawing as required. Items recorded on
the Variance Report, refer to Appendix-1, will not prevent an item being marked as complete.
Page 5 of 20
3. DCS STANDARD SOFTWARE FUNCTION
In preparation for, and in addition to the control strategy and graphics testing which will follow,
Emerson is able to provide introduction to standard software features and functionality of
Ovation DCS. This will include an overview of operator and engineering functions.
As most of these features may be familiar, these demonstrations are available upon request
and not mandatory for FAT completeness.
The Report Service starts automatically, but you must start the Report Manager as follows:
Start -> Programs -> Ovation Process Historian -> Report Manager.
The Report Generation Queue is a table in the Report Manager configuration database. The
queue is updated constantly. Records are deleted from the queue after one week. After you
have defined the properties for your report definition, the next step is to define the reports
that you want to generate, and then generate them. You can generate reports manually or
set them to be generated from the occurrence of a timed, triggered, or demand event.
Access the Report Manager and Select Report Definitions. Select the report that you want
to generate. Use the generate icon from the toolbar. You can also generate a report by right-
clicking a report and click Generate Report. A Generate <Report Definition Name> dialog
box appears. Use the Set option to specify a time span. Select a Historian Server from the
drop-down menu. Use the Report Destination drop-down list to select where you want the
report to go. The Destination Info field is populated with your choice.
Page 6 of 20
4. HARDWARE FAILURE MODE TESTING
Functional testing the DCS hardware will verify that the system meets or exceeds the intent of
the design based upon material from the Ovation DCS hardware drawing review.
Turn-off power supply for workstation and wait for 10 seconds. Restore power and verify the
station boot-up to log-in screen. Log-in and verify workstation is functioning as normal.
For server type workstations, turn-off primary power supply and verify server remains in-
operation. Turn-on primary power supply. Repeat this procedure for secondary power supply
to prove power supply redundancy. Afterwards, turn-off both power supplies and wait for 10
seconds. Restore any power and verify the station boot-up to log-in screen. Log-in and verify
workstation is functioning as normal. Verify all power supplies restored upon completion of
test.
Page 7 of 20
4.2 OVATION WORKSTATION HIGHWAY REDUNDANCY
Create fault on one of the Ovation network cable check for fault on respective switch and
Workstation. Check for data loss, functionality loss during failure at Workstation. Verify the
status of I/Os on EWS and OWS. Restore back control network communication. Repeat the
operation for other Ovation network cable. Verify that there is no control action change and
that the system operates correctly on one network.
Page 8 of 20
5. HARDWARE DISCREPANCY TESTING
Different areas of the hardware are monitored for discrepancies to normal conditions. These
discrepancies, although not necessarily failures, may impact functional control logic or
required operator action.
Complete/Date
Function
Mismatch isreported
Comments
Complete/Date
Function
Resume normal operation
Comments
Page 9 of 20
6 DCS CONTROL SYSTEM FUNCTIONAL TEST
Functionally testing the DCS control system will verify that the system meets or exceeds the
intent of the design based upon project specifications and material from the design review. It
is also to verify the coordination of each control strategy with graphical representation.
SIMULATION OVERVIEW
Simple tie-back simulation logic has been implemented to assist in the verification of the
control. For digital signals, the associated digital outputs, lagged if necessary, are used to
generate the corresponding input feedback value. For analog signals, the associated analog
outputs, lagged and scaled if necessary, are used to generate the corresponding input
feedback value.
The software simulation logic will be used to test few (2-3) major modulating control loops,
digital devices (pumps, valves, etc.), The functional control logic, in conjunction with the main
screen and pop-up window graphics, will be used to verify the control software.
Digital and Analog Device Lists, Appendix 3 and 4, will be used to track the status of the logic
and graphics that have been verified. Control Sheets and HMI Graphics would be reference
during this test.
The following general procedure will be used to test the digital logic.
1. Display the main screen graphic for the device to be activated; one of the screens if
the device is used by more than one screen.
2. If applicable, verify that all permissive conditions are displayed correctly on the pop-up
window. Set all permissions to allow the device to be operated. Verify that the device
cannot be operated if the permissions are not met.
3. Repeat step 2 for every operation, for example, start/open and stop/close. If the start
permit has not been met, the device cannot be started.
4. Display the control window and verify proper functioning of the push buttons. Verify the
corresponding feedbacks are correctly displayed on the main screen and pop-up
window.
5. Disable the simulator logic by turning points off-scan and verify the failure logic and
alarms.
6. If applicable, verify that any alarm conditions for the device are properly displayed on
the graphic, pop-up window and/or on the alarm list.
7. Check off and date the corresponding material when the test is successfully complete.
There may be variances associated with the completed test, but the material can still
be checked off.
Page 10 of 20
ANALOG LOGIC TEST
The following general procedure will be used to test the analog logic.
1. Display the main screen graphic for the device to be activated; one of the screens if
the device is used by more than one screen.
2. Clear all conditions that may cause the following by viewing the logic diagram
a. Manual Rejects (MRE)
b. Auto Rejects (ARE)
c. Priority Lower (PLW)
d. Priority Raise (PRA)
e. Raise Inhibit (RAI)
f. Lower Inhibit (LWI)
3. Display the control station and manually position the control output to establish flow.
4. Verify that the corresponding process variable is being calculated, compensate, or
selected correctly by using the graphic and/or control tune screens.
5. Place the control station in automatic and verify that the transfer is bumpless.
6. Verify the operation of any setpoint/bias stations.
7. Ramp the process or setpoint and verify that the controller responds in the correct
direction and provides stable, responsive control action.
8. Verify the following interlocks and overrides:
a. Manual Rejects (MRE)
b. Auto Rejects (ARE)
c. Priority Lower (PLW)
d. Priority Raise (PRA)
e. Raise Inhibit (RAI)
f. Lower Inhibit (LWI)
9. Check off and date the corresponding material when the test is successfully complete.
There may be variances associated with the completed test, but the material can still
be checked off.
Graphical displays show informational points that do not necessarily interface to logic sheets.
Testing these points should be conducted in the following procedure.
1. Display a graphical display and the point information window.
2. On a point-by-point basis, display the information for each point in the point information
window.
3. Verify that the corresponding point is correctly assigned.
4. Verity that the corresponding point information is correct.
5. Check off and date the corresponding material in the appendices when the test is
successfully complete. There may be variances associated with the completed test,
but the material can still be checked off.
Page 11 of 20
ALARM POINT TEST
Alarms are generated and configured in a manner to reduce unnecessary alarms, display
correct information, and assist the operator in addressing the alarm action. The following
general procedure should be followed to verify alarms.
1. Display the main screen graphic where the alarm indication is assigned.
2. Verify the alarm indication is hidden when not in any alarm condition.
3. If applicable, verify that any alarm cutout is applied, for example by activating the alarm
when the associated device is not in operation.
4. Verify alarm deadband, if applicable, by raising/increasing values within and exceeding
the deadband range.
5. Activate alarm.
6. Verify alarm priority and, if applicable, the alarm sound.
7. Verify alarm description is not a cryptic message.
8. Verify from the alarm list that the associated summary diagram can be accessed.
9. Verify graphical alarm symbol is displayed.
10. Verify alarm setpoints by scanning off the points and increasing/decreasing the value.
11. Verify the alarm acknowledge and alarm reset functions.
12. If applicable, verify the alarm sound turned off after acknowledging.
Page 12 of 20
7. SUPPORTING SYSTEMS
Ovation DCS is supported by a number of additional functionality and technology including
Ovation Process Historian, Ovation Security Manager, Object Linking and more. The following
will step through each supporting piece as supplied.
The following standard functions of the OPH could be demonstrated upon request but are not
required for FAT completeness.
▪ Historical Trending – process historical trends
▪ Historical Review – point, alarm, sequence of events (SOE), operator events
review
The following user accounts and passwords shall be configured and tested accordingly:
Page 13 of 20
ANTIVIRUS
The Anti-Virus system included with Ovation is used to distribute virus definitions within the
Ovation system. New virus definitions are release through Emerson and are to be physically
transferred to the Anti-Virus workstation for import and system update.
Item Complete/Date
Antivirus Software
Antivirus License
Complete / date
Complete / date
Page 14 of 20
8. SYSTEM ACCEPTANCE/DISCREPANCY RESOLUTION
MATERIAL SIGN-OFFS
During the FAT, the appropriate sections of this document and any Appendices will be
checked-off by an authorized representative of the customer. Upon completion of each major
section, the customer representative and Emerson will sign-off on the attached FAT Sign-Off
Sheet.
Any discrepancies identified during the execution of the Factory Acceptance Test Plan will be
entered a Material Punch List.
MATERIAL PUNCH-LIST
The Punch List is used to record significant discrepancies, which are identified during the FAT.
Each discrepancy will be recorded on a variance form. At the end of each day the Punch List
is to be reviewed and a plan of action identified to correct any system discrepancy.
The Punch List for Variance Reports is contained at the end of this document as reproducible.
VARIANCE REPORTS
Variance Reports are mainly used to report corrective problems or discrepancies. These
include changes that identify a deficiency in the system software, or a deviation from the 100%
review. These changes may include misinterpretation of existing functional logics, or
misinterpretation of graphical layouts, labels, or colors.
Variance Report Forms may also be used to identify enhancement/design alteration changes.
These changes are those items that are changes to previously supplied engineering
documentation, or new engineering that the customer wishes to have implemented. They may
include items such as alterations to device logics, addition of graphic components or devices,
or expansion upon existing logic or graphical components. The contents of a design alteration
will vary from minor changes to complex.
Emerson’s project engineer will characterize all variance reports as corrective or design
alteration. Corrective problems will have priority over alterations for resolution. Design
alterations will be handled as requests and the implementation of such requests will not impact
the completeness of the Site Acceptance Test.
SYSTEM RELEASE
After the conclusion of the FAT, Emerson will provide a plan for resolving and re-testing any
open variances. The Customer and Emerson will do the re-testing either on-site as mutually
agreed.
Upon successful completion of the Factory Acceptance Test and approval by the Customer,
the system will be shipped to site.
Page 15 of 20
9. FACTORY ACCEPTANCE TEST SIGN-OFF
9.1 FACTORY ACCEPTANCE TEST SIGN-OFF (FAT CERTIFICATE)
Project Title:
Customer Purchase Order Project Number:
TANJUNG BATU
M1284
EVERGREEN PROJECT
7 SUPPORTING SYSTEMS
By signing the Acceptance Certificate, customer agrees that system is built according to
specification. The equipment to customer site is scheduled on .
*Signatures provided with the understanding that open variances shown in the variance log
are to be resolved, and verified as mutually agree
Page 16 of 20
APPENDIX-1
FAT VARIANCE REPORT VARIANCE #
CLASSIFICATION Class 1: Testing must stop for immediate evaluation and correction.
(select one) Class 2: Testing may continue, but correction procedures will be
initiated at the end of the current session.
Class 3: Testing may continue, the variance will be corrected &
tested prior to shipping.
Class 4: Testing may continue, the variance is not major, will be
corrected & tested during site implementation.
Document Reference
Control Logic Reference
Graphics Reference
Description:
ACTION STATUS
Resolution:
FINAL STATUS
*Corrections are any items absent or clearly wrong according to previously supplied materials.
*Design alterations are any clarifications, changes, or additions made to any previously supplied materials.
Page 17 of 20
FAT VARIANCE LOG APPENDIX-2
Variance Status
Class Alt Time Description
# Hold Done
Page 18 of 20
DIGITAL TEST CHECKLIST APPENDIX-3
Device Name
Faceplate No
□ Checked □ NA □ Checked □ NA
Manual (Reset) & Auto Select Remarks: Remarks:
□ Checked □ NA □ Checked □ NA
Tagout & Reset Tagout Remarks: Remarks:
□ Checked □ NA □ Checked □ NA
Auto Start & Stop Remarks: NA Remarks: NA
□ Checked □ NA □ Checked □ NA
Stop Override Remarks: Remarks:
□ Checked □ NA □ Checked □ NA
Start Permit Remarks: Remarks:
□ Checked □ NA □ Checked □ NA
Control Power / AVBL Signal Remarks: Remarks:
□ Checked □ NA □ Checked □ NA
Start Fail
Remarks: Remarks:
Fail to Start Time:
□ Checked □ NA □ Checked □ NA
Stop Fail
Remarks: Remarks:
Fail to Stop Time:
□ Checked □ NA □ Checked □ NA
Running Feedback Remarks: Remarks:
□ Checked □ NA □ Checked □ NA
Stopped Feedback Remarks: Remarks:
Name: Name:
Function: Function:
Sign: Sign:
Page 19 of 20
ANALOG TEST CHECKLIST APPENDIX-4
Device Name
Faceplate No
□ Verified □ NA □ Verified □ NA
Manual & Auto Select Remarks: Remarks:
□ Verified □ NA □ Verified □ NA
Output Set, Raise & Lower
Remarks: Remarks:
Output Ramp Rate:
□ Verified □ NA □ Verified □ NA
Setpoint Set, Raise & Lower
Remarks: Remarks:
Setpoint Ramp Rate:
□ Verified □ NA □ Verified □ NA
Bias Set, Raise & Lower
Remarks: NA Remarks: NA
Bias Ramp Rate
□ Verified □ NA □ Verified □ NA
Manual Reject Remarks: Remarks:
□ Verified □ NA □ Verified □ NA
Auto Reject Remarks: NA Remarks: NA
□ Verified □ NA □ Verified □ NA
Priority Lower Remarks: NA Remarks: NA
□ Verified □ NA □ Verified □ NA
Priority Raise Remarks: NA Remarks: NA
□ Verified □ NA □ Verified □ NA
Lower Inhibit Remarks: NA Remarks: NA
□ Verified □ NA □ Verified □ NA
Raise Inhibit Remarks: NA Remarks: NA
□ Verified □ NA □ Verified □ NA
PID Direction Remarks: Remarks:
Name: Name:
Function: Function:
Sign: Sign:
Page 20 of 20
Appendix - 5 GPA FAT Checklist
The Global Performance Advisor includes customized modules that meet the specific performance
monitoring requirements for the above project. Depending on the above plant configuration and
available information to evaluate plant equipment and process performance factors the following
modules are included in the scope.
During the Factory Accepting Test, enter the date and the values across each column mentioned
in the test procedures. First, run the GPA software in offline mode and verify the results.
• Develop at least one test case based on design and/or performance data.
• Verify the configured GPA performance results against the test case
• In this manner we can evaluate and test the configured GPA in a controlled environment,
independent of the DCS interface and the project database.
Performance software in OFFLine mode receives static (initial) data from the known set of the test
case. The test case is taken from the guaranteed condition as per the HBD (100 % GT load). The
initial design values for guaranteed test conditions are entered manually in the performance
software. With the initial values (design) as an input to the software and the software will generate
the output as per the pre-configured set of algorithms. During the Factory Acceptance Test, offline
values will be simulated in the DCS database.
The following procedure will be done during the FAT to validate the software results.
• Genrating the import file (.IMP file) from the performance software.
After completing the test, request the customer to sign the GPA FAT certificate and minutes of
the meeting verifying his acceptance of the GPA Performance Monitoring System.
Calculated in accordance with ASME PTC test code, “Performance Test Code on Gas
Turbines,” 1997, the combustion turbine calculation adjusts for either natural gas or fuel oil
input. These fuel parameters can be adjusted via the data entry screens so the system can
correlate between volumetric to a mass basis. If the combustion turbine is also fired by liquid
Test Date:
Remarks:
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
Remarks:
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
Test Date:
Remarks:
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
Using design performance data provided by the steam turbine manufacturer, the system
conducts a heat balance around the turbine and calculates the overall turbine efficiency for
Single-stage casing, non-extracting turbines. Using PTC 6, “Performance Test Code 6 on
Steam Turbines,” 1996, the system calculates the turbine cycle heat rate. These calculations
are corrected to reference conditions using correction curves provided by the steam turbine
manufacturer. For two-stage casing turbines, the system calculates individual high-pressure
and intermediate/low-pressure stage efficiencies and then compares them to design stage
Test Date:
Remarks:
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
Note: NA – No data available
Test Date:
Remarks:
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
This module computes the net & gross turbine cycle heat rate
Test Date:
Remarks:
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
Remarks:
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
FOR
This is to certify that the subject system was duly verified in accordance with site
acceptance test procedure agreed. The system set up of the software was found to be in
accordance with specified project requirements.