Interview Questions
Interview Questions
If you want to configure TMS, you should login with 000 client and DDIC user into domain controller
system, it should be Development system
First login to Domain controller system and go to STMS T-code, pop up will come and enter the
controller system details in that pop up box and save
Then login to other system (example quality system) with 000 client and DDIC user and go to STMS
T-code, same pop-up box will appear, in the pop up click on other configuration button, enter the
Target system details like Host and instance and save it
Then login to Domain controller system and approve the Quality and production system
Routes can define where the object can be move from one system to other
Layers can be defined the path, where the object can be moved from one system other system
SAP: Standard
==============================================================================
Then I will check RDDIMPDP job is running or not, if not I will schedule the job by using RDDNEWPP
program
I will check TP is connected to SID or not by using TP connect SID command in os level
I will check tmp directory updating or not in usr/sap/trans, if it is not updating will check is there any
files are created with the names like .lob and .loc, if those are available will rename the file names
I will check TRJOB and TRBAT tables, is there any old data available
TR import failed:
I will check the TR log files 1st, based on log file, and return codes will check further things, like if RC
08 then we should connect with the TR owner if it is above return code means
I will check in log files like A log, U log, S log for error information
========================================================================
Kernel Upgrades:
1st I will download the database dependent files and database independent files from marketplace
those are SAPEXE and SAPEXEDB
I will create one folder and will copy those files into that folder, and I will uncar the files by using
Whenever you start SAP system, SAPCPE.exe will trigger and copy the global directory kernels into
instance directory kernel location
============================================================================
SAP note can be implemented in development system only, whenever implement the sap note it will
create one Transport request, that request we must import into other systems.
Before implementing the note, we should check the note status: can be implemented
==========================================================================
create new folder under usr/sap/SID/sum and copy the sum software and uncar it by using SAPCAR -
xvf command
once we uncar the sum, some folders will be created and those are ABAP, Startup, JVM like that
once we perform the pre steps then launch the SUM tool
command is:
Copy the URL and open it in browser, once we launch in browser sum phases will start those are
Extraction
Configuration
Checks
Preprocessing
Execution
Post processing
In extraction phase it will ask you for the stack.xml file location, DDIC password, System user
password, and SIDADM passwords
In configuration phase it will ask you for the configuration modes like single, standard, advanced
I have selected the standard on that time. These modes can be used for downtime minimization
purpose, then enter the uptime and downtime for SQL, R3trans, R3load, ABAP
It will ask you for the Shadow instance SID and Instance number
Select the SGEN, batch job server, and it will ask you if any sap notes needs be implemented
In check phase, Sum will check Is there any space needs to be increased, we can check space
requirement in checks.log file
In preprocessing phase, it will ask you to Lock the development workbench, if you click on OK
development workbench will be locked then users cannot be able to perform any development tasks
After that shadow instance will be created and data will be copied to the shadow instance by using
R3load process, in shadow process SPDD will be prompt for the data dictionary objects.
With the help of Developer, we can modify the data dictionary objects in shadow instance
Note: we must login to shadow instance with the 000 client and DDIC user and activate the one
user then re-login with that user
Whenever you modify the SPDD one TR will be created, that TR we can import into other systems
After that it will ask you to take the Backup, click on next to execution phase
In Execution phase, date will be executed, and it will perform the upgrade activities and copy the
upgraded data from shadow instance to original instance.
In Post processing phase, it will prompt for SPAU (repository data), this activity also we can connect
with developer, and it can be done within 14 days
SUM log files can be checked in right side of SUM screen only and
In OS level usr/sap/SID/SUM/ABAP/log
DB Refresh:
Pre steps: First we need to login to source system (Production) and perform the 2 activities like
Take the backup of the source system and generate the trace file using the command
Then copy these 2 files in to target system (Quality or Test) and perform some pre steps in target
system like,
Take the screen shots of some t codes like STMS, WE20 for partner profiles, SCC4, RZ04, SMLG, like
that
Create one request and add the RFC tables (RFC New, RFC Des…...), USR02 table, BDLS tables into
that request
Check the TRs which are needs to be reimport after refresh with functional and ABAP teams
Open the trace file which we have copied from source system and remove the lines above Nomount
and below character set UTF8 line and rename the source SID to Target SID, RESET to SET logs and
archive to noarchive mode
Once we complete all the pre steps, then we can restore the DB by using BRRestore command by
using the backup and trace file.
Post steps are 2 types: one is from DB side and other one is from SAP application side
First, we need to perform DB post steps like generating the control file
Start the DB in Mount phase and execute the @control.sql command, control file will be generated.
And execute the reset log files to set by using command Open reset log files
Then we have drop and regenerate the OP$ user if required (for older versions).
First, we need set Background job parameter to 0(ZERO) in OS level, then start the sap application
and suspend all background jobs by using BTCTRNS1 report in SE38
Then, perform the post activities like configuring required data using the screen shots which we have
taken in pre steps,
Change the background job parameter to previous value and restart the system and resume the
background jobs by using BTCTRNS2 report in SE38
Perform the logical systems conversion by using BDLS T code (it will take 6 to 8 hours).
If slowness issue for transaction/user then we can activate the trace, here we can find details about
the slowness where it is taking long time like that.
Client creation
Scc4 is the t code to create the client, before creating the client we should create the logical system
to uniquely identify the client in landscape, BD54 or SALE is the t codes to create logical system.
Once you create the client by using SAP* user and Password is PASS we can be able to login
If not able to login SAP* user, then we need to set the parameter
login/no_automatic_user_sapstar=0 (enable), 1 (disable)
Local client copy: it can be used to copy the client data between the clients and within the system
Login to target client and go to SCCL t code – select the client profile – schedule as background job
Remote client copy: it can be used to copy the client data between the clients and between the
systems
If you want to perform the remote client copy, you should create one RFC between the systems
Then login with target client and go to SCC9 t code – select the RFC destination and Client profile
then schedule background job.
Export and Import client copy: it can be used to copy the client data between the clients and
between the systems but if you want to perform export & import client copy, TMS should be
configured.
First login to source client and go to SCC8 t code select the client profile- schedule as background job
Then login to target system client go to STMS_Import t code- select the KT request and import