Sai Terraform
Sai Terraform
Technical Proficiency
● DevOps tools: Puppet, HPOO, Docker, GIT, Jenkins,Ansible,BITBUCKET,Ansible
Tower,kubernates,chef,terraform, terragrunt, packer
● Database:Oracle DBA 11g,Cloning,RAC
● Tools:commvault (RMAN backup’s), Netcool, Splunk, BSM, BLADELOGIC server
Automation,Goldengate
● Ticketing Tools:CA Service desk, JIRA.
● Other Software Expertise:websphere, shell scripting and python scripting and
JBOSS,YAML scripting.JSON.
● AWS: Glue, Gluecatalog database, Crawler, Lambda, stepfunction ,SNS, SQS,Elastic
Bean stalk, S3, EC2 instances, VPC, ECS, Fragate, Cloud watch, Cloud Formation,
Terraform, boto3, AWSCLI, Datalake,Redshift,Attunity
Replicate,Talend,CodeCommit,Codebuild,Codedeploy.EMR,EKS,IAM
● AWS Bigdata components: glue, glue catalog data base,RDS, EMR, datapipelines
● GCP: vminstances, firewalls, gkc{kubernates),dataflow
● Azure: vms and networking
● Bigdata : Batch processing, migration of datalake from on premises to aws ,altryx
Summary
Cloud activities:
● creation of ec2 instance, security groups and preparing an application servers with
cloudformation templates and ansible
● creation of datalake which includes procuring of s3, glue catalog database, glue job,
dynamodb, crawlers, and creation of stepfunctions and lambdas for each layer of ETL
● working on full end to end procuring and deployment of ETL flow using s3 and glue
using terraform and terragrunt
● Designing the data flow on aws based on requirements
● creation of CICD pipeline using code build
● creating specfiles for code build actions
● creating apigateways and integration with applications and route53
● accessing lambdas, RDBMS, sns,sqs, secretmanager, sftp services using terraform
● worked on redshift procuring server and doing admin tasks
● worked on GCP for creation of gks (kubernates) cluster using terraform
Key Responsibilities:
Migration of Datalake:
● Migrated successfully Datalake from On premises to AWS successfully of batch
processing
● Maintenance of Datalake and procuring infra for new Dataproducucts
● Creating CICD pipelines for all dataproducts
● Creating s3,glue and databases and dynamodbs for all dataproducts as required
● Scheduling the runs and maintaining the sns and sqs setup using lambdas to maintain
dependency of data products
● Deployment of all services at a single base repo using terraform
● Creating views in Athena
● Worked on redshift and lambdas to dump data into redshift
● Handled admin tasks for redshift and created users and provided required .so that
business can be able to view data in redshift
● Created apigateways and integration with application and data layer
● Undertaken full devops activities in over all 12-15 accounts with respect to products
JOB Profile
Key Responsibilities:
Migration of DataBase:
● Our main key role is to migrate the data from one DB to another DB using attunity
replicate
● Converted the total migration from normal manual to automation using ansible playbooks
using boto3 and winrm modules
● By using the scripts we can migrate whatever data we required through ansible.
● Migration of Bladelogic jobs to Ansible using Ansible Tower
● Creating infrastruchure on AWS using cloudformation and Ansible
JOB Profile
Key Responsibilities:
JOB Profile
Key Responsibilities:
Problem Management:
● Providing Support in technical Application Management as problem manager.
● Taking backups for database and restoring database and dealing with transaction logs.
● Attending Bridge, Outage calls and driving efficiently to provide solutions to save SLA.
● Finding Rootcauses for the bridges and outages happened.
● Following up with respective teams and technologies in generating solutions for the
rootcauses.
Deployments and Designing:
● Deploying EARS and scripts on respective servers and instances for nearly 20 products
which were hosted on Websphere,tomcat etc which were hosted in different
Environments of IAT,UAT,PROD
● Worked on Blade logic ,Puppet ,Docker and Splunk
● Developing playbooks of yaml for ansible
● Implementing changes on websphere and Oracle instances.
● Using HPOO Implementing the deployments using a central repository in Central
● Deploying the containers in IAT, UAT and PROD using DOCKER.and Jenkins
● Checking out the alerts using NETCOOL for DB, Sitescope for Application.
● Providing the 24*7 Troubleshoot support for application and database using Devops
Tools
● Releasing of Emergency HF using DevOps tools.
● Creating HPOO flow to automate the things for implementation
● Creating builds in Jenkins and storing in GitHub and implementing in Bladelogic and
docker
● Creating queries for splunk
● Customizing the jenkinsfile based on requirements
● Creating python scripts for api calls
● Creating AWS stacks using cloudformation and ansible in POCs
Educational Qualification