0% found this document useful (0 votes)
19 views3 pages

PoojaGCP

Pooja Patel has over 11 years of experience in DevOps, Data Flow, Big Query, and AI technologies, with a strong focus on cloud-native solutions and digital transformation. She has held senior roles at Wipro and Standard Chartered Bank, leading technical execution and implementing DevOps practices while managing complex data architectures. Pooja is certified in various cloud and DevOps technologies and has received awards for her contributions to innovation and leadership in technical solutions.

Uploaded by

Pooja Patel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views3 pages

PoojaGCP

Pooja Patel has over 11 years of experience in DevOps, Data Flow, Big Query, and AI technologies, with a strong focus on cloud-native solutions and digital transformation. She has held senior roles at Wipro and Standard Chartered Bank, leading technical execution and implementing DevOps practices while managing complex data architectures. Pooja is certified in various cloud and DevOps technologies and has received awards for her contributions to innovation and leadership in technical solutions.

Uploaded by

Pooja Patel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

POOJA PATEL

8826621116
[email protected]

Professional Summary:
 Over 11+ years of expertise in DevOps, Data Flow, Big Query, and Innovation & Gen AI technologies. Skilled in the onboarding, design,
development, testing, and deployment of AI-driven use cases and cloud-native solutions. Proven ability to manage end-to-end delivery in
complex, distributed environments, aligning technical solutions with business growth, risk, and compliance needs. Adept at defining technical
strategies, driving digital transformation, and collaborating with cross-functional teams to deliver innovative, high-quality solutions. Strong
leadership skills with a focus on coaching and guiding teams to achieve technical excellence and meet business objectives.
 experience in architecting/designing applications, creating multi-tier architectures following Micro-Services and service-oriented architectural
principles, Business Development and Collaborating with technical teams in Cloud Environments. Experience in configuring Continuous Integration
(CI) Server i.e. Jenkins, SonarQube and Code Pipeline. High-level understanding of Amazon Web Services global infrastructure and service
migrations, Cloud Orchestration & Automation, Security, Identity & Access Management, Monitoring and Configuration, Governance & Compliance,
Application Delivery, Data protection, Image and Patch Management while focusing on core Business Priority.
 Well versed with GCP service EC2, ECS, Cloud front, Autoscaling, cloud formation, cloud trail, ELB,SQS
 experience in upstream/downstream marketing and product management within the medical device industry. Demonstrated history of analyzing
client needs and translating customer requirements into innovative and value-added product solutions. Collaborative leader skilled communicating
with people from diverse backgrounds and at all levels. Utilizes unique blend of strategic, technical, clinical, and marketing skills to drive cross-
functional teams, leading to successful launch of highly technical products.
 Consistently optimizes and improves NLP systems by evaluating strategies and testing changes in machine learning models.
 Leading Data Management and Governance initiatives to ensure optimal data quality, security, and compliance.
 Created and maintained data pipelines in Azure Data Factory using Linked Services to ETL data from different sources like Azure SQL, Blob storage,
ADLS and Azure SQL Data warehouse. Strong understanding of Hadoop Architecture, Hadoop Clusters, HDFS, Job Tracker, Task Tracker, Name
Node, Data Node, Map Reduce, Spark. Created and trained models in ML, Deep Learning, computer vision and NLP.
 Experience with Snowflake cloud data warehouse and GCP S3 bucket for integrating data from multiple source systems which include loading
nested JSON formatted data into snowflake table.
 proven track record in mutual funds, managed accounts, and separate accounts,
 Experience with Azure Cloud, Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, Azure Analytical services, Big Data
Technologies (Apache Spark), and Data Bricks is preferred.
 Experienced in working with (GCP) like Autoscaling, DynamoDB, Route53,Bigquery EC2 for computing, S3 for storage, EMR, S3 and cloud watch to
run and monitor Spark jobs., with strong understanding n machine learning and Statistics
 Hands on experience in migrating on-premises ETLs to Google cloud platform (BANK) using Big Query, Cloud Storage, Data Proc and Composer
 Experience in writing Map Reduce programs using Apache Hadoop for analyzing Big Data.
 Develop and train generative AI models such as GANs, VAEs, or Transformer-based models (e.g., GPT) using large datasets. Implement state-of-the-art
architectures and algorithms for specific tasks like image generation, text generation, or music composition.

 Stay updated with the latest advancements in generative AI research. Experiment with novel techniques and methodologies to improve model
performance, efficiency, and scalability.

 Hands on experience in writing Ad-hoc Queries for moving data from HDFS to HIVE and analyzing the data using HIVE QL.
 Experience in designing, developing, testing and maintaining BI applications and ETL applications.
 Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical
application by making use of Spark with Hive and SQL/Oracle/Snowflake.
 Expertise in Python data extraction and data manipulation, and widely used python libraries like NumPy, Pandas, and Matplotlib for data analysis.
 ETL pipelines in and out of data warehouses using a combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.
 Played GCP key role in Migrating Teradata objects into Snowflake environment.
 Created a connection from Azure to an on-premises data center using the Azure Express Route for Single and Multi-Subscription.
 Experience in streaming data using Kafka as a platform in batches and real-time
 Have hands on experience with Snowflake Data warehouse. Created Schemas, Tables and views. Improved the performance by optimizing the
views for data validations.
 Worked with scheduling all jobs for automating the data pipelines using Airflow, Oozie and control M.
 Implementing data movement from File system to Azure Blob storage using python API
 Written Kafka consumer Topic to move data from adobe clickstream JSON object to Datalake.
 Experience on working with file structures such as text, CSV, JSON, sequence, parquet, and Avro file formats.
 Have Experience in List Comprehensions and Python inbuilt functions such as Map, Filter and Lambda.
 Knowledge of tools like Snowflake, SSIS, SSAS, SSRS to design warehousing applications.
 Expertise in using Sqoop & Spark to load data from MySQL/Oracle to HDFS or HBase.

Public
 Proficient in the Integration of various data sources with multiple relational databases like Oracle11g
/Oracle10g/9i, Sybase12.5, Teradata and Flat Files into the staging area, Data Warehouse and Data Mart.
KEY SKILLS
 DevOps (CI/CD, Automation, Kubernetes, Jenkins)
 Data Flow Management and Big Query
 AI & Machine Learning Deployment
 Cloud Computing (AWS, Google Cloud, Azure)
 Agile & Scrum Methodologies
 Digital Transformation and Innovation
 Stakeholder Management and Expectation Setting
 Solution Design and Architecture
 Risk & Compliance Adherence
 Coaching and Mentoring Teams
 Project and Resource Management
 Technical Debt Management
 Technical Strategy Alignment
 Continuous Integration/Delivery Practices
CERTIFICATIONS
 Google Cloud Professional Data Engineer
 AWS Certified Solutions Architect – Associate
 Certified DevOps Professional (DO101)
 Certified Kubernetes Administrator (CKA)
 Certified Big Data Professional
 Agile Scrum Master Certification
 Certified Information Systems Security Professional (CISSP) (optiona

AWARDS AND RECOGNITION


 Innovator of the Year Award, [Company], [2022] – For outstanding contributions to driving AI and cloud-based innovations.
 Excellence in Leadership Award, [Company], [2020] – Recognized for exceptional coaching and mentorship in delivering complex technical
solutions.

Professional Experience:

Wipro – senior consultant May 2021 to till date


Responsibilities:

 Lead the technical execution and delivery of Innovation and Gen AI use cases, overseeing the full lifecycle of new use case onboarding, design, coding,
testing, and deployment into production.
 Collaborate with global Risk and Compliance IT architecture teams to ensure solutions, developments, and integrations adhere to technical
standards, governance, and compliance requirements.
 Define and execute the technical strategy in alignment with Group and Compliance/Enterprise Technology Innovation strategy, ensuring that
technology solutions meet the business needs for growth, control, and innovation.
 Establish a vision for project and people management, providing clear direction and feedback to teams of developers, testers, analysts, and
architects to ensure high-quality deliverables that meet standards and best practices.
 Address technical debt while driving digital transformation and continuous innovation across teams by working closely with stakeholders,
business leaders, and Solution Architects globally.
 Facilitate DevOps practices for continuous integration, delivery, and testing, ensuring seamless collaboration between teams and rapid deployment
cycles.
 Design and implement data flow management solutions leveraging Big Query, ensuring efficient data processing and analytics pipelines.
 Manage the supply and demand pipeline, providing guidance to stakeholders for decision-making that aligns with business objectives and optimizes
resource allocation.
 Foster strong, trustworthy relationships with business and technical stakeholders, managing expectations and delivering solutions that maximize
business value and customer outcomes.
 Spearhead the adoption of cloud technologies and AI-powered solutions, enabling business units to enhance decision-making processes, improve
automation, and drive innovation.

Standard Chartered Bank


04/2019 TO 03/2021
 Led the development and implementation of DevOps solutions, enabling teams to automate deployment pipelines, increase efficiency, and
enhance quality in production environments.
 Designed and optimized data flow architectures, focusing on building scalable and performant systems using Big Query for large-scale data
storage and analytics.
 Collaborated with cross-functional teams to integrate AI models into production environments, ensuring seamless deployment, monitoring,
and continuous learning.
 Mentored junior developers and provided leadership in setting best practices for coding standards, testing procedures, and cloud
infrastructure setup.

Public
 Worked with stakeholders to define use cases and technical requirements, translating business needs into actionable technical solutions.

Assistant Manager HSBC Bank: - 08/2013 to 04/2019


 Managed cloud infrastructure on GCP, including EC2 instances, Lambda functions, and RDS databases, achieving a 99.9% uptime.
 Implemented Docker and Kubernetes for container orchestration, improving deployment speed and resource management.
 Supported clients in the design, development, and deployment of cloud-native applications, with a focus on implementing DevOps practices to
streamline development cycles and improve operational efficiency.
 Utilized Big Query to build and optimize data pipelines, ensuring robust and scalable data flow for real-time analytics.
 Coordinated across various business units to understand their objectives, translating these into actionable technical requirements for cloud migration
and digital transformation projects.
 Built and maintained automation scripts for server provisioning, patching, and configuration management.
 Developed monitoring dashboards with Grafana and Prometheus to track infrastructure health and alert on failures, reducing response time to
incidents.
 Collaborated with the development team to build automated testing and release pipelines, improving code quality and reducing deployment errors.
 Managed database backups, replication, and failover strategies to ensure data integrity and disaster recovery.

Education Details
DAVV IBS Indore
MBA in Business Decision Making. Feb. 2010

DAVV University -PIMR, Indore, M.P.

BBA - 70%, Feb. 2008

Public

You might also like