0% found this document useful (0 votes)
117 views1 page

DBT - Note2024-Roles

This job description is for a Technical Lead - Data Build Tool who will design and develop data integration solutions using tools like Data Build Tool and Azure Data Factory. Some key responsibilities include creating DBT models and maintaining dependencies, building CI/CD pipelines for deployments, integrating various data sources, analyzing data in Snowflake, and collaborating with teams to design effective cloud migration solutions and ensure data quality.

Uploaded by

vr.sf99
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
117 views1 page

DBT - Note2024-Roles

This job description is for a Technical Lead - Data Build Tool who will design and develop data integration solutions using tools like Data Build Tool and Azure Data Factory. Some key responsibilities include creating DBT models and maintaining dependencies, building CI/CD pipelines for deployments, integrating various data sources, analyzing data in Snowflake, and collaborating with teams to design effective cloud migration solutions and ensure data quality.

Uploaded by

vr.sf99
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

Job description

As a Technical Lead - Data Build Tool, you will be a part of an Agile team to build
healthcare applications andimplement new features while adhering to the best coding
development standards.
Responsibilities:
 Design and develop data integrationsolutions using Data Build Tool (DBT) and
Azure Data Factory (ADF) ETL tools
 Create DBT models, and maintaindependencies across the models
 Build and maintain CI/CD pipeline for DBTdeployments
 Write custom test scripts, and implement varioushooks techniques
 Integrate various sources using snowpipe (GCP,AWS, Azure)
 Analyze and validate data in Snowflakewarehouse
 Build metric model at Semantic layer
 Work with complex SQL functions, and enabletransformation of data on large
data sets in Snowflake
 Design and implement data models, developpipelines for incremental data
updates and historical data capture, optimizeperformance, ensure data quality,
and collaborate with team members to supportthe data needs
 Collaborate with cross-functional teams tounderstand business requirements,
and design effective Cloud migrationsolutions
 Perform data validation, testing, andtroubleshooting during and after migration to
ensure data integrity and quality

You might also like