Key Responsibilities

Design develop and maintain data pipelines using dbt for data transformations and Airflow for orchestration

Develop and maintain dbt models for data transformations ensuring data quality and consistency

Implement and manage Airflow DAGs to orchestrate data pipelines including scheduling dependencies and error handling

Implement and enforce data quality checks and validation rules using dbt and Airflow

Monitor data pipelines for errors and failures and implement solutions to prevent future issues

Collaborate with data analysts data scientists and other engineers to understand data requirements and develop solutions

Technical Skills

Proficient in SQL and experience with relational databases

Strong experience in dbt implementation and best practices

Experience with Apache Airflow and DAG development

Experience in AWS Snowflake RedShift Python

Skills

Mandatory Skills : Snowflake-Data Science, Snowpark Container services