Lead Data Engineer (Must have exp. in Fivetran)
Posted on July 1, 2025
Job Description
- Lead Data Engineer (Must have exp. in Fivetran)
- Strong hands-on expertise in SQL, DBT and Python for data processing and transformation.
- Lead the design and development of data pipelines (batch and real-time) using modern cloud-native technologies (Azure, Snowflake, DBT, Python).
- minimum 8 to 10 years
- Expertise in Azure data services (e.g., Azure Data Factory, Synapse, Event Hub) and orchestration tools.
- Translate business and data requirements into scalable data integration designs.
- Strong experience with Snowflake � including schema design, performance tuning, and security model.
- Guide and review development work across data engineering team members (onshore and offshore).
- Good understanding of DBT for transformation layer and modular pipeline design.
- Define and enforce best practices for coding, testing, version control, CI/CD, data quality, and pipeline monitoring.
- Hands-on with Git and version control practices � branching, pull requests, code reviews.
- Collaborate with data analysts, architects, and business stakeholders to ensure data solutions are aligned with business goals.
- Understanding of DevOps/DataOps principles � CI/CD for data pipelines, testing, monitoring.
- Own and drive end-to-end data engineering workstreams � from design to production deployment and support.
- Knowledge of data modeling techniques � Star schema, Data Vault, Normalization/Denormalization.
- Provide architectural and technical guidance on platform setup, performance tuning, cost optimization, and data security.
- Experience with real-time data processing architectures is a strong plus.
- Drive data engineering standards and reusable patterns across projects to ensure scalability, maintainability, and reusability of code and data assets.
- Proven leadership experience � should be able to mentor team members, take ownership, make design decisions independently.
- Define and oversee data quality frameworks to proactively detect, report, and resolve data issues across ingestion, transformation, and consumption layers.
- Strong sense of ownership, accountability, and solution-oriented mindset.
- Act as a technical go-to team member for complex design, performance, or integration issues across multiple teams and tools (e.g., DBT + Snowflake + Azure pipelines).
- Ability to handle ambiguity and work independently with minimal supervision.
- Contribute to hand on development as well for the ned to end integration pipelines and workflows.
- Clear and confident communication (written and verbal) � must be able to represent design and architecture decisions.
- Document using Excel, Word, or tools like Confluence.
Required Skills
(e.g.
azure data factory
synapse
event hub)