- Working on large-scale data engineering projects
- Contract opportunity
- Auckland CBD
About the Role
We’re looking for a Snowflake Data Engineer on a 6-month contract within a large-scale enterprise setting. You’ll play a key role in building and optimising cloud-native data pipelines, transforming datasets, and contributing to a modern data stack using best-in-class tools.
What You’ll Be Doing
- Desisning and maintaining robust Snowflake-based data warehouses for high-performance analytics
- Developing and managing transformation models using dbt
- Building and orchestrating ELT pipelines with tools like Meltano
- Collaborating across teams using GitLab for version control and CI/CD workflows
- Working within a modern, scalable data environment, contributing to data quality, governance, and delivery standards
- Deep hands-on experience with Snowflake in large-scale data engineering or warehousing projects
- Solid background in data modelling and transformation using dbt
- Proven experience in ELT workflows, ideally with Meltano or similar tooling
- Comfortable using GitLab for code versioning, automation, and collaboration across teams
- Strong SQL skills and a good understanding of cloud-based data architecture
- FiveTran for data pipeline automation
- Coalesce or Airflow (or AWS-managed version, Amazon MWAA)
- Exposure to modern orchestration and scheduling tools
How to Apply
If you're an experienced Data Engineer with a passion for modern data platforms and cloud tooling – apply today and one of our consultants will be in touch.