Data Engineer – DV 2.0 Certified

Website Dijital Team

Data Engineer – Data Vault 2.0 Certified

We are seeking a highly skilled Data Engineer with mandatory DV 2.0 Certification to join our growing data engineering team. In this role, you will play a pivotal part in delivering a next-generation, cloud-based data platform on Snowflake (Azure) . Your primary focus will be the strategic migration of legacy SQL Server/SSRS solutions into a modern, scalable ecosystem, rebuilding complex transformation logic, and implementing Data Vault 2.0 modeling patterns. This is an opportunity for a structured, detail-oriented engineer to drive auditable, high-performance analytics delivery for Power BI and beyond.


Core Responsibilities: Platform Migration & DV 2.0 Implementation

You will be responsible for the end-to-end engineering lifecycle, from raw data ingestion to the creation of reporting-ready information marts.

  • Data Vault 2.0 Architecture: Design and implement robust DV 2.0 structures, including Hubs, Links, and Satellites . You will be tasked with re-modeling legacy reporting data into these scalable patterns and developing Business Vault logic to codify complex business rules.

  • Snowflake Engineering: Build and optimize high-performance ELT pipelines within Snowflake. You will develop advanced SQL-based transformations and stored procedures, ensuring adherence to strict engineering and performance standards.

  • Azure-Based Ingestion & Orchestration: Manage data landing as Parquet/Delta files and utilize Azure Data Factory (ADF) for orchestration. You will be responsible for onboarding new sources and maintaining reliable, secure pipeline operations.

  • Reporting Enablement: Create and maintain downstream reporting layers (Information Marts/Dimensional patterns) specifically optimized for Power BI and automated data outputs via Power Automate.

  • Agile Project Delivery: Operate within a Scrum environment (Jira), contributing to sprint planning, estimation, and peer reviews. You will actively participate in knowledge-sharing sessions to uplift the team’s collective DV 2.0 and Snowflake capability.


Candidate Profile

We are looking for a “Data Architect-Engineer Hybrid” who understands the rigors of Data Vault methodology as well as the technical nuances of cloud performance tuning.

  • Experience: 4–5 years of dedicated experience in Data Engineering roles delivering enterprise-grade platforms.

  • Mandatory Certification: Data Vault 2.0 Certified (CDVP2) is a strict requirement for this role.

  • Technical Mastery:

    • Deep expertise in Data Vault 2.0 modeling and implementation.

    • Strong SQL foundations , including complex procedural logic and performance tuning.

    • Hands-on experience with Snowflake (Azure ecosystem preferred).

  • Cloud & Tools Proficiency:

    • Experience with Azure Data Factory (ADF) and cloud data orchestration.

    • Familiarity with Parquet/Delta file-based ingestion patterns.

    • Exposure to DV 2.0 automation/modeling tools (eg, IRIS or similar) is highly desirable.

    • Understanding of CI/CD and DevOps practices for data engineering.

  • Analytical Mindset: Ability to translate fragmented legacy reporting requirements into unified, scalable data models.


Standardized Job Data

Field Details
Position Data Engineer (DV 2.0 Certified)
Experience Level 4–5 Years (Intermediate-Senior)
Mandatory Cert Data Vault 2.0 (CDVP2)
Tech Stack Snowflake, Azure, SQL, ADF, Parquet/Delta
Primary Methodology Data Vault 2.0, Agile/Scrum
Downstream Focus Power BI Information Marts, Power Automate
Key Metrics Pipeline Reliability, Data Auditability, Query Performance

Why Join This Team?

  • Greenfield Innovation: Lead the migration from legacy SQL Server to a cutting-edge Snowflake architecture.

  • Methodological Rigor: Work in an environment that prioritizes the structural integrity and auditability of Data Vault 2.0.

  • Professional Growth: Collaborate with architects and domain experts to solve complex data modeling challenges in a high-growth cloud environment.