Back to Opportunities
Data

Lead Data Engineer

National Archives of Australia

Canberra, ACT Baseline clearance Closes 11 May 2026 at 11:59pm

About the Role

The Unified Data View project aims to aggregate, unify and presents collection metadata from multiple, disparate sources into a single, cohesive view to facilitate better decision making, improve data accessibility, and enhance data analysis capabilities for staff across the National Archives of Australia.

The National Archives is seeking a Data Engineer to join the Project Delivery Team to support the delivery of this strategic initiative.

To be eligible for the role, you must be an Australian citizen. If selected, you will be required to successfully undergo a pre-employment check and, due to the urgency of resource requirements, National Archives prefers candidates to already hold a Baseline security clearance.

Our ideal candidates will have proven experience in data engineering and have a deep understanding of data analytics, to help the project implement a data warehousing system that follows the business logic and rules dictated by the Archival control practices at the National Archives of Australia. This candidate must demonstrate interpersonal skills to foster and promote National Archives values, a highly developed written and oral communication skills, mentoring support to staff, ability to negotiate and problem solve.

Key duties and responsibilities

Under limited direction, the main duties of the role include:       

  • Data Solution Design & Architecture

  • Enhance and maintain scalable data solutions and infrastructure.

  • Configure and optimise data warehouse architecture, applying best practices and standardised data models.

  • Design and maintain source-to-target data mappings for the data warehouse

  • Data Integration & Pipeline Management

  • Build and operate ETL/ELT pipelines to ingest, transform, and store metadata from RecordSearch, Mediaflex, Preservica, and other sources.

  • Identify, prioritise, and ingest datasets to support reporting, analytics, and operational needs.

  • Data Quality, Governance & Compliance

  • Implement validation, cleansing, and governance processes to ensure accuracy, consistency, and compliance.

  • Establish and monitor data quality standards, proactively resolving issues.

  • Analytics & Visualisation

  • Develop Power BI Reports and visualisations and apply data storytelling to support decision-making.

  • Collaboration & Stakeholder Engagement

  • Work with internal project teams and external stakeholders to align solutions with business and technical requirements.

  • Values & Organisational Alignment

  • Apply APS Values, Code of Conduct, WHS principles, and promote the National Archives’ Vision and Strategy 2030.

Required skills and experience

  • Cloud & DevOps

    • Azure DevOps for version control, tasks, branching/merging.

    • Mandatory: Git repos and Fabric deployment pipelines.

    • Knowledge of Azure architecture, optimisation and cost management.

  • Microsoft Fabric

    • Mandatory: Experience with Fabric data loading, architecture and orchestration.

  • Project Delivery Experience

  • Backlog prioritisation, test scenario creation, UAT support and SQL validation.

  • Strong documentation discipline across mapping, lineage and traceability.

  • Data Governance & Security

  • Mandatory: Data quality, validation, cleansing and governance expertise.

  • Understanding of government security, privacy and compliance regulations (Archives Act, Privacy Act)

  • Analytical & Visualisation Skills

  • Mandatory Power BI dashboard/report development.

  • Strong analytical and data telling skills

  • Security clearance

    • able to hold/maintain Baseline clearance

  • Desirable

  • Certifications: Microsoft Fabric (DP-600, DP-700), Power BI (PL-300), Azure (AZ-900), Data Engineering, or Business Intelligence.

  • Familiarity with RecordSearch, Preservica, and Mediaflex is advantageous but not essential.

Technical skills

Essential skills:
• Strong applied skills in data engineering, including database fundamentals, data modelling, and ETL/ELT pipeline development.
• Proficiency in Microsoft Fabric is mandatory.
• A minimum of 5 years’ experience in Power BI including an understanding of data modelling, power query and data analysis expressions (DAX) is mandatory.
• A minimum of 5 years’ experience with Python for data processing and automation is mandatory.
• Familiarity with Pyspark, Pandas, Fabric management

Requirements

The buyer has specified that each candidate must provide a one page pitch to address all criteria specified. This is equal to 5000 characters.

Essential criteria

  1. Technical Expertise
    Proven capability in designing and optimising scalable data solutions in Microsoft Fabric and Azure, including the architecture and configuration of enterprise data warehouses.
    Strong expertise in data mapping and modelling, producing accurate ERDs, domain classifications, and table groupings that reflect business processes, while analysing view logic, data lineage, and cross system dependencies to ensure transparency and integrity.
    Skilled in rationalising complex data environments by analysing and consolidating system knowledge into clear, consumable artefacts for technical and leadership stakeholders.
    Knowledge of and understanding of data modelling, power query and data analysis expressions (DAX). Familiarity with Pyspark, Pandas, Microsoft Fabric management CLI tools and schema design for interoperability is desirable
    Relevant certifications in Microsoft Fabric, Azure, Data Engineering, or Business Intelligence.

  2. Cloud & DevOps
    Demonstrates experience using Azure DevOps for version control, task management, branching and merging. Utilising Git repositories, and Fabric deployment pipelines
    Understanding of cloud architecture principles, performance optimisation, and cost management in Microsoft Azure environments.

  3. Project Delivery Experience
    Backlog prioritisation such as balancing business value with technical dependencies.
    Testing support such as writing test scenarios, validating data outputs, supporting UAT, and performing SQL based checks.
    Documentation discipline such as maintaining mapping documents, lineage diagrams, and requirements traceability.

  4. Data Governance & Security
    Knowledge of data quality frameworks, validation, cleansing, and governance processes.
    Awareness of security, privacy, and compliance requirements relevant to government data (e.g., Archives Act, Privacy Act).

  5. Analytical & Visualisation Skills
    Experience creating data maps by extracting, analysing, and documenting view logic, to form a cohesive view of source systems and source to target mapping
    Use of critical thinking, curiosity and analytical abilities to design and develop interactive dashboards and reports in Power BI
    Strong data analysis and storytelling skills to support decision-making.

  6. Quality Assurance and Deliverable Management
    Ensures that project and product quality reviews are conducted on schedule and according to procedure. Manages deliverables to be completed within agreed cost, time, and resource constraints, and ensures formal acceptance by stakeholders.

Desirable criteria

  1. Information Management and Decision Support
    Captures and disseminates technical and business information effectively. Facilitates business decision-making processes and provides informed feedback to promote understanding across stakeholder groups.

  2. Security Clearance
    Preference is for the candidate to hold a Baseline Security Clearance due to the tight timeframe for Project Delivery however will consider candidates that have the ability to obtain and maintain a Baseline Security Clearance.

  3. Desirable Skills and Qualifications
    Relevant certifications in Microsoft Fabric (DP-600, DP-700) Power BI (PL-300), Azure (AZ-900), Data Engineering, or Business Intelligence Knowledge of with metadata management and interoperability standards.
    Understanding of collection data within the Galleries, Libraries, Archives and Museums sector would be advantageous

  4. Familiarity with RecordSearch, Preservica, and Mediaflex is advantageous but not essential


Rate on application

If there is mutual interest, we will talk you through the rate structure and next steps in more detail.

Apply for this role

Submit your resume and any supporting information you would like us to consider. If there is a fit, we will be in touch to discuss the role further.

By applying, you agree to our Privacy Policy. Resumes are stored securely and only accepted in supported file formats.