Experience

5+ years

Seniority level

Senior

Employment type

Full-Time

Overview

We are seeking a Senior Data Engineer to design, build, and optimize the data pipelines and streaming infrastructure powering a next-generation Smart City Data Platform. The role will focus on building real-time and batch data pipelines, integrating multiple operational systems, and enabling advanced analytics and AI capabilities across city services. The Senior Data Engineer will play a critical role in building the data ingestion and transformation layer, enabling data to flow from operational platforms such as mobile applications, IoT systems, access control systems, utilities platforms, and external vendor systems into the centralized data platform. The ideal candidate combines strong experience in data engineering, distributed systems, and cloud data platforms, with the ability to design scalable data architectures.

Responsibilities

Key Responsibilities

Data Platform Development

  • Design and implement scalable data ingestion pipelines for structured and unstructured data.

  • Build real-time data streaming pipelines using technologies such as Apache Kafka or similar event streaming platforms.

  • Develop ETL / ELT pipelines to transform operational data into analytics-ready datasets.

      • Data Integration

        • Integrate multiple data sources including:

          • APIs

          • databases

          • IoT platforms

          • vendor systems

          • event streams

        • Implement Change Data Capture (CDC) pipelines to capture data updates from operational systems.

        Data Transformation & Modeling

        • Implement data transformations and aggregation pipelines using distributed processing frameworks.

        • Work with data architects to implement lakehouse data models (Bronze, Silver, Gold layers).

        • Ensure data quality, consistency, and reliability across the platform.

        Performance & Scalability

        • Optimize pipelines for large-scale data processing and low-latency ingestion.

        • Monitor and improve pipeline performance and system reliability.

        • Implement best practices for data partitioning, indexing, and storage optimization.

        Platform Collaboration

        • Work closely with:

          • data architects

          • analytics teams

          • AI/data science teams

          • platform engineers

        to ensure data pipelines support analytics and machine learning use cases.

        Monitoring & Reliability

        • Implement monitoring and alerting for data pipelines.

        • Troubleshoot pipeline failures and data inconsistencies.

        • Maintain high availability of the data infrastructure.


Qualifications

Required Qualifications

  • Bachelor’s or master’s degree in computer science, Engineering, Data Engineering, or related field.

  • 5+ years' experience in data engineering or big data platforms.

  • Strong experience building data pipelines and distributed data systems.


Technical Skills

Programming

  • Python

  • SQL

  • Java or Scala (preferred)

Data Engineering

  • ETL / ELT pipeline development

  • Data modeling

  • Data transformation frameworks

Streaming & Messaging

Experience with one or more of:

  • Apache Kafka

  • Kafka Connect

  • Event Hub

  • RabbitMQ

Data Platforms

Experience with modern data platforms such as:

  • Databricks

  • Snowflake

  • BigQuery

  • Redshift

  • Azure Data Lake

Data Processing Frameworks

Experience with:

  • Apache Spark

  • Spark Streaming

  • Airflow or similar orchestration tools

  • Cloud Platforms

    Experience with cloud environments such as:

    • Microsoft Azure

    • AWS

    • Google Cloud

      Preferred Experience

      • Experience building near real-time analytics platforms

      • Experience working with IoT or smart city data systems

      • Experience implementing CDC pipelines using Debezium

      • Experience with data lakehouse architectures


      Soft Skills

      • Strong analytical and problem-solving skills

      • Ability to design scalable and reliable systems

      • Excellent collaboration and communication skills

      • Ability to work in cross-functional engineering teams


      Key Deliverables in This Role

      The Senior Data Engineer will contribute to delivering:

      • Real-time data ingestion platform

      • Data pipelines integrating city systems

      • Scalable lakehouse data architecture

      • Analytics-ready datasets for BI dashboards

      • Data pipelines supporting AI models


      Why Join Unparticle

      • Work on a large-scale Smart City platform

      • Build cutting-edge real-time data infrastructure

      • Collaborate with leading technology teams and data scientists

      • Contribute to transforming urban digital ecosystems

Share Job

Application Form

*Doc, Docx, PDF (Max file size - 6MB) upload icon