MENU
United Nations Environment Programme UNEP
Data Engineer, P3

United Nations Environment Programme Unep

Nairobi | Full Time | General

Closing in 3 weeks from now

Responsibilities

  • Data pipeline and integration engineering
  • Design, build and maintain secure, performant and reusable data pipelines and ETL/ELT processes to ingest, clean, transform and deliver environmental, administrative and partner data from multiple internal and external sources APIs, databases, flat files, EO feeds into UNEP's corporate data platforms and the World Environment Situation Room.
  • Implement internal process improvements: automate manual ingestion, optimize data delivery, re- design data architectures for greater scalability, observability and cost efficiency.
  • Develop and maintain data catalogues, metadata and lineage that enable users to discover and understand available datasets and ensure compliance with UNEP's data governance policies.
  • Apply infrastructure-as-code and CI/CD practices e.g. Git-based workflows, automated tests, containerization in the deployment of data pipelines across development, test and production environments.
  • Platform operations and quality
  • Provide technical support to programme units to onboard new data sources, improve data flows and use self-service analytics safely.
  • Participate in agile ceremonies, technical reviews and UNEP-wide digital communities of practice to share good practices in data engineering and DevOps.
  • Perform other duties as may be required.

Education

  • Advanced university degree Master's degree or equivalent in computer science, information systems, data management, engineering or a related field is required. A first-level university degree in combination with two additional years of qualifying experience may be accepted in lieu of the advanced university degree.

Work Experience

  • A minimum of five 5 years of progressively responsible experience in data engineering, data integration, data management or related area is required. Experience building and operating data pipelines from multiple sources, using ETL/ELT tools, is required.
  • Experience with database programming languages SQL and at least one high-level programming language used in data engineering such as Python is required.
  • Experience with version control and CI/CD toolchains e.g. Git, GitLab/GitHub, Jenkins, Azure DevOps for deploying data workflows is required. Experience with cloud or hybrid data infrastructure and with the deployment of data solutions in containerized environments is desirable. Experience working in or with UN entities, international organizations or other large, distributed organizations is desirable.

Never miss a chance!

Subscribe to get latest job listings, career insights and guidance in your inbox