Apply now »

Data Management Engineer

HQ

Job Details

Job Title:

Data Management Engineer

Section:

Data Management

Functional Level:

Entry

Department:

Technology Platform

Sector

Operations

Reporting Lines

Reporting Lines:

Reports directly to:

Business Integration & Enterprise Architecture Deputy Director

No. of subordinates:

N/A

Budget:

N/A

Job Purpose

Design, build, customize, integrate data from various resources, and manage data on big data, design and write complex data pipeline.

Job Responsibilities

Specific objectives and deliverables will be agreed as part of your annual Performance and Development Review (PDR) process, the points below are a summary of your main duties and responsibilities.

Job-specific responsibilities:

  1. Architecting and implementing distributed data pipelines systems
  2. Architecting and implementing distributed real time data streams processing systems.
  3. Developing data processing using large data sets in data lakes/big data eco systems, to support data services and analytics requirements.
  4. Responsible for finding trends in data sets and using/building/developing algorithms to help make raw data more useful to the enterprise.
  5. Identify ways to improve data reliability, efficiency, and quality
  6. Support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture
  7. Deploying a variety of data mining/data analysis methods.
  8. Use data to discover tasks that can be automated
  9. Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.

.

10. Assess the effectiveness and accuracy of new data sources and data-gathering techniques.
11. Applying and supporting custom data models and algorithms to apply to data sets.
12. Applying and supporting predictive models.
13. Develop company data testing framework and test model quality.
14. Develop processes and tools to monitor and analyze model performance and data accuracy.
15. Coordinate with different functional teams to implement data pipelines and streams and monitor outcomes.

KPIs (Key Performance Indicators)

  • Requirement Document Quality 16.6%
  • Analyses / design quality 16.6%
  • Customer experience. 16.6%
  • Reporting and Documentation 16.6%
  • Mean time to repair (MTTR) 16.6%
  • Meeting SLAs / OLAs 16.6%

Competency Model

Core Competencies

Level

  1. Working Strategically
  2. Customer Orientation
  3. Critical Thinking
  4. Agility 
  5. Innovation
  6. Leadership

1
1
2
1
2
1

Technical competency level

Technical Competencies

Level

  1. Problem Solving
  2. Project management
  3. Customer Experience

B
B
B

Specifications: Qualifications, Experience, skills

4.  Specifications (Qualifications, Experience and Skills):

Education      

Specialization:

  • Minimum of bachelor’s degree in Statistics, Operations Research, Mathematics, Physics, Computer Science, or Engineering, or in a related field.
  • A drive to learn and master new technologies and techniques.
  • Experience working with statistical computer languages: R.
  • Solid understanding and experience with Python, pandas, pyspark, dash, flask.

Experience                 

  • Minimum 2 years of relevant experience.

Conditions of Employment

  1. Days Off: 2 Days/Week
  2. PPEs / Health, Safety, and Environment requirements: No
  3. Working Environment: 98% standard office work & 2% out of office.
  4. Working Hours : From 9:00 AM till 5 PM.

Apply now »