
Specialist Data Engineer (ke)
Absa Bank Limited
Nairobi | Full Time | Banking / Financial Services
Closing in 1 week from now
Job Summary
- Work embedded as a member of squad OR; across multiple squads to produce, test, document and review algorithms & data specific source code that supports the deployment & optimisation of data retrieval, processing, storage and distribution for a business area.
Job Description
Data Architecture & Data Engineering
- Understand the technical landscape and bank wide architecture that is connected to or dependent on the business area supported in order to effectively design & deliver data solutions architecture, pipeline etc.
- Translate / interpret the data architecture direction and associated business requirements & leverage expertise in analytical & creative problem solving to synthesise data solution designs build a solution from its components beyond the analysis of the problem
- Participate in design thinking processes to successfully deliver data solution blueprints
- Leverage state of the art relational and No-SQL databases as well integration and streaming platforms do deliver sustainable business specific data solutions.
- Design data retrieval, storage & distribution solutions and OR components thereof including contributing to all phases of the development lifecycle e.g. design process
- Develop high quality data processing, retrieval, storage & distribution design in a test driven & domain driven / cross domain environment
- Build analytics tools that utilize the data pipeline by quickly producing well-organised, optimized, and documented source code & algorithms to deliver technical data solutions
- Create & Maintain Sophisticated CI / CD Pipelines authoring & supporting CI/CD pipelines in Jenkins or similar tools and deploy to multi-site environments – supporting and managing your applications all the way to production
- Automate tasks through appropriate tools and scripting technologies e.g. Ansible, Chef
- Debug existing source code and polish feature sets.
- Assemble large, complex data sets that meet business requirements & manage the data pipeline
- Build infrastructure to automate extremely high volumes of data delivery
- Create data tools for analytics and data science teams that assist them in building and optimizing data sets for the benefit of the business
- Ensure designs & solutions support the technical organisation principles of self-service, repeatability, testability, scalability & resilience
- Apply general design patterns and paradigms to deliver technical solutions
- Inform & support the infrastructure build required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Support the continuous optimisation, improvement & automation of data processing, retrieval, storage & distribution processes
- Ensure the quality assurance and testing of all data solutions aligned to the QA Engineering & broader architectural guidelines and standards of the organisation
- Implement & align to the Group Security standards and practices to ensure the undisputable separation, security & quality of the organisation’s data
- Meaningfully contribute to & ensure solutions align to the design & direction of the Group Architecture & in particular data standards, principles, preferences & practices. Short term deployment must align to strategic long term delivery.
- Meaningfully contribute to & ensure solutions align to the design and direction of the Group Infrastructure standards and practices e.g. OLA’s, IAAS, PAAS, SAAS, Containerisation etc.
- Monitor the performance of data solutions designs & ensure ongoing optimization of data solutions
- Stay ahead of the curve on data processing, retrieval, storage & distribution technologies & processes global best practices & trends to ensure best practice
Risk & Governance
- Identify technical risks and mitigate these pre, during & post deployment
- Update / Design all application documentation aligned to the organization technical standards and risk / governance frameworks
- Create business cases & solution specifications for various governance processes e.g. CTO approvals
- Participate in incident management & DR activity – applying critical thinking, problem solving & technical expertise to get to the bottom of major incidents
- Deliver on time & on budget always
Must have experience in:
- Spark and Scala Developers
- Hadoop experience
- Experience in ETL
- AWS S3 Buckets
- Data Engineering skill
Education
- Bachelor's Degree: Information Technology
Never miss a chance!
Subscribe to get latest job listings, career insights and guidance in your inbox