Data Engineer job at Absa Bank
Posted by: great-volunteer
Posted date: 2025-Jul-09
Location: Kampala, Uganda
Data Engineer 2025-07-08T17:29:11+00:00 Absa Bank https://cdn.ugashare.com/jsjobsdata/data/employer/comp_3160/logo/Absa%20Bank.png https://www.absa.co.ug/personal/ FULL_TIME Kampala Kampala 00256 Uganda Banking Computer & IT 2025-07-15T17:00:00+00:00 Uganda 8 Data Engineer at Absa Bank With over 100 years of rich history and strongly positioned as a local bank with regional and international expertise, a career with our family offers the opportunity to be part of this exciting growth journey, to reset our future and shape our destiny as a proudly African group. My Career Development Portal: Wherever you are in your career, we are here for you. Design your future. Discover leading-edge guidance, tools and support to unlock your potential. You are Absa. You are possibility. Job Summary Responsible for designing and maintaining secure, scalable ETL pipelines that integrate data from various banking systems, while managing data warehouses and lakes to ensure efficient storage, backup, and replication. Will support regulatory compliance through automated reporting and real-time processing for fraud detection and collaborate with analysts and data scientists to deliver clean, high-quality data. The role is grounded in strong data governance and architecture principles, ensuring that all systems are aligned, reliable, and optimized for performance and compliance. Job Description Accountability: Data Pipeline & Integration â 30% - Design and implement automated ETL (Extract, Transform, Load) pipelines to collect data from core banking systems, mobile apps, ATMs, and third-party APIs.
- Standardize and transform raw data into consistent formats for downstream systems.
- Ensure secure, encrypted data transfer and enforce access controls to protect sensitive financial information.
- Contribute to the data architecture by defining how data flows across systems, ensuring scalability, modularity, and maintainability.
Accountability: Data Warehousing & Management â 25% - Build and manage data warehouses and data lakes to store structured and unstructured data efficiently.
- Apply data modeling techniques and optimize storage using indexing, partitioning, and compression.
- Implement data lifecycle management, including retention, archival, and deletion policies.
- Set up data backup and replication strategies to ensure high availability, disaster recovery, and business continuity.
- Align storage solutions with the bankâs enterprise data architecture, ensuring compatibility with analytics, reporting, and compliance systems.
Accountability: Compliance & Real-Time Processing â 25% - Automate data preparation for regulatory reporting (e.g., KYC, AML, Basel III) using governed ETL workflows.
- Build real-time data processing systems using tools like Apache Kafka or Spark Streaming for fraud detection and transaction monitoring.
- Ensure data lineage, auditability, and traceability to support compliance audits and internal controls.
- Design real-time processing components as part of the broader data architecture, ensuring they integrate seamlessly with batch systems and reporting tools.
Accountability: Collaboration, Data Quality & Governance â 20% - Work with data scientists and analysts to deliver clean, reliable datasets for modeling and reporting.
- Apply validation rules, anomaly detection, and monitoring to maintain high data quality across ETL pipelines.
- Maintain metadata catalogs, data dictionaries, and lineage tracking to support transparency and governance.
- Collaborate with data stewards and architects to enforce data governance policies and ensure alignment with the bankâs overall data strategy.
Role/person specification: Preferred Education - Bachelorâs degree in Computer Science, Software Engineering, Information Technology, Data Science, Computer Engineering, Mathematics, Statistics, or a related field. (Master is an added advantage)
- Relevant professional certifications in data engineer like, Google Cloud Data Engineer, Azure Data Engineer (DP-203), AWS Data Analytics Specialty, Databricks Data Engineer, Snowflake, Kafka, Kubernetes, Analytics, Machine Learning, Artificial Intelligence and Cloud Platforms (GCP, AWS, Azure) are considered added advantages
Preferred Experience - â¢At least 3-5 yearsâ experience in working on building data pipelines, working with big data and cloud platform, managing real-time and warehouse data systems, and collaborating with cross-functional teams.
- Financial domain knowledge is an added advantage
Knowledge and Skills - Technical Proficiency: Skilled in data modeling, ETL/ELT, big data tools, programming (Python, R, SQL), data visualization, and cloud platforms.
- Analytical & Problem-Solving: Able to manage complex datasets, optimize pipelines, and ensure data quality.
- Communication & Collaboration: Effective in documenting workflows and working with cross-functional teams.
Education Bachelor's Degree: Information Technology (Required) Accountability: Data Pipeline & Integration â 30% Design and implement automated ETL (Extract, Transform, Load) pipelines to collect data from core banking systems, mobile apps, ATMs, and third-party APIs. Standardize and transform raw data into consistent formats for downstream systems. Ensure secure, encrypted data transfer and enforce access controls to protect sensitive financial information. Contribute to the data architecture by defining how data flows across systems, ensuring scalability, modularity, and maintainability. Accountability: Data Warehousing & Management â 25% Build and manage data warehouses and data lakes to store structured and unstructured data efficiently. Apply data modeling techniques and optimize storage using indexing, partitioning, and compression. Implement data lifecycle management, including retention, archival, and deletion policies. Set up data backup and replication strategies to ensure high availability, disaster recovery, and business continuity. Align storage solutions with the bankâs enterprise data architecture, ensuring compatibility with analytics, reporting, and compliance systems. Accountability: Compliance & Real-Time Processing â 25% Automate data preparation for regulatory reporting (e.g., KYC, AML, Basel III) using governed ETL workflows. Build real-time data processing systems using tools like Apache Kafka or Spark Streaming for fraud detection and transaction monitoring. Ensure data lineage, auditability, and traceability to support compliance audits and internal controls. Design real-time processing components as part of the broader data architecture, ensuring they integrate seamlessly with batch systems and reporting tools. Accountability: Collaboration, Data Quality & Governance â 20% Work with data scientists and analysts to deliver clean, reliable datasets for modeling and reporting. Apply validation rules, anomaly detection, and monitoring to maintain high data quality across ETL pipelines. Maintain metadata catalogs, data dictionaries, and lineage tracking to support transparency and governance. Collaborate with data stewards and architects to enforce data governance policies and ensure alignment with the bankâs overall data strategy. â¢At least 3-5 yearsâ experience in working on building data pipelines, working with big data and cloud platform, managing real-time and warehouse data systems, and collaborating with cross-functional teams. Financial domain knowledge is an added advantage Knowledge and Skills Technical Proficiency: Skilled in data modeling, ETL/ELT, big data tools, programming (Python, R, SQL), data visualization, and cloud platforms. Analytical & Problem-Solving: Able to manage complex datasets, optimize pipelines, and ensure data quality. Communication & Collaboration: Effective in documenting workflows and working with cross-functional teams. Education Bachelor's Degree: Information Technology (Required) JOB-686d55672181a Vacancy title: Data Engineer Jobs at: Absa Bank Deadline of this Job: Tuesday, July 15 2025 Duty Station: Kampala | Kampala | Uganda Summary Date Posted: Tuesday, July 8 2025, Base Salary: Not Disclosed JOB DETAILS: Data Engineer at Absa Bank With over 100 years of rich history and strongly positioned as a local bank with regional and international expertise, a career with our family offers the opportunity to be part of this exciting growth journey, to reset our future and shape our destiny as a proudly African group. My Career Development Portal: Wherever you are in your career, we are here for you. Design your future. Discover leading-edge guidance, tools and support to unlock your potential. You are Absa. You are possibility. Job Summary Responsible for designing and maintaining secure, scalable ETL pipelines that integrate data from various banking systems, while managing data warehouses and lakes to ensure efficient storage, backup, and replication. Will support regulatory compliance through automated reporting and real-time processing for fraud detection and collaborate with analysts and data scientists to deliver clean, high-quality data. The role is grounded in strong data governance and architecture principles, ensuring that all systems are aligned, reliable, and optimized for performance and compliance. Job Description Accountability: Data Pipeline & Integration â 30% - Design and implement automated ETL (Extract, Transform, Load) pipelines to collect data from core banking systems, mobile apps, ATMs, and third-party APIs.
- Standardize and transform raw data into consistent formats for downstream systems.
- Ensure secure, encrypted data transfer and enforce access controls to protect sensitive financial information.
- Contribute to the data architecture by defining how data flows across systems, ensuring scalability, modularity, and maintainability.
Accountability: Data Warehousing & Management â 25% - Build and manage data warehouses and data lakes to store structured and unstructured data efficiently.
- Apply data modeling techniques and optimize storage using indexing, partitioning, and compression.
- Implement data lifecycle management, including retention, archival, and deletion policies.
- Set up data backup and replication strategies to ensure high availability, disaster recovery, and business continuity.
- Align storage solutions with the bankâs enterprise data architecture, ensuring compatibility with analytics, reporting, and compliance systems.
Accountability: Compliance & Real-Time Processing â 25% - Automate data preparation for regulatory reporting (e.g., KYC, AML, Basel III) using governed ETL workflows.
- Build real-time data processing systems using tools like Apache Kafka or Spark Streaming for fraud detection and transaction monitoring.
- Ensure data lineage, auditability, and traceability to support compliance audits and internal controls.
- Design real-time processing components as part of the broader data architecture, ensuring they integrate seamlessly with batch systems and reporting tools.
Accountability: Collaboration, Data Quality & Governance â 20% - Work with data scientists and analysts to deliver clean, reliable datasets for modeling and reporting.
- Apply validation rules, anomaly detection, and monitoring to maintain high data quality across ETL pipelines.
- Maintain metadata catalogs, data dictionaries, and lineage tracking to support transparency and governance.
- Collaborate with data stewards and architects to enforce data governance policies and ensure alignment with the bankâs overall data strategy.
Role/person specification: Preferred Education - Bachelorâs degree in Computer Science, Software Engineering, Information Technology, Data Science, Computer Engineering, Mathematics, Statistics, or a related field. (Master is an added advantage)
- Relevant professional certifications in data engineer like, Google Cloud Data Engineer, Azure Data Engineer (DP-203), AWS Data Analytics Specialty, Databricks Data Engineer, Snowflake, Kafka, Kubernetes, Analytics, Machine Learning, Artificial Intelligence and Cloud Platforms (GCP, AWS, Azure) are considered added advantages
Preferred Experience - â¢At least 3-5 yearsâ experience in working on building data pipelines, working with big data and cloud platform, managing real-time and warehouse data systems, and collaborating with cross-functional teams.
- Financial domain knowledge is an added advantage
Knowledge and Skills - Technical Proficiency: Skilled in data modeling, ETL/ELT, big data tools, programming (Python, R, SQL), data visualization, and cloud platforms.
- Analytical & Problem-Solving: Able to manage complex datasets, optimize pipelines, and ensure data quality.
- Communication & Collaboration: Effective in documenting workflows and working with cross-functional teams.
Education Bachelor's Degree: Information Technology (Required) Work Hours: 8 Experience in Months: 36 Level of Education: bachelor degree Job application procedure Interested and qualified? Click Here to Apply
|