Overview

We are recruiting!

Our client in South Africa, Johannesburg is looking for an ICT Data Engineer to join their team for a job vacancy within the Information and Communications Technology industry.

The purpose of the Data Engineer role is to support the development and maintenance of data pipelines and systems within our data environment. This role is essential for ensuring smooth data operations by assisting in data extraction, transformation, and loading (ETL) processes. The Data Engineer will work with tools like Matillion for ETL, Snowflake for data warehousing, and Power BI for data visualization, gaining valuable experience and contributing to data-driven decision-making. The ideal candidate will have a foundational understanding of API integration, data security practices, and will be eager to grow their skills in this cloud-based environment

  • Communication: Articulate ideas clearly, listen actively, and adapt communication styles to engage diverse stakeholders effectively.
  • Problem-solving: Analyse complex issues, identify root causes, and develop innovative solutions to overcome challenges.
  • Strategic Thinking: Anticipate future trends, evaluate risks, and formulate long-term plans aligned with organisational objectives.
  • Adaptability: Embrace change, remain flexible, and thrive in dynamic environments by quickly adjusting strategies and priorities.
  • Decision-making: Make informed decisions based on data, analysis, and intuition, considering both short-term impacts and long-term implications.
  • Collaboration: Build strong relationships, leverage diverse perspectives, and work effectively in cross-functional teams to achieve shared goals.
  • Emotional Intelligence: Demonstrate empathy, self-awareness, and resilience in managing interpersonal dynamics and navigating challenging situations.
  • Innovation: Encourage creativity, experiment with new ideas, and continuously seek opportunities for improvement and growth.
  • Time Management: Prioritise tasks, manage resources efficiently, and maintain focus on key objectives to meet deadlines and deliver results effectively.
  • ETL Development and Maintenance:
    • Assist in designing and implementing ETL processes using Matillion to manage data flow from various sources to Snowflake.
    • Perform data validation and transformation tasks to ensure data quality and consistency.
    • Support troubleshooting and optimization of ETL workflows to enhance performance and reliability.
  • Data Warehouse Support:
    • Help manage and maintain the Snowflake data warehouse, ensuring efficient data storage and retrieval.
    • Assist in schema design and data modelling to support business intelligence initiatives.
    • Participate in data partitioning, clustering, and indexing tasks to optimize data warehouse performance.
  • API Integration:
    • Collaborate with other technical teams to develop and utilize APIs for data integration between external and internal systems.
    • Assist in testing and documenting API endpoints to ensure seamless data flow and accessibility.
  • Data Security and Compliance:
    • Learn and apply security authentication and authorization protocols to safeguard sensitive data.
    • Support compliance with data governance standards and industry regulations to maintain data privacy and security.
  • Data Visualization and Reporting:
    • Collaborate with business users to develop reports and dashboards using Microsoft Power BI.
    • Basic understanding of UI/UX best practices.
    • Assist in analysing data to extract insights and support informed decision-making processes.
  • Collaboration and Learning:
    • Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions.
    • Engage in continuous learning and development to expand technical skills and industry knowledge.
    • Seek mentorship and guidance from senior team members to grow professionally and contribute effectively.
  • Daily & Monthly Housekeeping:
    • Monitor ETL workflows, ensure data quality and system performance, and maintain security by verifying job success, tracking metrics on Matillion, AWS and Snowflake. Optimization of current platform to ensure data correctness for reporting.
  • Education:
    • Bachelor’s degree/Diploma in Computer Science, Information Technology, Data Science, or a related field.
    • Relevant coursework or certifications in AWS, Azure, data engineering, or data analytics are a plus.
  • Experience:
    • 3-5 years of experience in a data engineering role.
    • Familiarity with Matillion for ETL development and Snowflake as a data warehouse platform is preferred.
    • Familiarity with focusing on AWS, ETL processes, and data warehousing is preferred.
    • Basic understanding of API development and integration practices.
  • Technical Skills:
    • Experience with AWS/Azure is desirable.
    • Proficiency in SQL for data manipulation and querying.
    • Exposure to programming languages such as Python or JavaScript is advantageous.
    • Basic knowledge of Microsoft Power BI for data visualization and reporting.
    • Familiarity with DevOps practices GIT and CI/CD, pipelines for data engineering projects.
    • Understanding of data security principles and practices.