Key Accountabilities/Key Activities:
- Own the data technology stack, its design, development and operation throughout.
- Lead the implementation process of the data platforms to be adopted by the business.
- Do regular assessments to maintain the quality and keep the technology up to date across industry standards.
- Develop automated data pipelines required from the broad operational systems into the data warehouse.
- Develop a sound data architectural model to support the growth strategy of the business.
- Maintain data-related documentation and proactively provide knowledge sharing sessions with fellow colleagues, possibly including training sessions.
- Using data analytics platforms, help develop reports and real-time dashboards required to understand the performance of the various insights and forecasts.
- Have a constant review of data pipelines to maintain the highest level of data quality.
- Support and data-related queries and investigate any data quality issues which might be raised from time to time.
- Follow and apply best practices while building the data streaming framework.
- Maintain transparent communication within the team / other departments within the company.
- Participate in brainstorming meetings to discuss existing issues and new ideas.
- Working on Proof of Concept for selected technologies that are yet to be adopted.
Technical/Professional Expertise:
- Candidates must have a degree level of education in Information Technology.
- Have minimum 3+ years hands-on experience as a Data Engineer.
- Experience in the selection and implementation of new data platforms.
- Extensive experience in using cloud-based data warehouse, ETL, integration, visualization tools and data modelling.
- Comfortable to work within an agile delivery framework.
- A Qualification/Certification with a focus on data engineering would be an asset.
- Thorough knowledge in Scala, Python, or Java.
- Familiarity with Orchestration tools such as Talend and Airflow.
- Cloud computing experience on AWS, Microsoft Azure, or GCP.
- Preferably has experience with SQL, SSIS, Power BI, Azure technologies (Azure SQL, Azure Data Factory, Data Lake, Databrick).
- Stream data processing experience with technologies like Kinesis Streams, Pulsar, Kafka, Apache Nifi, Flink and Spark.
- Experience with ELK Stack or similar technologies
- Comfortable working in a fast-paced environment whilst interacting with a variety of different disciplines
- Proven ability to prioritize and deal with problems systematically and provide clear communications regarding the issues, resolution times and solutions