Your new company
A “Tech-first” digital bank venture backed by leading powerhouses.
Your new role
You are expected to play a central role in the build and strategy of the business’s data platform. In addition, you will be expected to work closely with the business across a broad range of domains to deliver business critical data pipelines within the bank. At the same time, you'd be able to work on and solve some of the many interesting challenges, learn new ways of working, and build delightful high-quality products for customers. You will work in project-based sprints in small, interdisciplinary teams.
- Build and manage data pipelines in the support of key data and analytics use cases.
- Contribute to the build and maintenance of the data platform – Kafka, RDS Postgres, Data Airflow.
- Take part in the continuous evolution and improvement of the software development team processes and practices.
- Being an active contributor to the successful performance and achievement of outcomes for the team.
What you’ll need in order to succeed
The ideal candidate will be a hands-on technologist who is passionate about software engineering and a proven subject matter expert in data engineering. Thrilled to come in at the build-phase, you are someone who has ideas and wants a platform to implement them in an innovative, tech-first environment.
- Have at least 7+ years previous experience working in data management and engineering discipline.
- You are a software engineer at heart and the go-to person for all things data engineering and its latest tech/trends/architecture.
- Proven development experience in Python, SQL and Spark
- The ability to work with both IT and business in integrating data pipelines to automate key business functions.
- Strong experience building scalable data pipelines.
- Good working knowledge of Git and CI/CD pipelines
- Excellent communication and interpersonal skills to collaborate within a team.
Bonus if you have:
- DevOps experience with managing and maintaining Apache Kafka, RDS Prostgres and/or Apache Airflow.
- Has proficient understanding of Kubernetes and AWS or other cloud providers.
- Has a high-level understanding of how to build and deploy batch and real-time Machine Learning models
What you need to do now
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV. If this job isn't quite right for you but you are looking for a new position within Data & AI, please contact Daen Huang at +65 63030158 or email daen.huang@hays.com.sg for a confidential discussion on your career.
Hays Registration Number: 200609504D, EA License: 07C3924, Registration ID Number: R1658977 #1207628