Job Scope:
This role will develop, maintain, test and evaluate data platform solutions within our clients' next generation Data and Analytics Platform.
Job Responsibilities
- Building and enhancing data platform solutions including data virtualization, data visualization and data lake
- Conduct data platform proof-of-concept and acceptance testing
- Develop services and integrate them with internal and external APIs
- Identify and evaluate new technologies that improve performance, maintainability and scalability of our infrastructure.
- Work closely with Architects, Platform Engineers and Product Owners to deliver software in a continuous delivery environment.
Job Requirements
- University Degree in computer science, software engineering, or related fields
- At least 3 years of experience in data platform development
- Experience in delivery of large scale data infrastructure (in scale of hundreds of TB) is highly desired
- Experience in system design and Infrastructure as Code (IaC) is a must
- Experience in designing, implementing and managing DevOps pipelines in a large scale enterprise environment is preferred
- Experience in public cloud platforms (AWS, Azure, AliCloud) and having any relevant certification is a plus
- Experience of implementing data security capabilities such as encryption, anonymization is a plus
- Experience with bash scripting for automation and system integration
- Experience in the following technology stack
- Programming skills, in any one of these programming language (Java, Nodejs, Python)
- Scripting: Bash script or PowerShell
- CICD Suite: Jenkins, JIRA, Sonar, Git, Ansible
- Test Framework: jMeter, jUnit, PyTest
- Data Engineering: Airflow, Spark
- Data Lake Framework: Apache Hudi / Iceberg / Delta Lake
- Infrastructure: Docker, Terraform, Kubernetes
- Experience in working in an international team, with solid grounding in financial service is a plus