We are seeking a highly skilled Senior Java Software Engineer to join our team. The ideal candidate will have extensive experience in developing API based services using Java Spring Boot microservices and event-driven systems using cutting-edge technologies such as Kafka, Apache Beam, Flink & AWS. This role will focus on building and optimising high-performance systems, integrating data streaming pipelines, and ensuring strong testing and security practices are implemented.
Key Responsibilities:
- Build and maintain APIs based services using Java, Spring Boot, OAuth 2 for REST and GraphQL services.
- Work extensively with Spring Boot and Java to deliver API Services for integrations.
- Design and develop event-driven architectures utilising Kafka and Apache Beam or Flink.
- Implement Kafka AVRO schema design and manage data serialisation processes.
- Develop Kafka streamers as both subscribers and publishers to ensure seamless data flow.
- Containerize applications using Docker for efficient deployment and scalability.
- Implement database solutions, using SQL databases as the target for data pipelines.
- Perform thorough unit, functional, integration, and non-functional testing following TDD/BDD principles.
- Participate actively in Agile ceremonies, including grooming user stories, sprint planning, and estimation.
Must Have:
- 5 + years of experience in Spring Boot, Java, and Docker, with API based service development with OAuth 2 (REST, GraphQL).
- Expertise in Kafka, Apache Beam, Flink for event-driven design.
- Strong experience in Kafka AVRO schema design and Kafka streamers using Apache Beam or Flink.
- Proficiency in developing a strong testing mindset with expertise in TDD/BDD.
- Experience in building and optimising data streaming pipelines for both batch and streaming use cases.
- Knowledge of SQL databases such as PostgreSQL.
- Familiarity with Kubernetes (K8s) and Amazon EKS.
- Experience with AWS services for cloud deployment and management.
- Understanding of data security practices, including encryption and ACL on Kafka topics.
- Experience in Kafka optimization techniques (e.g., Kafka compression).
- Knowledge of observability practices, including metrics, tracing, and log generation.
Nice to Have:
- Experience with Kubernetes pipelines such as ArgoCD, GitOps, and Helm Charts.
- Familiarity with Artifactory for artefact management.
- Exposure to DevSecOps tools (e.g., Snyk for vulnerability scanning).
- Experience with GitHub Actions for CI/CD pipelines.
- Knowledge of observability tools like AWS Cloud Watch, ELK, Grafana and Prometheus.
- Experience with Kafka load testing to ensure system reliability.
Kindly note that one of the skills—either Apache Beam or Apache Flink—is mandatory.