Job Title: Talend Data Engineer (Talend ETL, SQL, Bigquery)
Experience: 5+ years
Employment Type: Full Time
Job Mode: Fully Remote
Notice Period: Immediate to 15 Days
Key Skillsets: Talend ETL, Design, develop and support Talend data pipelines, warehouses and reporting systems, data ingestion and ETL solutions, SQL, Microsoft Azure, GCP, Bigquery, Jira, Scrum and Agile Methodology Knowledge
About the job:
We are seeking a highly motivated and creative Talend Data Engineer that will raise the bar of our in-house R&I solution for centralizing experiment data and creating outputs to help with researcher analysis - Reporting, BI, AI. You will be working with internal teams, as well as external partners, to create cohesive end-to-end data ingestion and ETL solutions. You will work in an Agile squad with a product owner, business analyst, architect, Data Engineers and Full Stack Developers.
Key Responsibilities:
- Design, develop and support Talend data pipelines, warehouses and reporting systems to solve business needs.
- Analyse business requirements to propose pertinent technical solutions.
- Design data models and define optimized tables/views in BigQuery and SQL queries to improve performances or reduce costs.
- Actively contribute to the architecture of the solutions: propose new design patterns, solutions, technologies to improve existing applications or develop new ones.
- Ensure data security in all layers of the applications.
- Collaborate with squad members to define and evolve development best practices, e.g. source code control, re-usability, coding standards, etc.
- Take ownership of full implementation, e.g. quality of code, unit tests, integration tests, reliability engineering (monitoring & alerting), performance of solution.
Education and experience
- University degree in Computer Science or related field of study.
- 5+ years’ of experience as a Data Engineer.
- Strong competency with data manipulation: SQL for DDL and DML.
- Experience in SQL optimization.
- Proven expertise with the Talend platform and toolset.
- Proven expertise with Google Cloud Platform, specifically BigQuery.
- Proven experience with Agile and DevOps methodologies.
- Proficiency in JIRA for task management/Confluence for documentation.
- Ability to troubleshoot issues/incidents.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration skills.
- Attention to detail and a commitment to delivering high-quality results.
- Familiarity with version control systems (e.g., Git).
- Nice to have:
- Experience in chemical industry
- Familiarity with industry-specific regulations and standards (if applicable).
- Experience with Microsoft Azure Platform
Share resume to hr@marsdata.in