Svitla Systems Inc. is looking for a Senior Software Engineer with Data Background for a full-time position (40 hours per week) in India.
Our client is a cloud platform for business spend management (BSM) that helps organizations manage their spending, procurement, invoicing, expenses, and supplier relationships. They provide a unified, cloud-based spend management platform that connects hundreds of organizations representing the Americas, EMEA, and APAC with millions of suppliers globally. The platform provides greater visibility into and control over how companies spend money. Small, medium, and large customers have used the platform to bring billions of dollars in cumulative spending under management.
The company offers a comprehensive platform that helps organizations manage their spending, procurement, invoicing, expenses, and supplier relationships. Founded in 2006 and headquartered in San Mateo, California, they aim to streamline and optimize business processes, driving efficiency and cost savings.
Requirements
- 4β8 years of experience in building and maintaining data pipelines for enterprise/SaaS applications.
- Strong knowledge of Python.
- Solid understanding of relational SQL and query optimization.
- Understanding designing and implementing ETL workflows and data transformation processes (PySpark, or similar libraries for ETL/data transformation).
- Deep knowledge of Kafka (or similar pub/sub systems) for data streaming.
- Strong experience with Apache Airflow, or similar tools to schedule, monitor, and manage complex data pipelines
- Exerience with AWS cloud (data storage, compute, and managed services).
- Understanding how to integrate datasets into BI/reporting tools (Tableau, Power BI, or QuickSight).
- Experience with CI/CD tooling for data pipeline deployment.
Nice to have
- Familiarity with AI/ML-based data cleansing, deduplication, and entity resolution techniques.
- Familiarity with microservices and event-driven architecture.
- Knowledge of performance tuning and monitoring tools for data workflows.
Responsibilities
- Implement a cloud-native analytics platform with high performance and scalability.
- Build an API-first infrastructure for data in and data out.
- Build data ingestion capabilities for the data, as well as external spend data.
- Leverage data classification AI algorithms to cleanse and harmonize data.
- Own data modelling, microservice orchestration, monitoring & alerting.
- Develop comprehensive expertise in the entire application suite and leverage this knowledge to design more effective applications and data frameworks.
- Adhere to iterative development processes to deliver concrete value each release while driving longer-term technical vision.
- Collaborate with cross-organizational teams, including product management, integrations, services, support, and operations, to ensure the overall success of software development, implementation, and deployment.