This is us
At Qinshift and Avenga we are merging together to start a new era of technology that matters. Leveraging the power of innovation, we are on a journey to shape the future of work — and we are inviting you to co-create it with us.
This is you
- 6–8 years of professional experience as a Software Engineer, with strong Python expertise.
- Proven track record of delivering production-grade data/ML services on public cloud platforms (GCP preferred, AWS/Azure also acceptable).
- Deep hands-on experience with orchestration frameworks like Airflow, Dagster, or Prefect.
- Strong knowledge of vector databases and similarity search tools, such as FAISS, Milvus, Pinecone, or ideally Google Vertex AI Vector Search.
- Solid understanding of the LLM ecosystem: model families (GPT-4o, Gemma 2, Mistral), prompt engineering, embeddings, fine-tuning with LoRA/PEFT, and RAG patterns.
- Experience integrating with LLM APIs (OpenAI, Google GenAI / Vertex AI, or Anthropic) via REST or gRPC.
- Strong grasp of software architecture patterns: event-driven design, microservices, Pub/Sub.
- Comfortable with CI/CD pipelines (GitHub Actions, Cloud Build) and infrastructure-as-code tools like Terraform.
- BS/MS in Computer Science, Electrical Engineering, or equivalent practical experience.
- English level: Upper-intermediate or above.
- Must be based in Argentina.
Nice-to-have skills:
- Experience deploying workloads on Vertex AI Vector Search, Gemini, or PaLM2 / Gemini Pro.
- Familiarity with serverless and event-driven architectures.
- Exposure to ad-tech data (bid requests, impression logs, CTV graphs).
- Experience with real-time data processing frameworks.
This is your role
- Designing and building core data services that ingest, vectorize, and retrieve large-scale first-party data signals.
- Partnering with data scientists to operationalize open-source and commercial LLMs (e.g., OpenAI, Gemini, Claude) on Google Cloud Platform.
- Implementing scalable and performant ETL/ELT pipelines, enabling downstream machine learning and analytics workloads.
- Developing intelligent systems that support RAG-based architectures, semantic search, and campaign personalization.
- Contributing to platform automation and MLOps best practices through robust CI/CD pipelines and infrastructure as code.
- Supporting a modern, cloud-native data architecture aligned with real-time business needs.