Full/part-time, Vienna (Austria) or remote within Austria
We are serving government agencies across Europe by monitoring agricultural parcels on a country-wide scale. Our data pipelines connect sensitive data and machine learning results to populate frontend applications and BI tools.
You’ll be a vital part of our Agri-Environmental Monitoring Solutions team, helping us implement and automate data processing pipelines, build and optimize Argo workflows, and maintain the underlying cloud deployments (IaC, GitOps).
This role combines data engineering with DevOps, and you'll work closely with our product owner and GIS engineers on delivering continuous value to our clients.
Technologies
Languages
Data formats
- Zarr
- Parquet
- Sqlite
- PostgreSQL
Frameworks
- Django
- Jupyter
- Argo-workflows
Infrastructure
Tools
- Flux-cd
- Helm
- Grafana
- prometheus
- GitLab CI
- Terraform
These challenges await you
- Write and own data pipelines using Django models in Python
- Load data into PostgreSQL databases and SQLite/Zarr archives
- Automate data pipeline orchestration for deliveries to government agencies
- Automatically validate data as early as possible (shift-left philosophy)
- Configure data viewers
- Set up, maintain, and improve cloud deployments of backend and frontend applications using GitOps principles
- Update cloud infrastructure software (e.g. Kubernetes, IaC)
- Set up, maintain, and improve CI/CD pipelines
- Create alerts for long-term SLAs and adapt cloud resource allocation accordingly
What excites us about you
- Computer science or equivalent degree
- 3+ years of experience designing, implementing, and maintaining complex data pipelines
- 3+ years of experience in DevOps with Kubernetes using GitOps principles
- You have hands-on experience designing or maintaining Argo Workflows or similar workflow engines
- You are familiar with the Python programming language and know the usual software structures for writing and maintaining high-quality code
- You are comfortable using Git and using CI/CD tools and working in Unix-like environments
- Experience with databases handling millions of records
- You investigate and fix encountered problems and feel comfortable reporting to domain experts and non-technical clients via GitLab issues or virtual meetings
- You are open to working in a remote environment, mostly meeting your team colleagues in virtual meetings (in-person team days twice per quarter, Vienna office open every day)
- You contribute to the crafting and execution of quarterly team objectives from your own perspective, led by the team lead
Please apply, even if you don’t match these requirements 100%. We would still like to get to know you to see if you would be a good fit for our team nevertheless.
Nice to have
- Experience with geospatial or remote sensing data
- Familiarity with cloud-native EO processing
- Background in the agriculture & environment domain
What you can look forward to
- Exciting challenges in a small, dynamic, and international team
- Fringe benefits (monthly contribution to a private retirement fund, public transport ticket, restaurant- and supermarket vouchers, various social and group events like company retreats and family activities)
- Flexible working conditions (working hours, working location)
- An up-and-coming industry with an international community
- Yearly salary starting from 45.000€ (Austria residents, depending on experience)
Would you like to join the EOX team as a DataOps Engineer?
We are looking forward to hearing from you! For this position, please contact our CEO Stephan Meißl via stephan.meissl+dataops-engineer@eox.at or Crew Captain Stefan Brand via stefan.brand+dataops-engineer@eox.at.