For companies
  • Hire developers
  • Hire designers
  • Hire marketers
  • Hire product managers
  • Hire project managers
  • Hire assistants
  • How Arc works
  • How much can you save?
  • Case studies
  • Pricing
    • Remote dev salary explorer
    • Freelance developer rate explorer
    • Job description templates
    • Interview questions
    • Remote work FAQs
    • Team bonding playbooks
    • Employer blog
For talent
  • Overview
  • Remote jobs
  • Remote companies
    • Resume builder and guide
    • Talent career blog
EOX IT Services GmbH
EOX IT Services GmbH

DataOps Engineer (m/w/d)

Location

Remote restrictions apply
See all remote locations

Salary Estimate

N/AIconOpenNewWindows

Seniority

N/A

Tech stacks

Data
Cloud
Python
+29

Permanent role
3 days ago
Apply now

Full/part-time, Vienna (Austria) or remote within Austria

We are serving government agencies across Europe by monitoring agricultural parcels on a country-wide scale. Our data pipelines connect sensitive data and machine learning results to populate frontend applications and BI tools.

You’ll be a vital part of our Agri-Environmental Monitoring Solutions team, helping us implement and automate data processing pipelines, build and optimize Argo workflows, and maintain the underlying cloud deployments (IaC, GitOps).

This role combines data engineering with DevOps, and you'll work closely with our product owner and GIS engineers on delivering continuous value to our clients.

Technologies

Languages

  • YAML
  • Python
  • SQL

Data formats

  • Zarr
  • Parquet
  • Sqlite
  • PostgreSQL

Frameworks

  • Django
  • Jupyter
  • Argo-workflows

Infrastructure

  • AWS
  • Kubernetes
  • Docker

Tools

  • Flux-cd
  • Helm
  • Grafana
  • prometheus
  • GitLab CI
  • Terraform

These challenges await you

  • Write and own data pipelines using Django models in Python
  • Load data into PostgreSQL databases and SQLite/Zarr archives
  • Automate data pipeline orchestration for deliveries to government agencies
  • Automatically validate data as early as possible (shift-left philosophy)
  • Configure data viewers
  • Set up, maintain, and improve cloud deployments of backend and frontend applications using GitOps principles
  • Update cloud infrastructure software (e.g. Kubernetes, IaC)
  • Set up, maintain, and improve CI/CD pipelines
  • Create alerts for long-term SLAs and adapt cloud resource allocation accordingly

What excites us about you

  • Computer science or equivalent degree
  • 3+ years of experience designing, implementing, and maintaining complex data pipelines
  • 3+ years of experience in DevOps with Kubernetes using GitOps principles
  • You have hands-on experience designing or maintaining Argo Workflows or similar workflow engines
  • You are familiar with the Python programming language and know the usual software structures for writing and maintaining high-quality code
  • You are comfortable using Git and using CI/CD tools and working in Unix-like environments
  • Experience with databases handling millions of records
  • You investigate and fix encountered problems and feel comfortable reporting to domain experts and non-technical clients via GitLab issues or virtual meetings
  • You are open to working in a remote environment, mostly meeting your team colleagues in virtual meetings (in-person team days twice per quarter, Vienna office open every day)
  • You contribute to the crafting and execution of quarterly team objectives from your own perspective, led by the team lead

Please apply, even if you don’t match these requirements 100%. We would still like to get to know you to see if you would be a good fit for our team nevertheless.

Nice to have

  • Experience with geospatial or remote sensing data
  • Familiarity with cloud-native EO processing
  • Background in the agriculture & environment domain

What you can look forward to

  • Exciting challenges in a small, dynamic, and international team
  • Fringe benefits (monthly contribution to a private retirement fund, public transport ticket, restaurant- and supermarket vouchers, various social and group events like company retreats and family activities)
  • Flexible working conditions (working hours, working location)
  • An up-and-coming industry with an international community
  • Yearly salary starting from 45.000€ (Austria residents, depending on experience)

Would you like to join the EOX team as a DataOps Engineer?

We are looking forward to hearing from you! For this position, please contact our CEO Stephan Meißl via stephan.meissl+dataops-engineer@eox.at or Crew Captain Stefan Brand via stefan.brand+dataops-engineer@eox.at.

About EOX IT Services GmbH

🔗Website
Visit company profileIconOpenNewWindows

Unlock all Arc benefits!

  • Browse remote jobs in one place
  • Land interviews more quickly
  • Get hands-on recruiter support
PRODUCTS
Arc

The remote career platform for talent

Codementor

Find a mentor to help you in real time

LINKS
About usPricingArc Careers - Hiring Now!Remote Junior JobsRemote jobsCareer Success StoriesTalent Career BlogArc Newsletter
JOBS BY EXPERTISE
Remote Front End Developer JobsRemote Back End Developer JobsRemote Full Stack Developer JobsRemote Mobile Developer JobsRemote Data Scientist JobsRemote Game Developer JobsRemote Data Engineer JobsRemote Programming JobsRemote Design JobsRemote Marketing JobsRemote Product Manager JobsRemote Project Manager JobsRemote Administrative Support Jobs
JOBS BY TECH STACKS
Remote AWS Developer JobsRemote Java Developer JobsRemote Javascript Developer JobsRemote Python Developer JobsRemote React Developer JobsRemote Shopify Developer JobsRemote SQL Developer JobsRemote Unity Developer JobsRemote Wordpress Developer JobsRemote Web Development JobsRemote Motion Graphic JobsRemote SEO JobsRemote AI Jobs
© Copyright 2025 Arc
Cookie PolicyPrivacy PolicyTerms of Service