For companies
  • Hire developers
  • Hire designers
  • Hire marketers
  • Hire product managers
  • Hire project managers
  • Hire assistants
  • How Arc works
  • How much can you save?
  • Case studies
  • Pricing
    • Remote dev salary explorer
    • Freelance developer rate explorer
    • Job description templates
    • Interview questions
    • Remote work FAQs
    • Team bonding playbooks
    • Employer blog
For talent
  • Overview
  • Remote jobs
  • Remote companies
    • Resume builder and guide
    • Talent career blog
Darkroom
Darkroom

Data Engineer - Matter

Location

Remote restrictions apply
See all remote locations

Salary Estimate

N/AIconOpenNewWindows

Seniority

N/A

Tech stacks

Data
Python
PostgreSQL
+17

Permanent role
a day ago
Apply now

About The Role

We are seeking an experienced Senior Data Engineer. You’ll work closely with our CTO, CPO, and product teams to architect, build, and deliver robust data pipelines and transformations for our always-on AI platform. Success in this role will have a direct, measurable impact on business outcomes and future product velocity.

Key Responsibilities

  • Architect, build, and optimize scalable end-to-end data pipelines using dbt and our stack (Python microservices, Postgres, BigQuery, and GCP).
  • Design, implement, and maintain ETL/ELT processes to ingest, clean, and transform large datasets from a variety of sources.
  • Collaborate with product and engineering to define data requirements for new product features and analytics initiatives.
  • Manage our data pipeline by onboarding new clients, loading their data, and applying all required formulas.
  • Maintain and troubleshoot workflows by fixing broken formulas, adding new services, and ensuring reliable, high-throughput performance.
  • Ensure data reliability, integrity, and security at scale.
  • Troubleshoot, performance-tune, and document all pipelines and data workflows for smooth handoff.

About You

  • 5+ years of hands-on experience building production-grade data pipelines, ETL/ELT workflows, and transformed datasets at scale.
  • Expert-level Python; strong experience with dbt, Postgres, BigQuery, and core GCP data services (e.g., Dataflow, Pub/Sub, Storage, Composer).
  • Demonstrated experience architecting, optimizing, and troubleshooting cloud-based data infrastructure.
  • Familiarity with analytics, BI tools, and data visualization platforms is a plus.
  • Startup or contract/consulting experience strongly preferred; you move quickly and commit to clear deliverables and deadlines.
  • Excellent communication skills: you read requirements carefully, keep stakeholders looped in, and document as you go. You can boil down complex tasks into plain english for non-data engineers to understand.
  • Low-ego, high-ownership, and passionate about building product-focused solutions.

About Darkroom

🔗Website
Visit company profileIconOpenNewWindows

Unlock all Arc benefits!

  • Browse remote jobs in one place
  • Land interviews more quickly
  • Get hands-on recruiter support
PRODUCTS
Arc

The remote career platform for talent

Codementor

Find a mentor to help you in real time

LINKS
About usPricingArc Careers - Hiring Now!Remote Junior JobsRemote jobsCareer Success StoriesTalent Career BlogArc Newsletter
JOBS BY EXPERTISE
Remote Front End Developer JobsRemote Back End Developer JobsRemote Full Stack Developer JobsRemote Mobile Developer JobsRemote Data Scientist JobsRemote Game Developer JobsRemote Data Engineer JobsRemote Programming JobsRemote Design JobsRemote Marketing JobsRemote Product Manager JobsRemote Project Manager JobsRemote Administrative Support Jobs
JOBS BY TECH STACKS
Remote AWS Developer JobsRemote Java Developer JobsRemote Javascript Developer JobsRemote Python Developer JobsRemote React Developer JobsRemote Shopify Developer JobsRemote SQL Developer JobsRemote Unity Developer JobsRemote Wordpress Developer JobsRemote Web Development JobsRemote Motion Graphic JobsRemote SEO JobsRemote AI Jobs
© Copyright 2025 Arc
Cookie PolicyPrivacy PolicyTerms of Service