1. INTRODUCTION
a. About WAO Hire:
WAO Hire company, is a collective group of tech recruiters and developers. We help fast growing companies connect with talented developers.
We take the humanized approach to hire and work with developers, because we know that's the best way to find the best developers for businesses.
2. RESPONSIBILITY:
We are a lean team based in NYC looking for a part-time software engineer to help us manage data pipelines, dashboards, and lightweight AI tasks. This role is ideal for a junior or mid-level engineer who is smart, motivated, and able to work independently.
1. Dune Analytics (SQL Dashboards)
- Write and maintain SQL queries using Dune Analytics
- Optimize and troubleshoot dashboard performance
- Maintain documentation for internal query logic
2. Data Pipelines (Python + APIs)
- Write scripts to pull data from external APIs (e.g., Binance, CoinGecko)
- Format and load data into PostgreSQL (on Heroku)
- Schedule jobs using Heroku Scheduler or GitHub Actions
3. Dashboard Support (Metabase)
- Ensure data is correctly structured and available in Metabase
- Maintain clean joins, naming conventions, and ad hoc metrics
- Provide lightweight support for building or editing dashboards
4. Messaging Integrations (Slack / Telegram)
- Build webhook or bot-based updates to send data summaries to Slack or Telegram
- Format messages and manage auth tokens
- Monitor and resolve failures
5. AI & Model Support (Optional)
- Work with OpenAI models and Grok for summarizing investment memos
- Assist with prompt engineering and lightweight fine-tuning using internal scoring data
- Help integrate outputs into Slack or internal tools
About Our Stack and Workflow
- Database: PostgreSQL hosted on Heroku
- Scheduling: Heroku Scheduler for all ETL jobs
- Data Flow: Fully script-based pipeline, no dbt or Airflow
- Dashboards: 10 internal Metabase dashboards, no public or embedded charts
- APIs: Data pulled regularly from Binance, CoinGecko, Artemis, Dune (token-based auth)
- AI Usage: GPT-4 used to score and summarize custom investment data; some fine-tuning completed; outputs integrated into Slack
- Dev Tools: Codebase on GitHub, simple deploy to Heroku, communication via Slack, docs in Notion
3. REQUIREMENTS
- Minimum 5 years of hands-on experience with data engineer
- Have experience working with AI technologies
- Fluent in English; able to write clearly and communicate asynchronously
- Strong academic background (good university, strong grades preferred)
- High attention to detail and data integrity
- Able to work independently with minimal supervision
- Strong sense of responsibility, consistent follow-through, and responsiveness
- Curious, resourceful, and eager to learn new tools or workflows
Tech Stack:
- SQL (Dune Analytics, PostgreSQL)
- Python (APIs, data pipelines, Metabase integration)
- GitHub (version control)
- Slack / Telegram (API integrations)
- Basic AI/ML (OpenAI-based summarization and tagging
4. BENEFITS:
Being a member of us, you will experience a youthful, flexible, and creative work environment
- Work 10–20hours/week, at your convenience
- Salary: $1500 - $2000
5. WORKING TIME: 10 – 20 hours/week
6. INTERVIEW PROCESS:
- 1st round: Assignment test
- 2nd round: Technical interview + Culture fit
7. LOCATION: Remote