About the Role
As a Data Engineer at Wagmi Studio, you will build the pipelines that turn raw chain events and sensor streams into the analytics, indexers, and real-time dashboards our clients depend on. You'll work directly with senior blockchain engineers and IoT specialists to design reliable, low-latency data infrastructure that survives the real world.
Responsibilities:
- Design, build, and operate end-to-end data pipelines for on-chain and IoT data
- Develop and maintain subgraphs (The Graph, Ponder) and custom indexers
- Model multi-chain datasets for dashboards, reporting, and client APIs
- Own data quality, observability, and alerting across production pipelines
- Integrate sensor and device telemetry with blockchain-anchored state
- Partner with backend and dApp engineers on schema and contract design
Requirements:
- 4+ years of professional data engineering experience
- Strong SQL and at least one of: Python, TypeScript, Go, or Rust
- Production experience with Postgres, Kafka, Redis, and ETL orchestration
- Hands-on familiarity with at least one blockchain data stack (The Graph, Dune, Flipside, or custom RPC indexing)
- Comfort operating infrastructure (Docker, one major cloud — AWS, GCP, or Cloudflare)
- An engineering-first, test-driven mindset
Nice to Have:
- Experience with IoT or edge-to-cloud telemetry pipelines
- Familiarity with EVM internals (logs, traces, receipts)
- Prior work with zero-knowledge rollups or L2 data availability layers
- Experience with streaming analytics (Flink, ksqlDB, Materialize)
Benefits:
- Top-of-market full-time salary aligned with senior engineering rates
- Fully remote — work from anywhere
- 25 days paid time off plus local public holidays
- Annual home-office setup stipend
- Annual learning & conference budget (Devcon, EthCC, and more)
- Supplemental healthcare contribution
OK for recruiters to contact this job poster.
OK to highlight this job opening for persons with disabilities.