Job Title:
Data Engineer - Crypto Market Data InfrastructureContract Type:
Time Type:
Job Description:
Build and operate a robust, low-latency, fault-tolerant market data platform that powers trading and analytics. Within our broader data engineering unit, you will champion standardization and reuse, delivering clean, consistent crypto market data and reusable APIs and blueprints that accelerate teams across the organization.
Design, build, and maintain a real-time market data pipeline. Aggregate order books, trades, and funding data from multiple exchanges into a single standardized feed, and harden it with redundancy, rigorous error handling, and validation to ensure reliability. Apply TDD and automation across ingestion, transformation, and storage.
Develop monitoring and alerts for data quality, latency, and system health.
Define reusable APIs, data contracts, and platform blueprints. Collaborate closely with fellow data engineers, developers, quants, and traders, sharing best practices, contributing to unit-wide standards, and ensuring seamless integration into execution and analytics systems.
Continuously document and improve the stack.
Main Responsibilities
Data Engineering Market Data
Design, build, and maintain a robust, low-latency, fault-tolerant market data pipeline.
Aggregate order books, trades, and funding data from multiple crypto exchanges into a single standardized feed.
Implement redundancy, error handling, and data validation mechanisms to ensure high reliability of live data.
Develop monitoring tools and alerts for data quality, latency, and system health.
Work closely with developers, quants, and traders to ensure seamless integration of data into execution and analytics systems.
Document and continuously improve data ingestion, transformation, and storage processes.
Strategic Collaboration & Business Alignment
Partner with trading desks, quantitative teams, and risk functions to translate business needs into data solutions that enhance decision-making and operational efficiency.
Act as a senior liaison between engineering and business stakeholders, ensuring alignment on data priorities and delivery timelines.
Prioritize a value-based backlog (e.g., faster close/settlement, improved forecast accuracy, reduced balancing penalties) and measure business impact.
Align data models and domain ownership with business processes (bids/offers, nominations, positions, exposures, outages).
Liaise with Cybersecurity, Compliance, and Legal on sector-specific controls (e.g., REMIT/NERC-CIP considerations, data retention, segregation).
Innovation & Product Development
Incubate and industrialize data products: curated marts, feature stores, real-time decision APIs, and event streams for forecasting and optimization.
Introduce modern patterns (CDC, schema evolution, Delta/Iceberg, stream–batch unification) to improve freshness and resilience.
Evaluate and integrate external data (weather, fundamentals, congestion, capacity postings), internal and external vendor systems (ETRM) safely and at scale.
Collaborate with quantitative analysts to productionize ML pipelines (forecasting load/renewables, anomaly detection, etc.. ) with monitoring and rollback.
Mentorship & Technical Oversight
Coach engineers through design reviews, pair programming, and clear contribution guidelines; raise the bar on code quality and documentation.
Lead incident reviews and architectural forums; provide pragmatic guidance on trade-offs (latency vs. cost, simplicity vs. flexibility).
Develop growth paths and learning plans focused on energy domain fluency and modern data engineering practices.
Operational Excellence
Implement robust monitoring/alerting, runbooks
Ensure security and compliance by design: least-privilege access, secrets management, encryption, auditability, and disaster recovery testing.
Profile
Bachelor’s degree or higher in Computer Science, Engineering, or related field.
3+ years’ relevant experience in data engineering, trading systems, or financial technology.
Proven experience building and operating tick-level data pipelines for financial or crypto markets.
Prior experience in low-latency or high-availability systems preferred.
Skills
Azure: ADLS Gen2, Event Hubs, Synapse Analytics, Azure Databricks (Spark), Azure Functions, Azure Data Factory/Databricks Workflows, Key Vault, Azure Monitoring/Log Analytics; IaC with Terraform/Bicep; CI/CD with Azure DevOps or GitHub Actions.
Snowflake (on Azure or multi-cloud): Warehousing design, Streams & Tasks, Snowpipe/Snowpipe Streaming, Time Travel & Fail-safe, RBAC & row/column security, external tables over ADLS, performance tuning & cost governance.
Kafka / Streaming: Confluent Platform/Cloud, Kafka Streams/Spring Kafka, ksqlDB, Schema Registry (Avro/Protobuf), Kafka Connect (Debezium CDC), MirrorMaker 2; patterns for exactly-once/at-least-once, backpressure, and idempotency.
Programming & Engineering Practices: Strong OOP in Python and/or Java/Scala;
SDLC, DevOps mindset, TDD/BDD, code reviews, automated testing (unit/integration/contract), packaging and dependency management, API design (REST/gRPC).
Orchestration & Quality: Airflow/ADF/Databricks Jobs, data contracts, Great Expectations (or similar), lineage/catalog (e.g., Purview), metrics/observability (Prometheus/Grafana/Application Insights).
Additional Skills
Highly numerate, rigorous, and resilient in problem-solving.
Ability to prioritize, multitask, and deliver under time constraints.
Strong written and verbal communication in English.
Self-motivated, proactive, and detail-oriented.
Comfortable working under pressure in a fast-paced environment.
Excellent communication skills, ability to explain technical topics clearly.
Team player with ability to collaborate across engineering, quant, and trading teams
If you think the open position you see is right for you, we encourage you to apply!
Our people make all the difference in our success.
Top Skills
Gunvor Group Singapore Office
12 Marina blvd, MBFC Tower 3, Singapore,, Singapore

