Pagos Logo

Pagos

Data Engineer

Job Posted 6 Days Ago Posted 6 Days Ago
Be an Early Applicant
Remote
28 Locations
Senior level
Remote
28 Locations
Senior level
As a Data Engineer, you'll design and maintain scalable data pipelines, craft high-quality code, and collaborate with other teams to enhance systems and data quality.
The summary above was generated by AI

About Us

At Pagos, we’re passionate about empowering businesses to take control of their payments stack and solve the puzzles standing between them and optimized growth. Our global platform provides developers, product teams, and payments leaders with both a deeper understanding of their payments data and access to new payments technology through user-friendly tools that are easy to implement. To succeed in this, we need creative thinkers who are willing to roll up their sleeves and start building alongside us.

About the Role

As a Data Engineer, you’ll play a key part in building and maintaining the platform that powers our products. By collaborating with backend engineers, data analysts, and other engineers, you’ll build and own new features, modules, and extensions of our systems. We’re seeking an action-oriented and collaborative problem solver who thrives in ambiguity and can take on new challenges with optimism in a fast-paced environment. We value team members who are not only skilled in their area of expertise but are also perpetual learners who are committed to growth and contributing to our collective success.


In this role, you will:

  • Craft high-quality code for scale, availability, and performance

  • Design, develop, and maintain scalable data pipelines and processes to extract, process, and transform large volumes of data, both real-time and batched (ELT/ETL)

  • Build and maintain integrations with data providers using various data transfer protocols

  • Drive engineering projects from start to finish with a high level of ownership and autonomy

  • Ensure the quality of our products and data through both manual and automated testing, as well as code reviews

What We’re Looking For

We’re looking for someone with:

  • 8+ years of software engineering experience with an emphasis on Data Engineering

  • Bachelor’s degree or higher in Computer Science or related technical discipline (or equivalent experience)

  • Advanced experience with complex SQL queries and database/lakehouse technologies such as Redshift, Apache Iceberg and Postgres

  • Deep experience with big data technologies and frameworks such as Apache Spark, DBT, as well as data quality tools, like DBT (test)

  • Familiarity with cloud platforms like AWS, GCP, or Azure, and common data-related services (e.g. S3, Redshift, EMR, Glue, Kinesis, Athena)

  • A bias for action, where no task is too small, and an eagerness to learn and grow with our industry

Nice to have: 

  • Experience with real-time streaming frameworks like Apache Kafka

  • Experience with Great Expectations and/or Soda

  • Comfort and/or past experience working and managing big data and ELT pipelines

  • Comfort and/or past experience working with Temporal, Apache Airflow or similar orchestration tools

  • Experience working in high-growth, venture-backed startup(s)

Pagos does not accept unsolicited resumes from third-party recruiting agencies. All interested candidates are encouraged to apply directly. 

Top Skills

Apache Airflow
Apache Iceberg
Apache Kafka
Spark
Athena
AWS
Azure
Dbt
Emr
GCP
Glue
Great Expectations
Kinesis
Postgres
Redshift
S3
Soda
SQL
Temporal

Similar Jobs

5 Days Ago
Remote
Hybrid
10 Locations
Mid level
Mid level
Artificial Intelligence • Healthtech • Machine Learning • Natural Language Processing • Biotech • Pharmaceutical
The role involves leading data engineering projects, developing AI models, overseeing system automation, and collaborating with various technical teams to optimize data processes and drive innovation in data analytics and supply chains at Pfizer.
Top Skills: Amazon NeptuneAmazon RedshiftApache AirflowApache NifiApache SolrAWSAzureDockerElasticsearchGoogle BigqueryGoogle Cloud PlatformHadoopInformaticaJavaKafkaKubernetesNeo4JPrefectPythonScalaSnowflakeSparkSQLTalend
9 Days Ago
Remote
28 Locations
Senior level
Senior level
Artificial Intelligence • Big Data • Logistics • Machine Learning
As a Principal Data Engineer, you will architect and maintain data pipelines using AWS services and collaborate with cross-functional teams to optimize data infrastructure for business insights.
Top Skills: Apache KafkaSparkAws EmrAws GlueAws LambdaAws S3Neo4JPostgresPython
7 Days Ago
Remote
29 Locations
Senior level
Senior level
Software
As a Senior Geospatial Data Engineer, you will optimize data ingestion processes, build scalable data pipelines, and mentor junior engineers to support risk analysis for utilities using advanced geospatial data.
Top Skills: Cloud RunGCPGdal/OgrGrafanaKubernetesPostgisPythonQgis

What you need to know about the Singapore Tech Scene

The digital revolution has driven a constant demand for tech professionals across industries like software development, data analytics and cybersecurity. In Singapore, one of the largest cities in Southeast Asia, the demand for tech talent is so high that the government continues to invest millions into programs designed to develop a talent pipeline directly from universities while also scaling efforts in pre-employment training and mid-career upskilling to expand and elevate its workforce.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account