Sedona Digital Logo

Sedona Digital

Senior Data Engineer (AWS)

Reposted 2 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in Brazil
Senior level
Remote
Hiring Remotely in Brazil
Senior level
The Senior Data Engineer will design, build, and maintain data pipelines on AWS, implement data lakes and warehouses, and foster collaboration with stakeholders for scalable data solutions.
The summary above was generated by AI

Accelerate your development and exposure to high‑performance data platforms and cloud infrastructure. Join Sedona Digital, a fast‑growing scale‑up with the ambition to be recognised as one of the leading technology companies in Romania. 

Our global client base needs builders, engineers who enjoy designing and implementing scalable data platforms, have deep expertise in cloud data technologies, and take pride in delivering reliable, well‑governed solutions. 

At Sedona, we: 

  • Obsess about our customers 
  • Build robust, scalable technical solutions 
  • Create an open, collaborative culture 
  • Invest in learning and long‑term careers 

We are looking for a Senior Data Engineer with strong experience in AWS‑based data platforms, data lake and warehouse design, and end‑to‑end data pipeline implementation. The ideal candidate is comfortable working independently, making sound technical decisions, and supporting clients as they modernize their data estates. 


Responsibilities:

  • Design, build, and maintain data pipelines and ingestion frameworks on AWS using services such as AWS Glue, S3, and Snowflake. 
  • Implement and maintain S3‑based data lakes, including raw, staged, and curated layers, following agreed standards and best practices. 
  • Set up and optimize Amazon Snowflake environments, including schema design, distribution and sort key strategies, and performance considerations. 
  • Develop reliable ingestion patterns for file‑based and database sources, including SFTP‑to‑S3 workflows. 
  • Implement secure secrets management using AWS Secrets Manager, ensuring credentials are never stored in code or configuration files. 
  • Build reusable, maintainable transformation logic aligned with agreed modelling and semantic‑layer principles. 
  • Monitor, troubleshoot, and improve pipeline reliability, data quality, and operational stability. 
  • Collaborate with architects, analysts, and client stakeholders to translate requirements into scalable data solutions. 
  • Contribute to defining and enforcing data engineering standards, security practices, and governance patterns. 
  • Support clients during platform adoption, knowledge transfer, and handover. 

Requirements
  • Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience). 
  • 5+ years of experience in data engineering, ETL/ELT development, and data integration. 
  • Strong hands‑on experience with AWS, including S3, AWS Glue, Amazon Redshift, and IAM. 
  • Solid understanding of data lake and data warehouse architectures. 
  • Experience designing and operating batch‑oriented data pipelines at scale. 
  • Strong SQL skills and experience with data modelling (facts, dimensions, semantic layers). 
  • Experience with Python for data processing, automation, or Glue jobs. 
  • Familiarity with secure credential handling and secrets management patterns. 
  • Ability to work independently, manage changing priorities, and make pragmatic technical decisions. 
  • Strong communication skills in English, with the ability to explain technical concepts clearly. 

Preferred Skills (Nice to Have) 

  • Exposure to Snowpipe, Athena, or Lakehouse patterns. 
  • Familiarity with Databricks and ability to compare trade‑offs. 
  • Experience working in regulated or security‑sensitive environments. 
  • Understanding of CI/CD practices for data platforms. 
  • Previous experience supporting discovery or early‑phase platform design engagements. 
  • Exposure to or certification in Microsoft Azure is considered a plus. 

Top Skills

AWS
Aws Glue
Python
Redshift
S3
SQL

Similar Jobs

12 Days Ago
In-Office or Remote
Senior level
Senior level
Information Technology • Software
The Senior Data Platform Engineer will lead the design and management of enterprise data pipelines primarily using Python and AWS. Responsibilities include building APIs, managing data processing frameworks, and implementing scalable ETL/ELT solutions, while ensuring communication with stakeholders.
Top Skills: AirflowAWSCdkCloudFormationDockerEcsEventbridgeFastapiFlaskGlueKinesisKubernetesLambdaNumpyPandasPrefectPysparkPythonS3SqlalchemyStep FunctionsTerraform
7 Hours Ago
Easy Apply
Remote
Easy Apply
Senior level
Senior level
Artificial Intelligence • Consumer Web • Digital Media • Information Technology • Social Impact • Software
The Senior Copywriter will be responsible for creating compelling marketing content, maintaining brand voice, and collaborating with cross-functional teams on integrated campaigns and product launches.
12 Hours Ago
Remote or Hybrid
Senior level
Senior level
Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
Lead the engineering execution of Assist Agents, focusing on architecture, scalability, and integration for real-time AI interactions. Mentor engineers, manage technical direction, and ensure long-term success of the product.
Top Skills: Ai TechnologiesLangchainLlmsPyTorch

What you need to know about the Singapore Tech Scene

The digital revolution has driven a constant demand for tech professionals across industries like software development, data analytics and cybersecurity. In Singapore, one of the largest cities in Southeast Asia, the demand for tech talent is so high that the government continues to invest millions into programs designed to develop a talent pipeline directly from universities while also scaling efforts in pre-employment training and mid-career upskilling to expand and elevate its workforce.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account