Bertoni Solutions Logo

Bertoni Solutions

Lead Data Engineer – (Remote – Latin America)

Posted 3 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in BR
Senior level
Remote
Hiring Remotely in BR
Senior level
Lead the design and development of scalable data pipelines using PySpark, SQL, and Python. Collaborate with cross-functional teams on data integration, ETL processes, and ensure data quality across systems.
The summary above was generated by AI
Company Description

We are a multinational team of individuals who believe that, with the right knowledge and approach, technology is the answer to the challenges businesses face today. Since 2016, we have brought this knowledge and approach to our clients, helping them translate technology into their success.

With Swiss roots and our own development team in Lima and across the region, we offer the best of both cultures: the talent and passion of Latin American professionals combined with the organizational skills and Swiss mindset.

Job Description

We are seeking a highly skilled Lead Data Engineer with strong expertise in PySpark, SQL, and Python, Azure Data Factory, Synapse, Databricks and Fabric, as well as a solid understanding of ETL and data warehousing end to end principles. The ideal candidate will have a proven track record of designing, building, and maintaining scalable data pipelines in a collaborative and fast-paced environment.

Key Responsibilities:

  • Design and develop scalable data pipelines using PySpark to support analytics and reporting needs.
  • Write efficient SQL and Python code to transform, cleanse, and optimize large datasets.
  • Collaborate with machine learning engineers, product managers, and developers to understand data requirements and deliver solutions.
  • Implement and maintain robust ETL processes to integrate structured and semi-structured data from various sources.
  • Ensure data quality, integrity, and reliability across pipelines and systems.
  • Participate in code reviews, troubleshooting, and performance tuning.
  • Work independently and proactively to identify and resolve data-related issues.
  • Contribute to Azure-based data solutions, including ADF, Synapse, ADLS, and other services.
  • Support cloud migration initiatives and DevOps practices.
  • Provide guidance on best practices and mentor junior team members when needed.

Qualifications

  • 8+ years of overall experience working with cross-functional teams (machine learning engineers, developers, product managers, analytics teams).
  • 3+ years of hands-on experience developing and managing data pipelines using PySpark.
  • 3 to 5 years of experience with Azure-native services, including Azure Data Lake Storage (ADLS), Azure Data Factory (ADF), Databricks, Azure Synapse Analytics / Azure SQL DB / Fabric.
  • Strong programming skills in Python and SQL.
  • Solid experience doing ETL processes and data modeling/data warehousing end to end solutions.
  • Self-driven, resourceful, and comfortable working in dynamic, fast-paced environments.
  • Advanced written and spoken English is a must have for this position (B2, C1 or C2 only).
  • Strong communication skills is a must.

Nice to have:

  • Databricks certification.
  • Knowledge of DevOps, CI/CD pipelines, and cloud migration best practices.
  • Familiarity with Event Hub, IoT Hub, Azure Stream Analytics, Azure Analysis Services, and Cosmos DB.
  • Basic understanding of SAP HANA.
  • Intermediate-level experience with Power BI.

Additional Information

Please note that we will not be moving forward with any applicants who do not meet the following mandatory requirements:

  • 3+ years of experience with PySpark/Python, ETL and data warehousing processes, Azure data factory, Synapse, Databricks, Azure Data Lake Storage, Fabric, Azure SQL DB etc.
  • Proven leadership experience in a current project or previous projects/work experiences.
  • Advanced written and spoken English fluency is a MUST HAVE (from B2 level to C1/C2)
  • MUST BE located in Central or South america, as this is a nearshore position (Please note that we are not able to consider candidates requiring relocation or those located offshore).

More Details:

  • Contract type: Independent contractor (This contract does not include PTO, tax deductions, or insurance. It only covers the monthly payment based on hours worked).
  • Location: The client is based in the United States; however, the position is 100% remote for nearshore candidates located in Central or South America.
  • Contract/project duration: Initially 6 months, with extension possibility based on performance.
  • Time zone and working hours: Full-time, Monday to Friday (8 hours per day, 40 hours per week), from 8:00 AM to 5:00 PM PST (U.S. time zone).
  • Equipment: Contractors are required to use their own laptop/PC.
  • Start date expectation: As soon as possible.
  • Payment methods: International bank transfer, PayPal, Wise, Payoneer, etc.

Bertoni Process Steps:

  • Requirements verification video interview.

Partner/Client Process Steps:

  • CV review.
  • 1 Technical video interview with our partner.
  • 1 or 2 video interviews with the end client.

Why Join Us?

  • Be part of an innovative team shaping the future of technology.
  • Work in a collaborative and inclusive environment.
  • Opportunities for professional development and career growth.

Top Skills

Azure Data Factory
Azure Data Lake Storage
Azure Fabric
Azure Sql Db
Databricks
Pyspark
Python
SQL
Synapse

Similar Jobs

20 Days Ago
Remote
13 Locations
Senior level
Senior level
Artificial Intelligence • Software
The Senior Data Engineer will develop data pipelines, optimize data flow, manage AWS infrastructure, and collaborate with teams on data initiatives.
Top Skills: AWSCypherIceberg Data FormatMedallion ArchitecturePostgresPysparkPythonScalaTrine
21 Hours Ago
Remote
15 Locations
Entry level
Entry level
Artificial Intelligence • Big Data • Cloud • Information Technology • Software • Cybersecurity • Data Privacy
Join Rubrik's sales talent community to connect and be considered for future opportunities aimed at securing data against cyber threats.
21 Hours Ago
Remote or Hybrid
São Paulo, BRA
Mid level
Mid level
Artificial Intelligence • Big Data • Cloud • Information Technology • Software • Big Data Analytics • Automation
As a Solution Engineer, you'll provide technical support to sales, conduct demos, manage POCs, and build customer relationships while advocating for Dynatrace's offerings.
Top Skills: .NetAnsibleApplication Performance ManagementAWSAzureCi/CdContainersCSSGCPGoHTMLJavaJavaScriptKubernetesNode.jsObservabilityOpenshiftPHPPuppetTerraform

What you need to know about the Singapore Tech Scene

The digital revolution has driven a constant demand for tech professionals across industries like software development, data analytics and cybersecurity. In Singapore, one of the largest cities in Southeast Asia, the demand for tech talent is so high that the government continues to invest millions into programs designed to develop a talent pipeline directly from universities while also scaling efforts in pre-employment training and mid-career upskilling to expand and elevate its workforce.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account