Design and architect data storage solutions using AWS and Databricks, create and optimize data pipelines, and integrate data from various sources while ensuring data quality and consistency.
- Design and architect data storage solutions, including databases, data lakes, and warehouses, using AWS services such as Amazon S3, Amazon RDS, Amazon Redshift, and Amazon DynamoDB, along with Databricks' Delta Lake. Integrate Informatica IDMC for metadata management and data cataloging.
- Create, manage, and optimize data pipelines for ingesting, processing, and transforming data using AWS services like AWS Glue, AWS Data Pipeline, and AWS Lambda, Databricks for advanced data processing, and Informatica IDMC for data integration and quality.
- Integrate data from various sources, both internal and external, into AWS and Databricks environments, ensuring data consistency and quality, while leveraging Informatica IDMC for data integration, transformation, and governance.
- Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and enrich data, making it suitable for analytical purposes using Databricks' Spark capabilities and Informatica IDMC for data transformation and quality.
- Monitor and optimize data processing and query performance in both AWS and Databricks environments, making necessary adjustments to meet performance and scalability requirements. Utilize Informatica IDMC for optimizing data workflows.
- Good experience in data engineering, with expertise in AWS services, Databricks, and/or Informatica IDMC.
- Proficiency in programming languages such as Python, Java, or Scala for building data pipelines.
- Evaluate potential technical solutions and make recommendations to resolve data issues especially on performance assessment for complex data transformations and long running data processes.
- Strong knowledge of SQL and NoSQL databases.
- Familiarity with data modeling and schema design.
- AWS certifications (e.g., AWS Certified Data Analytics - Specialty, AWS Certified Data Analytics - Specialty), Databricks certifications, and Informatica certifications are a plus.
Top Skills
Amazon Dynamodb
Amazon Rds
Amazon Redshift
Amazon S3
AWS
Aws Data Pipeline
Aws Glue
Aws Lambda
Databricks
Informatica Idmc
Java
NoSQL
Python
Scala
Spark
SQL
Unison Consulting Singapore Office
1 Changi Business Park Crescent, , Plaza 8 #03-06 Tower A, Singapore, , Singapore, 486025
Unison Consulting Singapore Office
#12-00, 63 Market Street, Bank of Singapore Center, Singapore, , Singapore, 048942
Similar Jobs
Big Data • Cloud • Internet of Things
The Principal Cloud Engineer will architect, build, and maintain intelligent multi-cloud infrastructures, focusing on AI and data workloads, collaborating across teams to optimize workflows and drive innovation.
Top Skills:
AWSAzureBashCrewaiFaissGCPGithub ActionsGoGrafanaJavaKubeflowLanggraphMlflowOpentelemetryPineconePrometheusPythonSQLTerraformVertex AiWeaviate
Software
Manage and maintain AWS cloud infrastructure and Data Lake architecture, automate operations, ensure data quality and compliance, and collaborate with teams.
Top Skills:
Amazon S3AWSAws CloudformationAws GlueBashDynamoDBEmrGoLake FormationPythonRedshiftTerraform
Financial Services
The Cloud Platform Engineer will automate operations, manage Kubernetes platforms, implement CI/CD pipelines, and ensure platform reliability and security while troubleshooting issues and working with development teams.
Top Skills:
AzureBashCi/CdDockerElkGrafanaKubernetesPrometheusPythonTerraform
What you need to know about the Singapore Tech Scene
The digital revolution has driven a constant demand for tech professionals across industries like software development, data analytics and cybersecurity. In Singapore, one of the largest cities in Southeast Asia, the demand for tech talent is so high that the government continues to invest millions into programs designed to develop a talent pipeline directly from universities while also scaling efforts in pre-employment training and mid-career upskilling to expand and elevate its workforce.