Razer Logo

Razer

Associate Data Engineer

Posted 5 Days Ago
Be an Early Applicant
Singapore
Entry level
Singapore
Entry level
The Associate Data Engineer will build and maintain data processing systems, develop data pipelines, manage data lakes and warehouses, and collaborate with data teams to enhance data usability for analytics and AI.
The summary above was generated by AI

Joining Razer will place you on a global mission to revolutionize the way the world games. Razer is a place to do great work, offering you the opportunity to make an impact globally while working across a global team located across 5 continents. Razer is also a great place to work, providing you the unique, gamer-centric #LifeAtRazer experience that will put you in an accelerated growth, both personally and professionally.

Job Responsibilities :

This role is required to build and maintain high volume big data processing systems that enable the organization to collect, manage, and convert raw data into usable information for data scientists and business analysts, and enabling the use of Artificial Intelligence (AI) capabilities. He / She is responsible for developing and maintaining data pipelines, data storage systems, and cloud infrastructure, while working closely with data scientists, data analysts and internal stakeholders to utilize data for analytics and AI capabilities.

Essential Duties and Responsibilities

  • Develop and maintain data systems and data pipelines that enable the organization to store, process, and analyze large volumes of data. This involves developing data pipelines, implementing data storage systems, and ensuring that data is integrated effectively to support of AI applications.
  • Manage data lakes and data warehouses by populating and operationalizing them. This involves creating and managing table schemas, views, materialized views, including tokenization and vectorization techniques for Gen AI.
  • Develop template data pipelines that enable extract, transform, and load operations from various sources. This involves using cloud computing tools to build streaming and batch processing pipelines that handle large volumes of data which can also be used for AI applications.
  • Monitor and troubleshoot data workflows, ensuring timely resolution of failures and rerunning failed jobs to ensure data completeness.
  • Leverage modern build tools to enhance automation, data quality, testing, and deployment of data pipelines.
  • Develop and implement cloud infrastructure that are in line with company's security policies and practices, as well as cost optimization practices.
  • Collaborate with data scientists, data analysts and business users to understand the data needs of various stakeholders across the organization to implement appropriate solutions.

Pre-Requisites :

Requirements

  • Bachelor’s or Master’s degree in computer science, engineering, mathematics or similar discipline.
  • Ability to write clean, maintainable, scalable and robust code using Python, SQL, Java.
  • Experience in building and maintaining ETL/ELT pipelines using Python and optimizing SQL.
  • Exposure to cloud data services such as AWS Redshift, Snowflake, BigQuery, or Azure Data Lake.
  • Experience using pipeline orchestration tools like Airflow, containerization tools like Dockers.
  • Excellent with various data processing techniques (streaming, batch, event-based), optimizing data storage (Data Lake, Data Warehouse, Vector Data Stores and Database, SQL, and NoSQL).
  • Experience using data transformation tool like Data Build Tool (DBT).
  • Experience with version control (Git) and CI/CD pipelines for data engineering workflows.
  • Experience using Infrastructure As Code tools such as Terraform, container orchestration platforms like Kubernetes.
  • Excellent problem-solving and analytical skills, with an understanding of AI technologies and their applications.
  • Excellent written and verbal communication skills for collaboration across teams.

Are you game?

Top Skills

Airflow
Aws Redshift
Azure Data Lake
BigQuery
Dbt
Docker
Git
Java
Kubernetes
Python
Snowflake
SQL
Terraform

Razer Singapore Office

1 One-north Cres, Singapore, 138538

Similar Jobs

2 Days Ago
Hybrid
Singapore, SGP
Mid level
Mid level
Fintech • Financial Services
The Senior Data Analyst will manage datasets for traders, ensuring data accuracy, conducting analyses, and collaborating with various teams on data quality improvement.
Top Skills: AirflowArgoBloomberg TerminalGitHdf5Jupyter NotebooksPandasParquetPysparkPythonRequestsSQL
4 Days Ago
Hybrid
Singapore, SGP
Entry level
Entry level
Enterprise Web • Fintech • Financial Services
Assist equity analysts in healthcare and financial sectors with valuation models, research reports, and analysis of stocks. Help produce research output and monitor industry information.
Top Skills: ChartsFinancial ModelingResearch ReportsTablesValuation Models
4 Days Ago
Hybrid
Singapore, SGP
Senior level
Senior level
Enterprise Web • Fintech • Financial Services
The Equity Analyst will perform fundamental analysis, build valuation models, write research reports, and pitch investment ideas while communicating in English and Japanese.
Top Skills: Discounted Cash FlowFundamental AnalysisValuation Models

What you need to know about the Singapore Tech Scene

The digital revolution has driven a constant demand for tech professionals across industries like software development, data analytics and cybersecurity. In Singapore, one of the largest cities in Southeast Asia, the demand for tech talent is so high that the government continues to invest millions into programs designed to develop a talent pipeline directly from universities while also scaling efforts in pre-employment training and mid-career upskilling to expand and elevate its workforce.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account