Prudential plc Logo

Prudential plc

Analytics Engineer

Posted 5 Days Ago
Be an Early Applicant
In-Office
Singapore
Mid level
In-Office
Singapore
Mid level
Analytics Engineers at Prudential design, develop, and deploy machine learning models, analyze business problems, and deliver insights while collaborating across teams. Responsibilities include collecting requirements, documenting solutions, ensuring model performance, and enabling self-service analytics.
The summary above was generated by AI

Prudential’s purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people’s career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed.

Analytics Engineers play a critical role in the Data Science workstream within the Analytics Centre of Excellence (CoE). Analytics Engineers are multidisciplinary team members who design, develop, deploy, and monitor machine learning models that support business growth. Analytics Engineers also undertake general and advanced analysis of business problems using various analytical and statistical methods. They support data analytics and all member of the team grow their skills through their interest in using data to solve business problems.

Design Advance Analytics Solutions 

·       Responsible for collecting business requirements from stakeholders and proposing machine learning or analytical solutions to business problems. 

·       Responsible for clearly documenting and articulating how proposed solutions solve business problems. 

 

Development of Advance Analytics Solutions 

·       Responsible for setting up the required technical and non-technical prerequisites for modelling. 

·       Leading models developing, writing, and persisting features from structured and unstructured data using a feature store method. 

·       Develop train and test pipelines for machine learning models with a high degree of code quality. 

·       Critically evaluating model performance and model fairness using common frameworks that balance performance vs ethical considerations. 

·       Writing unit tests, undertaking SIT, and supporting UAT of machine learning solutions to ensure a high degree of reliability. 

·       Documenting solutions in accordance with required standards. 

 

Deploying Advance Analytics Solutions 

·       Writing and refactoring solutions so they are production ready. 

·       Preparing the required artifacts for deployment such as pipelines, triggers, orchestrators etc. 

·       Deploying models via various serving layers including batch and API options. 

·       Writing good quality documentation and presenting solutions to various approval committees. 

 

Monitoring of Advanced Analytics Solutions 

·       Undertake automated and manual analysis of a model’s performance after production. 

·       Undertake automated and manual analysis of a models key ethical and fairness metrics after production. 

·       Critically evaluate the business use of a model postproduction to support a model’s transition through its life cycle (replacement, retirement etc) 

 

Business Insight & Trend Analysis

·       Conduct in-depth studies and trend analyses to support data-driven decision-making.

·       Collaborate with stakeholders to identify analytical needs and translate them into technical solutions.

·       Deliver actionable insights that influence strategic initiatives and operational improvements.

·       Conduct ad hoc analysis of data to better understand drivers of business performance and to support business planning. This includes simple and advanced analysis such as statistical tests. 

·       Use your passion for data and technology to identify new patterns that help the business make decisions. 

 

Platform Enablement

·       Enable self-service analytics by developing reusable data assets and standardized metrics.

·       Maintain documentation and governance standards to ensure data quality and consistency.

  

Who we are looking for:  

 

Competencies & Personal Traits

·       Kindness, openness, and the willingness to make the team and yourself better. 

·       Creating specification documents, attribute mapping documents, functional specifications.

·       Finding innovative solutions to business problems and the ability to not-recommend AI and machine learning as a solution if not appropriate. 

·       Presenting results and recommendations to non-technical business partners and stakeholders to drive decision-making and actions. 

 

Technical Experience 

 

·       Extensive experience with machine learning frameworks, including scikit-learn, MLFlow, PyCaret, FairLearn, and TensorFlow.

·       Strong analytical thinking and data expertise.

·       Proven knowledge of traditional machine learning techniques such as linear/logistic regression, clustering, classification, principal component analysis (PCA), recommendation systems, and anomaly detection.

·       Proven expertise in cloud development, including designing and implementing data processes for data warehouses and production environments. Hands-on experience with cloud platforms such as Azure is essential.

·       Proficiency in transforming raw data to make it more available, organized, and easier to analyze. This includes improving and optimizing existing data solutions and applying engineering best practices to provide clean, transformed datasets ready for analysis. 

·       Ability to quickly learn and apply advanced methods, including deep learning, Natural Language Processing (NLP), and Generative AI.

·       Proficient in Python and SQL, with a focus on writing clean, reliable, and maintainable code.

·       Familiar with Python-based visualization libraries such as matplotlib and plotly.

·       Experience using Git for version control and platforms like GitHub or Bitbucket.

·       Exposure to PySpark and Scala is an advantage.

·       Knowledge of web frameworks and APIs (e.g., Flask, Django, FastAPI) is a plus.

·       Familiarity with reporting tools such as Power BI, Qlik, or Tableau is a plus. 

 

Education & Professional Qualifications  

 

·       Master’s degree or PhD or equivalent work experience in Mathematics, Statistics, Computer Science, Business Analytics, Economics, Physics, Engineering, or related discipline.  

·       Knowledge of life insurance practices is advantageous  

 

 

Prudential is an equal opportunity employer. We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements.

Top Skills

Azure
Django
Fairlearn
Fastapi
Flask
Git
Matplotlib
Mlflow
Plotly
Power BI
Pycaret
Pyspark
Python
Qlik
Scala
Scikit-Learn
SQL
Tableau
TensorFlow

Similar Jobs

3 Days Ago
Easy Apply
In-Office
46 Locations
Easy Apply
Senior level
Senior level
Software
The role involves converting data into analysis-ready datasets, collaborating with teams, building data models, ensuring data quality, and enabling self-service BI for stakeholders.
Top Skills: AirflowBi ToolsBigQueryDatabricksDbtPythonSnowflakeSQL
8 Days Ago
Easy Apply
In-Office
46 Locations
Easy Apply
Senior level
Senior level
Software
The Senior Analytics Engineer will lead analytics engineering practices, transforming data into usable insights, collaborating with various teams, and building reliable data models.
Top Skills: AirflowBigQueryDatabricksDbtPythonSnowflakeSQL
2 Days Ago
Easy Apply
In-Office
Singapore, SGP
Easy Apply
Internship
Internship
Cloud • Enterprise Web • Information Technology • Productivity • Software
As an Analytics Engineer Intern, you'll design data models, monitor data accuracy, and collaborate with teams to ensure data quality and drive insights. You'll also explore new technologies like GenAI.
Top Skills: DbtPythonSQL

What you need to know about the Singapore Tech Scene

The digital revolution has driven a constant demand for tech professionals across industries like software development, data analytics and cybersecurity. In Singapore, one of the largest cities in Southeast Asia, the demand for tech talent is so high that the government continues to invest millions into programs designed to develop a talent pipeline directly from universities while also scaling efforts in pre-employment training and mid-career upskilling to expand and elevate its workforce.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account