What can you expect as a FedEx team member?
- Career Mobility and Development: When you join FedEx, you’re joining a team with possibilities that literally span the world—from opportunities for advancement and location transfer, to training and leadership programs.
- Total Compensation and Benefits Package: We want to keep our employees for a long time, so we offer competitive benefits, opportunities for flexible work arrangements, and programs to support well-being.
Equal Opportunities
Our greatest asset at FedEx is our people. We are committed to building a diverse, equitable and inclusive workforce, and offer equal opportunities, fairness and respect to all regardless of who you are. We encourage you to apply even if you feel your experience does not align with all the aspects in the job description as you could be exactly who we need for this or another opportunity.
We do not tolerate discrimination or harassment based on race, color, ethnicity, national origin, religion, sex, age, genetic information, citizenship, disability, marital status, pregnancy, sexual orientation, gender identity, gender expression, veteran status or any other characteristic protected under national, state or local laws. We will reasonably accommodate team members and third parties with physical and mental disabilities.
Purpose of Position
This role is responsible for designing, building, and maintaining scalable, reliable, and high‑quality data pipelines to support Air Network Planning analytics, digital products, and decision‑support use cases.
You will focus on end‑to‑end ETL/ELT development, data integration, and data platform enablement—ensuring that analytical and operational teams have timely, trusted, and well‑modelled data. The role works closely with Data Analysts, Data Scientists, and business stakeholders to translate planning and engineering needs into enterprise‑grade data solutions.
Areas of Responsibility
1 – Data Engineering & ETL Development
- Design, build, and maintain robust ETL/ELT pipelines ingesting data from multiple internal and external sources
- Develop batch and, where applicable, streaming data pipelines using Azure and Databricks
- Implement data transformations, validations, and enrichment logic to produce analytics‑ready datasets
- Ensure data quality, lineage, observability, and reliability of production pipelines
- Optimize pipeline performance, scalability, and cost efficiency
2 – Data Platform & Architecture Support
- Contribute to the design and evolution of the Air Network data platform on Azure
- Work with Databricks (Spark, Delta Lake) to implement scalable data processing and storage patterns
- Collaborate with enterprise platform and security teams to ensure compliance with data governance, access control, and security standards
- Support migration of manual or semi‑automated processes into fully automated, production‑grade pipelines
3 – Analytics & Business Enablement
- Partner with Data Analysts and Data Scientists to enable downstream analytics, dashboards, and advanced models
- Translate business and planning requirements into well‑structured data models and pipelines
- Support strategic Air Network initiatives by delivering reliable datasets for planning, performance evaluation, and optimization use cases
- Document data pipelines, schemas, and integration logic so solutions are reusable and maintainable
To Be Successful in This Role, You Will Need
- Bachelor’s degree or equivalent in Computer Science, Data Engineering, Engineering, Information Systems, or a related discipline
- 4+ years of hands‑on data engineering experience
- Strong experience building production ETL pipelines and data workflows
Technical Skills (Required / Strongly Preferred)
- Azure data ecosystem (e.g., Azure Data Factory, Azure Data Lake, Azure Synapse or equivalent)
- Databricks / Apache Spark, including Delta Lake concepts
- Strong SQL skills and experience with data modeling (star/snowflake, analytics‑optimized schemas)
- Programming experience in Python (or equivalent) for data engineering workflows
- Experience with orchestration, scheduling, and monitoring of data pipelines
- Understanding of data quality checks, error handling, logging, and alerting
Technical Skills (Nice to Have)
- Experience supporting BI tools (e.g., Power BI) through well‑designed data layers
- Familiarity with CI/CD, version control, and DevOps practices for data pipelines
- Exposure to operations research, optimization, or forecasting data use cases
- Experience working in Agile / SAFe environments
Professional Skills
- Strong problem‑solving mindset and attention to data quality
- Ability to work across technical and business teams
- Clear communication and documentation skills
- Good planning, prioritization, and stakeholder management skills
Job Posting End Date:
2026-03-16


