About Agero:Wherever drivers go, we’re leading the way. Agero’s mission is to rethink the vehicle ownership experience through a powerful combination of passionate people and data-driven technology, strengthening our clients’ relationships with their customers. As the #1 B2B, white-label provider of digital driver assistance services, we’re pushing the industry in a new direction, taking manual processes, and redefining them as digital, transparent, and connected. This includes: an industry-leading dispatch management platform powered by Swoop; comprehensive accident management services; knowledgeable consumer affairs and connected vehicle capabilities; and a growing marketplace of services, discounts and support enabled by a robust partner ecosystem. The company has over 150 million vehicle coverage points in partnership with leading automobile manufacturers, insurance carriers and many others. Managing one of the largest national networks of service providers, Agero responds to approximately 12 million service events annually. Agero, a member company of The Cross Country Group, is headquartered in Medford, Mass., with operations throughout North America. To learn more, visit https://www.agero.com/.
Role Description & Mission:
We are seeking a Data Engineer who is passionate about data and eager to make a meaningful impact. In this role, you will design, build, and maintain the core data infrastructure that powers our analytics, machine learning, and data science initiatives. Your responsibilities will include optimizing data management processes, ensuring data quality and reliability, and developing scalable, efficient data models to support advanced analytics and data-driven decision-making. Success in this role requires a strong technical foundation, a collaborative mindset, and a drive to deliver innovative and impactful solutions.
Key Outcomes:
- Data Pipelines:
- Develop and maintain robust ETL/ELT pipelines to ingest data from diverse sources (relational and NoSQL databases, APIs, etc.), including implementing best practices for real-time and batch data ingestion.
- Create and optimize data workflows using modern orchestration tools (e.g., Apache Airflow, Snowflake Tasks, Dagster, Mage).
- Cloud Cost Optimization:
- Monitor and optimize cloud costs (e.g., AWS, Snowflake) by analyzing resource usage and implementing cost-saving strategies.
- Perform query optimization in Snowflake to reduce compute costs and improve performance.
- Data Foundations:
- Develop and maintain modern data architectures, including data lakes and data warehouses (e.g., Snowflake, Databricks, Redshift), considering trade-offs of different data storage solutions and ensuring alignment with business requirements and SLAs.
- Data Modeling & Transformation:
- Apply dimensional modeling techniques (Kimball), star and snowflake schemas, and normalization vs. denormalization strategies based on use cases.
- Develop transformations using DBT (Core or Cloud), Spark (PySpark), or other frameworks.
- Collaborate on emerging approaches such as data mesh or specialized templates (e.g., Jina) to handle complex data needs.
- Coding:
- Write reusable, efficient, and scalable code in Python, PySpark, and SQL.
- Integrate serverless computing frameworks or modern API frameworks to support data-driven applications (FastAPI, Flask).
- Develop and maintain data-intensive UIs and dashboards using tools like Streamlit, Dash, Plotly, or React.
- Data Quality Control:
- Establish data governance and data quality frameworks, using either custom solutions or popular open-source/commercial tools (e.g., DBT tests, Great Expectations, Soda).
- Implement data observability solutions to monitor and alert on data integrity and reliability (e.g., Monte Carlo, Alation, or Elementary).
- Define SLAs, SLOs, and processes to identify, troubleshoot, and resolve data issues.
- Teamwork:
- Work cross-functionally with data scientists, analysts, and business stakeholders to translate requirements into robust data solutions.
- Follow and advocate for best practices in version control, CI/CD.
- Document data flows, processes, and architecture to facilitate knowledge sharing and maintainability.
Skills, Education & Experience:
Education & Experience: Bachelor's degree in a technical field and 3+ years of industry experience or Master's degree in a technical field and 3+ years of industry experience. (2-5+ years of experience)
Technical Skills (Essential):
- Extensive experience with Snowflake (preferred) or other cloud-based data warehousing solutions like Redshift or BigQuery.
- Expertise in building and maintaining ETL/ELT pipelines using tools like Airflow, DBT, Fivetran, or similar frameworks.
- Proficiency in Python (e.g., Pandas, PySpark) for data processing and transformation.
- Advanced SQL skills for querying and managing relational and NoSQL databases (e.g., DynamoDB, MongoDB).
- Solid understanding of data modeling techniques, including dimensional modeling (e.g., star schema, snowflake schema).
- Knowledge of query optimization and cost management strategies for platforms like Snowflake and cloud environments.
- Experience with data quality and observability frameworks (e.g., Great Expectations, Soda).
- Proven expertise in designing and deploying data solutions in the cloud, with a focus on AWS services (e.g., EC2, S3, RDS, Lambda, IAM).
- Experience in building and consuming data-intensive APIs using frameworks like FastAPI or Flask.
- Familiarity with version control systems (e.g., Git) and implementing CI/CD pipelines.
Soft Skills:
- Strong communication and collaboration skills with the ability to explain technical concepts to both technical and non-technical audiences.
- Ability to manage multiple priorities and work independently
Bonus Skills:
- Experience with data streaming platforms such as Kafka or Kinesis.
- Familiarity with Agile methodologies (Scrum, Kanban) and IaC tools like Terraform or CloudFormation.
- Knowledge of emerging technologies or frameworks in the data engineering ecosystem, such as Delta Lake, Iceberg, or Hudi.
Hiring In:
- United States: AZ, FL, IL, KY, MA, MI, NM, NH, TN, GA, NC, VA, CA
- Canada: Province of Ontario
- #LI-REMOTE
The base salary range presented represents the anticipated low and high end salary range for new hires in this position. Your final base salary will be determined based on factors such as work location, experience, job related skills, and relevant training and education. The range listed is just one component of the total compensation package provided by Agero to employees.
National Pay Range
$97,482—$140,000 USD
Life at Agero:
At Agero, you'll find a workplace where your unique perspective is not just welcomed, it's celebrated. We believe that our differences make us stronger, and we're committed to creating an environment where every employee feels a sense of belonging. If you're looking for a company that values your individuality, provides opportunities for growth, and champions open communication, Agero is the place for you. Join our team and help us drive the future of driver assistance, while experiencing a workplace where you can truly thrive.
Benefits Built for Well-being:
Agero’s innovation is driven by a workforce where all associates feel like they can truly thrive. Agero offers a wide range of benefits to promote well-being, encourage personal development, and ensure financial stability. Our benefits include:
- Health and Wellness: Healthcare, dental, vision, disability, life insurance, and mental health benefits for associates and their families.
- Financial Security: 401(k) plan with company match and tuition assistance to support your future goals.
- Work-Life Balance: Flexible time off, paid sick leave, and ten paid holidays annually.
- For Contact Center Roles: Accrual of up to 3 weeks Paid Time Off per year, paid sick leave, and ten paid holidays annually.
- Family Support: Parental planning benefits to assist associates through life’s milestones.
- Bonus/Incentive Programs
Join Agero and experience a workplace that invests in your success both personally and professionally.