Data Engineer III - (Data Platforms)

Cedar
USA$170k – $215kPosted 9 March 2026

Job Description

Our healthcare system is the leading cause of personal bankruptcy in the U.S. Every year, over 50 million Americans suffer adverse financial consequences as a result of seeking care, from lower credit scores to garnished wages. The challenge is only getting worse, as high deductible health plans are the fastest growing plan design in the U.S. Cedar’s mission is to leverage data science, smart product design and personalization to make healthcare more affordable and accessible. Today, healthcare providers still engage with its consumers in a “one-size-fits-all” approach; and Cedar is excited to leverage consumer best practices to deliver a superior experience. The Role Cedar’s Data Integration Platforms organization builds the data infrastructure, pipelines and tooling that power our products, analytics, and financial operations. We are looking for a Data Engineer to help evolve our data ecosystem from homegrown ETL scripts to a modern, scalable stack built on tools like dbt, Airflow and Snowflake. You will contribute to the design and implementation of critical data pipelines (e.g., client billing, product analytics, and platform data services), improve data quality and observability, and help implement patterns and standards for how Cedar builds and operates data products. This is a hands-on individual contributor role with meaningful ownership, technical depth, and cross-functional exposure. What You’ll Do Design, build, and maintain scalable ELT/ETL pipelines that power core use cases including client billing, financial reporting, product analytics, and data services for downstream teams (Finance, Data Science, Commercial Analytics, Product). Modernize legacy data flows by migrating SQL- and Liquibase-based transformations into dbt, with solid testing, documentation and data contracts. Improve reliability and observability of our data platform by applying best practices in testing, monitoring, alerting and runbook-driven operations for pipelines orchestrated via Airflow (and/or similar tools). Model data for usability and performance in Snowflake and other systems, applying sound data modeling patterns (e.g., dimensional models, entity-centric designs) for analytics and operational use cases. Collaborate closely with product, finance, analytics and integrations teams to understand requirements, define interfaces, and ensure data is accurate, well-documented, and delivered in the right form and cadence for consumers. Contribute to Cedar’s data platform vision by implementing standards for governance, metadata and access, and by helping pilot tools like OpenMetadata and data quality frameworks within your projects. Participate in code reviews and design discussions , helping to raise the bar on code quality, reliability, and operational excellence across the team. About You 3+ years of hands-on data engineering (or closely related software engineering) experience , including building and supporting production data pipelines. Strong SQL and Python proficiency , with experience implementing data transformations, utilities and tooling (e.g., dbt models, Airflow DAGs, internal scripts). Experience with modern data stack tools , including some combination of: Snowflake (or similar cloud data warehouse), dbt, Airflow/Dagster (or similar orchestrator). Comfort designing and operating reliable pipelines , including applying testing strategies (unit/integration/dbt tests), basic monitoring and alerting, and contributing to incident/root-cause analysis. Experience with data modeling and schema design for analytics and reporting use cases (e.g., star/snowflake schemas, event or entity-centric designs). Familiarity with cloud platforms , ideally AWS (e.g., S3, IAM, containerized workloads, or related infrastructure supporting data workloads). Strong collaboration and communication skills , with the ability to break down ambiguous business problems into clear technical tasks and work effectively with partners ... (truncated, view full listing at source)
Apply Now

Direct link to company career page

Share this job