Principal AI Engineer

Ceridian HCM Holding
RemotePosted 2 March 2026

Job Description

Skip to Content Sign In Principal AI Engineer Req #22096 Canada Apply Share Job Description Posted Sunday, March 1, 2026 at 8:00 PM Dayforce is a global human capital management (HCM) company headquartered in Toronto, Ontario, and Minneapolis, Minnesota, with operations across North America, Europe, Middle East, Africa (EMEA), and the Asia Pacific Japan (APJ) region.  Our award-winning Cloud HCM platform offers a unified solution database and continuous calculation engine, driving efficiency, productivity and compliance for the global workforce. Our brand promise - Makes Work Life Better™- Reflects our commitment to employees, customers, partners and communities globally.   Location: Work is what you do, not where you go. For this role, we are open to remote work and can hire anywhere in the United States or Canada. About the opportunity We’re looking for a Principal AI Engineer to lead the architecture, development, and implementation of cutting-edge AI and Generative AI solutions across our enterprise. In this role, you’ll play a critical part in shaping and advancing our Enterprise AI Platform—including ownership of core tools and infrastructure that power AI innovation company-wide. You will work hands-on with both traditional and generative AI technologies (e.g., LLMs, agents, RAGs), designing scalable, secure, and high-performing AI systems that drive real business impact across departments such as Sales, Marketing, Finance, HR, Legal, and IT. If you thrive at the intersection of strategy, engineering, and experimentation, and are excited by the opportunity to work on a modern enterprise AI stack (e.g., Replit, Vercel, cloud-native environments), we’d love to hear from you. What you’ll get to do Architect and Build AI Solutions Design and implement end-to-end AI applications tailored to complex enterprise challenges Break down ambiguous use cases into well-defined, actionable solution designs using the latest AI technologies Hands-On Development Lead the engineering and deployment of AI assistants, agents, and applications using both code-first and low/no-code approaches Build scalable, real-time AI systems with optimized performance and low latency AI Platform Ownership Own and evolve the underlying enterprise AI infrastructure and toolchain Evaluate and integrate new AI frameworks, SDKs, and cloud services into the platform Experimentation and Innovation Rapidly prototype new AI features and solutions; conduct proofs of concept and guide production rollout Research and apply cutting-edge techniques in GenAI, LLMs, RAG, agentic workflows, and multi-modal AI MLOps & Automation Implement robust MLOps pipelines, including model training, deployment, testing, monitoring, and drift detection Automate lifecycle workflows to ensure reliability, scalability, and speed Cross-Functional Collaboration Work closely with product, engineering, data science, and business stakeholders to align AI capabilities with enterprise priorities Translate strategic goals into scalable AI solutions that deliver measurable business value Governance & Responsible AI Champion responsible AI principles, including model interpretability, data privacy, and compliance with security policies Mentorship & Leadership Serve as a technical mentor and thought leader across AI teams Foster a culture of experimentation, excellence, and continuous learning Skills and experience we value 10+ years in software engineering, distributed systems, or enterprise architecture 5+ years of experience focused on AI/ML engineering, platforms, or applications Experience in a lead role, directing and coaching teams of engineers and analysts Strong background in enterprise-grade AI solution design and delivery across multiple domains Deep understanding of LLMs, GenAI architectures, RAG pipelines, and agent-based systems Hands-on expertise with AI/ML frameworks (e.g., PyTorch, TensorFlow, JAX) and cloud AI platforms (e.g., AWS SageMaker, Vertex AI, Az ... (truncated, view full listing at source)