AI Platform Developer Sr

Ceridian HCM Holding
RemotePosted 3 March 2026

Job Description

Skip to Content Sign In AI Platform Developer Sr Req #22097 Canada Apply Share Job Description Posted Sunday, March 1, 2026 at 8:00 PM | Expires Thursday, April 2, 2026 at 7:59 PM Dayforce is a global human capital management (HCM) company headquartered in Toronto, Ontario, and Minneapolis, Minnesota, with operations across North America, Europe, Middle East, Africa (EMEA), and the Asia Pacific Japan (APJ) region.    Our award-winning Cloud HCM platform offers a unified solution database and continuous calculation engine, driving efficiency, productivity and compliance for the global workforce.   Our brand promise - Makes Work Life Better™ - Reflects our commitment to employees, customers, partners and communities globally.    Effective November 1, 2025 this position is not open to residents of Quebec; applicants must reside in a province or territory of Canada other than Quebec to be considered. Any roles available in Quebec will be posted separately.   Location: Work is what you do, not where you go. For this role, we are open to remote work and can hire anywhere in the United States or Canada. About the Opportunity We are looking for a Sr. AI Platform Developer to help shape how AI solutions are delivered and scaled across the organization. Sitting at the intersection of infrastructure, tooling, and enterprise needs, this role focuses on building and evolving the platform capabilities that support the development, deployment, and maintenance of AI Applications.  You will work within a modern cloud-based environment, empowering both internal teams and external developers to harness the power of AI—whether they are building code assistants, autonomous agents, or full-stack AI apps. If you are passionate about making AI accessible, fast, and safe for everyone, this is the opportunity to have a broad and lasting impact. What you will get to do Design and build scalable infrastructure for AI workloads (inference, training, fine-tuning) within a multi-tenant, cloud-native development platform Develop tools, APIs, and libraries that simplify building and deploying AI/LLM-based applications from a collaborative environment like Replit Integrate and optimize large language models (LLMs), RAG pipelines, embedding stores, and multi-modal systems Implement MLOps best practices for model versioning, deployment, monitoring, and performance optimization Collaborate with Engineering, product, and research teams to ship AI-powered features that enhance developer workflows Ensure platform reliability, security, and responsible AI practices at scale Continuously explore and evaluate emerging AI tools and frameworks for inclusion into the platform. Skills and experience we value Bachelor’s degree in computer science, Data Science or related field 6+ years in software Engineering, with 4+ years in AI/ML infrastructure or platform roles Experience with cloud platforms such as AWS, GCP, or Azure and their machine-learning services Knowledge of cloud-native AI platforms (e.g., Replit, Vercel, LangChain or similar) Proficiency in building distributed systems and managing scalable compute infrastructure (Kubernetes, serverless architectures, etc.) Solid understanding of LLMs, vector databases, RAG pipelines, and fine-tuning workflows Strong programming skills in Python (plus Go, TypeScript, or JavaScript a bonus) Familiarity with modern MLOps tools and practices (e.g., MLFlow, Weights & Biases, Azure ML, Vertex AI) Solid understanding of security, observability, and performance engineering in a cloud environment What would make you really stand out Direct experience integrating AI capabilities into a live developer platform (e.g., IDEs, code assistants, copilots) Experience working with cloud-native platforms (e.g., Replit, Vercel, LangChain or similar) Hands-on experience with LLMs, vector databases, RAG pipelines, and fine-tuning workflows Contributions to open-source AI tooling or platforms Experience optimizing LLMs for low-lat ... (truncated, view full listing at source)