Software Engineer - Human Alignment, Consumer Devices
OpenAISan FranciscoPosted 11 March 2026
Job Description
Software Engineer - Human Alignment, Consumer Devices
About the Team
The Future of Computing Research team is an applied research team within the Consumer Devices group focused on developing new methods, models, and evaluation frameworks that support our vision for the future of computing. We work at the frontier of multimodal AI, helping turn emerging model capabilities into product experiences that are useful, delightful, and worthy of long-term trust.
Our work explores a new class of AI systems that can learn over time, adapt to individuals, and support people in the flow of daily life. This includes long-term memory, user modeling, and personalization systems that are aligned not just with immediate satisfaction, but with a person’s broader goals, values, and well-being.
We work closely across research, engineering, design, product, and safety to define what it means to build AI systems that know you over time, act at the right moment, and help in ways that are context-aware, respectful, and demonstrably beneficial.
About the Role
We are looking for a Software Engineer to join the Human Alignment Team within Future of Computing Research to build the infrastructure, data systems, and evaluation foundations for next-generation multimodal models.
This role is focused on the systems that make rigorous product-grounded research possible. You will build the pipelines that transform messy real-world signals into usable training and evaluation substrates, the tooling that supports human feedback, and the evaluation platforms that help us measure model behavior with precision and credibility.
The best candidates for this role have strong engineering fundamentals, but also strong research taste: they know that, especially in human-centered AI systems, measurement is often the core problem.
This role is based in San Francisco, CA. We use a hybrid work model of four days in the office per week and offer relocation assistance to new employees.
In this role, you will:
- Build evaluation and data foundations for next-generation personalized and multimodal AI systems.
- Partner closely with researchers to turn fuzzy behavioral questions into rigorous evals, datasets, rubrics, and scorecards.
- Design and implement human-data pipelines, grader systems, and experiment infrastructure for product-grounded research.
- Create evaluation frameworks for subjective, contextual, and long-horizon behaviors.
- Develop reproducible pipelines for collecting, processing, joining, and analyzing multimodal signals from real-world studies and product usage.
- Help define what should count as meaningful progress, and build the systems that let the team measure it with confidence.
- Work across research, safety, design, and engineering to ensure that what we optimize for is both technically sound and human-centered.
- Prototype quickly, iterate on measurement frameworks, and improve the team’s ability to debug, compare, and trust behavioral results.
- Shape the infrastructure and methodology that future OpenAI products will rely on for personalization, adaptation, and evaluation.
You might thrive in this role if you:
- Have strong software engineering fundamentals and experience building data, backend, ML, or evaluation systems.
- Have excellent research taste and strong judgment about what is worth measuring, how to measure it, and when a metric is misleading.
- Enjoy working on ambiguous, early-stage problems where the hardest part is often defining the right evaluation rather than implementing the obvious one.
- Have experience with human-in-the-loop systems, annotation pipelines, experimentation platforms, or evaluation tooling.
- Are rigorous about data quality, reproducibility, metric design, and empirical correctness.
- Are motivated by human-centered AI and excited by the challenge of measuring behaviors that are subtle, contextual, and difficult to benchmark.
- Want to help define the research infrastructure behind a new c ... (truncated, view full listing at source)
Apply Now
Direct link to company career page