Abuse Investigator - Child Safety

OpenAI
San FranciscoPosted 7 March 2026

Job Description

Abuse Investigator - Child Safety About the Team OpenAI’s mission is to ensure that general-purpose artificial intelligence benefits all of humanity. We believe achieving this goal requires real-world deployment and continuous iteration based on how our products are used—and misused—in practice. The Intelligence and Investigations team supports this mission by identifying, analyzing, and investigating misuse of our products, particularly novel or emerging abuse patterns. Our work enables partner teams to develop data-backed product policies and build scalable safety mitigations. By precisely understanding abuse, we help ensure OpenAI’s products can be used safely to build meaningful, legitimate applications. About the Role As a Child Safety Investigator on the Intelligence & Investigations team, you will identify and disrupt actors attempting to use OpenAI’s products to sexually exploit minors both online and in the real world. OpenAI maintains strict prohibitions in this area and reports apparent CSAM and other credible child sexual exploitation threats to the National Center for Missing and Exploited Children (NCMEC), consistent with applicable law and our policies. This role requires domain-specific expertise, technical fluency, and the ability to operate in ambiguous, high-impact situations. You will conduct in-depth investigations into user behavior, analyze product data, identify emerging threat patterns, and support enforcement actions — including escalations requiring legal review and external reporting. You will also help develop detection strategies that proactively surface high-risk behavior, especially cases that evade existing safeguards. This role includes responding to time-sensitive escalations. Investigations may involve exposure to sensitive and disturbing material, including sexual or violent content. In this role, you will: - Investigate high-severity child safety violations and disrupt malicious actors in partnership with Policy, Legal, Integrity, Global Affairs, Security, and Engineering teams, including through cross-platform and cross-internet research - Support investigations across other high-risk harm areas where child safety concerns intersect - Conduct open-source and cross-platform research to contextualize actors and abuse networks - Develop detection signals, behavioral heuristics, and tracking strategies to proactively identify high-risk users using tools such as SQL, Databricks, and Python - Communicate investigation findings clearly and effectively to internal stakeholders through written briefs, data-backed recommendations, and escalation summaries - Develop a deep, working understanding of OpenAI’s products, internal data systems, and enforcement mechanisms - Collaborate with engineering and data partners to improve investigative tooling, data quality, and analyst workflows - Support time-sensitive escalations and high-priority investigations requiring rapid analysis and sound judgment - Represent investigative findings and work externally with the press, governments, NGOs, and law enforcement agencies - Participate in a rotating on-call schedule to support timely response to high-priority safety incidents and sensitive investigations You might thrive in this role if you: - Have deep expertise in online child safety and child exploitation threats - Have familiarity or proficiency with technical investigations, especially using SQL, Python, Notebooks and scripts in a government, law enforcement and/or tech-company setting - Speak one or more languages in addition to English - Have at least 5+ years of experience tracking threat actors in abuse domains - Have worked on time-sensitive escalations involving high-risk harm - Have presented analytic findings to senior stakeholders or external partners - Have experience scaling and automating processes, especially with language models About OpenAI OpenAI is an AI research and deployment company dedicated to ensuring tha ... (truncated, view full listing at source)
Apply Now

Direct link to company career page

Share this job