Member of Engineering (Pre-training / Data Research)

Poolside
Applied ResearchPosted 24 February 2026

Job Description

ABOUT POOLSIDEIn this decade, the world will create Artificial General Intelligence. There will only be a small number of companies who will achieve this. Their ability to stack advantages and pull ahead will define the winners. These companies will move faster than anyone else. They will attract the world's most capable talent. They will be on the forefront of applied research, engineering, infrastructure and deployment at scale. They will continue to scale their training to larger & more capable models. They will be given the right to raise large amounts of capital along their journey to enable this. They will create powerful economic engines. They will obsess over the success of their users and customers.Poolside exists to be this company: to build a world where AI will be the engine behind economically valuable work and scientific progress. We believe the fastest way to reach AGI lies in accelerating software development itself, by reshaping the developer experience with agentic systems, coding assistants, and the frontier models that power them. We deploy these systems directly into the development environments of security-conscious enterprises.ABOUT OUR TEAMWe were founded in the US and have our home there, but our team is distributed across Europe and North America. We get our fix of in-person collaboration (and croissants) in Paris each month for 3 days, always Monday-Wednesday, with an open invitation to stay the whole week. We also do longer off-sites once a year.Our team is a multidisciplinary blend of research, engineering, and business experts. What unites us is our deep care for what we build together. We’re in a race that requires hard work, intellectual curiosity, and obsession; to balance this intensity, we’ve assembled a team of low ego and kind-hearted individuals who have built the special culture Poolside has. By building collaboratively and with intention, we create a compounding effect that moves the entire company forward towards our mission: reaching AGI through intelligence systems built for software development.ABOUT THE ROLEYou’ll be working on our data team focused on the quality of the datasets being delivered for training our models. This is a hands-on role where your #1 mission would be to improve the quality of the pretraining datasets by leveraging your previous experience, intuition and training experiments. This includes synthetic data generation and data mix optimization. You’ll closely collaborate with other teams like Pretraining, Postraining, Evals, and Product to define high-quality data needs that map to missing model capabilities and downstream use cases.Staying in sync with the latest research in the fields of dataset design and pretraining is key to success in this role. You will constantly lead original research initiatives through short, time-bounded experiments while deploying highly technical engineering solutions into production. With the volumes of data to process being massive, you'll have a performant distributed data pipeline together with a large GPU cluster at your disposal.YOUR MISSIONTo deliver large, high-quality, and diverse datasets of natural language and source code for training poolside models and coding agents.RESPONSIBILITIESFollow the latest research related to LLMs and data quality in particular. Be familiar with the most relevant open-source datasets and models.Design and implement complex pipelines that can generate large amounts of data while maintaining high diversity and optimizing the resources available.Closely work with other teams such as Pretraining, Posttraining, Evals and Product to ensure short feedback loops on the quality of the models delivered.Suggest, conduct and analyze data ablations or training experiments that aim to improve the quality of the datasets generated via quantitative insights.SKILLS & EXPERIENCEStrong machine learning and engineering backgroundExperience with Large Language Models (LLM), including:Understanding of transformer ... (truncated, view full listing at source)