Research Program Associate In Ai Safety jobs in Cambridge – Browse 402 openings on RoboApply Jobs

Research Program Associate In Ai Safety jobs in Cambridge

Open roles matching “Research Program Associate In Ai Safety” with location signals for Cambridge. 402 active listings on RoboApply Jobs.

402 jobs found

1 - 20 of 402 Jobs
Apply
companyCambridge Boston Alignment Initiative logo
Research Program Associate in AI Safety

Cambridge Boston Alignment Initiative

Full-time|$100K/yr - $125K/yr|On-site|Cambridge, Massachusetts

Join the Cambridge Boston Alignment InitiativeThe Cambridge Boston Alignment Initiative (CBAI) is a nonprofit organization dedicated to pioneering research and educational initiatives aimed at ensuring a safe and beneficial transition to advanced AI systems. Our mission focuses on producing original research and accelerating AI safety through comprehensive fellowship programs.Since our initial summer fellowship cohort, we have achieved significant milestones, including published papers at prominent conferences such as NeurIPS and ICLR. As we enter 2026, we are poised for rapid growth, planning multiple fellowship cycles and expanding our team significantly.Refer candidates to us, and if hired, you will receive a $5,000 referral bonus!Your RoleAs a Research Program Associate, you will collaborate closely with Research Managers, mentors, and program leadership to design and refine the frameworks that empower fellows to excel in their research. This is a pivotal program-building position where you will create systems for mentor matching, research goal tracking, progress assessment, and problem-solving support for fellows.Program Design & Development (0.6 FTE)Enhance CBAI's fellow selection process and program deliverables.Identify effective outreach channels and manage outreach campaigns for future iterations.Develop evaluation frameworks to assess fellow progress and program effectiveness.Implement structural improvements based on feedback from fellows, mentors, and research managers.Assist in the planning and execution of fellowship events, such as speaker series and poster days.Fellow & Mentor Experience (0.4 FTE)Design and oversee the onboarding process for mentors, ensuring a positive experience.

Mar 31, 2026
Apply
companyCambridge Boston Alignment Initiative logo
Research Manager in AI Safety

Cambridge Boston Alignment Initiative

Full-time|From $100K/yr|On-site|Cambridge, Massachusetts

We are open to hiring for this role at various levels of expertise. For the right candidate, this position can be structured as a Senior Research Manager, with compensation tailored to experience and the anticipated scope of work, potentially exceeding the listed pay rate.About the Cambridge Boston Alignment InitiativeThe Cambridge Boston Alignment Initiative (CBAI) is a nonprofit research organization dedicated to promoting research and education aimed at facilitating a safe and beneficial transition to advanced AI systems. Our efforts include generating original research and accelerating AI safety initiatives through our fellowship programs.Our first summer fellowship cohort has already published significant papers at the Mechanistic Interpretability Workshop at NeurIPS and had accepted papers at ICLR. Additionally, some fellows have transitioned to roles at Goodfire and Redwood Research. Following a successful launch in 2025, we are poised for rapid expansion in 2026, with plans to host multiple fellowship cycles (Fall, Spring, and Summer), double our fellowship cohort, and quadruple our team size.Refer candidates to us and earn $5,000 if they are hired.The RoleIn this role, you will collaborate closely with research fellows and their esteemed mentors—renowned researchers from Cambridge and beyond—to support pioneering work on interpretability, AI control, formal verification for provably safe AI, evaluations, and various aspects of AI governance and policy. We are looking for research managers with experience in both technical research and governance and policy research.Research Management Responsibilities (0.7 FTE)Conduct regular one-on-one meetings with fellows to provide constructive feedback on research progress, assist in overcoming challenges, and coach them through issues such as debugging research methodologies and preparing literature scaffolds, as well as supporting data collection, analysis, and methodology development for experiments and hypothesis testing.Offer feedback on fellows' research and help cultivate an environment that encourages rigorous approaches among them.Connect fellows with relevant resources, literature, and opportunities available during and after the fellowship program.Facilitate communication between fellows and their mentors to ensure a supportive research ecosystem.

Mar 30, 2026
Apply
companyLila Sciences logo
Full-time|$192K/yr - $272K/yr|On-site|Cambridge, MA USA; San Francisco, CA USA

Lila Sciences is forming a dedicated AI safety team to address the unique risks and challenges posed by scientific superintelligence. The company seeks a Senior or Principal Technical Program Manager to guide the operational side of AI safety research, helping to shape how the team approaches complex and evolving problems. Role overview This Technical Program Manager position connects research, engineering, model development, policy, and executive leadership. The work involves translating fast-moving research into structured, accountable plans. While this is not a research role, curiosity about the technical aspects of AI safety is important. The team values clear communication and the ability to bring clarity and structure as the organization expands. What you will do Act as the primary communication link between the AI safety team and technical, research, and scientific groups. Share complex results and coordinate resource needs. Establish information flows to keep teams connected. Promote accountability within cross-functional, distributed teams, building consensus and trust through open communication and sound judgment. Support rapid experimentation and iteration by refining and applying effective program management practices. Create clear documentation and reports to communicate vision, track progress, and ensure alignment with company objectives. Accurately represent program status and risks, even in uncertain or shifting situations. Requirements Bachelor’s or Master’s degree in Computer Science, Engineering, Life Sciences, or a related discipline. Minimum of 6 years of program or project management experience in technology or life sciences. Demonstrated success in program management, leading cross-functional teams, and delivering projects. Strong analytical and problem-solving abilities, with skill in turning technical requirements into actionable plans. Excellent written and verbal communication skills, including experience preparing executive-level documents, roadmaps, and updates. Location This position is based in Cambridge, MA or San Francisco, CA, USA.

Apr 24, 2026
Apply
companyLila Sciences logo
Full-time|$176K/yr - $304K/yr|On-site|Cambridge, MA USA; London, UK; San Francisco, CA USA

Your Impact at Lila Join our dynamic and innovative AI safety team at Lila Sciences, where we prioritize talent and agency to mitigate risks associated with scientific superintelligence. Our mission is to craft and execute a tailored safety strategy that aligns with our unique objectives and deployment methods. This role involves creating technical safety strategies, engaging with the broader scientific community, and producing critical technical documentation, including evaluations focused on risk and capability assessments. What You’ll Be Creating Design and implement capability evaluations to assess scientific risks, particularly from cutting-edge scientific models integrated with automated physical laboratories across biological and physical sciences. Lead and coordinate threat modeling sessions with both internal and external scientific experts, keeping abreast of emerging technologies and use cases. Develop and manage high-quality training and testing datasets for evaluations and safety systems. Analyze risks associated with Lila’s capabilities and their interactions with the broader ecosystem of general-purpose frontier models and specialized scientific tools. Contribute to high-quality research initiatives focused on scientific capability evaluation and restriction as needed. Assist with external communications regarding Lila’s safety initiatives. What You’ll Need to Succeed A PhD in biological sciences (e.g., molecular biology, virology, computational biology) or physical sciences (e.g., materials science, physics, chemistry, or chemical engineering), or similar experience. Proficient in scientific computing related to biological or physical sciences. Familiarity with dual-use research and dissemination issues within relevant safety, regulatory, and governance frameworks (e.g., export control, biological and chemical conventions). Exceptional communication skills to convey complex technical concepts to non-expert audiences effectively. Proven ability to lead internal and external teams in developing Lila's perspective on biological and physical risks. Demonstrated capacity to collaborate with cross-functional stakeholders (science, AI, product, policy) in a complex environment.

Mar 4, 2026
Apply
companylilasciences logo
Full-time|$268K/yr - $384K/yr|On-site|Cambridge, MA USA; London, UK; San Francisco, CA USA

Your Contribution at Lila At Lila, we are assembling a highly skilled and proactive AI safety team that will collaborate with all core departments, including science, model training, and lab integration, to effectively address risks associated with scientific superintelligence. The primary mission of this team is to develop and execute a tailored safety strategy that aligns with Lila's unique objectives and deployment methodologies. This will encompass formulating technical safety strategies, engaging with the broader ecosystem, and producing technical documentation such as risk and capability assessments and safety measures. Your Responsibilities Establish the research and development strategy for Lila’s safety framework concerning biological and physical risks. Design and implement capability evaluations to identify scientific risks (both recognized and novel) arising from state-of-the-art scientific models integrated with automated physical laboratories across biological and physical sciences. Lead and coordinate threat modeling sessions with both internal and external scientific experts, including monitoring advancements in technologies and their applications. Create and curate high-quality training and testing datasets for evaluations and safety systems. Assess risks linked to Lila’s capabilities, considering interactions with the broader ecosystem of capabilities (including general-purpose frontier models and specialized scientific tools). Contribute to extensive, high-quality research initiatives when needed for scientific capability evaluation and restriction. Engage in external communications regarding Lila’s safety initiatives. Qualifications for Success A PhD in a biological sciences field (e.g., molecular biology, virology, computational biology) or a physical sciences field (e.g., materials science, physics, chemistry, chemical or nuclear engineering) or equivalent experience. Proven track record in setting research directions for open issues surrounding dual-use risks in biological and physical sciences. Experience in scientific computing within the biological or physical sciences. Understanding of dual-use research and dissemination issues in relation to relevant safety, regulatory, and governance frameworks (e.g., export controls, biological and chemical-related conventions). Excellent communication skills, capable of articulating complex technical concepts to non-specialist audiences. Demonstrated leadership capabilities in guiding teams of internal and external collaborators in developing Lila's perspective on biological and physical risks.

Mar 4, 2026
Apply
companyLila Sciences logo
Full-time|$228K/yr - $358K/yr|On-site|Cambridge, MA USA; London, UK; San Francisco, CA USA

Your Contribution at Lila At Lila, we are assembling a dynamic and empowered AI safety team dedicated to proactively addressing the potential risks associated with scientific superintelligence. This team will collaborate closely with all core departments, including science, model training, and lab integration, to craft a customized safety strategy that aligns with our unique objectives and deployment methods. Key responsibilities will encompass the development of technical safety strategies, engagement with the broader ecosystem, and the creation of essential technical documentation, including risk assessments and capability evaluations. Your Key Responsibilities Design and execute evaluations to identify scientific risks—focusing on both established and emerging threats—from state-of-the-art scientific models integrated with automated physical laboratories. Develop initial proof-of-concept safety measures, such as machine learning models designed to detect and mitigate unsafe behaviors from scientific AI models and physical laboratory outputs. Gain a comprehensive understanding of various model capabilities, primarily within scientific contexts but also extending to non-scientific domains (e.g., persuasion, deception) to shape Lila's overarching safety strategy. Engage in high-quality research initiatives as needed to evaluate and restrict scientific capabilities effectively. Qualifications for Success A Bachelor's degree in a relevant technical field (e.g., computer science, engineering, machine learning, mathematics, physics, statistics) or equivalent experience. Proficient programming skills in Python and hands-on experience with machine learning frameworks (such as Inspect) for large-scale evaluations and structured testing. Demonstrated experience in constructing evaluations or conducting red-teaming exercises pertaining to CBRN/cyber risks or frontier model capabilities, encompassing both unsafe and benign attributes. Background in designing and/or implementing AI safety frameworks in cutting-edge AI enterprises. Exceptional ability to communicate intricate technical concepts and issues to audiences without technical expertise. Desirable Qualifications A Master’s or PhD in a field pertinent to safety evaluations of AI models within scientific areas or another technical discipline. Publications in AI safety, evaluations, or model behavior at leading ML/AI conferences (such as NeurIPS, ICML, ICLR, ACL) or model release documentation. Experience exploring risks arising from novel scientific advancements (e.g., biosecurity, computational biology) or utilizing specialized scientific tools (e.g., large-scale foundational models in science).

Mar 4, 2026
Apply
companyIntegrated Resources, Inc. logo
Research Associate

Integrated Resources, Inc.

Full-time|On-site|Cambridge

We are seeking a detail-oriented and motivated Research Associate to join our dynamic team at Integrated Resources, Inc. As a Research Associate, you will play a pivotal role in conducting research, analyzing data, and supporting various projects that contribute to the advancement of our organization’s goals.Your responsibilities will include collaborating with cross-functional teams, preparing reports, and presenting findings to stakeholders. This position offers an excellent opportunity for professional growth and development in a fast-paced environment.

Apr 22, 2017
Apply
companyIntegrated Resources Inc. logo
Associate Research Scientist

Integrated Resources Inc.

Full-time|On-site|Cambridge

Join our team at Integrated Resources Inc. as an Associate Research Scientist, where you will contribute to cutting-edge research projects that impact real-world applications. We are looking for passionate individuals eager to innovate and collaborate in a dynamic environment. In this role, you will support senior scientists in the design and execution of experiments, analyze data, and contribute to publications. This is a fantastic opportunity to grow your career in scientific research.

Jun 9, 2015
Apply
companyIntegrated Resources, Inc. logo
Senior Research Associate

Integrated Resources, Inc.

Full-time|On-site|Cambridge

We are seeking a highly motivated Senior Research Associate to join our dynamic team at Integrated Resources, Inc. In this pivotal role, you will conduct advanced research, analyze data, and contribute to innovative projects that drive our mission forward. If you are passionate about scientific research and eager to make an impact, we want to hear from you!

Aug 22, 2017
Apply
companyIntegrated Resources Inc. logo
Senior Research Associate

Integrated Resources Inc.

Full-time|On-site|Cambridge

We are seeking a talented and dedicated Senior Research Associate to join our dynamic team in Cambridge. In this role, you will contribute to innovative research projects and collaborate with interdisciplinary teams to drive impactful findings.

Aug 21, 2014
Apply
companyHarvard University logo
Full-time|On-site|Cambridge

Harvard University is seeking an innovative and strategic Associate Director of Research to lead groundbreaking research initiatives. The ideal candidate will possess a deep understanding of research methodologies, strong leadership skills, and a passion for academic excellence. This role involves collaborating with faculty and students to enhance research output and drive impactful scholarship.

Feb 13, 2026
Apply
companyHarvard University logo
Full-time|On-site|Cambridge

About the OrganizationThe Middle East Initiative (MEI) at the Belfer Center for Science and International Affairs serves as Harvard University’s leading platform for policy-relevant research and education related to the contemporary Middle East and North Africa. By integrating scholarly research with policy analysis, executive and graduate education, and community involvement, MEI strives to influence public policy and enhance capacity in the Middle East, ultimately improving the lives of its diverse populations.As part of the Belfer Center, MEI is committed to fostering a diverse and inclusive environment as a fundamental aspect of our mission. We uphold principles of equality and do not discriminate based on race, color, creed, national or ethnic origin, age, sex, gender identity, sexual orientation, marital or parental status, disability, income, or veteran status.About the PositionMEI is on the lookout for a dynamic and organized Program Coordinator for Fellowships and Research Administration. This role is pivotal in administering MEI’s fellowship programs, research awards, and associated activities. Reporting directly to the Associate Director, the Program Coordinator is responsible for executing vital programs and ensuring operational efficiency while providing faculty, students, and scholars with opportunities to enhance their engagement with the MENA region. This position is suitable for early- to mid-career professionals with 3-5 years of relevant experience who are eager to contribute to these mission-driven initiatives within an academic setting.Job-Specific Responsibilities:Fellowships:Oversee fellowship advertising, recruitment, and nomination processes in line with program priorities under the guidance of the MEI Associate Director.Manage application, selection, nomination, and hiring processes on behalf of MEI, collaborating closely with MEI leadership and relevant faculty committees.Coordinate the annual fellowship renewal process in partnership with the MEI Associate Director.Draft fellowship letters and related correspondence on behalf of the MEI Faculty Chair.Collaborate with the Harvard International Office to facilitate visa procurement for fellows when required.Provide logistical support for fellows’ onboarding and arrival.Maintain records of fellowship status, evaluations, and alumni updates.Review fellowship applications and make recommendations to MEI leadership and/or relevant committees as needed.Research Administration:Administer application and award processes for MEI Faculty Research Awards and MEI Student Research & Internship funding opportunities, including collaboration with MEI leadership and relevant faculty committees.

Feb 24, 2026
Apply
companyLila Sciences logo
Full-time|On-site|Cambridge, MA USA

AI Resident – 2026 CohortThe AI Residency Program presents a unique, full-time research opportunity aimed at connecting cutting-edge academic research with practical industry applications, specifically in the realm of AI for materials science. As a resident, you will collaborate closely with leading scientists and engineers at Lila Sciences on impactful, open-science projects, allowing you to either delve into fundamental research or apply innovative solutions to real-world challenges.Duration: 6–12 months (with the possibility of extension)Start Dates: Initial cohort members will begin in January 2026, with rolling applications and additional intakes scheduled for Summer and Fall 2026.Cohort Size: A select group of talented residentsMentorship: Dedicated pairing with technical mentors and constructive feedback from diverse cross-functional teamsResources: Access to proprietary datasets, high-performance computing resources, and Lila’s comprehensive research infrastructureResearch areas of focus include ML-accelerated simulations, Bayesian methods, representation learning, generative models, agentic science, and ML-driven automation.

Mar 12, 2026
Apply
companyGraphcore logo
Full-time|On-site|Cambridge, UK

About Graphcore At Graphcore, we are pioneering the future of artificial intelligence computing. Our team comprises semiconductor, software, and AI specialists with extensive expertise in developing the complete AI compute stack—from silicon and software to large-scale infrastructure. As a proud member of the SoftBank Group, we benefit from substantial long-term investments, enabling us to contribute essential technology to the rapidly evolving SoftBank AI ecosystem. To capture the immense potential of AI, Graphcore is expanding globally, uniting the brightest minds to tackle the most challenging problems, where every individual is empowered to make a significant impact on our company, our products, and the future of AI. Job Summary As a Research Scientist at Graphcore, you will play a vital role in advancing AI research by exploring innovative ideas that address significant AI/ML challenges. The evolution of AI has been primarily driven by specialized hardware over the past decade, and we believe that developing hardware-aware AI algorithms and AI-optimized hardware will remain crucial for progress in this exciting domain. We seek candidates who are not only curious scientists but also proficient engineers, equipped with both theoretical knowledge and practical skills essential for impactful AI research. We welcome applicants with experience in low-power, edge, and embodied AI applications, including robotics, autonomous vehicles, and augmented/virtual reality. Your expertise will contribute to the training and deployment of multimodal AI models in these contexts, focusing on areas such as world models, real-time computer vision, and reasoning over audio and video streams. The Team The Graphcore Research team engages in both fundamental and applied research to define the computational needs of machine intelligence and showcase how hardware advancements can lead to the next generation of innovative AI models. We actively publish in leading AI/ML conferences (NeurIPS, ICML, ICLR) and participate in specialized workshops while collaborating with various research teams and organizations globally. We take pride in fostering a supportive and collaborative environment, where we organize ourselves around individual research interests to collectively solve challenges in domains such as efficient computation, model scaling, and distributed training and inference of AI models across multiple modalities and applications, including sequence and graph-based data. Our teams are spread across London, Cambridge, and Bristol, with projects and discussions that involve all locations.

Mar 13, 2026
Apply
companyIntegrated Resources Inc. logo
Drug Safety Associate

Integrated Resources Inc.

Full-time|On-site|Cambridge

As a Drug Safety Associate at Integrated Resources Inc., you will play a crucial role in ensuring the safety and efficacy of pharmaceutical products. You will be responsible for monitoring and evaluating adverse events, coordinating safety reports, and contributing to the overall pharmacovigilance efforts within our organization. Your attention to detail and analytical skills will be vital in supporting our commitment to patient safety.

Feb 4, 2016
Apply
companyrai logo
Full-time|On-site|Cambridge, MA

Our MissionAt rai, we are dedicated to addressing the most pressing and foundational challenges in Artificial Intelligence and Robotics. Our goal is to pave the way for future generations of intelligent machines that enhance our daily lives.Position OverviewWe are seeking passionate and innovative Research Scientists with substantial hands-on research experience in one or more of the following areas: Cognitive AI, Athletic AI, Organic Hardware Design, or Robot Ethics. If you're enthusiastic about advancing robotic technology and its applications to improve functionality and effectiveness, we invite you to join our team!

Oct 5, 2022
Apply
companyIntegrated Resources Inc. logo
Senior Neurology Research Associate

Integrated Resources Inc.

Full-time|On-site|Cambridge

Join our dynamic team at Integrated Resources Inc. as a Senior Neurology Research Associate, where you will play a pivotal role in advancing groundbreaking research in neurology. You will collaborate with top-tier professionals, contributing to innovative projects that drive the future of neurological health.

Jun 8, 2015
Apply
companyIntegrated Resources, Inc. logo
Pharmaceutical Research Associate

Integrated Resources, Inc.

Full-time|On-site|Cambridge

Join Integrated Resources, Inc. as a Pharmaceutical Research Associate, where you will play a vital role in advancing medical research and drug development. You will collaborate with a team of scientists and researchers to assist in the design and execution of clinical trials, ensuring the integrity and accuracy of data collection.Your contributions will help pave the way for innovative therapies that make a difference in patients' lives. Ideal candidates are detail-oriented, possess strong analytical skills, and have a passion for the pharmaceutical industry.

Jul 13, 2017
Apply
companyFlagship Pioneering, Inc. logo
Full-time|$81K/yr - $126.5K/yr|On-site|Cambridge, MA USA

COMPANY DESCRIPTION We are a pioneering start-up dedicated to transforming the landscape of chemical discovery. Our innovative platform integrates artificial intelligence with a state-of-the-art laboratory discovery pipeline to redefine molecular formulation development. Our interdisciplinary team is exploring uncharted chemical territories, facilitating significant advancements in drug delivery, agricultural formulations, and sustainable, high-performance industrial chemicals. Backed by Flagship Pioneering, a leader in biotechnology origination, we have fostered over 115 scientific ventures in 25 years, generating more than $20 billion in aggregate value, 500+ patents, and over 50 clinical trials for groundbreaking therapeutic agents. THE ROLE We are on the lookout for a passionate and skilled Senior Research Associate or Associate Scientist to enhance our formulation discovery platform by characterizing biomolecules. This position involves utilizing a range of biophysical and biochemical methodologies to analyze the effects of formulations on biomolecular integrity, aggregation, and functional stability. You will collaborate closely with chemists and machine-learning experts to produce high-quality experimental data that drives formulation design and predictive modeling initiatives. The ideal candidate will possess a solid background in biochemistry, structural biology, or biopolymer engineering, complemented by practical experience with analytical instrumentation for biomolecular characterization. KEY RESPONSIBILITIES Conduct experiments assessing biomolecular stability and solubility across various formulation conditions. Evaluate cargo structural integrity and folding utilizing circular dichroism (CD) spectroscopy and complementary biophysical techniques. Measure biomolecular interactions and stability through spectroscopy, dynamic light scattering, and electrophoretic methods. Prepare samples for diverse external workflows to evaluate molecular integrity and degradation. Experimental Pipeline Support Collaborate with formulation chemists to investigate cargo-formulation interactions across different experimental contexts. Generate high-quality experimental datasets to assist internal modeling and data analysis processes. Support laboratory instrumentation maintenance and troubleshooting. Maintain organized laboratory records, analyze results, and communicate findings effectively to cross-functional teams.

Mar 23, 2026
Apply
companyAbbVie logo
Full-time|On-site|Cambridge

AbbVie is seeking a highly skilled Associate Director to lead our Safety Operations Portfolio. This pivotal role involves overseeing safety operations while ensuring compliance with regulatory requirements. The ideal candidate will possess exceptional leadership skills and a strong background in pharmacovigilance.Join our dynamic team and contribute to advancing healthcare solutions that enhance patient safety and improve health outcomes. This is an exciting opportunity for professionals looking to make a meaningful impact in the pharmaceutical industry.

Apr 7, 2026

Sign in to browse more jobs

Create account — see all 402 results

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.