Job offer
Fixed-term contract

Trusted AI Research Engineer F/H

Trusted AI Research Engineer F/H

Apply for this position

Date limite de candidature
Closing date for submitting applications : 31.08.22
Corps de texte


Mathematics, scientific information, software

About the company

The French Atomic Energy and Alternative Energies Commission (CEA) is a public research organization. A major player in research, development and innovation, the CEA is involved in four missions: . defense and security . nuclear energy (fission and fusion) . technological research for industry . fundamental research (material sciences and life sciences). With its 16,000 employees - technicians, engineers, researchers, and research support staff - the CEA participates in numerous collaborative projects alongside its academic and industrial partners.

Description of the Department

Within CEA Tech, the technology research arm of the CEA, the List Institute is dedicated to intelligent digital systems with R&D programs in advanced manufacturing, embedded systems, and ambient intelligence. We support our partners in the fields of transport, industry, energy, health, security, and defense, to transfer the technologies resulting from innovation and improve their competitiveness.

Unit description

At the heart of the Paris-Saclay campus, the CEA List Software Safety and Security Laboratory develops analysis tools to verify the safety and security properties of software. Our goal is to ensure confidence in critical and sensitive systems. We are open source supporters and are proud to see our scientific tools used for industrial applications.

Offer description

You will participate in the development of one of the laboratory's tools, PyRAT (Python Reachability Analysis Tool), for the verification of safety and security properties on artificial intelligence-based systems using formal methods such as abstract interpretation:

  • You participate in the various actions related to PyRAT in the framework of the Confiance.AI project.
  • You participate in the development of new features for the PyRAT tool and extend the support of existing features.
  • You contribute to the state of the art around new artificial intelligence techniques and their verification as well as to the reflection around their potential contribution and support by PyRAT.
  • You research new state-of-the-art methods to verify AI systems and study their contribution to PyRAT.
  • You will support the development of other tools related to PyRAT or PyRAT front-end such as AIMOS or CAISAR and their integration with PyRAT. 
  • You will improve the visualization features of PyRAT by developing a complete GUI for the tool.

The safety and security of AI-based software is a key objective for new industrial systems, both critical and non-critical, which increasingly rely on AI techniques. Traditional software verification provides a base of methods and experiences in the field of formal verification that must now be applied to AI.

More generally, AI systems are becoming more and more diversified with recent advances such as XGBoost networks, Transformers, Recurrent Nets, ... adding to older methods already diverse not currently treated such as decision trees. It is to this increasing diversification that we must face in order to succeed in ensuring the safety and security of these new systems.

Candidate profile

  • engineer or master of science
  • AI basics/experience (Tensorflow/Pytorch/Keras)
  • Python development skills
  • knowledge for GUI (ReactJS or other)
  • (optional) notions of abstract interpretation/formal methods


Saclay, Palaiseau (France)