Visible to the public Assured Autonomy Tools PortalConflict Detection Enabled

AdvoCATE (Assurance Case Automation Toolset) supports the development and management of safety/assurance cases.
CSAF is an analysis framework to describe, simulate, analyze and verify learning enabled closed loop control systems.
CoPilot is a domain-specific, embedded-stream language for generating hard real-time C code for monitors.
Marabou is a framework for verifying deep neural networks. It can answer queries about a network's properties by transforming them into constraint satisfaction problems, accommodate networks with different activation functions and topologies, and perform high-level reasoning on the network to reduce search space and...

Visible to the public 


The goal of the Assured Autonomy program is to create technology for continual assurance of Learning-Enabled, Cyber Physical Systems (LE-CPSs).

In order to ground the Assured Autonomy research objectives, the program will prioritize challenge problems in the militarily relevant autonomous vehicle space. However, it is anticipated that the tools, toolchains, and algorithms created will be relevant to other LE-CPSs. The resulting technology from the program will be in the form of a set of publicly available tools integrated into LE-CPS design toolchains that will be made widely available for use in commercial and defense sectors.

Visible to the public 

  T O O L S  

Vanderbilt NNV Design Studio Exploration tool for Neural Network Verification (NNV) tool from Verivital Labs, Vanderbilt University.


HumANav is a framework for navigation around humans combining learning-based perception with model-based optimal control.


KeYmaera X is a hybrid systems theorem prover that can analyze safety properties of a control program or learned policy in a physical model by combining offline proofs with verified runtime monitoring by ModelPlex.


Verisig Design Studios a tool for verifying safety properties of closed-loop systems with neural network components.


Scenic is a domain-specific probabilistic programming language for modeling the
environments of cyber-physical systems like
robots and autonomous cars.


VerifAI is a software toolkit for the
formal design and analysis of systems that include VerifAI: artificial intelligence (AI) and machine learning (ML) components.

  P U B L I C A T I O N S  

Run-Time Assurance for Learning-Enabled Systems

A Programmatic and Semantic Approach to Explaining and Debugging Neural Network Based Object Detectors

Scenic: a language for scenario specification and scene generation

Verisig: verifying safety properties of hybrid systems with neural network controllers

OVERT: Verification of nonlinear dynamical systems with neural network controllers via overapproximation

Improved Geometric Path Enumeration for Verifying ReLU Neural Networks

Formal Scenario-Based Testing of Autonomous Vehicles: From Simulation to the Real World

Combining optimal control and learning for visual navigation in novel environments

Visible to the public 

  N E W S  

Visible to the public 

  E V E N T S  
* no upcoming events found

Supported by DARPA Assured Autonomy Program Fundamental Research - Contract FA8750-18-C-0089