StarExec: A Web Service for Evaluating Logic Solvers

Abstract:

Ongoing breakthroughs in nationally important research areas like Verification and Artificial Intelligence depend on continuing advances in high-performance automated theorem proving tools. The typical use of these tools is as backends: application problems are translated by an application tool into (typically very large and complex) logic formulas, which are then handed off to a logic solver. Different tradeoffs between linguistic expressiveness and the difficulty of solving the resulting problems give rise to different logics. Solver communities, formed around these different logics, have developed their own community research infrastructures to encourage innovation and ease adoption of their solver technology. Such infrastructure includes standard formats for logic formulas, libraries of benchmark formulas, and regular solver competitions to spur solver advances. The goal of the StarExec project is to develop shared logic-solving infrastructure, that can be used by many diff t solver communities. This infrastructure consists of a custom web service interfacing with a cluster of compute nodes. StarExec provides solver community organizers the ability to manage their benchmark libraries and reserve compute nodes for events like competitions or evaluations. Solver community members can upload solvers and benchmarks to StarExec, and execute jobs comparing solvers on benchmark formulas. StarExec also provides fi access control features, that support collaboration among organizers and regular users. Results of jobs can be viewed graphically on the system, or downloaded in spreadsheet format for subsequent analysis or custom display. The current status of the project is that we have an implementation of the web service up and running at http://www.starexec.org, backed by 32 compute nodes. We are in the process of purchasing over 150 more compute nodes, which should be available by January 2014. We also hosted two public events on StarExec during the summer, 2013: the CoCo 2013 Confluence Competition, and the SMT-EVAL Evaluation of SMT Solvers. Both these events ran smoothly on the infrastructure, and the organizers were quite appreciative of the capabilities of the system.

Tags:
License: CC-2.5
Submitted by Aaron Stump on