Submission
Since SV-COMP 2026, tool submissions consist of the following parts only:
-
An archive with the tool
published at Zenodo.
The archive has to meet the following requirements:- The archive is publicly available and contains a LICENSE that allows reproduction and evaluation of the tool by anybody and does not place any restriction on the tool output (log files, witnesses).
- The archive contains a README file that describes the contents.
- The archive contains a script smoketest.sh that runs the tool on some simple example(s) that are also in the archive.
- The tool is archived in a ZIP file (.zip), which contains exactly one directory (no tarbomb), with the files LICENSE, README, smoketest.sh, and all other files and folders of the tool.
- The tool does not exceed an stdout/stderr limit of at most 2 MB.
- The tool has an option to report its version (ideally --version).
- The archive does not contain large amounts of unnecessary data, such as repository data (.svn, .git), source files, aux folders like __MACOSX, and test files.
- The tool should not require any special software on the competition machines; all necessary libraries and external tools should be contained in the archive. Standard packages that are available as Ubuntu packages can be requested via an entry in the FM-Tools repository.
- The tool should be executable from any path, and should not expect the working directory to match the tool directory.
-
An entry at the FM-Tools repository
named <tool_id>.yml.
The structure of the file is described in the repository. In particular, the file has to contain- a version entry containing the DOI of the archive mentioned above,
- a participation declaration,
- a jury member who is (a) a contributing designer/developer of the submitted tool (witnessed by occurrence of the person’s name on the tool's project web page, a tool paper, or in the revision logs) or (b) authorized by the competition organizer (after the designers/developers of the tool were contacted about the participation),
- label meta_tool if the tool is a meta-verifier,
- label ai if the tool uses an LLM or other sophisticated kind of artificial intelligence, and
- description that shortly explains the principles of the tool and the components it uses.
- A tool-info module for BenchExec (to be added to the BenchExec repository. The tool-info module enables BenchExec to abstract from the concrete command-line interface by assembling the command line, parsing the output to determine the result/status, and to get tool version.)
-
A merge request to category-structure.yml
saying in which categories the tool wants to participate.
- The tool basically declares for each meta category (like C.ReachSafety or C.Overall) whether it wants to participate.
- opt-out is used for the cases when a tool wants to hide its results for some base or meta categories included in a meta category it participates in. For example, if a tool participates in C.ReachSafety but it does not correctly handle floats, it can opt-out from C.unreach-call.Floats. The effect is that it will be evaluated on all base categories of C.ReachSafety including C.unreach-call.Floats, but its score for C.unreach-call.Floats will be replaced by a dash in the result table on the competition web and no execution results are reported in that base category. Still, the score will be used in the computation of the score for C.ReachSafety.
- opt-in is used when a tool wants to participate in some base categories, but not in the meta category containing these base categories. For example, you can opt-in for C.unreach-call.ControlFlow and C.unreach-call.Loops, but not participate in C.ReachSafety. Similarly, one could opt-out from C.Concurrency while participating in C.Overall.
- Additionally, new jury members have to announce their e-mail address to the organizers.
Note that in contrast to previous years, there is no paper submission/registration to EasyChair needed for the participation. There will be a separate and optional submission deadline for short papers (called competition contributions) after the competition results are publicly available.
We strongly recommend to subscribe to the mailing list and the Zulip stream.
Qualification
To be qualified for the competition, each tool needs to have:
- correct flags in <tool_id>.yml in the FM-Tools repository,
- the tool archive passes the CI checks,
- a license in the archive with the tool satisfying the requirements given above,
- conceptual or implementation differences compared to other participating tools,
- a jury member who contributes to the development of the tool or has an approval from the organizers.
Requirements for Competition Contribution Papers
The competition contribution paper should be structured as follows (the structure is recommended, not mandatory; but the below-mentioned information must be provided):
-
Title, Authors, and Abstract
The format is defined in the usual LNCS style. It is a good idea to mention the name of the tool and/or technique (or a combination thereof) in the title. Please mark the jury member in the paper (asterisk after the name and footnote, perhaps). -
1. Verification/Validation Approach
A short overview of the theory that the tool is based on. Description of the abstract domains and algorithms that are used. Reference to the concept papers that describe the technical details. -
2. Software Architecture
- Libraries and external tools that the tool uses (e.g., parser frontend, SAT solver)
- Software structure and architecture (e.g., components that are used in the competition)
- Implementation technology (e.g., programming language)
-
3. Discussion of Strengths and Weaknesses of the Approach
Evaluation of the results for the benchmark categories, where was the tool successful, where not, why? -
4. Tool Setup and Configuration
- Download instructions (a public web page from which the tool can be downloaded) including a reference to a precise version of the tool (do not refer to ``the latest version'' or such, because that is not stable and not replicable)
- Installation instructions
- Participation statement (a clear statement which categories the tool participates in)
- Configuration definition (there is one global set of parameters for all categories of the benchmark set, a full definition of the parameter set must be provided); check the rules page under Parameters for more details
-
5. Software Project and Contributors
- Contact info (web page of the project, people involved in the project)
- Information about the software project, licensing, development model, institution that hosts the software project, acknowledgement of contributors
-
6. Data-Availability Statement
Publish your tool archive and reference it here via DOI to ensure reproducibility, also provide the URL of the project web site and repository. References
Bibliographic references to more in-depth papers about the approaches used in the tool.