TACAS Logo TACAS 2014
Competition on Software Verification (SV-COMP)

Results of the Competition

This web page presents the results of the 2014 3rd International Competition on Software Verification (SV-COMP'14). The competition and results are described in more detail in the competition report. The benchmarks and rules are also available from the competition web site.

All Results

In every table cell for competition results, we list the points in the first row and the CPU time (rounded to two significant digits) for successful runs in the second row. The background color is gold for the winner, silver for the second, and bronze for the third.

The entry '--' means that the competition candidate opted-out in the category.
The definition of the scoring schema and the categories is given on the respective SV-COMP web pages.

Competition candidate BLAST 2.7.2 CBMC CPAchecker CPAlien CSeq-Lazy CSeq-MU ESBMC 1.22 FrankenBit LLBMC Predator Symbiotic 2 Threader UFO Ultimate Automizer Ultimate Kojak
Representing Jury Member Vadim Mutilin Michael Tautschnig Stefan Löwe Petr Muller Bernd Fischer Gennaro Parlato Lucas Cordeiro Arie Gurfinkel Stephan Falke Tomas Vojnar Jiri Slaby Corneliu Popeea Aws Albarghouthi Matthias Heizmann Alexander Nutz
Affiliation Moscow, Russia London, UK Passau, Germany Brno, Czechia Stellenbosch, South Africa Southampton, UK Manaus, Brazil Pittsburgh, USA Karlsruhe, Germany Brno, Czechia Brno, Czechia Munich, Germany Pittsburgh, USA Freiburg, Germany Freiburg, Germany
BitVectors
49 tasks, max. score: 86
--
86
2 300 s
78
690 s
--
--
--
77
1 500 s
--
86
39 s
-92
28 s
39
220 s
--
--
--
-23
1 100 s
Concurrency
78 tasks, max. score: 136
--
128
29 000 s
0
0.0 s
--
136
1 000 s
136
1 200 s
32
30 000 s
--
0
0.0 s
0
0.0 s
-82
5.7 s
100
3 000 s
--
--
0
0.0 s
ControlFlow
843 tasks, max. score: 1261
508
32 000 s
397
42 000 s
1009
9 000 s
455
6 500 s
--
--
949
35 000 s
986
6 300 s
961
13 000 s
511
3 400 s
41
39 000 s
--
912
14 000 s
164
6 000 s
214
5 100 s
ControlFlowInteger
181 tasks, max. score: 255
64
7 800 s
-298
35 000 s
179
4 800 s
121
3 400 s
--
--
85
24 000 s
149
5 300 s
74
10 000 s
-28
2 200 s
-151
22 000 s
--
184
9 500 s
33
5 800 s
57
5 000 s
Loops
65 tasks, max. score: 99
25
320 s
99
1 100 s
68
600 s
-16
91 s
--
--
88
3 600 s
76
50 s
95
160 s
27
14 s
26
4.9 s
--
44
44 s
26
170 s
29
150 s
ProductLines
597 tasks, max. score: 929
639
24 000 s
918
6 600 s
928
3 500 s
715
3 100 s
--
--
928
7 500 s
905
950 s
925
2 600 s
929
1 200 s
347
17 000 s
--
927
4 800 s
0
0.0 s
0
0.0 s
DeviceDrivers64
1428 tasks, max. score: 2766
2682
13 000 s
2463
390 000 s
2613
28 000 s
--
--
--
2358
140 000 s
2639
3 000 s
0
0.0 s
50
9.9 s
980
2 200 s
--
2642
5 700 s
--
0
0.0 s
HeapManipulation
80 tasks, max. score: 135
--
132
12 000 s
107
210 s
71
70 s
--
--
97
970 s
--
107
130 s
111
9.5 s
105
15 s
--
--
--
18
35 s
MemorySafety
61 tasks, max. score: 98
--
4
11 000 s
95
460 s
9
690 s
--
--
-136
1 500 s
--
38
170 s
14
39 s
-130
7.5 s
--
--
--
0
0.0 s
Recursive
23 tasks, max. score: 39
--
30
11 000 s
0
0.0 s
--
--
--
-53
4 900 s
--
3
0.38 s
-18
0.12 s
6
0.93 s
--
--
12
850 s
9
54 s
SequentializedConcurrent
261 tasks, max. score: 364
--
237
47 000 s
97
9 200 s
--
--
--
244
38 000 s
--
208
11 000 s
-46
7 700 s
-32
770 s
--
83
4 800 s
49
3 000 s
9
1 200 s
Simple
45 tasks, max. score: 67
30
5 400 s
66
15 000 s
67
430 s
--
--
--
31
27 000 s
37
830 s
0
0.0 s
0
0.0 s
-22
13 s
--
67
480 s
--
0
0.0 s
Overall
2868 tasks, max. score: 4718
--
3 501
560 000 s
2 987
48 000 s
--
--
--
975
280 000 s
--
1 843
24 000 s
-184
11 000 s
-220
42 000 s
--
--
399
10 000 s
139
7 600 s

Note on meta-categories: The score is not the sum of scores of the sub-categories (normalization). The run time is the sum of run times of the sub-categories, rounded to two significant digits.

The raw data of the competition are publicly available.

Ranking by Category

BitVectors

1. LLBMC
2. CBMC
3. CPAchecker

Concurrency

1. CSeq-Lazy
2. CSeq-MU
3. CBMC

ControlFlow

1. CPAchecker
2. FrankenBit
3. LLBMC

DeviceDrivers64

1. BLAST 2.7.2
2. UFO
3. FrankenBit

HeapManipulation

1. CBMC
2. Predator
3. LLBMC

MemorySafety

1. CPAchecker
2. LLBMC
3. Predator

Recursive

1. CBMC
2. Ultimate Automizer
3. Ultimate Kojak

Sequentialized

1. ESBMC 1.22
2. CBMC
3. LLBMC

Simple

1. CPAchecker
2. UFO
3. CBMC

Overall

1. CBMC
2. CPAchecker
3. LLBMC

Demonstration Categories

The demonstration categories of SV-COMP 2014 are defined in the description of demo categories.

As for the main categories, the time limit per verification task is 900 s and the memory limit is 15 GB.

Termination

The table cells contain the following entries:
1st row: score, according to the scoring schema of SV-COMP 2014
2nd row: CPU time for solved verification tasks

Competition candidate AProVE FuncTion T2 Tan Ultimate
Termination-crafted
47 tasks, max. score 89
58
360 s
20
220 s
46
80 s
12
33 s
57
250 s
Termination-ext
199 tasks, max. score 265
0
0 s
0
0 s
50
64 s
23
590 s
117
4 800 s

Device-Driver Challenges

Competition candidate CBMC CPAchecker ESBMC 1.22
Driver Challenges
Solved tasks: (max 15)

3

4

2