ICCMA
International Competition on Computational Models of Argumentation
Home Competition 2015 Competition 2017 Calls Rules Participation Submissions Results Organization Competition 2019 Competition 2021 Competition 2023 Contact

Supported by
This page is an archived version of the original 2017 competition webpage (date of archiving: 2017-10-23)

Results

Partial results have been announced at TAFA 2017. The updated version of the slides includes rankings and scores for all tasks and tracks.


CO Track

The winning solver is:
pyglaf
Mario Alviano.

  1. pyglaf
  2. cegartix
  3. argmat-sat
  4. goDIAMOND
  5. argmat-dvisat
  6. CoQuiAAS
  7. argmat-mpg
  8. heureka
  9. ConArg
  10. ArgTools
  11. ArgSemSAT
  12. EqArgSolver
  13. argmat-clpb
  14. gg-sts

Results for CO-Track: Tables.

Detailed results1: DS-CO, DC-CO, SE-CO, EE-CO.


PR Track

The winning solver is:
ArgSemSAT
Federico Cerutti, Mauro Vallati, Massimiliano Giacomin, and Tobia Zanetti.

  1. ArgSemSAT
  2. argmat-sat
  3. pyglaf
  4. argmat-dvisat
  5. cegartix
  6. goDIAMOND
  7. ArgTools
  8. ConArg
  9. heureka
  10. argmat-mpg
  11. EqArgSolver
  12. CoQuiAAS
  13. gg-sts

Results for PR-Track: Tables.

Detailed results1: DS-PR, DC-PR, SE-PR, EE-PR.


ST Track

The winning solver is:
pyglaf
Mario Alviano.

  1. pyglaf
  2. goDIAMOND
  3. argmat-sat
  4. cegartix
  5. argmat-mpg
  6. argmat-dvisat
  7. ConArg
  8. heureka
  9. ArgSemSAT
  10. ArgTools
  11. EqArgSolver
  12. argmat-clpb
  13. CoQuiAAS
  14. gg-sts

Results for ST-Track: Tables.

Detailed results1: DS-ST, DC-ST, SE-ST, EE-ST.


SST Track

The winning solver is:
argmat-sat
Fuan Pu, Guiming Luo, and Ya Hang.

  1. argmat-sat
  2. ArgSemSAT
  3. cegartix
  4. goDIAMOND
  5. pyglaf
  6. argmat-mpg
  7. ConArg
  8. ArgTools
  9. gg-sts
  10. CoQuiAAS

Results for SST-Track: Tables.

Detailed results1: DS-SST, DC-SST, SE-SST, EE-SST.


STG Track

The winning solver is:
argmat-sat
Fuan Pu, Guiming Luo, and Ya Hang.

  1. argmat-sat
  2. pyglaf
  3. cegartix
  4. goDIAMOND
  5. ConArg
  6. argmat-mpg
  7. ArgTools
  8. CoQuiAAS
  9. gg-sts

Results for STG-Track: Tables.

Detailed results1: DS-STG, DC-STG, SE-STG, EE-STG.


GR Track

The winning solver is:
CoQuiAAS v2.0
Jean-Marie Lagniez, Emmanuel Lonca, and Jean-Guy Mailly.

  1. CoQuiAAS
  2. cegartix
  3. heureka
  4. goDIAMOND
  5. pyglaf
  6. argmat-dvisat
  7. argmat-clpb
  8. EqArgSolver
  9. argmat-sat
  10. ArgTools
  11. argmat-mpg
  12. ConArg
  13. ArgSemSAT
  14. gg-sts

Results for GR-Track: Tables.

Detailed results1: DC-GR, SE-GR.


ID Track

The winning solver is:
pyglaf
Mario Alviano.
(photo)

  1. pyglaf
  2. argmat-dvisat
  3. argmat-sat
  4. goDIAMOND
  5. cegartix
  6. ArgTools
  7. argmat-mpg
  8. ConArg
  9. CoQuiAAS
  10. gg-sts

Results for ID-Track: Tables.

Detailed results1: DC-ID, SE-ID.


Special Track: Dung's Triathlon

The winning triathlete is:
argmat-dvisat
Fuan Pu, Hang Ya, and Guiming Luo.

  1. argmat-dvisat
  2. pyglaf
  3. argmat-sat
  4. ConArg
  5. cegartix
  6. EqArgSolver
  7. goDIAMOND
  8. argmat-mpg
  9. gg-sts
  10. CoQuiAAS

The winners have received an author copy of the famous paper On the Acceptability of Arguments and its Fundamental Role in Nonmonotonic Reasoning, Logic Programming and n-Person Games, signed by Phan Minh Dung. (titlepage)

Results for D3-Track: Table.

Detailed results1: D3.


A zipped archive of detailed results of the all tasks can be found here (updated by EE-tasks on 25-09-2017, updated by Dung's Triathlon on 29-09-2017).


Benchmarks

The note on benchmark selection provides details on how the benchmarks were selected.
Tasks have been grouped according to compatible complexiy as follows:

  • A (download, 614MB): DS-PR, EE-PR, EE-CO
  • B (download, 713MB): DS-ST, DC-ST, SE-ST, EE-ST, DC-PR, SE-PR, DC-CO
  • C (download, 856MB): DS-CO, SE-CO, DC-GR, SE-GR
  • D (download, 614MB): DC-ID, SE-ID
  • E: DS-SST, DC-SST, SE-SST, EE-SST, DS-STG, DC-STG, SE-STG, EE-STG.
  • T (download, 631MB): D3

Note that groups A, D, and E use the same set of benchmarks, with the exception that D has different query arguments.


1The results are provided as comma-separated files, where each line contains the following information:
  Solver, Instance, Query-Argument, Answer, CPU-time, Score
  (note that Query-Argument and Answer remain empty for SE- and EE-tasks)



Last updated 14.11.2018, Matthias Thimm | Terms