MaxSAT Evaluation 2020
Affiliated with SAT 2020   ·   July 5-9   ·   Alghero, Italy

Submission Instructions

Benchmark Submissions

Please follow the instructions given in the call for benchmarks, and submit benchmarks by email to maxsatevaluation@gmail.com using the subject title "MSE20 benchmark submission" by June 12 the latest.

Solver Submissions

Solvers need to be submitted by June 12 the latest. We encourage the participants to test their solvers within the execution environment well in advance. MSE 2020 will be using the StarExec system which supports experimental evaluation of logic solvers. All interaction with StarExec is preformed via its web interface at https://starexec.org To submit a solver you will need to prepare a zip or tar file (the suffix needs to be .zip, .tar, tar.gz, or .tgz) with the following file and directory structure

./starexec_description.txt
./bin/<your_solver_binary>
./bin/starexec_run_default
./code/<your_source_tree>
./doc/<your_solver_description>.pdf

  • ./starexec_description.txt is a short text description of our solver (1204 characters max.). This file is optional; you can instead enter the description directly in the upload page.
  • ./bin/solver_binary is your executable.
  • ./bin/starexec_run_default is a StarExec configuration. It is just a shell script used to invoke your solver. A bare bones version would be

    #!/bin/bash
    ./<your_solver_binary> $1 
    

    which will invoke your solver on an input file passed into the script as the $1 variable. More elaborate versions might supply various parameter settings to your solver.
  • ./code/ is where you place your source code (which can be in a directory tree).
  • ./doc/ is where you place your PDF formatted description of your solver.

Using StarExec

Getting an account

Register for an account in the MaSAT community (unless you are already registered). Wait until one of the MaxSAT community leaders approves your request (this should not take more than 24h).

Uploading a solver for testing

A solver is submitted by uploading an archive as specified above. Note however that StarExec runs an older version of Linux and so dynamically linked binaries are likely to fail. A statically linked binary should work.

Building your Solver on StarExec
Otherwise you can modify your archive to include a build script, in which case your solver will be built on the StarExec system on upload. To do this you include a script "starexec_build" at the top level of your archive file. On upload StarExec will detect if this file exists and execute it to build your solver. Note that your archive must still include the other files specified above (as well as, of course, any Make file or other files you need to perform the build). It is also recommended that your build create a static executable. See the StarExec Users Guide for more details.

To upload your solver to StarExec
Login to the StarExec website and click Spaces->Explore (on the top) then in the left frame open root->MaxSAT->MSE2020->Testing At the bottom of page click "upload solver" to open the upload dialog. Click "Browse" and select the archive with your solvers code, enter the solver name, change downloadable to "no" (so as to keep your source code hidden until the submission deadline is over) and click "upload".

Testing your solver

We put a few testing instances in the MSE2020->Testing->Complete and ->Testing->Incomplete spaces which you can use after you uploaded your solver to these spaces. To run a test:

Go to the root->MaxSAT->MSE2020->Testing->Complete->Weighted subspace (other subspaces exist for the Complete/Incomplete and Unweighted/Weighted combinations). Click "create job" on the bottom of the page. This opens a page where you don't need to change anything so just click "next" On the next page select the "choose" option and then "all in MSE2020->Testing->Complete->Weighted" A page with all solvers (and their configurations) comes up, select the solver you want to run and click "submit" at the bottom of the page. You can check out the status and results of your test job by clicking on Jobs on the top and selecting your job.

To check the output of your job and the correctness of your solution
When creating a job, the post processor "MSE2017 Checker" is enabled by default. This checker will run on the output of your solver and will check if: (i) all hard clauses are satisfied, (ii) the solution provided in the 'v' line matches the value of the last 'o' line, (iii) the solution is larger or equal than the optimum solution. Once the job is finished, you can check the logs of your solver by downloading the "job output" in the job webpage. You can also download the "job information" that will contain a .csv file with a summary of your job. This .csv file will contain default attributes such as "benchmark id", "solver", "wallclock time", "memory usage" as well as checker attributes such as "optimum_solution", "hard_clauses", "optimum", "checker_solution", "checker" and "given_solution". If the column "checker" reports an error then it means that your solver has a bug that needs to be fixed before submission. If this is the case, we recommend you to download the checker and run it offline on that benchmark for more details.

Uploading the final version of your solver

The submission of the final version of a solver is essentially the same as uploading the solver for testing (see above). To submit the final version of your solver upload it to the space root->MaxSAT->MSE2020->Evaluation->[track], where [track] is one of Complete->Unweighted, Complete->Weighted, Incomplete->Unweighted, Incomplete->Weighted, depending on the track you want to participate in. If you want to participate in several tracks upload your solver to each one of them separately.

Don't forget to select "downloadable:no" when uploading your solver to prevent other participants from seeing your source code before the start of the evaluation. After the solver submission deadline we will ask all the participants to update the downloadable option to "downloadable:yes". This will make the source code visible to the organizers and all the participants. The participants must comply with this request in order to participate in the evaluation.