Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

yicesQS submission to the 2024 SMT comp #70

Merged
merged 7 commits into from
Jun 18, 2024
Merged

Conversation

disteph
Copy link
Contributor

@disteph disteph commented May 28, 2024

No description provided.

@bobot bobot added the submission Submissions for SMT-COMP label May 29, 2024
Copy link

github-actions bot commented May 30, 2024

Summary of modified submissions

YicesQS

@disteph disteph force-pushed the master branch 2 times, most recently from aff7075 to 2ac9d83 Compare June 4, 2024 17:07
martinjonas pushed a commit that referenced this pull request Jun 10, 2024
#84: Create cvc5-cloud
#74: Draft STP submission
#70: draft yicesQS submission
#68: Create STP-CNFLS
#66: Yices2 SMTCOMP 2024 Submission
#65: Z3-alpha draft PR
#64: Solver submission: cvc5
#63: submission iProver
#61: OSTRICH 1.4
#60: SMT-RAT submission
#57: Amaya's submission for SMT-COMP 2024
#55: plat-smt submission
#54: Add 2024 Bitwuzla submission.
#53: 2024 solver participant submission: OpenSMT
#52: Z3-Noodler submission
#51: Submission Colibri
#45: Submission for smtinterpol
#42: Adding Algaroba to SMTCOMP 2024
@martinjonas
Copy link
Contributor

@disteph We have executed the latest version of yicesQS on a randomly chosen subset of 20 single-query benchmarks from each logic where it participates. The benchmarks are also scrambled by the competition scrambler (with seed 1). You can find the results here: https://www.fi.muni.cz/~xjonas/smtcomp/yicesqs.table.html#/table

Quick explanation:

  • Green status means that the result agrees with the (set-info :status _) annotation from the benchmark.
  • Blue status means that the benchmark has annotation (set-info :status unknown).
  • By clicking on the result (e.g. false, true, ABORTED, …) you can see the command-line arguments with which your solver was called and its output on the benchmark.
  • By clicking on the benchmark name (i.e., *scrambled*.yml), you can see the details of the benchmark including its contents (by clicking on the file link in input_files) and the name of the original bennchmark before scrambling (e.g., # original_files: 'non-incremental/AUFBVFP/20210301-Alive2-partial-undef/ph7/583_ph7.smt2').

Please check whether there are some discrepancies, such as missing/extra logics, unexpected aborts or unknowns, and similar. If you update the solver, let me know and I can execute further test runs. We still have plenty of time for several follow-up test runs.

@bobot
Copy link
Contributor

bobot commented Jun 12, 2024

The error in TestSolver is on our side.

@disteph
Copy link
Contributor Author

disteph commented Jun 12, 2024

OK, great! Thanks!

@disteph disteph changed the title draft yicesQS submission yicesQS submission to the 2024 SMT comp Jun 12, 2024
@martinjonas martinjonas merged commit b975f0c into SMT-COMP:master Jun 18, 2024
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
submission Submissions for SMT-COMP
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants