Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatic conjunct/disjunct simplification #23

Open
haz opened this issue Aug 30, 2020 · 4 comments
Open

Automatic conjunct/disjunct simplification #23

haz opened this issue Aug 30, 2020 · 4 comments
Labels
discussing enhancement New feature or request good first issue Good for newcomers wontfix This will not be worked on

Comments

@haz
Copy link
Collaborator

haz commented Aug 30, 2020

Not sure if this should be optional or the default, but a common pattern in building theories has raised an issue...

T = true
T &= <constraint 1>
T &= <constraint 2>
...

This ends up creating a theory T such as And(<constraint 1>, And(<constraint 2>, And ( ... ))). With any moderately sized theory, the recursion depth is hit in a variety of ways (checking height, simplifying, etc). The workaround was to just store the constraints in a list of conjuncts and then wrap as And(conjuncts), but something like the following might be reasonable behaviour:

  • If taking the And of two theories with And as their root, return the And of their children. I.e., And(T1.children, T2.children)
  • If just one theory (wlog, assume T1) is an And, add the other theory as a child of it. I.e., And(T1.children, T2)
  • If neither theory is an And, do as normal. I.e., And(T1, T2)
  • Same for disjunctions.
@haz haz added enhancement New feature or request discussing good first issue Good for newcomers labels Aug 30, 2020
@blyxxyz
Copy link
Collaborator

blyxxyz commented Aug 30, 2020

This is tricky.

Making &= combine nodes would give terrible performance. It's currently O(1), but combining nodes is O(n) in the number of children (because it has to construct a whole new frozenset), so repeated use of &= would go from O(n) to O(n²). It's the same problem as repeatedly using += on strings.
(It's in principle possible to solve this at a low level, see e.g. efficient mutation in the frozenmap proposal. But I don't think that's reasonable here.)

Merging nodes can also drastically increase the sentence size, at least in contrived cases and possibly even in useful cases. Consider this sentence:
image
If we run .simplify() on it the single Or node at the bottom is merged with each of the five other Or nodes and we get this:
image
That increases the sentence size from 20 to 35.
For the same structure with 100 variables it increases the size from 400 to 10200.
(.simplify() takes a merge_nodes=False argument for this reason.)
So I'm hesitant to combine nodes automatically. It can make things worse, and the current behavior is at least very easy to analyze.

But it may be reasonable and feasible (though tricky) to restructure .simplify() to use some kind of queue instead of the call stack so that it never hits the recursion limit. Then you could use that after constructing the sentence to "fix" it.

Maybe other methods could be adjusted as well. It would definitely be nice to be able to handle sentences of unlimited depth.

@haz
Copy link
Collaborator Author

haz commented Aug 30, 2020

Fair enough -- this issue, along with some of the others that have come up, stem from my first medium scale encoding task with the library. It occurs to me that we might want custom functionality for building theories destined for a SAT solver.

  • custom_object.add_constraint(<NNF>) instead of T &= <NNF>
  • custom_object.to_CNF(demorgan=True)
    • warns if loads of Aux vars are being created
    • special assumptions given the makeup of a set of constraints
    • etc.

I'm also thinking of a set of decorators to turn arbitrary object oriented code into SAT encodings, that would make sense in such a library.

Think it makes sense as part of python-nnf or a separate library that builds on it?

@haz haz added the wontfix This will not be worked on label Sep 11, 2020
@haz
Copy link
Collaborator Author

haz commented Sep 11, 2020

Unless there's major objections, I think we'll move forward with a companion library that focuses on advanced techniques for modelling. It will build heavily on nnf, and the student (👋 @karishmadaga ) will likely contribute fixes/enhancements to this library too.

@blyxxyz
Copy link
Collaborator

blyxxyz commented Sep 11, 2020

Making simplify() work with these tall sentences turned out to be harder than I thought, because the intermediate results take up too much memory even if you work around the recursion limit.

A higher level companion library sounds like a great idea.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discussing enhancement New feature or request good first issue Good for newcomers wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

2 participants