-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
#407: remove direct model iteration #416
base: master
Are you sure you want to change the base?
Conversation
There is no "nice way" to deprecate interface usage, so we remove the interface completely and force the user to switch.
613a7d6
to
7e6c214
Compare
I completely get your reasoning why you want to remove this. An alternative would be to eagerly create the model when creating the model class. |
I would still find it nice to have an incremental model. But if you do not want to support this, I would find it better to not make this hard incompatible change from one version to the next. An alternative would be to mark |
I thought about this some more. Why can't we split the model into a eager and a lazy model? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approved.
We should maybe take a look at the CachingModel
delegate, as we always use it currently. Maybe we can simplify it by simply including it in the regular model?
// We need to generate and save this at construction time as OpenSMT has no functionality to | ||
// give a persistent reference to the model. If the SMT engine is used somewhere else, the | ||
// values we get out of it might change! | ||
model = generateModel(pCreator, pAssertedTerms); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are we caching the model twice here if we use the CachingModel
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, CachingModel also keeps a reference to that list.
CachingModel does only cache the call to asList
and no other evaluation. The whole CachingModel seems to become redundant, also for some other solvers.
The idea for model iteration was based on lazy evaluation and the corresponding memory efficiency. However, since at least 2016 no solver was implemented in such a way and always use eager model evaluation internally. Depending on the solver's internal API, evaluating arrays and uninterpreted functions is not possible in a lazy way. Removing direct model iteration allows to simplify the API.