Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Belief propagation cache #139

Merged
merged 16 commits into from
Mar 8, 2024
Merged

Belief propagation cache #139

merged 16 commits into from
Mar 8, 2024

Conversation

JoeyT1994
Copy link
Contributor

This PR combines the partitionedITensorNetwork and message_tensors being passed around and used in belief propagation into one bpc::BeliefPropgationCache containing both. Relevant functionality is added for bpc such as getting message tensors, getting environments, updating message tensors (via belief propagation) and updating the tensors in the partitionedITensorNetwork.

Thus src/beliefpropgation/beliefpropagation.jl has essentially moved to src/caches/beliefpropagationcache.jl

test/test_beliefpropagation.jl has been refactored to reflect the new interface.

@codecov-commenter
Copy link

codecov-commenter commented Feb 15, 2024

Codecov Report

Attention: Patch coverage is 86.24339% with 26 lines in your changes are missing coverage. Please review.

Project coverage is 73.61%. Comparing base (17132f2) to head (affa62c).
Report is 1 commits behind head on main.

Files Patch % Lines
src/caches/beliefpropagationcache.jl 83.16% 17 Missing ⚠️
src/gauging.jl 89.85% 7 Missing ⚠️
src/apply.jl 86.66% 2 Missing ⚠️

❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #139      +/-   ##
==========================================
+ Coverage   73.59%   73.61%   +0.01%     
==========================================
  Files          70       71       +1     
  Lines        4117     4146      +29     
==========================================
+ Hits         3030     3052      +22     
- Misses       1087     1094       +7     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

src/exports.jl Outdated Show resolved Hide resolved
src/gauging.jl Outdated Show resolved Hide resolved
src/gauging.jl Outdated Show resolved Hide resolved
@mtfishman
Copy link
Member

@JoeyT1994 over all I think this is a big improvement, I like basing the BP code around the interface update(::BeliefPropagationCache) and I can see that this same kind of interface could extend to other cache types. For example, I could imagine extending this to update(::ContractionCache) corresponding to a cache for exact tensor network contraction, where the cache stores the tensor network, a contraction sequence, and a map from a subset of vertices to the result of contracting the subregion of the tensor network.

Then we can define contract(::ITensorNetwork; (cache!)::AbstractContractionCache) as a general interface for performing a contraction, where it uses the cache and updates the cache where appropriate (not saying we should do that in this PR, I think the scope of this PR is good as it is).

I've suggested a number of stylistic changes. One broad comment is that I think it would be good to change bpc to bp_cache, I kept forgetting what bpc stands for.

src/gauging.jl Outdated Show resolved Hide resolved
src/gauging.jl Outdated Show resolved Hide resolved
src/gauging.jl Outdated Show resolved Hide resolved
src/apply.jl Outdated Show resolved Hide resolved
src/gauging.jl Outdated Show resolved Hide resolved
src/gauging.jl Outdated Show resolved Hide resolved
@JoeyT1994
Copy link
Contributor Author

@mtfishman I am thinking about how best to improve the logic of this code based on your comments.

I think a helper constructor ITensorNetwork(psiv::VidalITensorNetwork) which absorbs all bond tensors onto the sites symmetrically (by taking square roots) and returns to the arbitrary gauge is going to be helpful.

Then you could have the constructor VidalITensorNetwork(psi::ITensorNetwork, bp_cache::BeliefPropagationCache) which constructs a VidalITensorNetwork via the algorithm described in our paper.

Then you could have update(psiv:VidalTensorNetwork; (cache!) = nothing, cache_update_kwargs) which first converts to the symmetric gauge as you suggested in your comments. Then does whatever level of BP that you desire, and then calls the constructor above. That way you never simultaneously have bond_tensors and a bp_cache

What do you think? I know that VidalITensorNetwork(psi::ITensorNetwork, bp_cache::BeliefPropagationCache) doesn't do any cache updating and just takes it at face value but I personally think that is okay and a logical function to have?

src/gauging.jl Outdated Show resolved Hide resolved
src/gauging.jl Outdated Show resolved Hide resolved
src/apply.jl Outdated Show resolved Hide resolved
src/apply.jl Outdated Show resolved Hide resolved
src/gauging.jl Outdated Show resolved Hide resolved
src/gauging.jl Outdated Show resolved Hide resolved
src/gauging.jl Outdated Show resolved Hide resolved
src/gauging.jl Outdated Show resolved Hide resolved
src/apply.jl Outdated Show resolved Hide resolved
src/gauging.jl Outdated Show resolved Hide resolved
src/apply.jl Outdated Show resolved Hide resolved
@mtfishman
Copy link
Member

Thanks, definitely a nice improvement. Looking forward to seeing this in action in the alternating_update code!

@mtfishman mtfishman changed the title Belief Propagation Cache Belief propagation cache Mar 8, 2024
@mtfishman mtfishman merged commit 5df202a into ITensor:main Mar 8, 2024
8 of 11 checks passed
@JoeyT1994 JoeyT1994 deleted the caches branch June 16, 2024 13:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants