Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High memory usage during the fix step #3

Open
SquirrelKnight opened this issue Oct 20, 2022 · 0 comments
Open

High memory usage during the fix step #3

SquirrelKnight opened this issue Oct 20, 2022 · 0 comments

Comments

@SquirrelKnight
Copy link

I am using data from two different surveys at one site. The initial procedure works well using Log_Lambda0 = -1 and mu_t = 0.0. For the fix step, higher mu_t values (in this case 0.5) leads to a program failure due too much memory usage. I have tracked this down to the data correlation step, specifically with lu.solve in line 263 of setup_model.py. I am uncertain if there is a more efficient/less memory intensive way of addressing this issue. Here is the information for matrix E:

<66864x66864 sparse matrix of type '<class 'numpy.float64'>'
with 4331926 stored elements in List of Lists format>

Currently, my RAM is limited to 64 gb. I would be happy to pack the data into a zip file and provide it through Google Drive, if you would like to use it for a test case. I understand if this an impossible issue at the moment, and I greatly appreciate your time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant