Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

*** caught segfault *** address (nil), cause 'memory not mapped' #762

Open
motyocska opened this issue Aug 10, 2024 · 4 comments
Open

*** caught segfault *** address (nil), cause 'memory not mapped' #762

motyocska opened this issue Aug 10, 2024 · 4 comments

Comments

@motyocska
Copy link

Hello Guys,

This is little bit above ny head but will try: Have you encountered: "error message *** caught segfault *** address (nil), cause 'memory not mapped' " before? This happens what I assume to be a memory related issue. Am running rxsolve() as part of a shiny app on shinyapps.io with 8GB RAM as:

diffeqs<-rxode2(model="cp<-cent/((1.75*pow(weight/2.9,1))exp(vc))FREE;d/dt(cent)<--((0.345pow(weight/2.9,0.75)(1/(1+pow(AGE/34.8,-4.53)))*pow(1/creatinine,0.267))exp(cl))/((1.75pow(weight/2.9,1))*exp(vc))*cent;cent(0)=0;")
plist is a data.frame of 2000 rows with all parameters needed in the model
forceData is a relevant even table with weight and creatinine as a time dependent variable added
modelout <-rxSolve(diffeqs,plist,forceData,maxsteps = 5000000L,method = c("lsoda"))

can provide more info on sim routine if needed...when running it the system does not crash to about the 3rd repeated run? Reading some of the blogs by Matt it seems when supplying all parameters to rxSolve then some sort of memory allocation happens? So I wonder that if it is possible somehow to "reset" after each sim run the system to baseline, ie prior to memory allocation, assuming my thinking is correct? I have attached the log file if you find it helpful at the time of crash....

appreciate any thoughts on how a crash may be prevented without increasing memory size? my home machine with 128 GB RAM does not lead to same crash which makes me believe it is a memory issue...

thanks,

Andars

@motyocska
Copy link
Author

logs.txt

@mattfidler
Copy link
Member

Hi @motyocska

You are right, this is likely due to memory issues.

I had some experimental branches to solve over the memory in the system, though it is solving for multiple subjects (ids).

This gives ideas of how to overcome this, but I need to understand if your system is many ids or simply one solved system.

In the case of multiple IDs I could point you to the branch (and maybe even sync it with the current rxode2) to allow this to be solved.

The solution is to solve as much as you can in memory, then save it to disk and then clear the memory used and the solve the next hunk.

In the case of a single subject ODE, this would require sub-setting the times and solve up to a specified time, save the results and reset the system to have the ODE values in each comtartment and then solve the other parts of the system.

It isn't terribly fun to do by hand, and I have it on my many list of things to do to automate this somehow.

I have to figure out how to calculate how much memory a solve will take a-priori and then section this out.

With more and more systems having big data and big systems, this is something I really want to do (though I have many many things I really want to do).

For your case you have to figure out where your smaller system crashes so you can separate the solve into appropriate pieces.

@motyocska
Copy link
Author

motyocska commented Aug 12, 2024 via email

@mattfidler
Copy link
Member

Would you know what is the bases of how Rxode2 decides how much memory to allocate?

This is a function of the number of ODEs and parameters in your system as well as the number of timepoints;

(# time-points )* (# ODEs)  + (# non time-covariates + parameters) + (# time-points * # time varying covariates)  + (# time-points * number of output items)

There is a bit more that is allocated, but in general this is true.

The number of time-points seems the easiest to change (though not adding covariates or suppressing variable output in your model by using ~ instead of <- or = could help a bit too).

Based on the general idea below I could break up the simulations into pieces based on time by simulating a chunk, save conditions at the end of SIM then recycle those as initial conditions for the start of the next chunk using a for loop or similar maybe? Do you think that may help?

Indeed that is the general idea.

I would suggest also

  • removing the input data from the R environment
  • saving the solve and then removing the data from the R environment
  • Then use gc() afterward to free the R memory usage and rxSolveFree() to free the rxode2 memory.
  • Solve the next chunk of the equation.
  • Only save the initial conditions and either pass them as the init or use them as the initial conditions of the next solve
  • When solving make sure that the solving does not initialize at time zero by using the option rxSetIni0(FALSE) to initialize at first observed time

Unfortunately you are using a shiny interface; if you expect some snappy performance for some user, perhaps pre-calculating some of the common use cases (or caching them somehow) would be helpful.

With above can we expect the system to "return" the used memory before initiating the given chunk that was used to simulate the previous chunk? If yes I guess then breaking it up would make sense as described in the above

If you free the R memory using gc() and the C++ memory with rxSolveFree() indeed the memory will be freed so you can tackle the next chunk.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants