You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello - I was wondering if there is a way to disable the following warnings, which have started appearing when running pypower on interactive nodes in Perlmutter (and I guess they will also appear when submitting to the queue). I'm assuming these are related to the GPU capabilities of the code.
If not possible to disable them, is there a way so that they will only be printed by the root process? Each rank is printing its own copy, so the terminal is quickly filled up with these warnings.
[000002.72] 05-21 10:20 absl INFO Unable to initialize backend 'tpu_driver': NOT_FOUND: Unable to find driver in registry given worker:
[000002.72] 05-21 10:20 absl INFO Unable to initialize backend 'cuda': module 'jaxlib.xla_extension' has no attribute 'GpuAllocatorConfig'
[000002.72] 05-21 10:20 absl INFO Unable to initialize backend 'rocm': module 'jaxlib.xla_extension' has no attribute 'GpuAllocatorConfig'
[000002.72] 05-21 10:20 absl INFO Unable to initialize backend 'tpu': INVALID_ARGUMENT: TpuPlatform is not available.
[000002.72] 05-21 10:20 absl WARNING No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)
Cheers,
The text was updated successfully, but these errors were encountered:
Hello - I was wondering if there is a way to disable the following warnings, which have started appearing when running pypower on interactive nodes in Perlmutter (and I guess they will also appear when submitting to the queue). I'm assuming these are related to the GPU capabilities of the code.
If not possible to disable them, is there a way so that they will only be printed by the root process? Each rank is printing its own copy, so the terminal is quickly filled up with these warnings.
Cheers,
The text was updated successfully, but these errors were encountered: