-
Notifications
You must be signed in to change notification settings - Fork 284
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OOM - Segmentation fault (not ulimit, not cgroups, not max-space, not exhausted RAM) #4474
Comments
I guess this is applicable here - v8 array size is limited: https://stackoverflow.com/questions/70746898/why-cannot-v8-nodejs-allocate-a-max-size-array-if-sufficient-memory . can you pls examine the stack trace from a core file generated with |
Hello. Thank you for your answer. Moreover, this test reaches the old space limit on my computer. |
@riverego - I was referring to but you say it carries only less than 100 entires when OOM is hit, so apparently that is not the cause. I guess there is a limit on the number of maps (object shape descriptions) in v8, but I am not sure of it, also that cannot explain why it works in one system and not in another. for this reasons, I would still recommend you to turn on |
@riverego - any updates? |
Hello, Sadly no. I made ticket to Outscale, They didn't gave me their feedback yet. |
Node.js Version
v22.7.0 & previous
NPM Version
v10.8.2 & previous
Operating System
Linux ip-10-8-1-229 6.1.0-23-cloud-amd64 #1 SMP PREEMPT_DYNAMIC Debian 6.1.99-1 (2024-07-15) x86_64 GNU/Linux
Subsystem
Other
Description
The code works as expected on my own computer : it crashes when max-old-space is reached around 32G...
But on cloud VMs (of Outscale) it always runs OOM around 20G.
The problem happens on all images that I have tested : Debian12, Debian 11 & Ubuntu 20 (outscale out of the box images) with same result on 128 and 64Go of RAM Vms and all tested node versions (22, 20 & 16)
I checked ulimits, cgroups (even if cgroups kills a process with oom reaper, it doesn't throws a segfault), I found nothing...
I tried to put 50G fixed value on ulimits to see if unlimited hides a low default value and it's the same.
I looked with
/proc/sys/vm/overcommit_memory
0,1,2 values and its the same.I tried to recompile nodejs on the VM.... Same....
I exhausted ChatGPT ideas....
I thought maybe this is a host limit applied on processes, so I tried this :
But this can reach the VM RAM limit (64G or 128G) without any problem.
Same for the
stress
command....So I'm running out of ideas... I can't figure out what makes NodeJS run OOM around 20G on these VMs....
I hope someone here has a clue about what is happening....
Thank you.
Minimal Reproduction
The code just have to reach the OOM point.
Output
Before You Submit
The text was updated successfully, but these errors were encountered: