-
Notifications
You must be signed in to change notification settings - Fork 2.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bun leaks memory in Workers #5709
Comments
I think I've run into similar issue when using Bun's
|
Thanks for reporting, this is a known issue that we will be fixing. |
Interesting! In my case, Bun never returned an error before crashing. |
Any follow-up on when this might be fixed? This is preventing me from using Workers in bun. |
I just re-ran the reproduction code and the memory leak seems to have been mostly fixed somewhere between v1.0.11 and v1.0.12. Memory usage used to double every iteration, but now (in my testing) memory usage went from |
Should this issue be closed (the massive memory leak is fixed), or kept open (there still seems to be a tiny leak left over)? |
It's still valid for me for the most recent released version (1.0.33) even for canary :( |
I was running into issues possibly caused by #5659 so I tried to use workers as a workaround, hoping that when the worker gets terminated, the memory might be freed. Unfortunately, this doesn't seem to be the case. |
Currently a major issue impacting us using Bun in production. |
Collab with Deno maybe? denoland/deno#18414 |
Please fix this! The memory just keeps increasing until server runs out of memory crashes. |
Made investigation
Sample code ( import { heapStats } from "bun:jsc";
function spawnWorker() {
return new Promise((res, rej) => {
const worker = new Worker(new URL("./worker.ts", import.meta.url).href, {
type: "module",
ref: false,
smol: true,
});
worker.onerror = (e) => rej(e);
worker.addEventListener('close', () => {
res(worker)
})
worker.onmessage = (msg) => {
if (msg.data === "ready") {
worker.terminate();
}
}
})
}
Bun.gc(true);
const start = heapStats().objectTypeCounts
for (let i = 0; i < 1000; i++) {
await spawnWorker()
Bun.gc(true);
const stats = heapStats();
await Bun.write(`data/${i}.json`, JSON.stringify(stats))
} Worker code ( self.postMessage("ready") Made 1000 iterations: Heap/Objects Per objects FunctionCodeBlock Reproduce:
|
What version of Bun is running?
1.0.3+f77df12894e7952ea58605432bf9f6d252fff273
What platform is your computer?
Linux 5.15.90.1-microsoft-standard-WSL2 x86_64 x86_64
What steps can reproduce the bug?
Create a main file which starts up a worker, waits until it completes its work, then logs memory usage.
And a worker which allocates a large amount of memory
Then run main with
bun run main.ts
What is the expected behavior?
Memory usage should spike as the worker begins, then drop sometime afterwards as garbage collection is run. The script should continue running until the loop completes.
When a (functionally) identical script and worker (code below) is run via Node, memory usage stays around the same every call.
Results in node:
> node node/main.js 971173888 983752704 979537920 988639232 986144768 979836928 984797184 986599424 985792512 ^C
What do you see instead?
Memory usage continues to rise, eventually crashing (saturates 8gb of RAM and 2gb of swap) at around the 8th iteration.
Results in Bun:
> bun bun/main.ts 1050456064 2005811200 2962059264 3915100160 4867907584 5819809792 6765035520 ^C
Additional information
smol
to eitherbun run
ornew Worker()
has no effect.Bun.gc(true)
immediately afterself.postMessage()
in a personal project, but couldn't reproduce the results here.The text was updated successfully, but these errors were encountered: