Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fs.ReadStream does not emit data events #12099

Open
johnp789 opened this issue Jun 24, 2024 · 5 comments
Open

fs.ReadStream does not emit data events #12099

johnp789 opened this issue Jun 24, 2024 · 5 comments
Labels
bug Something isn't working confirmed bug We can reproduce this issue node:stream

Comments

@johnp789
Copy link

What version of Bun is running?

1.1.16

What platform is your computer?

Darwin 23.5.0 arm64 arm

What steps can reproduce the bug?

A Raspberry Pi Pico is connected to a Mac USB port. The Pico shows up as a character special device at /dev/cu.usbmodem1201 and sends a 29-byte line every 250ms or so, which I would like to read with Bun.

With the script read-test.js below, the behavior of Bun is different from the behavior of Node.

import * as fs from "node:fs"

const inp = fs.createReadStream("/dev/cu.usbmodem1201")
inp.on("data", function(chunk) {
  console.log("in:", chunk.length)
})
inp.on("open", function() {
  console.log("on open")
})
inp.on("close", function() {
  console.log("on close")
})
setTimeout(function() {
  console.log("Closing file now...")
  inp.close()
}, 3000)

What is the expected behavior?

With Node.js v20.11.1, several data events are received before exiting. The device writes 29-character lines about 250ms apart.

$ node read-test.js
on open
in: 27
in: 2
in: 27
in: 2
in: 29
in: 29
in: 29
in: 29
Closing file now...
on close

What do you see instead?

Bun runs without reporting an error, but no data events are emitted.

$ bun read-test.js
on open
Closing file now...
on close

Additional information

The Pi Pico streams data from a sensor. My goal here is to read each line from the USB-attached Pi Pico and process the line immediately. Eventually, the script needs to run on Windows, Mac, and Linux.

Changing the script to read from the Mac's /dev/random, Bun has some data events, but the behavior is still different from Node.js. Bun evidently uses a larger read buffer. Is there a way to set a read timeout for Bun, so that it does not wait for the read buffer to fill up before emitting a data event?

import * as fs from "node:fs"

// const path = "/dev/cu.usbmodem1201"
const path = "/dev/random"
const inp = fs.createReadStream(path)
let chunkCount = 0
inp.on("data", function(chunk) {
  chunkCount += 1
  console.log(`in: ${chunk.length} ${chunkCount}`)
  if(chunkCount > 5) {
    console.log("Closing file now due to chunk count...")
    inp.close()
  }
})
inp.on("open", function() {
  console.log("on open")
})
inp.on("close", function() {
  console.log("on close")
})
setTimeout(function() {
  console.log("Closing file now...")
  inp.close()
}, 1000)
$ node read-test.js
on open
in: 65536 1
in: 65536 2
in: 65536 3
in: 65536 4
in: 65536 5
in: 65536 6
Closing file now due to chunk count...
on close
Closing file now...
$ bun read-test.js
on open
in: 262144 1
in: 262144 2
in: 262144 3
in: 262144 4
in: 262144 5
in: 262144 6
Closing file now due to chunk count...
in: 262144 7
Closing file now due to chunk count...
on close
Closing file now...
@johnp789 johnp789 added bug Something isn't working needs triage labels Jun 24, 2024
@museadam
Copy link

museadam commented Jun 26, 2024

createReadStream seems to work because it gets to the "on open" log, i would like to see further debugging of the data from the createReadStream to see what data it is getting.

If you aren't using the node fs functions mkdir and readdir then you should use buns api file I/O (https://bun.sh/docs/api/file-io)
and what you could do...
Instead of: const inp = fs.createReadStream(path) you can use const inp = Bun.file(path);
and await inp.stream()
and then get the size with const chunckSize = inp.size;

@johnp789
Copy link
Author

I tried using Bun.file(path) instead, but the stream still does not return chunks, at least not in a reasonable time for this slow stream. Using /dev/random instead, I see chunks of size 262144, but the reported inp.size is Infinity.

// const inp = Bun.file("/dev/cu.usbmodem1201")
const inp = Bun.file("/dev/random")
const stream = await inp.stream()
console.log(`chunk size is ${inp.size}`) // chunk size is Infinity
let chunkCount = 0
for await (const chunk of stream) {
  console.log(chunk.length);             // 262144
  if (++chunkCount > 3) {
      break
  }
}

@nektro
Copy link
Member

nektro commented Sep 13, 2024

does this still reproduce for you on Bun 1.1.27? (you may need to run bun upgrade)

@johnp789
Copy link
Author

Yes, this still reproduces on Bun 1.1.27. The read-test.js script above still works OK with node 20.16.0 and fails to read anything with Bun 1.1.27.

@johnp789
Copy link
Author

Here's a reproducer using a fifo.

// fifo-writer.mjs
import * as fs from "node:fs"

const outp = fs.createWriteStream("my-fifo")
let i = 1
setInterval(() => {
  outp.write(`Message ${i++}\n`)
}, 10)
// fifo-reader.mjs
import * as fs from "node:fs"

const inp = fs.createReadStream("my-fifo")
let chunkCount = 1
inp.on("data", function (chunk) {
  console.log(`${chunkCount++}: ${chunk.toString().trim()}`)
  if (chunkCount > 5) {
    console.log("Closing file now due to chunk count...")
    inp.close()
  }
})
inp.on("open", function () {
  console.log("on open")
})
inp.on("close", function () {
  console.log("on close")
})

Running the writer in the background, try the reader with node (twice) and with bun:

$ mkfifo my-fifo
$ bun fifo-writer.mjs &
[1] 50572
$ node fifo-reader.mjs
on open
1: Message 1
2: Message 2
3: Message 3
4: Message 4
5: Message 5
Closing file now due to chunk count...
on close
$ node fifo-reader.mjs
on open
1: Message 198
2: Message 199
3: Message 200
4: Message 201
5: Message 202
Closing file now due to chunk count...
on close
$ bun fifo-reader.mjs  # hangs, interrupt with Ctrl-C
on open
^C

@RiskyMH RiskyMH added the confirmed bug We can reproduce this issue label Dec 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working confirmed bug We can reproduce this issue node:stream
Projects
None yet
Development

No branches or pull requests

4 participants