-
Notifications
You must be signed in to change notification settings - Fork 29.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Partially consuming fs.ReadStream closes file handle despite autoClose: false
#45721
Comments
I suggest you try stream composition and/or iterator-helpers, such as "use strict";
const fsPromises = require('node:fs/promises');
async function run() {
const fh = await fsPromises.open('example.txt')
fh.on('close', () => {throw new Error('handle closed!')})
const stream = fh.createReadStream({
autoClose: false,
})
for await (const chunk of stream.take(Infinity)) {
break
}
// Give the event loop a breather: this seems to be where the close happens
await new Promise(resolve => setTimeout(resolve, 10))
console.log('fd', fh.fd) // PASSES
}
run(); |
Yes, I understand that The thing I find surprising is that I was merely using the loop as an example of the context in which I discovered this problem. I am aware that there are other ways of interacting with streams. However, as I said in my original post:
I don't think your example, as given, solves this problem: it doesn’t close the handle, but it still seems to leak the stream. (The easiest way to see this is to wrap lines 7 through 16 in a loop: this results in “ |
The docs also state that destroying the stream closes internal resources, which may only be understood as releasing the fd, and that's in fact all it does. According to my understanding,
|
Okay, I think this is what you are looking for. "use strict";
const fsPromises = require('node:fs/promises');
const {setTimeout} = require('node:timers/promises');
async function run() {
const fh = await fsPromises.open('example.txt')
fh.on('close', () => {throw new Error('handle closed!')})
for (let i = 0; i < 10; i++) {
const stream = fh.createReadStream({
autoClose: false,
})
// Long-lived file handle. Streams needn't be kept around til it closes.
fh.off('close', fh.rawListeners('close').at(-1));
for await (const chunk of stream.iterator({destroyOnReturn: false})) {
break
}
}
await setTimeout(10);
console.log('fd', fh.fd) // PASSES
}
run(); However, removing |
Support creating a Read/WriteStream from a FileHandle instead of a raw file descriptor Add an EventEmitter to FileHandle with a single 'close' event. Fixes: #35240 PR-URL: #35922 Reviewed-By: Anna Henningsen <[email protected]> Reviewed-By: Rich Trott <[email protected]>
I don’t pretend to fully understand Node’s internals here, but your code block does seem like a plausible workaround, though I agree it looks like a hack. Also worth noting that, for reasons I'm not entirely clear on, it seems to work fine even without I'v also commented on #45917 with some additional notes. |
Version
v19.2.0, v14.21.1
Platform
Linux wolf-x1c6 5.15.0-53-generic #59-Ubuntu SMP Mon Oct 17 18:53:30 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Subsystem
No response
What steps will reproduce the bug?
How often does it reproduce? Is there a required condition?
Always
What is the expected behavior?
FileHandle.createReadStream({autoClose: false})
should result in the file handle not being automatically closed when the stream is destroyed.What do you see instead?
Using
fs.ReadStream[Symbol.asyncIterator]
seems to unconditionally close the file handle (after a small delay; I guess at the end of the event loop or something? but haven't dug in; just noted thesetTimeout
above is necessary to reproduce) despite{autoClose: false}
being passed toFileHandle.createReadStream()
.This seems to be the result of the underlying
readable.destroy()
also closing the file handle; but it means that, as far as I can tell, there's no way to only partially consume anfs.ReadStream
created from a file handle without either closing the handle or leaking the stream.Additional information
No response
The text was updated successfully, but these errors were encountered: