Bump readable-stream from 3.6.0 to 4.0.0#3
Conversation
Bumps [readable-stream](https://github.com/nodejs/readable-stream) from 3.6.0 to 4.0.0. - [Release notes](https://github.com/nodejs/readable-stream/releases) - [Commits](nodejs/readable-stream@v3.6.0...v4.0.0) --- updated-dependencies: - dependency-name: readable-stream dependency-type: direct:production update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] <support@github.com>
|
In addition to the Node.js 12 test failure (Level/many-level#7 (comment)) which I intend to "fix" by dropping support of Node.js 12, there are also random failures in the |
|
It reproduces more consistently with v4, across all node versions and operating systems: https://github.com/Level/rave-level/actions/runs/2601809114. But it also reproduces with v3 on mac: https://github.com/Level/rave-level/actions/runs/2601819174. And also with In short, this is a pre-existing bug. |
|
To summarize (and correct) the above, there are 3 separate issues here:
1 and 2 can be fixed by replacing As for 3, I have no idea. We can tackle that separately though, because it's unrelated to |
|
Yeah, we could.
The issue isn't in the tests. It's a general problem: before |
|
You did give me pause though, and made me look at my test code (with a pipeline replacement) again. I found a potential bug there, leading me to wrong conclusions. Will update. |
|
Waiting for This can be seen in the following repro, which also contains a fix that we could make in const { pipeline, Duplex, PassThrough, finished } = require('readable-stream')
const net = require('net')
const cp = require('child_process')
const socketPath = process.platform === 'win32' ? '\\\\.\\pipe\\stream-test' : __dirname + '/stream-test.sock'
if (process.argv[2] === 'server') {
const server = net.createServer(sock => {
sock.pipe(new PassThrough()).pipe(sock)
})
server.listen(socketPath, function () {
process.send('ready')
})
} else {
const child = cp.fork(__filename, ['server'], { timeout: 2e3 })
child.once('message', function ready () {
const socket = net.connect(socketPath)
const encode = new PassThrough()
const decode = new PassThrough()
const proxy = Duplex.from({ writable: decode, readable: encode })
decode.on('data', function (x) {
console.log('received %s', x)
})
// (1) Notice how the encode stream does not end, finish or close
for (const event of ['close', 'finish', 'end']) {
socket.on(event, () => console.log('socket', event))
proxy.on(event, () => console.log('proxy', event))
encode.on(event, () => console.log('encode', event))
decode.on(event, () => console.log('decode', event))
}
pipeline(socket, proxy, socket, function () {
// (2) So we never get here
console.log('done')
})
encode.write('x')
// (3) To fix, uncomment:
// finished(proxy, { writable: true, readable: false }, () => encode.destroy())
})
} |
|
Ok, great that you were able to reproduce this using core libs only! Do you think it's an issue there or adding |
|
I hope to further simplify the repro (removing |
|
Cool, I'm interested in this and will take a stab at simplifying it |
|
I'm seeing the same behavior with tcp sockets, in codespaces |
|
Example without child processes: const { pipeline, Duplex, PassThrough, finished } = require('readable-stream')
const net = require('net')
const cons = []
const server = net.createServer(sock => {
cons.push(sock)
sock.pipe(new PassThrough()).pipe(sock)
})
server.listen(8000, function () {
const socket = net.connect(8000)
const encode = new PassThrough()
const decode = new PassThrough()
const proxy = Duplex.from({ writable: decode, readable: encode })
decode.on('data', function (x) {
console.log('received %s', x)
})
// (1) Notice how the encode stream does not end, finish or close
for (const event of ['close', 'finish', 'end']) {
socket.on(event, () => console.log('socket', event))
proxy.on(event, () => console.log('proxy', event))
encode.on(event, () => console.log('encode', event))
decode.on(event, () => console.log('decode', event))
}
pipeline(socket, proxy, socket, function () {
// (2) So we never get here
console.log('done')
})
encode.write('x')
// (3) To fix, uncomment:
// finished(proxy, { writable: true, readable: false }, () => encode.destroy())
setTimeout(() => {
server.close()
cons[0].end()
}, 2000)
}) |
|
Also removed const { pipeline, Duplex, PassThrough, finished } = require('readable-stream')
const socket = new PassThrough()
const encode = new PassThrough()
const decode = new PassThrough()
const proxy = Duplex.from({ writable: decode, readable: encode })
decode.on('data', function (x) {
console.log('received %s', x)
})
// (1) Notice how the encode stream does not end, finish or close
for (const event of ['close', 'finish', 'end']) {
socket.on(event, () => console.log('socket', event))
proxy.on(event, () => console.log('proxy', event))
encode.on(event, () => console.log('encode', event))
decode.on(event, () => console.log('decode', event))
}
pipeline(socket, proxy, socket, function () {
// (2) So we never get here
console.log('done')
})
encode.write('x')
// (3) To fix, uncomment:
// finished(proxy, { writable: true, readable: false }, () => encode.destroy())
setTimeout(() => {
socket.end()
}, 2000) |
|
While it is reproducible like this, I'm not sure it still shows the same issue, because we're not ending the |
|
If you do The writable side of our proxy stream behaves as I expect. The readable side - which is connected to a closed socket stream - does not. If we did But in |
|
Is there a bug in Node.js core v18 for this? There might be. If you have a simpler repo just open an issue and we can discuss it there. |
|
Further simplified. Reproduces on 18.4.0 core streams as well. I'll test it on some more versions and then open an issue. const { pipeline, Duplex, PassThrough, finished } = require('readable-stream')
const remote = new PassThrough()
const local = new Duplex({
read () {},
write (chunk, enc, callback) {
callback()
}
})
for (const event of ['close', 'finish', 'end']) {
remote.on(event, () => console.log('remote', event))
local.on(event, () => console.log('local', event))
}
pipeline(remote, local, remote, function () {
// We never get here
console.log('done')
})
// To fix, uncomment:
// finished(local, { writable: true, readable: false }, () => local.destroy())
setImmediate(() => {
remote.end()
}) |
|
Thx! |
|
Superseded by #6. |
Bumps readable-stream from 3.6.0 to 4.0.0.
Release notes
Sourced from readable-stream's releases.
Commits
49a45a9Bumped v4.0.037935dfv4 launch details (#472)431a9afEvade override mistake (#474)81c5264Update to Node 18.0.0 (#471)97fd94fdoc: make references to primary branch be mainbcbe5e0Add usage instructions for Webpack 5 (#455)07462e1Cleanup CI (#446)8c73012Adds Node v14 to github actions (#445)19381e9Upgrade airtap and use GitHub Actions (#443)69aa464tests: add GitHub Actions CI (#418)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebasewill rebase this PR@dependabot recreatewill recreate this PR, overwriting any edits that have been made to it@dependabot mergewill merge this PR after your CI passes on it@dependabot squash and mergewill squash and merge this PR after your CI passes on it@dependabot cancel mergewill cancel a previously requested merge and block automerging@dependabot reopenwill reopen this PR if it is closed@dependabot closewill close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot ignore this major versionwill close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor versionwill close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependencywill close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)