-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Release v1.48.0 #4248
Comments
I also just stumbled across comments in the dotnet repo that pointed to documentation that seem to suggest a function was being used improperly for ESRCH in libuv. Seemed like a pretty simple change so I went ahead and made a PR that I think should fix it: #4301. If that PR looks okay to others, should we slip that into this release? |
Libuv in Node CI: https://ci.nodejs.org/view/libuv/job/libuv-in-node/231/ There are 2 concerning failures:
Looking into those before making the release. |
Let's revert b9421d7 for this release until there is time to look into that. It sounds like a kernel bug (since there is a C reproducer), and while it only impacts the tests, it seems inconvenient to have that be breaking for now. I am also not certain how nodejs detects this issue, since libuv should have been the only one to install signal handlers. |
Will do.
Node.js installs signal handlers both using libuv and using |
For the exact same signals though? That sounds a bit unreliable (because of exactly this bug), though not entirely unreasonable. The next phase after b9421d7 though might be to implement full signal chaining, so as soon as the libuv handler finishes, it calls the next one in the chain, hiding the existence of the libuv The output I could find for the other test seems fairly worthless. Is this all that nodejs gives for info when a test crashes?
|
Yes, it may happen for |
This reverts commit b9421d7. Refs: libuv#4299 Refs: libuv#4248
Revert PR: #4302 |
Current status: bisecting, as the coredump I could get didn't provide much info. Click to expand!
|
Frame 10 of thread 1 is a call to edit: does valgrind report anything? |
It looks more like an unwinder bug to me (probably gdb to v8 integration is disabled so the unwinding cannot occur correctly). I suspect none of those frames are right, given:
|
Is it possible to run nodejs with the v8 JIT configured with the equivalent of |
That would've been my guess too but I don't think that code path calls into JS code. Disassembly of the top most frame will probably tell though; V8-generated machine code has a very specific vibe to it. |
Right. The stack frames past 1 or 2 are probably wrong, so the eventual printing of |
No. V8 uses a completely different calling convention, it's not due to lack of frame pointers. There is gdb integration (the |
In order to reproduce it locally on my linux x64 box I had to really push it: # Debug build
$ ./configure --debug && make -j12
# Run 300 instances of the test in parallel a few times until some crashes happened
$ for i in $(seq 1 300); do out/Debug/node --expose-internals --expose-gc test/sequential/test-performance-eventloopdelay.js & done Also, I'm at the moment building node in one of the ci instances where the issue seems to happen consistently. |
Okay, I see segfaults with that too. That makes it really trival to throw |
It looks like the stack gets pretty badly smashed somewhere around, since we start the function call with a frame that looks like this
and then we return from that DrainTasks into no-mans-land
|
That points to commit 51a22f6, which btw is the bisect last step I'm currently running |
It looks like |
Looks like #4250 (review) might have had the correct code, but it didn't get copied into the PR correctly? |
So, are we talking about smthg like this?
I've tested locally and I couldn't reproduce the crash. |
yes, that looks like what I was thinking, though it doesn't need to be conditional. I can also confirm that running Ben's while loop a few times doesn't crash anymore with that change. |
As there were instances where this didn't happen and could cause memory corruption issues. Refs: libuv#4248
I opened #4304. PTAL |
As there were instances where this didn't happen and could cause memory corruption issues. Refs: #4248
This is done. Sorry I messed up with the last commits but the GH Security Advisory feature was broken and didn't allowed me to merge the advisory with branch protections (even though it said it would bypass them), so I manually merged it and after the fact I figured out that I had to disable the protections manually in order to merge the advisory. |
I think it might because we don't allow admins to bypass branch protections? I usually briefly disable it around a release so that the script will work correctly to push the release and tag it |
Yes I understand, but the SA UI had a |
Fixed. Thanks! |
To pick up these 24 commits v1.47.0...v1.x
and also probably the last release with Windows 8 support?
desired:
optional:
The text was updated successfully, but these errors were encountered: