Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

async shutdown in nomad package causes test-only panics #7132

Closed
tgross opened this issue Feb 12, 2020 · 4 comments
Closed

async shutdown in nomad package causes test-only panics #7132

tgross opened this issue Feb 12, 2020 · 4 comments
Labels
stage/needs-verification Issue needs verifying it still exists theme/cleanup theme/testing Test related issues

Comments

@tgross
Copy link
Member

tgross commented Feb 12, 2020

If a goroutine is running after a test exits, any logging it does will cause a panic on the main test goroutine. This particularly seems to be an issue with the nomad package, because the shutdown of the server isn't synchronous in all places the way it is for clients.

This isn't a production-facing issue but it makes testing flaky. From a discussion with @schmichael we want to collect some examples. We'll then follow-up with some work to make incremental improvements to server shutdown where we see consistent problems.

@tgross tgross added theme/cleanup theme/testing Test related issues labels Feb 12, 2020
@tgross tgross added this to the unscheduled milestone Feb 12, 2020
@tgross
Copy link
Member Author

tgross commented Feb 12, 2020

An example from working on #7124:

▶ go test ./nomad -run TestCSI -count=1
[INFO] freeport: detected ephemeral port range of [49152, 65535]
panic: Log in goroutine after TestCSIVolumeEndpoint_Claim has completed

goroutine 716 [running]:
testing.(*common).logDepth(0xc00023c500, 0xc000c80230, 0x67, 0x3)
        /usr/local/Cellar/[email protected]/1.12.15/libexec/src/testing/testing.go:634 +0x3a6
testing.(*common).log(...)
        /usr/local/Cellar/[email protected]/1.12.15/libexec/src/testing/testing.go:614
testing.(*common).Logf(0xc00023c500, 0x59b6401, 0x4, 0xc000739660, 0x2, 0x2)
        /usr/local/Cellar/[email protected]/1.12.15/libexec/src/testing/testing.go:649 +0x7f
github.com/hashicorp/nomad/helper/testlog.(*writer).Write(0xc00033f0c0, 0xc000592480, 0x67, 0x16c, 0x0, 0x404f900, 0x0)
        /Users/tim/go/src/github.com/hashicorp/nomad/helper/testlog/testlog.go:34 +0x106
github.com/hashicorp/nomad/vendor/github.com/hashicorp/go-hclog.(*writer).Flush(0xc000794140, 0xbf8936e100000001, 0x396b239, 0x6f164e0)
        /Users/tim/go/src/github.com/hashicorp/nomad/vendor/github.com/hashicorp/go-hclog/writer.go:29 +0x157
github.com/hashicorp/nomad/vendor/github.com/hashicorp/go-hclog.(*intLogger).log(0xc0007b1d40, 0xc00049af00, 0x17, 0x1, 0x59c476a, 0xd, 0x0, 0x0, 0x0)
        /Users/tim/go/src/github.com/hashicorp/nomad/vendor/github.com/hashicorp/go-hclog/intlogger.go:139 +0x167
github.com/hashicorp/nomad/vendor/github.com/hashicorp/go-hclog.(*intLogger).Trace(0xc0007b1d40, 0x59c476a, 0xd, 0x0, 0x0, 0x0)
        /Users/tim/go/src/github.com/hashicorp/nomad/vendor/github.com/hashicorp/go-hclog/intlogger.go:446 +0x7b
github.com/hashicorp/nomad/vendor/github.com/hashicorp/go-hclog.(*interceptLogger).Trace(0xc000bc9c80, 0x59c476a, 0xd, 0x0, 0x0, 0x0)
        /Users/tim/go/src/github.com/hashicorp/nomad/vendor/github.com/hashicorp/go-hclog/interceptlogger.go:48 +0x8c
github.com/hashicorp/nomad/nomad/drainer.(*drainingJobWatcher).watch(0xc000996680)
        /Users/tim/go/src/github.com/hashicorp/nomad/nomad/drainer/watch_jobs.go:153 +0x1ed7
created by github.com/hashicorp/nomad/nomad/drainer.NewDrainingJobWatcher
        /Users/tim/go/src/github.com/hashicorp/nomad/nomad/drainer/watch_jobs.go:89 +0x1e3
FAIL    github.com/hashicorp/nomad/nomad        1.398s

@stale
Copy link

stale bot commented May 13, 2020

Hey there

Since this issue hasn't had any activity in a while - we're going to automatically close it in 30 days. If you're still seeing this issue with the latest version of Nomad, please respond here and we'll keep this open and take another look at this.

Thanks!

@tgross tgross removed this from the unscheduled milestone Feb 12, 2021
@tgross tgross added the stage/needs-verification Issue needs verifying it still exists label Mar 4, 2021
@tgross
Copy link
Member Author

tgross commented Jun 24, 2024

I haven't run into these in a long time. Going to close this out.

@tgross tgross closed this as not planned Won't fix, can't repro, duplicate, stale Jun 24, 2024
Copy link

I'm going to lock this issue because it has been closed for 120 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Dec 27, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
stage/needs-verification Issue needs verifying it still exists theme/cleanup theme/testing Test related issues
Projects
None yet
Development

No branches or pull requests

1 participant