-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
zaptest: test logger seems susceptible to data races (?) #687
Comments
The issue that you are running into does it perhaps the side effect of opening 100+ |
The race detector has the following logs:
and
If we look at the file:line of the write (testing.go:856) we see the following:
Closing the issue as this doesn't seem like a zap issue, please let me know if that's not the case. |
ah thanks @prashantv , that definitely seems like a likely culprit 👍 appreciate the help! update: fixed by making sure no goroutines log after a test ends as suggested |
…d data race (#24899) **Description:** Fixes data race on `TestChain` that was caused by logging after the test was ended (see uber-go/zap#687 (comment) for details). To do this, we move the logging outside of the goroutines in the Chain provider. Goroutines can still be running after the function has returned (this is intentional, their return value will just be ignored), but they will no longer log anything. I also made the `Source` method honor cancellation of the context when waiting on providers. **Link to tracking Issue:** Reported on Slack, see https://cloud-native.slack.com/archives/C01N6P7KR6W/p1691074033049049?thread_ts=1690992010.123699&cid=C01N6P7KR6W
Added sync logs for every service separately to provide the ability to have a custom logger for each service. Data race in logging is fixed by adding a sync log, this problem is related to uber-go/zap#687. Close #2973 Close #2974 Close #3112 Close #3217 Signed-off-by: Ekaterina Pavlova <[email protected]>
Added sync logs for every service separately to provide the ability to have a custom logger for each service. Data race in logging is fixed by adding a sync log, this problem is related to uber-go/zap#687. Close #2973 Close #2974 Close #3112 Close #3217 Signed-off-by: Ekaterina Pavlova <[email protected]>
Added sync logs for every service separately to provide the ability to have a custom logger for each service. Data race in logging is fixed by adding a sync log, this problem is related to uber-go/zap#687. Close #2973 Close #2974 Close #3112 Close #3217 Signed-off-by: Ekaterina Pavlova <[email protected]>
Added sync logs for every service separately to provide the ability to have a custom logger for each service. Data race in logging is fixed by adding a sync log, this problem is related to uber-go/zap#687. Close #2973 Close #2974 Close #3112 Close #3217 Signed-off-by: Ekaterina Pavlova <[email protected]>
Added sync logs for every service separately to provide the ability to have a custom logger for each service. Data race in logging is fixed by adding a sync log, this problem is related to uber-go/zap#687. Close #2973 Close #2974 Close #3112 Close #3217 Signed-off-by: Ekaterina Pavlova <[email protected]>
Added sync logs for every service separately to provide the ability to have a custom logger for each service. Data race in logging is fixed by adding a sync log, this problem is related to uber-go/zap#687. Close #2973 Close #2974 Close #3112 Signed-off-by: Ekaterina Pavlova <[email protected]>
Added sync logs for every service separately to provide the ability to have a custom logger for each service. Data race in logging is fixed by adding a sync log, this problem is related to uber-go/zap#687. Close #2973 Close #2974 Close #3112 Signed-off-by: Ekaterina Pavlova <[email protected]>
I've run into this quite a bit - here's an example: https://travis-ci.com/ubclaunchpad/pinpoint/jobs/183635090#L564 (relevant PR: ubclaunchpad/pinpoint#152 - race condition only appeared after converting test loggers to
zaptest
)and another one: https://travis-ci.com/RTradeLtd/Nexus/builds/103797950#L717
it seems to me that
testing.T.Log()
etc are not thread safe?while it kinda makes sense that the test logger isn't thread-safe (it's only meant to be used sequentially in a test after all), once it is passed tozaptest.NewLogger()
it enters all sorts of potentially racey conditions, which makes this rather problematicI think this can be resolved by just adding a mutex in
testingWriter
- I'd be happy to submit a patch! Let me know if this is a known/intended behaviour, or if there's something else i should be doing differently on my endedit actually, i just saw this note:
am I doing something wrong here? 🤔
edit2 I added a crude test and under
-race
nothing seems to be off on its own:any help would be appreciated!
The text was updated successfully, but these errors were encountered: