go-metrics
go-metrics copied to clipboard
Global State Race
The global state here is a problem and causes data races if multiple implementing servers are spun up at the same time, as can be done in testing: https://github.com/rcrowley/go-metrics/blob/master/runtime.go#L11
In my code, I start a server like such:
type Server struct {
config config.Configuration
metricsPublisher *pfmetrics.Publisher
metricsRegistry metrics.Registry
...
In my New method, I set the values like such:
// srv is a &Server{} and c is config
srv.metricsRegistry = metrics.NewRegistry()
srv.metricsPublisher = pfmetrics.NewPublisher(fmt.Sprintf("%s:%s", c.CarbonHost, c.CarbonPort), c.CarbonPrefix, srv.metricsRegistry)
In my tests, I want to run parallel servers. This results in multiple calls to the following in my server's serve method:
go srv.metricsPublisher.Start()
defer srv.metricsPublisher.Stop()
Even though the Start and Stop methods are on my srv instance, they affect the global scoped metrics in the library, leading to the following data races.
==================
==================
WARNING: DATA RACE
Read at 0x000001a52df8 by goroutine 61:
github.com/sendgrid/mta/vendor/github.com/rcrowley/go-metrics.CaptureRuntimeMemStatsOnce()
/Users/sethammons/workspace/go/src/github.com/sendgrid/mta/vendor/github.com/rcrowley/go-metrics/runtime.go:137 +0xa12
github.com/sendgrid/mta/vendor/github.com/sendgrid/platformlib/pfmetrics.(*Publisher).Start()
/Users/sethammons/workspace/go/src/github.com/sendgrid/mta/vendor/github.com/sendgrid/platformlib/pfmetrics/publisher.go:104 +0x469
Previous write at 0x000001a52df8 by goroutine 14:
github.com/sendgrid/mta/vendor/github.com/rcrowley/go-metrics.CaptureRuntimeMemStatsOnce()
/Users/sethammons/workspace/go/src/github.com/sendgrid/mta/vendor/github.com/rcrowley/go-metrics/runtime.go:138 +0xa4e
github.com/sendgrid/mta/vendor/github.com/sendgrid/platformlib/pfmetrics.(*Publisher).Start()
/Users/sethammons/workspace/go/src/github.com/sendgrid/mta/vendor/github.com/sendgrid/platformlib/pfmetrics/publisher.go:104 +0x469
I sent a PR for #252 that should fix this: https://github.com/rcrowley/go-metrics/pull/253
Looked over the PR, seems like it should do it from my perspective! Thanks