spiderpool icon indicating copy to clipboard operation
spiderpool copied to clipboard

data race

Open Icarus9913 opened this issue 2 years ago • 1 comments

Describe the version version about: spiderpool : v0.0.5

Describe the bug Once nightly CI report data race error with e2e test. Here's the e2e test report:

Ginkgo ran 12 suites in 4m51.835640533s
Test Suite Passed
debugEnv.sh : E2E_KUBECONFIG /home/runner/work/spiderpool/spiderpool/test/.cluster/spiderpool0803120355/.kube/config 
output debug information to /home/runner/work/spiderpool/spiderpool/test/e2edebugLog.txt
debugEnv.sh : E2E_KUBECONFIG /home/runner/work/spiderpool/spiderpool/test/.cluster/spiderpool0803120355/.kube/config 
output debug information to /home/runner/work/spiderpool/spiderpool/test/e2edebugLog.txt
debugEnv.sh : E2E_KUBECONFIG /home/runner/work/spiderpool/spiderpool/test/.cluster/spiderpool0803120355/.kube/config 
output debug information to /home/runner/work/spiderpool/spiderpool/test/e2edebugLog.txt
debugEnv.sh : E2E_KUBECONFIG /home/runner/work/spiderpool/spiderpool/test/.cluster/spiderpool0803120355/.kube/config 
output debug information to /home/runner/work/spiderpool/spiderpool/test/e2edebugLog.txt
error, found data race !!!
failed to run e2e test
make[1]: *** [Makefile:215: e2e_test] Error 1
make: *** [Makefile:296: e2e_test] Error 2
make[1]: Leaving directory '/home/runner/work/spiderpool/spiderpool/test'
::set-output name=pass::false
##[debug]steps.dualstack_e2e.outputs.pass='false'
::set-output name=updaloadlog::false
##[debug]steps.dualstack_e2e.outputs.updaloadlog='false'
error, did not find e2e report
::set-output name=upload::true
##[debug]steps.dualstack_e2e.outputs.upload='true'
##[debug]Finishing: Run e2e Test For Dual-stack

Screenshots and log

WARNING: DATA RACE
Write at 0x00c0003975e0 by goroutine 157:
  k8s.io/client-go/tools/cache.(*sharedProcessor).run.func1()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:667 +0x234
  k8s.io/client-go/tools/cache.(*sharedProcessor).run()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:668 +0x51
  k8s.io/client-go/tools/cache.(*sharedProcessor).run-fm()
      <autogenerated>:1 +0x44
  k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:56 +0x3e
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:73 +0x73

Previous write at 0x00c0003975e0 by goroutine 152:
  k8s.io/client-go/tools/cache.(*sharedProcessor).run.func1()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:667 +0x234
  k8s.io/client-go/tools/cache.(*sharedProcessor).run()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:668 +0x51
  k8s.io/client-go/tools/cache.(*sharedProcessor).run-fm()
      <autogenerated>:1 +0x44
  k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:56 +0x3e
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:73 +0x73

Goroutine 157 (running) created at:
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:71 +0xdc
  k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:55 +0xdb
  k8s.io/client-go/tools/cache.(*sharedIndexInformer).Run()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:436 +0x744
  k8s.io/client-go/informers.(*sharedInformerFactory).Start.func2()
      /src/vendor/k8s.io/client-go/informers/factory.go:134 +0x59

Goroutine 152 (running) created at:
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:71 +0xdc
  k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:55 +0xdb
  k8s.io/client-go/tools/cache.(*sharedIndexInformer).Run()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:436 +0x744
  github.com/spidernet-io/spiderpool/pkg/gcmanager.(*SpiderGC).startPodInformer.func2()
      /src/pkg/gcmanager/pod_informer.go:29 +0x59
==================
WARNING: DATA RACE
Read at 0x00c0003bb748 by goroutine 162:
  k8s.io/utils/buffer.(*RingGrowing).WriteOne()
      /src/vendor/k8s.io/utils/buffer/ring_growing.go:55 +0x6a
  k8s.io/client-go/tools/cache.(*processorListener).pop()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:800 +0x228
  k8s.io/client-go/tools/cache.(*processorListener).pop-fm()
      <autogenerated>:1 +0x39
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:73 +0x73

Previous write at 0x00c0003bb748 by goroutine 139:
  k8s.io/utils/buffer.(*RingGrowing).ReadOne()
      /src/vendor/k8s.io/utils/buffer/ring_growing.go:41 +0x2e4
  k8s.io/client-go/tools/cache.(*processorListener).pop()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:787 +0x4e7
  k8s.io/client-go/tools/cache.(*processorListener).pop-fm()
      <autogenerated>:1 +0x39
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:73 +0x73

Goroutine 162 (running) created at:
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:71 +0xdc
  k8s.io/client-go/tools/cache.(*sharedProcessor).run.func1()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:665 +0xf1
  k8s.io/client-go/tools/cache.(*sharedProcessor).run()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:668 +0x51
  k8s.io/client-go/tools/cache.(*sharedProcessor).run-fm()
      <autogenerated>:1 +0x44
  k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:56 +0x3e
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:73 +0x73

Goroutine 139 (running) created at:
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:71 +0xdc
  k8s.io/client-go/tools/cache.(*sharedProcessor).run.func1()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:665 +0xf1
  k8s.io/client-go/tools/cache.(*sharedProcessor).run()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:668 +0x51
  k8s.io/client-go/tools/cache.(*sharedProcessor).run-fm()
      <autogenerated>:1 +0x44
  k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:56 +0x3e
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:73 +0x73
==================
==================
WARNING: DATA RACE
Read at 0x00c0003bb740 by goroutine 162:
  k8s.io/utils/buffer.(*RingGrowing).WriteOne()
      /src/vendor/k8s.io/utils/buffer/ring_growing.go:70 +0x38e
  k8s.io/client-go/tools/cache.(*processorListener).pop()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:800 +0x228
  k8s.io/client-go/tools/cache.(*processorListener).pop-fm()
      <autogenerated>:1 +0x39
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:73 +0x73

Previous write at 0x00c0003bb740 by goroutine 139:
  k8s.io/utils/buffer.(*RingGrowing).ReadOne()
      /src/vendor/k8s.io/utils/buffer/ring_growing.go:48 +0x4c7
  k8s.io/client-go/tools/cache.(*processorListener).pop()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:787 +0x4e7
  k8s.io/client-go/tools/cache.(*processorListener).pop-fm()
      <autogenerated>:1 +0x39
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:73 +0x73

Goroutine 162 (running) created at:
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:71 +0xdc
  k8s.io/client-go/tools/cache.(*sharedProcessor).run.func1()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:665 +0xf1
  k8s.io/client-go/tools/cache.(*sharedProcessor).run()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:668 +0x51
  k8s.io/client-go/tools/cache.(*sharedProcessor).run-fm()
      <autogenerated>:1 +0x44
  k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:56 +0x3e
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:73 +0x73

Goroutine 139 (running) created at:
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:71 +0xdc
  k8s.io/client-go/tools/cache.(*sharedProcessor).run.func1()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:665 +0xf1
  k8s.io/client-go/tools/cache.(*sharedProcessor).run()
      /src/vendor/k8s.io/client-go/tools/cache/shared_informer.go:668 +0x51
  k8s.io/client-go/tools/cache.(*sharedProcessor).run-fm()
      <autogenerated>:1 +0x44
  k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:56 +0x3e
  k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
      /src/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:73 +0x73

Additional context deep into k8s/client-go

Icarus9913 avatar Aug 04 '22 11:08 Icarus9913

related client-go issue: https://github.com/kubernetes/client-go/issues/1143

Icarus9913 avatar Aug 04 '22 11:08 Icarus9913