go2rtc icon indicating copy to clipboard operation
go2rtc copied to clipboard

Possible data race in the logger?

Open zhengshi1998 opened this issue 1 year ago • 8 comments

I'm using the v1.9.7 - 2024-11-11 release.

After I run the command go build -race && ./go2rtc to build go2rtc together with race detector, some race reports are printed.

17:38:13.843 INF go2rtc platform=linux/amd64 revision=mod.aa12ae0 version=1.9.7
17:38:13.843 INF config path=/home/shizheng/go-benchmarks/go2rtc-1.9.7/go2rtc.yaml
17:38:13.844 ERR [rtsp] listen error=listen tcp :8554: bind: address already in use
17:38:13.844 INF [webrtc] listen addr=:8555/tcp
17:38:13.846 ERR [api] listen error=listen tcp :1984: bind: address already in use
==================
WARNING: DATA RACE
Read at 0x00c000002180 by goroutine 7:
  github.com/AlexxIT/go2rtc/internal/app.(*circularBuffer).Write()
      /home/shizheng/go-benchmarks/go2rtc-1.9.7/internal/app/log.go:120 +0x92
  github.com/rs/zerolog.LevelWriterAdapter.WriteLevel()
      /home/shizheng/golang/pkg/mod/github.com/rs/[email protected]/writer.go:27 +0x6f
  github.com/rs/zerolog.(*LevelWriterAdapter).WriteLevel()
      <autogenerated>:1 +0x1f
  github.com/rs/zerolog.multiLevelWriter.WriteLevel()
      /home/shizheng/golang/pkg/mod/github.com/rs/[email protected]/writer.go:98 +0xe9
  github.com/rs/zerolog.(*multiLevelWriter).WriteLevel()
      <autogenerated>:1 +0x84
  github.com/rs/zerolog.(*Event).write()
      /home/shizheng/golang/pkg/mod/github.com/rs/[email protected]/event.go:80 +0x2b2
  github.com/rs/zerolog.(*Event).msg()
      /home/shizheng/golang/pkg/mod/github.com/rs/[email protected]/event.go:151 +0x439
  github.com/rs/zerolog.(*Event).Msg()
      /home/shizheng/golang/pkg/mod/github.com/rs/[email protected]/event.go:110 +0xba
  github.com/AlexxIT/go2rtc/internal/api.listen()
      /home/shizheng/go-benchmarks/go2rtc-1.9.7/internal/api/api.go:89 +0x91
  github.com/AlexxIT/go2rtc/internal/api.Init.gowrap1()
      /home/shizheng/go-benchmarks/go2rtc-1.9.7/internal/api/api.go:72 +0x4d

Previous write at 0x00c000002180 by main goroutine:
  github.com/AlexxIT/go2rtc/internal/app.(*circularBuffer).Write()
      /home/shizheng/go-benchmarks/go2rtc-1.9.7/internal/app/log.go:142 +0x525
  github.com/rs/zerolog.LevelWriterAdapter.WriteLevel()
      /home/shizheng/golang/pkg/mod/github.com/rs/[email protected]/writer.go:27 +0x6f
  github.com/rs/zerolog.(*LevelWriterAdapter).WriteLevel()
      <autogenerated>:1 +0x1f
  github.com/rs/zerolog.multiLevelWriter.WriteLevel()
      /home/shizheng/golang/pkg/mod/github.com/rs/[email protected]/writer.go:98 +0xe9
  github.com/rs/zerolog.(*multiLevelWriter).WriteLevel()
      <autogenerated>:1 +0x84
  github.com/rs/zerolog.(*Event).write()
      /home/shizheng/golang/pkg/mod/github.com/rs/[email protected]/event.go:80 +0x2b2
  github.com/rs/zerolog.(*Event).msg()
      /home/shizheng/golang/pkg/mod/github.com/rs/[email protected]/event.go:151 +0x439
  github.com/rs/zerolog.(*Event).Msg()
      /home/shizheng/golang/pkg/mod/github.com/rs/[email protected]/event.go:110 +0x5da
  github.com/AlexxIT/go2rtc/internal/webrtc.Init()
      /home/shizheng/go-benchmarks/go2rtc-1.9.7/internal/webrtc/webrtc.go:56 +0x58e
  main.main()
      /home/shizheng/go-benchmarks/go2rtc-1.9.7/main.go:53 +0x78

Goroutine 7 (running) created at:
  github.com/AlexxIT/go2rtc/internal/api.Init()
      /home/shizheng/go-benchmarks/go2rtc-1.9.7/internal/api/api.go:72 +0x8d1
  main.main()
      /home/shizheng/go-benchmarks/go2rtc-1.9.7/main.go:45 +0x64
==================

Looks like this is caused by the logger under go2rtc/internal/app/log.go. The func (b *circularBuffer) Write(p []byte) (n int, err error) method is not protected by mutex, so that during initialization phase, when multiple goroutines try to write to the buffer concurrently, data races could appear.

zhengshi1998 avatar Nov 29 '24 09:11 zhengshi1998

Automatics does see a possible error here. As the person who wrote this, I don't see any possible real errors here. Mutex will add unnecessary stalling. They should not be allowed in a function such as a logger.

AlexxIT avatar Nov 29 '24 10:11 AlexxIT

Is it possible that in go2rtc/internal/app/log.go, the MemoryLog is initialized to be a buffer for logging and used by multiple goroutines without synchronization?

zhengshi1998 avatar Nov 29 '24 11:11 zhengshi1998

Of course this function is called simultaneously from different goroutines. In very rare cases, this can result in slightly messed up logs. But in my opinion it will never lead to panic or anything really scary. This is intentional behavior, logs should not slow down the program.

I don't like the race warnings either. But I don't know how to get around them without using mutexes.

AlexxIT avatar Nov 29 '24 11:11 AlexxIT

I see, thanks for your reply. Should I see it as a "benign race" (a race that has no harm on the program behavior) ?

zhengshi1998 avatar Nov 29 '24 11:11 zhengshi1998

Try analyzing the code. Whether it can lead to significant errors. In my opinion, no: https://github.com/AlexxIT/go2rtc/blob/d7cdc8b3b07f6fbff7daae2736377de98444b962/internal/app/log.go#L116-L144

AlexxIT avatar Nov 29 '24 11:11 AlexxIT

Maybe this is a buffer overflow error?

Suppose at some time, b.w = cap(b.chunks) - 1 and there are two goroutines accessing the Write method concurrently.

Both goroutine pass the check if len(b.chunks[b.w])+n > chunkSize at line 120, and at line 122, goroutine 1 executes b.w++. Before goroutine 1 passes the check b.w == cap(b.chunks), goroutine 2 also executes b.w++. Now b.w = cap(b.chunks) + 1, thus both goroutine 1 and 2 fail to pass the check b.w == cap(b.chunks). Then b.w will not be reset to 0, but instead remains cap(b.chunks) + 1.

Later at line 142, when accessing b.chunks[b.w], it leads to a buffer overflow error.

Does this make sense?

zhengshi1998 avatar Nov 29 '24 12:11 zhengshi1998

Looks like you right

AlexxIT avatar Nov 29 '24 12:11 AlexxIT

Thank you for your response!

zhengshi1998 avatar Nov 29 '24 12:11 zhengshi1998

https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.9

AlexxIT avatar Mar 10 '25 03:03 AlexxIT