go-mysql-server
go-mysql-server copied to clipboard
Intermittent DATA RACE error when creating MySQL Servers
We use the in-memory implementation provided during our tests and recently we've seen intermittent data race failures. Since they only happen on occasion they are annoying to debug because the entire stack trace is in this library, we thought you may be able to help.
WARNING: DATA RACE
Write at 0x00c00066a00f by goroutine 77:
github.com/dolthub/vitess/go/mysql.writeUint32()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/go/mysql/encoding.go:111 +0x2d9
github.com/dolthub/vitess/go/mysql.(*Conn).writeHandshakeV10()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/go/mysql/server.go:540 +0x23f
github.com/dolthub/vitess/go/mysql.(*Listener).handle()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/go/mysql/server.go:299 +0x3a8
Previous read at 0x00c00066a00f by goroutine 66:
runtime.slicebytetostring()
/opt/hostedtoolcache/go/1.16.5/x64/src/runtime/string.go:80 +0x0
github.com/dolthub/go-mysql-server/sql.stringType.Convert()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/sql/stringtype.go:254 +0x873
github.com/dolthub/go-mysql-server/sql.(*stringType).Convert()
<autogenerated>:1 +0xaa
github.com/dolthub/go-mysql-server/server.bindingsToExprs()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/server/handler.go:197 +0x4d4
github.com/dolthub/go-mysql-server/server.(*Handler).doQuery()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/server/handler.go:292 +0x1cfa
github.com/dolthub/go-mysql-server/server.(*Handler).errorWrappedDoQuery()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/server/handler.go:477 +0x71
github.com/dolthub/go-mysql-server/server.(*Handler).ComStmtExecute()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/server/handler.go:99 +0xaf
github.com/dolthub/vitess/go/mysql.(*Conn).handleNextCommand()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/go/mysql/conn.go:1060 +0x3418
github.com/dolthub/vitess/go/mysql.(*Listener).handle()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/go/mysql/server.go:472 +0x181a
Goroutine 77 (running) created at:
github.com/dolthub/vitess/go/mysql.(*Listener).Accept()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/go/mysql/server.go:266 +0x146
github.com/dolthub/go-mysql-server/server.(*Server).Start()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/server/server.go:118 +0x4f
github.com/verticalscope/fl-core/pkg/test.NewMysqlServer.func2()
/home/runner/work/fl-core/fl-core/pkg/test/mysql.go:100 +0x50
Goroutine 66 (running) created at:
github.com/dolthub/vitess/go/mysql.(*Listener).Accept()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/go/mysql/server.go:266 +0x146
github.com/dolthub/go-mysql-server/server.(*Server).Start()
/home/runner/go/pkg/mod/github.com/dolthub/[email protected]/server/server.go:118 +0x4f
github.com/verticalscope/fl-core/pkg/test.NewMysqlServer.func2()
/home/runner/work/fl-core/fl-core/pkg/test/mysql.go:100 +0x50
These errors appear while running our tests with the std library testing library and asserts with the testify library.
@Echelons Can you give some more details on the types of tests you are running? Any code that you can link too?
@VinaiRachakonda the code is on a private github repo and therefore unavailable
At first glance it would seem that the server is reusing the memory allocated to handle bindvars in another request. We'll see if we can reproduce this on our end.
From a brief read through the vitess code that controls read and writing packets, it looks like this code is just wrong. It uses a buffer pool, and immediately after reading the values of the bind vars from the wire, it recycles the byte buffer, returning it to the pool to be used for other connections:
case ComStmtExecute:
queryStart := time.Now()
stmtID, _, err := c.parseComStmtExecute(c.PrepareData, data)
c.recycleReadPacket()
I'm not sure yet where the reuse of the buffer is coming from, though. Still looking into it.
Apologies for the lack fo response, we ended up going with another solution.
Sorry to hear you couldn't get it to work!
Keeping this open since the bug appears to be real.
Got into this issue as well. Are you have any plan to address this issue @zachmu ?
Have also hit this
We believe this is fixed on main, caused by inappropriate reuse of MySQL connection buffer slices. Please open a new issue if you run into it again.