bun icon indicating copy to clipboard operation
bun copied to clipboard

Postgres COPY statements hoard connections forever

Open oka-tan opened this issue 3 years ago • 2 comments

Calling COPY with QueryContext seems to hold the connection forever without ever releasing it back into the pool.

Minimal example:

package main

import (
	"context"
	"database/sql"
	"fmt"
	"log"
	"time"

	"github.com/uptrace/bun"
	"github.com/uptrace/bun/dialect/pgdialect"
	"github.com/uptrace/bun/driver/pgdriver"
)

type Example struct {
	bun.BaseModel `bun:"table:example"`

	Foo string `bun:"foo,pk"`
}

func main() {
	time.Sleep(5 * time.Second)

	dsn := "postgres://postgres:postgres@postgres:5432/postgres?sslmode=disable"
	sqldb := sql.OpenDB(pgdriver.NewConnector(pgdriver.WithDSN(dsn)))
	sqldb.SetMaxOpenConns(1)

	db := bun.NewDB(sqldb, pgdialect.New())

	ctx := context.Background()

	err := db.ResetModel(ctx, &Example{})

	if err != nil {
		log.Fatal(err)
	}

	examples := []Example{
		{
			Foo: "foo",
		},
		{
			Foo: "bar",
		},
		{
			Foo: "baz",
		},
	}

	_, err = db.NewInsert().
		Model(&examples).
		Exec(ctx)

	if err != nil {
		log.Fatal(err)
	}

	for i := 0; i < 1000; i++ {
		log.Printf("Generating csv number %d\n", i)

		_, err := db.QueryContext(ctx, fmt.Sprintf("COPY example TO '/csv/%d.csv'", i))

		if err != nil {
			log.Fatal(err)
		}
	}
}

Example output:

2022/08/02 15:52:28 Generating csv number 0
2022/08/02 15:52:28 Generating csv number 1

and it proceeds to get stuck there. Bumping SetMaxOpenConns to 2 makes it get stuck on 2, to 3 makes it get stuck on 3, and so on and so forth, unless the value is too high, then it crashes with a "too many open connections" message, hence the conclusion in the beginning.

Attempted solutions:

  • Waiting 5 seconds after calling query context to see if the connection eventually goes back to the pool: doesn't work.
  • Adding ";" to the end of the query: doesn't work.
  • Creating a transaction, calling all COPY commands within the transaction, rolling back the transaction: works.

Attached in bun-example.tar.gz is a complete example with a go.mod, go.sum, Dockerfile and docker-compose.yml that scaffolds everything. The only extra step needed is creating the csv folder and granting postgres permission to write in it.

EDIT: It's a normal leaking resources problem, so it's probably not that hard to solve if you're familiar with the codebase, but if it turns out to be complicated I'll take a shot this weekend.

oka-tan avatar Aug 02 '22 13:08 oka-tan

You need to close the rows when done. The Close() function releases the connection

rows, err := db.QueryContext(ctx, fmt.Sprintf("COPY example TO '/csv/%d.csv'", i))
if err != nil {
	log.Fatal(err)
}
defer rows.Close()

johandk89 avatar Sep 05 '22 13:09 johandk89

Oh, I see, that's interesting.

Why do I not need to close the rows when calling something like

db.QueryContext(ctx, "SELECT 1")

?

oka-tan avatar Sep 05 '22 16:09 oka-tan