sqlx
sqlx copied to clipboard
Postgres arrays fail to scan
first off, coming from database/sql
, I am really enjoying the benefits leveraged by using sqlx
. thank you, the team and contributors, for all the work you put into this library.
I've especially enjoyed the fact that this library can mostly be dropped in and used side-by-side with existing database/sql
code. that being said, it seems I've run into a bit of a snag.
it appears that Postgres arrays won't properly scan into my structs.
Using this as a example struct definition closely representing my scenario:
type Person struct {
Name string `db:"name"`
Languages []string `db:"languages"`
}
Using this as an example table definition closely representing my scenario:
CREATE TABLE "person" (
"name" text NOT NULL,
"languages" text[] NOT NULL,
);
Using this example Select
call closely representing my scenario:
people := []Person{}
if err := tx.Select(&people, `SELECT * FROM "person"`); err != nil {
return err
}
I receive this error:
sql: Scan error on column index 9, name "languages": unsupported Scan, storing driver.Value type []uint8 into type *[]string
changing the struct definition to use pq.StringArray
does solve the issue:
type Person struct {
Name string `db:"name"`
Languages pq.StringArray `db:"languages"`
}
but forces a dependency on the rest of my code to rely on pq
which I don't find ideal.
normally in database/sql
-land, you would just wrap the binding struct field with pq.Array(...)
and the helper function would be responsible for converting to and from the type but in this case I have no way of using that helper function. or do I?
I'm not sure if this is a bug or by design. if it's by design, is there any advice you could impart to make this easier to manage?
any advice would be appreciated.
thanks in advance.
actually, I may have accidentally posted this too early despite feeling like I had done enough homework.
I may have overestimated the severity of the issue of having to rely on a type exported by pq
but I'll leave this issue open to interpretation.
What if you implemented Scan for a custom type? I did something like this similarly in order to Scan JSONB arrays. For example:
type LanguageSlice []string
func parseJSONToModel(src interface{}, dest interface{}) error {
var data []byte
if b, ok := src.([]byte); ok {
data = b
} else if s, ok := src.(string); ok {
data = []byte(s)
} else if src == nil {
return nil
}
return json.Unmarshal(data, dest)
}
func (c *LanguageSlice) Scan(src interface{}) error {
return parseJSONToModel(src, c)
}
Thanks for opening up this @syphoxy just stumbled upon the same issue and pq.StringArray
fixed it :)
Is there a plan to add array types in sqlx ?
👍🏼 for adding array types.