Custom ColumnTypeScanType
Is there a way to change the default scanType for a pgType? for example I have a custom type for sql date types and it has implemented Scan and Value methods, but when I have a dynamic query and pass map[string]any as gorm Scan input, it still uses time.Time{} for date types. And also for database numeric type it is using float64 which in some cases the number overflows.
Wrt numeric types: Go doesn't have a builtin decimal type. float64 is the largest floating point type. If you want to use numeric, try shopspring/decimal. There are examples.
The other thing seems to be related to gorm. I don't thing pgx should start to support gorm or any other orm.
You're right I can use decimal for numeric types and implement Scan and Value methods for my custom type. But again my query is dynamic and I dont know the output result so I have to pass map[string]any for scan type. in this case this method in pgx gets called:
func (r *Rows) ColumnTypeScanType(index int) reflect.Type { fd := r.rows.FieldDescriptions()[index]
switch fd.DataTypeOID {
case pgtype.Float8OID:
return reflect.TypeOf(float64(0))
case pgtype.Float4OID:
return reflect.TypeOf(float32(0))
case pgtype.Int8OID:
return reflect.TypeOf(int64(0))
case pgtype.Int4OID:
return reflect.TypeOf(int32(0))
case pgtype.Int2OID:
return reflect.TypeOf(int16(0))
case pgtype.BoolOID:
return reflect.TypeOf(false)
case pgtype.NumericOID:
return reflect.TypeOf(float64(0))
case pgtype.DateOID, pgtype.TimestampOID, pgtype.TimestamptzOID:
return reflect.TypeOf(time.Time{})
case pgtype.ByteaOID:
return reflect.TypeOf([]byte(nil))
default:
return reflect.TypeOf("")
}
}
but this is not the behaviour I want for casting database types to golang types
In the native pgx interface you can entirely replace the logic any given type by registering a new Codec for that PostgreSQL OID. But I don't think you can change what you want in the stdlib adapter. AFAIK, the database/sql interface doesn't allow it.