fluentmigrator
fluentmigrator copied to clipboard
AsStringMax() / AsAnsiStringMax()
Darren Oster on Stack Overflow had this brilliant suggestion: https://stackoverflow.com/a/2496809/1040437
Basically, use .AsString(Int32.MaxValue). Pity there's not a .AsMaxString() method, but I guess it's easy enough to put in...
Only difference is I think it should be called AsStringMax and AsAnsiStringMax so that it's easier to discover via IntelliSense/Resharper
For SQLite, see https://www.sqlite.org/faq.html#:~:text=(9)%20What%20is%20the%20maximum,all%20500%2Dmillion%20characters%20intact.
(9) What is the maximum size of a VARCHAR in SQLite?
SQLite does not enforce the length of a VARCHAR. You can declare a VARCHAR(10) and SQLite will be happy to store a 500-million character string there. And it will keep all 500-million characters intact. Your content is never truncated. SQLite understands the column type of "VARCHAR(N)" to be the same as "TEXT", regardless of the value of N.
For Firebird, see: https://firebirdsql.org/manual/migration-mssql-data-types.html#:~:text=Firebird%20can%20hold%20up%20to%2032%2C765,In%206.5%2C%20maximum%20was%20255.
| MSSQL Ver | Data Type | Firebird | MSSQL definition and comments |
|---|---|---|---|
| 6.5 | varchar | VARCHAR | Variable-length non-Unicode data with a maximum of 8,000 characters. Firebird can hold up to 32,765 characters. In 6.5, maximum was 255. |
For IBM DB2, searching docs is difficult. I found this tutorial: https://www.db2tutorial.com/db2-basics/db2-varchar/
Db2 VARCHAR type is used to store variable-length character strings. To define a variable-length character string column, you use the following syntax:
column_name VARCHAR(n)In this syntax, n is a positive integer that represents the maximum length of n bytes that the column can store. n also must be greater than zero and less than 32,740.
If you need to store a string whose length is longer than this, you should use the VARBINARY(n) data type instead.
For Oracle, docs are very clear. See: https://docs.oracle.com/en/database/oracle/oracle-database/19/refrn/datatype-limits.html#GUID-963C79C9-9303-49FE-8F2D-C8AAF04D3095
But it depends on knowing how the user configured MAX_STRING_SIZE.
This could be done via a command show parameter max_string_size. Not sure if Oracle driver for .NET supports sending a show command? Following Oracle query may also be possible:
SELECT name, value
FROM v$parameter
WHERE name = 'max_string_size'
@PhenX any thoughts?
For MYSQL, see: https://dev.mysql.com/doc/refman/8.0/en/char.html#:~:text=Values%20in%20VARCHAR%20columns%20are,and%20the%20character%20set%20used.
Values in VARCHAR columns are variable-length strings. The length can be specified as a value from 0 to 65,535. The effective maximum length of a VARCHAR is subject to the maximum row size (65,535 bytes, which is shared among all columns) and the character set used.
For Redshift, see: https://docs.aws.amazon.com/redshift/latest/dg/r_Character_types.html#r_Character_types-storage-and-ranges
The
CREATE TABLEsyntax supports theMAXkeyword for character data types. For example:create table test(col1 varchar(max));The
MAXsetting defines the width of the column as 4096 bytes forCHARor 65535 bytes forVARCHAR.
For Postgres, 😒
See: https://www.postgresql.org/docs/9.2/datatype-character.html
In any case, the longest possible character string that can be stored is about 1 GB. (The maximum value that will be allowed for n in the data type declaration is less than that. It wouldn't be useful to change this because with multibyte character encodings the number of characters and bytes can be quite different. If you desire to store long strings with no specific upper limit, use
textorcharacter varyingwithout a length specifier, rather than making up an arbitrary length limit.)
My thoughts on this issue were: Make AsStringMax() an alias for AsString(int.MaxValue) (or vice-versa), which will be translated to a CLOB type for the different databases. IOW: The Postgres and MySQL processors would emit text, while the SQL Server processor would emit varchar(max) (which has a 2GB limit, since SQL Server 2005, IIRC).
It's not an optimal solution, but better than nothing.
| Database | CLOB type |
|---|---|
| Firebird | BLOB SUB_TYPE 1 |
| MySQL | TEXT/NTEXT |
| SQL Server 2005+ | VARCHAR(MAX)/NVARCHAR(MAX) |
| Postgres | TEXT |
| Redshift | VARCHAR(65535) |
| Oracle | CLOB? |
| SQLite | TEXT |
EDIT: Added table
For Oracle, docs are very clear. See: https://docs.oracle.com/en/database/oracle/oracle-database/19/refrn/datatype-limits.html#GUID-963C79C9-9303-49FE-8F2D-C8AAF04D3095
But it depends on knowing how the user configured
MAX_STRING_SIZE.This could be done via a command
show parameter max_string_size. Not sure if Oracle driver for .NET supports sending a show command? Following Oracle query may also be possible:SELECT name, value FROM v$parameter WHERE name = 'max_string_size'@PhenX any thoughts?
I'm not sure what to think about this. I recently struggled about this, and having a migration resulting in different SQL depending on server configuration feels weird to me. But I don't know how to handle this without having an option in FluentMigrator itself, telling if strings are limited to 2000/4000 or 32767.
Also, for many DBMS, there is a large difference between "char" strings and "text" strings, much larger, but with other limitations. I think it may be important to handle those differently, like with AsTextMax -> blob, text, etc and AsStringMax -> char, varchar. What do you think ?
But I don't know how to handle this without having an option in FluentMigrator itself, telling if strings are limited to 2000/4000 or 32767. -- @PhenX
It sounds like you feel #1063 is a pre-requisite to doing this feature correctly, and that for Oracle, you feel if users don't like the lowest limit of 2000, they can override the OracleTypeMap instance and specify a new StringMax value.
I'm not sure what to think about this. I recently struggled about this
@PhenX When you say "this", what struggle did you have, in particular? Just making sure I understand.