Inserting to a column CLOB, fill all clob with x00
Hi
Inserting to a CLOB column, fill all clob with x00 at the end.
When inserting a row, using bindParameters with and targeting a table with a CLOB it add 00 to fill all clob size. If my CLOB is defined with a size 1MB, select LENGTH(MYCLOB) from MYTABLE return 1MB and not the size of my actual data.
If I use a 2D array bindParameters( [ ['mytext', db.IN, db.SQL_ALL_TYPE] ] ) it fill the full CLOB with x00. So the select LENGTH(MYCLOB) form MYTABLE returns 1MB
If I use a 1D array bindParameters( ['mytext'] ) it doesn't fill the CLOB column with x00 So the select LENGTH(MYCLOB) form MYTABLE returns only the actual data I inserted.
In the dbstmt.cc code
When using a 2D array for parameters, for a clob we need to use 0 or 1 bind indicator, since there is no CLOB indicator
at line ~2601
if (bindIndicator == 0 || bindIndicator == 1)
{ //Parameter is string or clob
std::string string = value.ToString().Utf8Value();
size_t str_length = string.length();
const char *cString = string.c_str();
param[i].valueType = SQL_C_CHAR;
// CLI does not honor the buffer size parameter or the output size in the indicator
// Ensure the buffer size is at least the size of parameter or the size of the string
param[i].buf = (char *)calloc(std::max(static_cast<size_t>(param[i].paramSize), str_length) + 1, sizeof(char));
// Set the indicator to be SQL_NTS
// this helps in edge cases like empty string where indicator is set 0 (which CLI doesn't like for some reason)
param[i].ind = SQL_NTS;
if (param[i].io != SQL_PARAM_OUTPUT) {
// for SQL_PARAM_INPUT and SQL_PARAM_INPUT_OUTPUT
// copy the string to the buffer
strcpy((char*)param[i].buf, cString);
}
}
I see that param[i].ind = SQL_NTS; can be the part of the problem I think that param[i].ind should be set to str_length for CLOBS as it is done for BLOBS
When inserting a row, using bindParameters with and targeting a table with a CLOB it add 00 to fill all clob size.
Are you adding the 00 to fill to clob size or are you saying that nodejs-idb-connector is?
nodejs-idb-connector/cli is adding the clob x00 to fill all the clob. Since it is working well when I use 1D array parameters but not 2D array parameters
I see that param[i].ind = SQL_NTS; can be the part of the problem I think that param[i].ind should be set to str_length for CLOBS as it is done for BLOBS
std::string string = value.ToString().Utf8Value();
const char *cString = string.c_str();
param[i].buf = (char *)calloc(std::max(static_cast<size_t>(param[i].paramSize), str_length) + 1, sizeof(char));
if (param[i].io != SQL_PARAM_OUTPUT) {
strcpy((char*)param[i].buf, cString);
}
I the code you referenced, the data that is copied in from Node.js is converted to a std::string and then we access it as a C-style null-terminated string, then we copy it to an internal buffer which is initialized to all null. The copy is also a C-style string copy. And we also set the indicator to tell CLI that the data is null-terminated. Any single one of these steps would mess up inserting null bytes, but all together there's overwhelmingly no way this would cause the data to be padded with null bytes.
Though, I could see maybe this should be changed so that if a user did put embedded null bytes in their JS string that they would get inserted whereas today the data would be truncated incorrectly which is the opposite of your problem.
It must be somewhere else within the idb-connector code which is causing your issue. Or else the problem is within CLI itself.
The problem I end with a lot of x'00' values, and I was only inserting a small string. If my CLOB is 10 MB, it will always end with 10MB written. I was only inserting a smaller string.
I think the problem is that when you use 1D array params, the CLOB is written to the correct size with no many nulls embeded at the end.
For 1D array params it end at this case :
case SQL_CLOB:
{
param[i].valueType = SQL_C_CHAR;
param[i].buf = (char *)calloc(param[i].paramSize + 1, sizeof(char));
if(value.IsString())
{
std::string string = value.ToString().Utf8Value();
const char *cString = string.c_str();
if(strlen(cString) > 0)
{
strcpy((char *)param[i].buf, cString);
param[i].ind = strlen(cString);
break;
}
}
// value is '' -> output param
param[i].ind = param[i].paramSize;
}
The only difference I see is that param[i].ind has a different value than SQL_NTS.
Must be this code then:
// value is '' -> output param
param[i].ind = param[i].paramSize;
Since the code within the if(value.IsString()) block is doing all using null-terminated strings.
You are passing in something like null for the value?
I tested with the same javascript string, I got different results ( works well with 1D array params but not with 2 array params )
Can you provide example JS code which can recreate the problem?
const db = require('idb-connector');
wDbconn = new db.dbconn();
...
dbStmt = new db.dbstmt(wDbconn);
var sql = "INSERT INTO MYTABLE (MYCLOB) VALUES ( ? ) WITH NONE\n";
var params = [
["MyTest", db.IN, db.SQL_ALL_TYPES]
];
dbStmt.prepare(sql, (error) => {
if (error) {
throw error;
}
dbStmt.bindParameters(params, (error) => {
if (error) {
throw error;
}
dbStmt.execute((out, error) => {
if (error) {
throw error;
}
const result = dbStmt.commit();
});
});
});
In this case 'select LENGTH(MYCLOB) from MYTABLE' returns me 1MB.
var params = [ ["MyTest", db.IN, db.SQL_ALL_TYPES] ];
Hello @abamara 👋🏿
Why are you are using db.SQL_ALL_TYPES as the type?
There should be db.CLOB that can be used with 2D Array to indicate that the type is a CLOB.
Hi,
If I use db.SQL_CLOB, I get this error ; BIND INDICATOR FOR PARAMETER 4IS INVALID
db.SQL_ALL_TYPES is equivalent to 0. db.CLOB is defined as 0 ?
@abmusse I can't recall how the idb-connector bind code works, but if the type passed in by the user is passed as the target type on SQLBindCol, then this will not work properly for SQL_CLOB as that requires binding as a LOB struct, IIRC.
db.SQL_ALL_TYPES is equivalent to 0. db.CLOB is defined as 0 ?
Using db.SQL_CLOB wouldn't work as you've seen.
Yes SQL_ALL_TYPES and CLOB alias are equivalent both are 0 so these would go down same code path. I was unsure what SQL_ALL_TYPES was defined to but looks like its the same as CLOB.
@kadler
In the case of 2-D array the user passes in 0 to indicate I want to bind a clob and Looks like the indicator is set as SQL_NTS like @abamara mentioned earlier.
This will later be used in the SQLBindParameter call.
https://github.com/IBM/nodejs-idb-connector/blob/6db9a2da0d1deb6e06d9b9e4b668ba1aa6124aad/src/db2ia/dbstmt.cc#L2601-L2619
In the case of the 1D Array we handle CLOBs differently:
https://github.com/IBM/nodejs-idb-connector/blob/6db9a2da0d1deb6e06d9b9e4b668ba1aa6124aad/src/db2ia/dbstmt.cc#L2747-L2763
We set the indicator to the parameter size that was retrieved from SQLDescribeParam call.
So perhaps using SQL_NTS is causing the weird behavior?
Well, I see that we don't set ind in the case where strlen was <= 0, but that shouldn't affect this case with "MyTest" which should be set to its length (6). I think we need to try to recreate this and debug.