node-mysql2
node-mysql2 copied to clipboard
cache and procedure
Dear developers i am use mysql2 lib with PoolConnection on ExpressJS. When I make a two request (changing the parameters of the request), I get old (cached) data sometimes (reloading the page). Is it somehow connected with Lru cached, but it is impossible to disable it through the connection parameters or it bug?
in docunetation say: // If you execute same statement again, it will be picked from a LRU cache // which will save query preparation time and give better performance but i use Procedure and PoolQuery . Patching my CALL procedure added Math.Random but it had no effect
my source code
const mysql = require('mysql2');
const mysqlpool = mysql.createPool({
host: 'localhost',
user: 'root',
database: 'base_test',
waitForConnections: true,
connectionLimit: 10,
queueLimit: 0,
multipleStatements: false,
//caching: true
});
module.exports.getQuery = function (str,brand,resout) {
let rnd = Math.random();
let sqlProcedure = 'CALL alfaquery_v2(?,?);';
mysqlpool.query(
sqlProcedure,[str,rnd],
function(err, rows) {
if (err) {
console.log(err);
resout.send([]);
}
console.log(rows[0]);
resout.send(rows[0]);
});
}
only statement handle is cached, the data is always fresh from whatever results DB server sends you. Don't think it's related to driver caching something.
Do you actually see console.log(rows[0]); printing results that do not match input str? Try to print parameter next to results - console.log(str, rows[0]);
same thing happening to me as well. right after an update, i sometimes get old data. i tried refreshing the page multiple times and the same thing keeps happening. this only happens with createPool. using createConnection everything works fine.
this is my config file:
import '../env/index';
import mysql from 'mysql2/promise';
const dbConnectionInfo = {
host: process.env.MARIADB_HOST,
port: process.env.MARIADB_PORT,
user: process.env.MARIADB_USER,
password: process.env.MARIADB_PASS,
connectionLimit: process.env.MARIADB_POOLSIZE,
database: process.env.MARIADB_NAME,
dateStrings: 'date',
};
const db = mysql.createPool(dbConnectionInfo);
export default db;
this is how i use it whenever i want to make a connection to db:
import db from './mysql.js';
const someFunc = async (input) => {
const [rows] = await db.query('update User set userId = ?',['input']);
return rows;
}
@outranker you might have a race condition, e.i your read request result is completed before update request.
When you use single connection all queries are performed sequentially. With pool ( if connection limit is > 1 ) you can get queries running in parallel
@sidorares i thought so in the beginning. multiple page refreshes (3 or 4 times) after an update spanning 10 to 20 seconds either return old record or new record.
connection limit is set to 30 and only I and colleague are making requests to DB so exhausting pool connections is not the issue i believe
I solved the issue. The culprit was the transactions! I think there is a bug with transactions. I replaced all of the transactions with regular queries and now everything is working fine.
I hit the same case today. Disabling transactions does the job, but it's not an option for me.