Fix memory leak from unmanaged tfjs resources
When using webshap with tfjs-node, the least squares utility function produces a memory leak by not explicitly releasing the tensorflow resources it creates. Wrapping the tensorflow related logic in tidy() allows tfjs to automatically release resources it detects are not used outside of the function block (the final result tensor still needs to be manually released).
Below is a simple node snippet that demonstrates the issue:
const { KernelSHAP } = require('webshap');
const tf = require('@tensorflow/tfjs-node'); // Importing tfjs-node is sufficient to force usage of node backend in webshap
(async () => {
const inputTensor = new Array(10);
for (let i = 0; i < inputTensor.length; i++) {
inputTensor[i] = Math.random();
}
const iterations = 1000;
for (let i = 0; i < iterations; i++) {
const explainer = new KernelSHAP(
async (inputData) => {
return inputData.map(() => [0]);
},
[new Array(inputTensor.length).fill(0)],
0.2022
);
await explainer.explainOneInstance(inputTensor);
process.stdout.clearLine(0);
process.stdout.cursorTo(0);
process.stdout.write(
`Computed SHAP value ${i}/${iterations}`,
);
}
console.log('');
console.log('Final memory usage:', tf.memory());
})()
Running the above with the current implementation shows memory usage over 4GB after 1000 iterations (also corroborated by Activity Monitory on Mac):
Final memory usage: {
unreliable: true,
numTensors: 11000,
numDataBuffers: 11000,
numBytes: 4366704000
}
With the change in this PR included, the result becomes:
Final memory usage: { unreliable: true, numTensors: 0, numDataBuffers: 0, numBytes: 0 }
Based on the tensorflow documentation, I believe this issue would also affect the WebGL backend but I haven't tested.