ml5-library
ml5-library copied to clipboard
[devops] Refactor ml5 library & support ml5 in node.js
Dear ml5 community,
I'm submitting a new issue. Please see the details below.
→ Step 1: Describe the issue 📝
Did you find a bug? Want to suggest an idea for feature?
- enhancement
- Related to: https://github.com/ml5js/ml5-library/issues/570
- cc/ @oveddan
This is a stretch goal, but I'd like to suggest doing the following:
-
Refactor the ml5 library to more module based system:
- Refactor ml5-library to sub-modules like what we see here in tfjs-models or turfjs . This will allow people to get the entire ml5 library or just get the features of interest.
-
Support ml5 in a nodejs environment:
- In the refactoring and restructuring of the library, I think we could gain a lot from supporting ml5 in the node.js environment. @oveddan has shown some really amazing examples -- e.g. https://github.com/oveddan/ml-text -- of how easy tensorflow-node has made it to use tfjs in the node environment so it would be really wonderful if we could bring the same ml5 friendliness to this environment as well. This will take some more planning and a better understanding of how all these pieces can fit together, but I think this would complete the "package" of ml5 in a really nice way.
This is ambitious, but I think it would be great to try to chip away at this over the next months with a goal of having something up and running by June/July.
Any suggestions for structure, devops best practices, etc would be awesome! Thanks!
Possible ways forward
- change everything to require tfjs-core rather than tfjs
- separate out the dependencies to the browser
windowe.g. fetch and p5 functions - centralize the fetch functions to handle the model fetching etc
- allow ml5 at the very least to be imported to a nodejs environment, and then fix broken models incrementally.
Suggestions for how to do this, inspired by the structure of tfjs and tfjs-models: Version 1:
- Change all models to require
tfjs-coreandtfjs-converterfor loading converted models, instead oftfjs. See example in body-pix. This can probably be done in its own PR - Move all of the models into
ml5-models. Create a package.json in here that can be published as@ml5/modelsand rollup.config.js - Move the web bundle into a folder
ml5-web- this would include the p5 things. Create a package.json here and a rollup.config.js in this folder, and this can be published as the main package that gets download from the cdn. tfjs-core gets bundled with this.
Version 2 (totally optional, and to be discussed):
- A package.json and publishable npm package from each model folder. Each model would be published under `@ml5/models/{model-name}
A nice example of how to switch backends (tfjs browser / tfjs-node (cpu) / tfjs-node-gpu (cuda) is in tfjs-examples/lstm-text-generation
const args = parseArgs();
if (args.gpu) {
console.log('Using GPU');
require('@tensorflow/tfjs-node-gpu');
} else {
console.log('Using CPU');
require('@tensorflow/tfjs-node');
}
What's great is that example works in the browser, but if running in node.js, and you just "require" the proper tfjs backend, it will switch to that.
is this actively being worked on?
That would be awesome!
I think this would also be pretty cool, is there any active work being done on this?
Does this work or not? Will I be wasting my time trying to get this to function?
Hello, Any news on this ? Thanks