karma-babel-preprocessor
karma-babel-preprocessor copied to clipboard
Adding in caching
Would you be interested in receiving a local file storage caching mechanism in this project? I implemented to improve independent karma startup performance and was able to cut processing time in half.
It is based on the node-localcache
project w/ md5 content hash of the originalPath file vs the output JS.
Hi @sb8244, I'm having a hard time with the start-up times and something along the lines of what you suggested would improve things tremendously. Would you like to publish your work in a fork of this project?
Is this available? I'm trying to add Karma to a very large project and experiencing very slow startup time, a couple minutes. Setting debug log level, looks like majority of the time is in preprocessor.babel processing each file.
I think the caching system should be in karma itself so that it's also available to other preprocessors.
I will put together a small code example today. Not sure if I'll be able to pr it that quick, but maybe I can.
I don't think it should be in karma, personally.
Hi All, here is a patch that applies this.
I'll see about putting a PR together for this! I've been using this for a year w/o much problem
@@ -1,7 +1,15 @@
'use strict';
+// Modified to include caching of the babel files
+
var babel = require('babel-core');
+var LocalCache = require('node-localcache');
+var fs = require('fs');
+var crypto = require('crypto');
+
+var cache = new LocalCache('.cache/babel_caches.json');
+
// @param args {Object} - Config object of custom preprocessor.
// @param config {Object} - Config object of babelPreprocessor.
// @param logger {Object} - Karma's logger.
@@ -17,8 +25,23 @@
var options = createOptions(args, config, helper, file);
file.path = options.filename || file.path;
- var processed = babel.transform(content, options).code;
- done(null, processed);
+ var path = file.originalPath;
+ var cacheEntry = cache.getItem(path);
+ var md5Hash = "";
+
+ fs.readFile(path, (err, data) => {
+ md5Hash = crypto.createHash('md5').update(data || "").digest('hex');
+
+ if (err || !cacheEntry || cacheEntry.md5 !== md5Hash) {
+ log.debug('Processing "%s" - Cache miss.', path);
+ var processed = babel.transform(content, options).code;
+ cache.setItem(path, { md5: md5Hash, js: processed });
+ done(null, processed);
+ } else {
+ log.debug('Processing "%s" - Cache hit.', path);
+ done(null, cacheEntry.js);
+ }
+ });
} catch (e) {
log.error('%s\n at %s', e.message, file.originalPath);
done(e, null);
Any news on this?
Extracted PR #34 for this. @johannesjo
I think this would be a fantastic feature to add. I'm running tests in parallel on several very slow machines, and I've sped up my workflow by running as many steps as possible on a single, fast machine, then distributing artifacts to test to the slow test devices. If I could run babel's transformations in advance as well, I could cut ~10 minutes of test startup time on each slow device.
@MattiasBuelens, are you a maintainer? Can you or someone else in charge comment on this proposal and/or suggest alternative approaches?
I revived #34 as #77, with some improvements. I simplified it to remove the extra file read, and I made the cache path configurable, with the default being not to cache at all.
In case this gets no traction from the maintainers, I published @joeyparrish/karma-babel-preprocessor on NPM for those who want to use it right away.
This has allowed me to cut 60 seconds of latency off the startup of tests on a large project.