CoreNLP icon indicating copy to clipboard operation
CoreNLP copied to clipboard

Reducing the time for loading Stanford CoreNLP dependency parser model in an Android Studio project

Open ftoom235 opened this issue 7 years ago • 2 comments

I am developing an android application that makes use of Stanford CoreNLP pipeline with the properties: "tokenize, ssplit, pos, lemma", and the Stanford DependencyParser. I have managed to use them successfully in my app but the issue is that the parser takes too long to load the model. To be more specific, it takes too long to load english_SD.gz model in these lines of code:

String modelPath = "edu/stanford/nlp/models/parser/nndep/english_SD.gz";
DependencyParser parser = DependencyParser.loadFromModelFile(modelPath);

The time is about 2 to 3 minutes which is kind of unacceptable, is there any way to reduce this time or overcome this issue without using server-client architecture, or changing the type of model that I am using in the meanwhile?

ftoom235 avatar Feb 26 '18 22:02 ftoom235

Hi, It would me more useful if you can state your exact requirement, then we can decide on the type of model and annotators to be used.

Thanks

kagov avatar Mar 06 '18 13:03 kagov

same problem here - NER taking time-

W/System.err: Loading classifier from edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz ... Background concurrent copying GC freed 1216083(44MB) AllocSpace objects, 4(1576KB)

luckynarang avatar May 03 '19 07:05 luckynarang