Kumar

Results 7 issues of Kumar

[main] INFO epic.parser.models.ParserTrainer$ - Training Parser... Exception in thread "main" java.lang.NullPointerException at scala.collection.mutable.ArrayOps$ofRef$.length$extension(ArrayOps.scala:192) at scala.collection.mutable.ArrayOps$ofRef.length(ArrayOps.scala:192) at scala.collection.IndexedSeqLike$class.iterator(IndexedSeqLike.scala:90) at scala.collection.mutable.ArrayOps$ofRef.iterator(ArrayOps.scala:186) at epic.trees.Treebank$$anon$2.treesFromSection(Treebank.scala:125) at epic.trees.Treebank$$anonfun$treesFromSections$1.apply(Treebank.scala:67) at epic.trees.Treebank$$anonfun$treesFromSections$1.apply(Treebank.scala:67) at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434) at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440) at...

Hello, How to use trained models using neural CRF parsing is not described ?

Exception in thread "main" java.lang.NullPointerException at org.mapdb.Volume$ByteBufferVol.getLong(Volume.java:300) at org.mapdb.StoreDirect.checkHeaders(StoreDirect.java:112) at org.mapdb.StoreDirect.(StoreDirect.java:100) at org.mapdb.StoreWAL.(StoreWAL.java:46) at org.mapdb.DBMaker.makeEngine(DBMaker.java:582) at org.mapdb.DBMaker.make(DBMaker.java:556) at epic.util.CacheBroker$ActualCache.db$lzycompute(Cache.scala:50) at epic.util.CacheBroker$ActualCache.db(Cache.scala:49) at epic.util.CacheBroker.db(Cache.scala:38) at epic.util.CacheBroker$CacheMap.liftedTree1$1(Cache.scala:103) at epic.util.CacheBroker$CacheMap.theMap(Cache.scala:100) at epic.util.CacheBroker$CacheMap.getOrElseUpdate(Cache.scala:151) at...

WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS [main] INFO epic.framework.ModelObjective - Inference took: 2.876s [main] INFO epic.parser.models.NeuralParserTrainer$ - Validating... [ForkJoinPool-1-worker-13] INFO epic.parser.ParseEval$ - Sentences parsed 100/100 (0.224s elapsed.) [main] INFO...

There is no examples. There is no description.

It stopped functioning, When I give started experimenting with following data q a m q X A f a v I n × d u m f i n ×...

The process terminated when trained with 1000000 tokens. With following error message terminate called after throwing an instance of 'std::bad_alloc' what(): std::bad_alloc Aborted (core dumped)