SidWeng
SidWeng
@alexandrnikitin I send a PR #40 to fix this issue, please review it.
@alexandrnikitin I cannot figure out why "java.lang.NoClassDefFoundError: sbt/inc/Analysis" happens, any advice?
@alexandrnikitin Could you post the JVM crash log or reproduce step?
I try to reproduce it and looks like something wrong with unsafe.getLong() in UnsafeBitArray.set() but still cannot find out the root cause(I guess the index is too big and cause...
I add some [debug log in UnsafeBitArray.scala](https://gist.github.com/SidWeng/0512793ba6e77df420a401a9f18945ad) then run `sbt "project tests" "testOnly *UnsafeBitArraysSpec"` following is the debug log just before crash(unsafe.getLong(offset) in UnsafeBitArray.set()): numberOfBits: 2147483647 indices: 33554432 ptr: 4805672960...
I solve it by: ``` class RulesPolyModelBase(RulesModelBaseMixin, PolymorphicModelBase): pass class MyModel(RulesModelMixin, PolymorphicModel, metaclass=RulesPolyModelBase): ```
@revans2 Of course and thanks for your explanation. > In order to support these we would have to do a lot of JVM byte code analysis at the plan level...
@revans2 Sorry for late reply and here's an example in our scenario: We have a DataFrame of Record, what we want to do is group Record by region, then do...
same issue, but it complains about .create()
@maziyarpanahi I found the root cause but I'm guessing it is not a bug, please take a look https://github.com/JohnSnowLabs/spark-nlp/discussions/14362#discussioncomment-10344195