2efper
2efper
I recently download a project ,and it can't import this plugin by any means, sbt report a `not found` error in https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/org.tpolecat/tut-core_2.12/0.6.13/ivys/ivy.xml , but I find out the correct address...
所执行的命令: /root/Global-Encoding/RELEASE-1.5.5/ROUGE-1.5.5.pl -e /root/Global-Encoding/RELEASE-1.5.5/data -c 95 -2 -1 -U -r 1000 -n 4 -w 1.2 -a -m /tmp/tmphr15dtvf/rouge_conf.xml 异常信息: Illegal division by zero at /root/Global-Encoding/RELEASE-1.5.5/ROUGE-1.5.5.pl line 2450. 返回代码255 虽然有[此issue](https://github.com/lancopku/Global-Encoding/issues/4) #4...
`Failed fingerprinting Traceback (most recent call last): File "E:\dejavu\dejavu\__init__.py", line 77, in fingerprint_directory song_name, hashes, file_hash = iterator.next() File "C:\Python27\lib\multiprocessing\pool.py", line 668, in next raise value WindowsError: [Error 2] Failed...
> info : (config): Using zig executable C:\Program Files (x86)\zig\zig_0.10.1 error: (translate_c): Failed to execute zig translate-c process, error: error.AccessDenied I give full permission to this folder,but nothing changes. And...
> org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.StarryClosureCleaner$.ensureSerializable(StarryClosureCleaner.scala:46) ~[classes/:2.3.1] at org.apache.spark.util.StarryClosureCleaner$.clean(StarryClosureCleaner.scala:40) ~[classes/:2.3.1] at org.apache.spark.util.StarryClosureCleaner$.clean(StarryClosureCleaner.scala:23) ~[classes/:2.3.1] at com.github.passionke.starry.StarrySparkContext.clean(StarrySparkContext.scala:9) ~[classes/:na] at org.apache.spark.rdd.HadoopRDD.(HadoopRDD.scala:105) ~[spark-core_2.11-2.3.1.jar:2.3.1] at org.apache.spark.SparkContext$$anonfun$hadoopFile$1.apply(SparkContext.scala:1031) ~[spark-core_2.11-2.3.1.jar:2.3.1] at org.apache.spark.SparkContext$$anonfun$hadoopFile$1.apply(SparkContext.scala:1021) ~[spark-core_2.11-2.3.1.jar:2.3.1] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.11-2.3.1.jar:2.3.1] at...
https://github.com/passionke/starry/blob/cd183960a6a2b0bcf32170cb45e1664724e4449d/src/main/scala/org/apache/spark/sql/execution/StarryJoinLocalStrategy.scala#L76
`if (crc.calcCRC(block) != readCRC(crc)) { throw new InvalidBlockException(); }` I think it's something wrong about the algorithm , it always throw the InvalidBlockException. Could anybody fix it?
class JsonProvider(object): def __init__(self, rpc_addr,proxies=None): ... self.proxies=proxies ... def json_rpc(self, method, params, timeout=2): ... r = requests.post(self.rpc_addr(), json=j, timeout=timeout,proxies=self.proxies) ... example: proxies= {'http':'socks5://localhost:1080','https':'socks5://localhost:1080'} provider=JsonProvider(node_url,proxies)