Robert Feldt
Robert Feldt
Has there been any discussion on implementing any of the DREAM and its variant MCMC algorithms as samplers for Turing? Specifically this paper includes simple Matlab implementations of DREAM and...
Thanks for this great package. I plan to implement some online Entropy estimation algorithms. Was planning to do as separate package but then realized it might fit in this one....
Hi, I'm interested in a Julia implementation of Domingo's VFDT's aka "Hoeffding Trees", see, for example: http://weka.sourceforge.net/doc.dev/weka/classifiers/trees/HoeffdingTree.html This is a streaming algorithm for learning decision trees and might be very...
Wow, this is a very welcome package and I really look forward to using this. You mention that in the future you plan to make this useable in an online...
Has there been a recent change in the API? I tried the tutorial code https://nbviewer.jupyter.org/github/theogf/AugmentedGaussianProcesses.jl/blob/master/examples/Regression%20-%20Gaussian.ipynb on Julia 1.3 and get the error: ``` ERROR: MethodError: no method matching SVGP(::Array{Float64,2}, ::Array{Float64,1},...
I can load all the bert models but none of the scibert ones: ```julia julia> bert_model, wordpiece, tokenizer = pretrain"bert-uncased_L-12_H-768_A-12" [ Info: loading pretrain bert model: uncased_L-12_H-768_A-12.tfbson ... julia> bert_model,...
Thanks for a useful package. I just got a large excel file where the data has been split into separate sheets so only the first sheet has column headings. Is...
I have a situation where I need to train models with lots of missing information, i.e. the training matrix is very sparse (on the order of more than 90% of...
I would like to get the length of the compressed stream up to now but without closing the stream or affecting continued compression. I understand most Codecs might not support...
Thanks for this package; very useful. Would it make sense to include simple multi-word distance metrics like MOWE (mean/median of word embeddings) etc in this package or is that already...