Anthony Blaom, PhD
Anthony Blaom, PhD
I can't see anything wrong with LogLoss: ```julia using MLJ model = ConstantRegressor() data = make_regression() evaluate(model, data...; measure = LogLoss()) # Evaluating over 6 folds: 100%[=========================] Time: 0:00:02 #...
no, not public api
For a little more context, I'm getting the error below when I change `julia` to `@example MNIST` [here](https://github.com/FluxML/MLJFlux.jl/blob/07e01b7f41bc3dd870dd08c7c97114cc45f91f5e/docs/src/extended_examples/MNIST/notebook.md?plain=1#L328). ```julia ┌ Warning: failed to run `@example` block in src/extended_examples/MNIST/notebook.md:328-331 │ ```@example...
> @ablaom do you have any suggestion? how to stripe away reference to CUDA data after machine is trained? Uh. You could try: ```julia mach = restore!(serializable(mach)) ``` Methods exported...
> https://github.com/JuliaAI/MLJBase.jl/issues/750 it doesn't fully remove reference to data somehow? But looks like it helps quite a lot at least I'm not 100% sure, but I doubt the cited issue...
Sorry, this is a little off topic, but if you use [EvoTrees.jl](https://github.com/Evovest/EvoTrees.jl), the pure Julia gradient tree boosting implementation, then I expect these sorts of issues either do not occur,...
Thanks @EssamWisam for this.
Thanks for this. Looking at this again, I'm a little confused as to why it's even necessary. According to [this line]() the embedding layer weights are initialised using `glorot_uniform` which...
In your embedding, you first convert a level, such as `"male"` or `"female"` to a float integer, such as `1.0`, right? And then this is input to the embedding layer...
> It's quite easy to amke them Float32 upon generation Yes, this is my preference. Can you do it here, so I don't have to find the relevant code? I...