alerem18
alerem18
i can save model parameters using BSON in a hook, but i don't know how can i use the model and test it on the env(without training, only testing)
`[ Info: Precompiling GameZero [9da27670-f782-11e9-1da1-f53579315bfe] ERROR: LoadError: InitError: could not load library "C:\Users\Administrator\.julia\artifacts\fa03230282478d82c2295049d440075426a17202\bin\SDL2_ima ge.dll"`
New Bug
i tried the code twice and it was throwing an error in both also some times it stuck(when i'm not drawing objects, it stuck for long time), it's much slower...
```julia using Flux using MLDatasets: MNIST, CIFAR10, CIFAR100 using Flux: logitcrossentropy, setup, Adam, train! using Flux.OneHotArrays: onehotbatch, onecold using Statistics: mean using Flux.MLUtils: DataLoader using ProgressBars: tqdm, set_postfix using Flux.Zygote:...
i tried to implement a RNN MODEL to classify Mnist Dataset but i get an accuracy around 40-50% even with running it for more than 20 epochs, while in pytorch,...
### Motivation and description let's say we have an array of shape (embedding_size, seq_len, batch_size), our padding mask will have a shape of (seq_len, batch_size) which can't be used in...
minist classification using convolutional networks link: https://flax.readthedocs.io/en/latest/quick_start.html same approach on pytorch is significantly faster, (almost 5 times faster) flax version takes 50 seconds per epoch!!!
can't run this example on jax or pytorch backend it just works on tensorflow backend https://keras.io/examples/nlp/neural_machine_translation_with_keras_nlp/ also inferencing is significantly slower than a similar implementation in pytorch, like 8 times...
as postgre supports arrays you should add types like string[] or integer[], as using those type currently throws an error in migration up
it's a good feature if you support socks proxies as i think currently you support only http/s proxies