Transformers.jl
Transformers.jl copied to clipboard
Julia version dependency?
I'm having a hard time getting everything working on julia 1.5. Is there a julia version dependency?
Or are you close to merging the gsoc2020 branch which seems to update to the CUDA.jl package. Kinda choking on the CuArrays.
Hi,
Is there a julia version dependency?
Looks like I forgot to add a julia version compat in Project.toml, but I'm also working on Julia v1.5. It should work with julia version higher than v1.3
Or are you close to merging the gsoc2020 branch
Definitely. The GSoC 2020 ends this week, so it will be merge soon.
The problem is due to https://github.com/chengchingwen/Transformers.jl/blob/master/Project.toml#L33-L37 bounding CUDA packages and Adapt < 2 not allowing any recent Flux version to be used with this package. I added a PR for CompatHelper so this library can get some maintenance help, but this should probably be brought into the same fold as the other ML core infrastructure so that this doesn't happen in the future (and so you can get some help!)
The problem is due to https://github.com/chengchingwen/Transformers.jl/blob/master/Project.toml#L33-L37 bounding CUDA packages and Adapt < 2 not allowing any recent Flux version to be used with this package.
The gsoc2020 branch is using the new CUDA.jl package and Adapt v2.0. I guess once the branch is merge, this problem will be fixed.
but this should probably be brought into the same fold as the other ML core infrastructure so that this doesn't happen in the future (and so you can get some help!)
I'm not sure I get what you mean here. Could you elaborate more on this?
Hi @bhalonen,
Could you try the latest version to see if anything works fine?
The problem is due to https://github.com/chengchingwen/Transformers.jl/blob/master/Project.toml#L33-L37 bounding CUDA packages and Adapt < 2 not allowing any recent Flux version to be used with this package.
The gsoc2020 branch is using the new CUDA.jl package and Adapt v2.0. I guess once the branch is merge, this problem will be fixed.
but this should probably be brought into the same fold as the other ML core infrastructure so that this doesn't happen in the future (and so you can get some help!)
I'm not sure I get what you mean here. Could you elaborate more on this?
I'm guessing that Chris means making the repo part of some Julia ML organization on github. @ChrisRackauckas , is that what you had in mind?
(I'm just chiming in here because getting this package running smoothly would advance my own selfish interests :P)
... getting this package running smoothly ...
Any issues?
... getting this package running smoothly ...
Any issues?
Sorry I didn't mean to suggest I had issues with the package. Was just saying that it's in my own interests to have this package flourish, since I would like to be able to use Transformers in Julia.
Chris probably means making this part of the FluxML or JuliaML organization. Both have a lot of Julia core contributors as members. FluxML is probably more relevant, since you build on top of it, while JuliaML has more general purpose packages.
@chengchingwen if you're interested, why don't you drop a message in the ml-ecosystem-coordination zulip stream?