Transformers.jl icon indicating copy to clipboard operation
Transformers.jl copied to clipboard

Julia version dependency?

Open bhalonen opened this issue 4 years ago • 10 comments

I'm having a hard time getting everything working on julia 1.5. Is there a julia version dependency?

bhalonen avatar Aug 14 '20 15:08 bhalonen

Or are you close to merging the gsoc2020 branch which seems to update to the CUDA.jl package. Kinda choking on the CuArrays.

bhalonen avatar Aug 14 '20 15:08 bhalonen

Hi,

Is there a julia version dependency?

Looks like I forgot to add a julia version compat in Project.toml, but I'm also working on Julia v1.5. It should work with julia version higher than v1.3

Or are you close to merging the gsoc2020 branch

Definitely. The GSoC 2020 ends this week, so it will be merge soon.

chengchingwen avatar Aug 18 '20 23:08 chengchingwen

The problem is due to https://github.com/chengchingwen/Transformers.jl/blob/master/Project.toml#L33-L37 bounding CUDA packages and Adapt < 2 not allowing any recent Flux version to be used with this package. I added a PR for CompatHelper so this library can get some maintenance help, but this should probably be brought into the same fold as the other ML core infrastructure so that this doesn't happen in the future (and so you can get some help!)

ChrisRackauckas avatar Aug 23 '20 23:08 ChrisRackauckas

The problem is due to https://github.com/chengchingwen/Transformers.jl/blob/master/Project.toml#L33-L37 bounding CUDA packages and Adapt < 2 not allowing any recent Flux version to be used with this package.

The gsoc2020 branch is using the new CUDA.jl package and Adapt v2.0. I guess once the branch is merge, this problem will be fixed.

but this should probably be brought into the same fold as the other ML core infrastructure so that this doesn't happen in the future (and so you can get some help!)

I'm not sure I get what you mean here. Could you elaborate more on this?

chengchingwen avatar Aug 24 '20 04:08 chengchingwen

Hi @bhalonen,

Could you try the latest version to see if anything works fine?

chengchingwen avatar Aug 31 '20 07:08 chengchingwen

The problem is due to https://github.com/chengchingwen/Transformers.jl/blob/master/Project.toml#L33-L37 bounding CUDA packages and Adapt < 2 not allowing any recent Flux version to be used with this package.

The gsoc2020 branch is using the new CUDA.jl package and Adapt v2.0. I guess once the branch is merge, this problem will be fixed.

but this should probably be brought into the same fold as the other ML core infrastructure so that this doesn't happen in the future (and so you can get some help!)

I'm not sure I get what you mean here. Could you elaborate more on this?

I'm guessing that Chris means making the repo part of some Julia ML organization on github. @ChrisRackauckas , is that what you had in mind?

(I'm just chiming in here because getting this package running smoothly would advance my own selfish interests :P)

ym-han avatar Mar 26 '21 18:03 ym-han

... getting this package running smoothly ...

Any issues?

chengchingwen avatar Mar 26 '21 19:03 chengchingwen

... getting this package running smoothly ...

Any issues?

Sorry I didn't mean to suggest I had issues with the package. Was just saying that it's in my own interests to have this package flourish, since I would like to be able to use Transformers in Julia.

ym-han avatar Mar 26 '21 21:03 ym-han

Chris probably means making this part of the FluxML or JuliaML organization. Both have a lot of Julia core contributors as members. FluxML is probably more relevant, since you build on top of it, while JuliaML has more general purpose packages.

Oblynx avatar Jul 19 '21 15:07 Oblynx

@chengchingwen if you're interested, why don't you drop a message in the ml-ecosystem-coordination zulip stream?

Oblynx avatar Jul 19 '21 15:07 Oblynx