triton icon indicating copy to clipboard operation
triton copied to clipboard

Is there a plan to support Windows?

Open achalpandeyy opened this issue 2 years ago • 66 comments

I have noticed that the README states Linux as the only compatible platform. https://github.com/openai/triton#compatibility

Some people in the past have managed to compile on Windows https://github.com/openai/triton/issues/871 (there is even a really old PR for Windows support https://github.com/openai/triton/pull/24). But still going by the README, I suppose something changed and Triton doesn't support Windows anymore? I haven't tried to compile it myself yet.

I'm interested in the development of this repository but my main OS is Windows. I'm aware that I can probably use WSL2 but still I would prefer to run it on Windows natively. So my question is: is there a plan to officially support Windows? If so, I can help.

achalpandeyy avatar May 09 '23 04:05 achalpandeyy

I really want this as well! It would be so useful for so many things!

OPPEYRADY avatar May 09 '23 22:05 OPPEYRADY

I would also be really grateful if it happens.

Zodiac505 avatar May 10 '23 05:05 Zodiac505

This is a pretty frequent request. Let me see what we can do about it.

ptillet avatar May 11 '23 06:05 ptillet

+1

liuyunrui123 avatar May 11 '23 08:05 liuyunrui123

With torch.compile heavily relying on Triton, lots of Hugging Face users are also interested in this it seems :-)

patrickvonplaten avatar May 16 '23 12:05 patrickvonplaten

We have a number of interested parties optimizing inference times for Invoke AI on Windows. We're currently evaluating alternatives, but as @patrickvonplaten noted above, torch.compile is the most straightforward but requires Triton.

hipsterusername avatar May 16 '23 13:05 hipsterusername

+1

Get "RuntimeError: Windows not yet supported for torch.compile" in CUDA 12.1 & Pytorch 2.1.0, it seems that Triton is the main main reason which is not available on Windows, how we can get Windows version

Li-Yanzhi avatar May 28 '23 10:05 Li-Yanzhi

+1

countzero avatar Jun 03 '23 09:06 countzero

+1,many python packages support windows,and hope this as well,

Pythonpa avatar Jun 04 '23 03:06 Pythonpa

+1

domef avatar Jun 06 '23 08:06 domef

+1

speedystream avatar Jun 15 '23 05:06 speedystream

+1

jyizheng avatar Jun 24 '23 00:06 jyizheng

+1

Bigfield77 avatar Jun 25 '23 05:06 Bigfield77

@ptillet anything we could do to help you implement this? With PyTorch 2.+ becoming more and more dependent on Triton this feature request will only become more and more important I tihnk.

Can we help you here in any way?

patrickvonplaten avatar Jul 04 '23 16:07 patrickvonplaten

please add support to windows

i hate to see triton not available on windows message

FurkanGozukara avatar Jul 04 '23 16:07 FurkanGozukara

The way to help here is probably to just submit a PR that adds windows support :) we won't have CI for it though soon

ptillet avatar Jul 04 '23 19:07 ptillet

The issues / solutions to them found so far: (also somewhat related to #1560)

  • fixing url issue ValueError: unknown url type: '' - it seems LLVM_SYSPATH is not set in system path and this. I added it but still doesn't work properly for me. Workaround for it was settings up the variable manually in setup.py -- os.environ['LLVM_SYSPATH'] = 'path/to/llvm_build

  • another issue i found was with the target / build type. I couldn't make MSYS / ninja generator working so I am just using my default - Visual Studio 17 2022. I had to force get_build_type function to return RelWithDebInfo

  • next issue I got is that MLIRGPUOps (and the other 2 files in Conversion) doesn't exist in build. As I am using llvm 17 [built from master] (version 17 is used on linux ) it seems it was renamed to MLIRGPUDialect

  • another issue I got is that I couldn't build with VS + clang (i got some error with -f flag), so had to stay with MSVC. I got error about /Werror value being incorrectly set. Had to change configuration to just set(CMAKE_CXX_FLAGS "/std:c++17")

  • Currently stuck because 'C:\Users\potato\Desktop\llvm-project\build\RelWithDebInfo\bin\mlir-tblgen.exe' is not recognized as an internal or external command, operable program or batch file. It seems there is some issue with it not being built

bartekleon avatar Aug 28 '23 11:08 bartekleon

+1

gilberto-BE avatar Sep 12 '23 12:09 gilberto-BE

+1

DarkAlchy avatar Sep 13 '23 17:09 DarkAlchy

Were there any fork for this?

There is this repo but i don't know : https://github.com/PrashantSaikia/Triton-for-Windows

FurkanGozukara avatar Sep 19 '23 19:09 FurkanGozukara

+1

skirdey avatar Oct 01 '23 06:10 skirdey

+1

Pevernow avatar Oct 01 '23 14:10 Pevernow

+1

ezra-ch avatar Oct 02 '23 18:10 ezra-ch

+1

FurkanGozukara avatar Oct 02 '23 20:10 FurkanGozukara

+1

mush42 avatar Oct 05 '23 21:10 mush42

+1

DheerajMadda avatar Oct 06 '23 06:10 DheerajMadda

I'm also trying to build a llvm-17.0.0-c5dede880d17 compiled for Windows with Github actions here: https://github.com/andreigh/triton-llvm-windows

andreigh avatar Oct 07 '23 12:10 andreigh

I'm also trying to build a llvm-17.0.0-c5dede880d17 compiled for Windows with Github actions here: https://github.com/andreigh/triton-llvm-windows

you don't have any release will you do? i would like to install and test

if i merge your pull request locally how can i install on windows what command?

assume that i cloned repo merged your pull request then what?

FurkanGozukara avatar Oct 07 '23 20:10 FurkanGozukara

* Currently stuck because `'C:\Users\potato\Desktop\llvm-project\build\RelWithDebInfo\bin\mlir-tblgen.exe' is not recognized as an internal or external command, operable program or batch file.` It seems there is some issue with it [not being built](https://github.com/llvm/llvm-project/issues/64150)

This seems to be fixed in b1115f8c? I can build it without problem. Now I can build triton but not any backend. There are some gcc-only code that I have no idea how to modify it for msvc.

you74674 avatar Oct 31 '23 13:10 you74674

+1

CHEROAD avatar Dec 12 '23 08:12 CHEROAD