llama.cpp icon indicating copy to clipboard operation
llama.cpp copied to clipboard

MinGW: clblast "variable length arrays are not supported", unicode issues in common.c

Open asctime opened this issue 1 year ago • 7 comments

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • [ x] I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • [ x] I carefully followed the README.md.
  • [x ] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • [ x] I reviewed the Discussions, and have a new bug or useful enhancement to share.

Sorry I am not sure how much of what I have to ask fits into your template. First, thank you for this project many people are extremely excited about it, and several from the mingw community are looking at this I think (mingw is just gcc/clang compiler on windows). So to my issue. Essentially, I compiled llama.cpp #5ea4339 Windows (MinGW64 gcc) OpenCL headers 20200327 with clinfo: Number of platforms 2 Platform Name NVIDIA CUDA Platform Vendor NVIDIA Corporation Platform Version OpenCL 3.0 CUDA 11.6.134 ... Get a crash in main.exe: Initializing CLBlast (First Run)... Attempting to use: Platform=0, Device=0 (If invalid, program will crash) Using Platform: NVIDIA CUDA Device: NVIDIA GeForce GTX 1650 Ti :1:133: error: variable length arrays are not supported in OpenCL

I was hoping an experienced user could help me understand what "variable length arrays are not supported in OpenCL" implies concerning a mingw-llama. Is it something that could be theoretically patched around?

Also, I thought it was worth mentioning while I have anyone's ear that I am working on a mingw patch. It's mainly focused on the projects existing/conflicting use of ascii/unicode (wchar) on windows in the examples. I am sure it all compiles on msvcrt to mix unicode and ascii, but gcc has a coniption. Basically in mingw to support unicode wide-characters you need to compile with -DUNICODE and link with -municode and have a main called wmain. Everything has to be completely consistent. I understand that most devs don't anticipate gcc on windows but it's actually a thing now and growing. I am still testing my patches, just thought I should point it out..

asctime avatar May 24 '23 01:05 asctime

Thy a newer commit, the OpenCL code has been almost completely rewritten now.

Basically in mingw to support unicode wide-characters you need to compile with -DUNICODE and link with -municode and have a main called wmain.

Why is it necessary to use Unicode wide characters? The LLaMa tokenizer itself is only working in UTF-8.

SlyEcho avatar May 24 '23 08:05 SlyEcho

Thanks I will have a go with the new code.

Re UTF8 ok but there seem to be issues there, I'm still looking at it but experience is that anything beyond native code pages really should use wmain to render correctly in MinGW. Wasn't an issue in alpaca.cpp as long as I used the Windows 10 console just to interpret the color codes and terminal commands. Which reminds me, common.c actually excludes legacy console support? A dumb terminal fallback for what is essentially a simple chat might be more what I am after but this is a different topic.

asctime avatar May 24 '23 12:05 asctime

I don't know how well this works, but Unicode support may be needed for reading non-ANSI file names?

SlyEcho avatar May 24 '23 12:05 SlyEcho

I had a couple more hours free this week so I tested the latest commit as opposed to the latest release, as suggested. I can report from my experience that the OpenCL support now seems to work correctly, at least in the basic sense appropriate to this issue. I realize that the console issue is completely separate and will re-open once I have a reasonable patch to propose. Thank you. Good to close from my side.

asctime avatar May 31 '23 20:05 asctime

OpenCL is going to have more updates, you can test them as well: https://github.com/ggerganov/llama.cpp/pull/1653

SlyEcho avatar May 31 '23 20:05 SlyEcho

Great!! Thank you I will do my best.

asctime avatar May 31 '23 22:05 asctime

Sorry I had another project to finish. https://github.com/ggerganov/llama.cpp/pull/1897/commits/5dc35d3b59b8d76a7438036166c786a1b1675b3f

I'll get to the rest now..

update: @SlyEcho tested today's code and all is still well with CLBlast under MinGW64 broadly speaking, once https://github.com/ggerganov/llama.cpp/pull/1897 applied.

I am happy to keep testing but all good for now.

asctime avatar Jun 16 '23 16:06 asctime

This issue was closed because it has been inactive for 14 days since being marked as stale.

github-actions[bot] avatar Apr 09 '24 01:04 github-actions[bot]