node-raylib icon indicating copy to clipboard operation
node-raylib copied to clipboard

install (build) fails on linux & mac

Open konsumer opened this issue 2 years ago • 21 comments

I am on Pop!OS 20.04 (very similar to same version of Ubuntu)

I have latest raylib installed, and pkg-config --cflags --libs raylib returns this (correctly):

-I/usr/local/include -L/usr/local/lib -lraylib

I can build C raylib programs fine, using cmake, or make.

I am using node v16.0.0.

it looks like maybe it's a version mismatch or something, as there are no errors about missing headers, just a ton of undefined things.

I notice you have a raylib @ e25e380 sub-module, which is in the 3.5 branch, and the current is 4.0, so it could just be that it needs to be updated.

here is log when I run npm i raylib 2> log.txt

The prebuilt linux release works fine: Screenshot from 2021-11-19 17-07-01

But I will need to build this for pi and mac (my dev laptop is mac) and I had similar issues there.

konsumer avatar Nov 20 '21 01:11 konsumer

Here is similar issue on mac (11.6), with node v16.8.0

konsumer avatar Nov 20 '21 12:11 konsumer

Thinking more about this problem, I realize that it might help to just isolate the issue and get it building for my desktop platform, as an initial step, in a docker container. I could wrap the build process, and see if I can produce a build for my computer (which does work with the pre-built release, currently) and then maybe expand that to cross-build for pi, and 2/3s of my usecase would be covered. The last part would be figuring out how to cross-build for mac in docker, which might be easier to troubleshoot once I have a working build for the other 2. It would be ideal if I could just npm install it, and have it work on all 3, but I'm not quite sure what exactly is breaking, now.

konsumer avatar Nov 20 '21 23:11 konsumer

Been considering replacing this approach entirely with Emscripten and wasm. You'll see the work done in the emscripten branch.

RobLoach avatar Nov 21 '21 00:11 RobLoach

Would that still run natively? Part of what I like about this lib-wrapper is that I can run js on pizero natively, without a browser or X, and it runs ok. I like that I could eventually use it in a web-view with wasm, too (for non-pizeros) but my primary focus, personally, is getting it to run on pizero optimized (as raylib does, like in C.)

konsumer avatar Nov 21 '21 02:11 konsumer

Also, sidenote, it does appear like the main issue is that it was trying to build against 4.0, but it was made for 3.5. I will try to slowly work through it, and update, but I'm not super-fast at C, so it might take a bit. In the meantime, I can offer my mac to build a mac release for 3.5, and I can also build for windows (in a virtual machine) if you like.

konsumer avatar Nov 21 '21 02:11 konsumer

If we find a good package that will compile and package for raspberry pi, yeah. Tried handling native build with an electron alternative. Won't overwrite it until there's a proven solution.

RobLoach avatar Nov 21 '21 02:11 RobLoach

The npm install seems to fail generally when you have another version of raylib installed to your system includes somewhere? I'm not really a C/++ user, but my understanding is it has to do with Cmakelists - which looks for an existing install before trying to pull and use the 3.5 vendor? On arch I couldn't have raylib installed on the system package manager if i wanted to install the npm package.

On my own I've gotten raylib 4.0 running with node-addon-api, and for that I'm just using a prebuilt libraylib.a that I point to in the gyp file. I don't know personally if these .a files are platform/OS specific (If so I believe you could conditionally load different libraries based on platform in the gyp file) but I see that as a better alternative to having to build raylib yourself whenever you install the npm package, assuming raylib license is permitting.

I checked out the emscripten branch - it seems like using raylib from wasm assumes you are in a browser context, so I couldn't create a window running it from node. I think from wasm it won't interface with GLFW properly outside the browser? For me personally the purpose of using node-raylib is to have a JS graphics api that wouldn't require bundling the application with electron to release. Its probably possible to write the package in a way that detects whether or not its a node runtime and exports either the wasm or the node-addon, or just seperate into two packages and leave that detail to the users.

twuky avatar Nov 21 '21 20:11 twuky

I see that as a better alternative to having to build raylib yourself whenever you install the npm package, assuming raylib license is permitting.

Yeh, a lot of node native libs (like serial for example) have a bunch of prebuilt DLLs for every target/platform that they try to use first. It seems like it could be a path, like setup CI to build a bunch of pre-built native libs, and on fail, download the original raylib 3.5 and build it. I actually looked into getting CI to do this, and it seems doable, but I didn't quite get it working. I think it would have nice side-effects, like if the node-user has a supported platform/os, it would install much faster, and not require cmake/raylib/etc to be installed. It seems like when people set this up, they always forget (or can;t support) M1 macs and pis (32 & 64bit) so we'd still have to get it actually building to support anything that can't be generated in CI. Maybe the install-script could download raylib 3.5 if needed (instead of using submodule) and it might be ok for most people, even if they do have to build.

The npm install seems to fail generally when you have another version of raylib installed to your system includes somewhere?

Yep, that seems to be my general problem on both OS's. I have 4.0.0 installed system-wide on mac & linux, and it works in C (and other wrappers that use 4.) I found I could disable the check for 3.5.0, and it would force it to use the vendored copy, which actually got it building on mac & linux. So really, I think the issue could be described in a couple ways, depending on the path forward: "use vendored 3.5.0 raylib" or "update this lib to use 4.0".

The first one can be resolved by turning this:

find_package(raylib 3.5.0)
if (NOT raylib_FOUND)
  set(BUILD_EXAMPLES OFF CACHE BOOL "" FORCE) # don't build the supplied examples
  set(BUILD_GAMES    OFF CACHE BOOL "" FORCE) # or games
  add_subdirectory(vendor/raylib)
endif()

into this:

set(BUILD_EXAMPLES OFF CACHE BOOL "" FORCE) # don't build the supplied examples
set(BUILD_GAMES    OFF CACHE BOOL "" FORCE) # or games
add_subdirectory(vendor/raylib)

which will get it building on linux & mac, in a recursive-cloned copy of this repo. It seems like cmake's version detection isn't working, but that does fix (ignore) it.

I'm not really a C/++ user

Me too! I can get through it generally, if I have to, and I have a couple projects written in it, but it definitely is not my strongest, which leads me to problems with migrating, I think. I don't see an easy way to migrate, as there are very big changes. I spent this whole weekend working on this, and made like no progress.

Initially, I looked at quickjs (and it's raylib wrapper) thinking it might do the trick, and be a bit easier to work with because it's so simple & made for embedding, but for my target (pizero) I think the up-front size cost of compiled quickjs code vs the performance boost from nodejs (benchmarks are often like 5-10X) might not be a worthwhile trade, plus I'd have to implement all my own libs (http, etc) in quickjs, which means more C things, which all together made me want to explore this lib, more. One thing I discovered in that process is this, though. It's an auto-generated JSON (and other formats) file that describes the entire interface, for making automatic wrappers. I honestly feel like I might be more productive using that to auto-generate a C node-wrapper in javascript than trying to pull the API-migration out, 1-by-1 in C, from errors.

If this seems like a good direction, I can work on this next, and submit a PR.

I also made a repo that is just a bunch of roughly the same scene, in a bunch of languages/libs, just to be able to compare them, like in terms of pizero-performance/ease-of-use/fun-to-code-in/etc. It has a couple raylib things in it, so I kind of use it as my litmus for a working setup. It might be helpful to you, not sure.

So to summarize, in terms of actionable items, I think these are the paths forward:

  • Update to 4.0 manually. As I said, this path may be beyond me, or just take a really long time for me to figure out, but seems possible, maybe with someone helping that is more familiar with the raylib migration & C, in general.
  • Be ok with 3.5.0 and use it for building. Either require it up-front (be clear about 3.5 dep in docs, etc), or include raylib source in node-module or download it in install script.
  • Auto-generate the bindings in JS, using node, for 4.0, or maybe even both, and use that with the system-installed version

For all 3, maybe setup CI to handle the 80% case, for the 3 major OS's on 64bit intel chips, and download correct raylib version & build for the other 20%.

I really like the last solution, but it might take a bit of work, up-front. I imagine that if we have mostly auto-generated bindings, and can bind to both versions (3.5 and 4), pre-building for a bunch of arches/platforms would be a super-slick setup, and for most people would install incredibly fast. I also think it's ok to just be like "it has 3.5 embedded in it" and just fully commit to that (instead of trying to use system raylib.)

konsumer avatar Nov 21 '21 22:11 konsumer

I honestly feel like I might be more productive using that to auto-generate a C node-wrapper in javascript than trying to pull the API-migration out, 1-by-1 in C, from errors.

I've actually done this! It's how my current test of a 4.0 binding works - I wrote a script that reads from the JSON and builds a cpp file with relevant function / struct bindings. I haven't made a repo for it yet though, it needs a little cleanup and i wanted to be clear about whether it was alright to just bundle the prebuilt lib in the repo. I got a lot of pointers from checking out how values are converted from this repo, and a few of the wrapped functions like for updating shader uniforms.

because i'm generating the cpp rather than using adapter functions like this library, all my wrapper functions just assume argument types rather than using generics. I think this seems to improve performance of the bindings too? I don't know what the performance difference between 3.5 and 4.0 is in pure C - but with my JS bindings a simple bunnymark can render around twice as many textures a frame compared to node-raylib 3.5.

There are a few things I'm not entirely sure work - and I had to disable a few of the functions that deal with arrays. And not all of the raylib API is covered by the file, so it may take some extra work to get any of the other libraries like rlgl, raymath, gui, etc. integrated.

You can also check #96 where I do the same thing to generate TS definitions for it too. I've updated it a bit since that issue for my 4.0 binding test though.

edit: this being said - unless CI could be integrated to create/manage the .node addon files, installing from npm would still need to compile the bindings themselves (even if raylib is precompiled). whatever my gypfile is configured as works pretty cleanly on linux, but isn't compiling on windows (i don't have a mac to test that).

twuky avatar Nov 22 '21 01:11 twuky

@twuky very cool! this seems like a good direction to me.

There are a few things I'm not entirely sure work - and I had to disable a few of the functions that deal with arrays. And not all of the raylib API is covered by the file, so it may take some extra work to get any of the other libraries like rlgl, raymath, gui, etc. integrated.

Do you have a running list of missing/incomplete stuff? unit-tests might be helpful for unclear areas, if nothing else, just to make sure it's all working. (edit: ah yes, I see this, that seems doable to fix. I would love to help.)

A few notes:

  • this seems right. I have seen a few places where char was used as an int param, and char* is a string.
  • Does this work now (other than the missing stuff)? if so, maybe we should work on forking it back into node-raylib. I started doing this manually, but as I said it got a bit overwhelming, and I'm not really happy with the results. If @RobLoach is cool with moving to a generated setup, it could be a nice direction for future raylib support.

unless CI could be integrated to create/manage the .node addon files, installing from npm would still need to compile the bindings themselves (even if raylib is precompiled).

Agreed. Even with CI all setup M1 & pi (32 & 64 bit) will not be able to be built without some cross-compiling, so in general it needs to be able to compile raylib & the bindings on npm install, if there isn't a prebuilt for the arch/platform. I propose we try to setup 4.0 bindings, and precompile in CI for everything we can, and then make a little preinstall script that downloads raylib (precompiled or source) and builds it if needed.

Here is the start of similar work for quickjs. As I said, I like node better for my application, but it may have ideas we can use or whatever.

konsumer avatar Nov 22 '21 03:11 konsumer

For reference in the conversation I've invited you both to a repo with my test of 4.0 bindings.

I think that most of the unimplemented (nearly all the autogenerated functions compile but I haven't actually tested very many) functions mostly have to do with arrays or other pointer based operations. it seems like about 90% of the library can be covered just by converting types back and forth based on name. but since raylib only uses pointer arrays, you have to manually calculate the array length, so you cant just convert 1 for 1 with string substitution like in my bindings. So I think most the functions that have anything to do with pointers probably need to be hardcoded like the wrapper functions. Ideally a single source of truth can be maintained in the code so that when things are generated we arent duplicating functions (or creating typescript defs for functions that arent bound etc)

A lot of things in the 3.5 bindings that use pointers are converted to and from int64_t and I have no clue how that works. Images for example have the data field, which when sent to JS just looks like a number, but still seemed to work when I pass one into LoadTextureFromImage.

Testing is definitely something I'm interested in setting up, probably after I know it can properly build on win/mac too. It may be hard to test specific functions in isolation. A lot of the raylib examples honestly would serve decently as "end to end" tests, if you were to automate some of the user input? Or tests could be set up based on the seperating of categories on the cheatsheet. I haven't actually worked with JS testing so it would be cool to set up.

Does this work now (other than the missing stuff)?

Right now it works on my linux machine (arch based) - you should just need to run npm run test once to build the node addon, then there are two other test files that run as well that you can run just with node test/x.js. I very quickly tried cloning and running npm install on windows - it seems like its missing links to std or something. It may have something to do with visual studio c libraries? It is probably worth looking into using cmake-js like the current version instead of node-gyp.

twuky avatar Nov 22 '21 04:11 twuky

For reference in the conversation I've invited you both to a repo with my test of 4.0 bindings.

Yep, I accepted, thanks!

. So I think most the functions that have anything to do with pointers probably need to be hardcoded like the wrapper functions.

That seems workable, as you said, with a list of edge-cases. I wonder if it can't be even more automated. Maybe they would even take a PR to add a little more info to the JSON, like "the takes an array param", or we could just maintain our own list.

A lot of things in the 3.5 bindings that use pointers are converted to and from int64_t and I have no clue how that works.

I think these are BigInt but they may also be too wide of types for practical use, like maybe number/int32 would be fine.

Testing is definitely something I'm interested in setting up, probably after I know it can properly build on win/mac too. Also, I am happy to provide troubleshooting on an ubuntu-based distro and mac/windows for testing, so we can see how it's all going.

Agreed. it will also help validate the build on other platforms (easier to spin up a vmachine and run a battery of tests.) I'm happy to test on an intel mac, and I can also do some windows testing, if you don't have a windows testing machine.

A lot of the raylib examples honestly would serve decently as "end to end" tests, if you were to automate some of the user input? Or tests could be set up based on the separating of categories on the cheatsheet. I haven't actually worked with JS testing so it would be cool to set up.

Yeh, it could be tricky to test a lot of it. I think some tests can be as simple as making a new class (like form one of the structs), then testing it's members for the right values. We'll still probably need some function-tests, though. Generally, I try to not need native requirements (like graphics) so I can do tests in headless or limited environments (like docker) but really I think just having examples would be fine as a first-step, like if we could say "looks like that stuff works" it would definitely be better than no tests at all. Examples are also nice for end-users to get a feel of how to actually use it.

It is probably worth looking into using cmake-js like the current version instead of node-gyp.

Yes. I am a very big fan of cmake over node-gyp.

So how is this? We fork, add your generator stuff, so we are generating most of the C, but everything else is essentially the same (cmake, etc) then we can both work on porting tests (starting with missing or unclear features) and trying to hammer out the last missing cases. Once it all feels fairly complete, we PR back. I have a fork here I can add you too, or we can use another.

konsumer avatar Nov 22 '21 05:11 konsumer

One idea is to not use a submodule, and let cmake download it if it's needed. I started working on a raylib4 branch, and I do this:

# version doesn't seem to pick correct version
#find_package(raylib 3.5 QUIET EXACT)
if (NOT raylib_FOUND)
  include(FetchContent)
  FetchContent_Declare(
    raylib
    GIT_REPOSITORY https://github.com/raysan5/raylib.git
    GIT_TAG 3.5.0
  )
  FetchContent_GetProperties(raylib)
  if (NOT raylib_POPULATED)
    set(FETCHCONTENT_QUIET NO)
    FetchContent_Populate(raylib)
    set(BUILD_EXAMPLES OFF CACHE BOOL "" FORCE)
    add_subdirectory(${raylib_SOURCE_DIR} ${raylib_BINARY_DIR})
  endif()
endif()

it allows me to remove the submodule, but also the builds work on mac & linux (even if another version of raylib is installed.) So this essentially fixes my original problem, and I like it because it only downloads on build (so if we have a setup that tries to download pre-built releases, it's going to have a nice fallback that doesn't bloat the npm package.) I'm going to use this idea as a base for the raylib 4 generation fork, but it could be used in 3.5 to fix things, as well.

konsumer avatar Nov 22 '21 09:11 konsumer

I started working on generation here. It's not finished, but I think it's a good start. I ended up just trying to generate code in the same style as this repo's. The actual generator is here. I won't have a lot of time to work on it, for a little while, but I am happy to add you guys to the repo, if you have time to work on it.

konsumer avatar Nov 22 '21 13:11 konsumer

This is looking great, all! Happy to merge whenever you feel fit. Will be able to test this later on today too.

I checked out the emscripten branch - it seems like using raylib from wasm assumes you are in a browser context, so I couldn't create a window running it from node. I think from wasm it won't interface with GLFW properly outside the browser?

Correct. For running in a desktop-context, it uses an Electron-alternative called Neutralino. Still trying to figure out the best design for it. I'll split the emscripten/wasm work into a new repository so it's not distracting from keeping the Node.js plugin up and running. Thanks!

RobLoach avatar Nov 22 '21 16:11 RobLoach

Checking out this branch. Should we move the discussion on updating to 4.0 to its own issue?

twuky avatar Nov 22 '21 16:11 twuky

So I brought over the CMakelists from your branch to the C generator that I had written, and with cmake-js my repo compiles and runs now on both my linux and windows partitions. If someone had a mac to test that would be really interesting. https://github.com/twuky/raylib-4.0

The reason I have that branch separate is because it's a pretty different approach than the current one. I'm just creating a really large single file that explicitly wraps each raylib function attempting to convert NAPI types based on the JSON. It doesn't do any param checking/validation though.

I do think that the Cpp code in the current branch looks a lot cleaner (compared to an 8000 line generated file), but I'm wondering if the use of generic types with the AddDefine introduces performance overhead. raylib isn't an API where we are passing large amounts of data into a single function, rather small amounts of data into many functions, so anything extra it does within that function wrap could mean a lot? I'm willing to bring things over to whichever repo if you think that the approach I took is sound, and can get it to build

I think in the meantime adjusting the current 3.5 release to modify CMakelists with your edits would fix the install issues a lot of folks are having

twuky avatar Nov 22 '21 17:11 twuky

Still trying to figure out the best design for it. I'll split the emscripten/wasm work into a new repository so it's not distracting from keeping the Node.js plugin up and running. Thanks!

Sounds good. Like I said, I am really into the native part, but I can definitely see the charm of also running in browser with same code, and even on my project, I will probly set that up (after native builds are working) for desktop computers to try it out without installing anything.

This is looking great, all! Happy to merge whenever you feel fit. Will be able to test this later on today too.

Maybe we could do it in 2 parts? I think the original problem can be solved with forcing the use of the submodule in cmake, or in my opinion, the better way: downloading on build in cmake. Either is a small change that works for 3.5 now, without any code-generation. I could make a separate PR for that.

The other part is generating from that JSON, and although I got pretty far I think, it's still going to need some tweaking, for sure. It does not currently build. I have a very dumb system where I drop the pointer-operator (*) and just ignore anything that doesn't auto-generate correctly (Matrix for example is added by hand.) I think the ref-dropping is not the right way, and I get build errors because of it, so we need to work around that, but otherwise it seems pretty close with mostly auto-generated code.

So I brought over the CMakelists from your branch to the C generator that I had written, and with cmake-js my repo compiles and runs now on both my linux and windows partitions. If someone had a mac to test that would be really interesting.

I'd be happy to build for mac, tonight.

The reason I have that branch separate is because it's a pretty different approach than the current one. I'm just creating a really large single file that explicitly wraps each raylib function attempting to convert NAPI types based on the JSON. It doesn't do any param checking/validation though.

Yep, I hear that, and it's very similar to how I did it with quickjs (which is still unfinished.) For me, it seems a bit easier to troubleshoot in 3 files, and the code it generated uses other style differences (like setting up all the type adapters first, then wrapping the original raylib function, directly.) I don't think your way is bad, but this feels like more idiomatic napi to me, and it follows the original style more closely, which may not matter at all. I wonder if the changes for performance are just 3.5->4.0. Under the hood there are some really significant improvements that might account for it, but I really have no idea. I also wrapped all the basic types, but didn't actually implement them, so you will see Tofloat outputs 0.0. I do like this style of wrapping all the types, so it just converts it when you go back & forth, though. It seems to lend itself to neater, easy-to-follow code, but as you noted, it could be slower.

Maybe once we get it building for 4, we can compare them, and if it sucks, adjust the generators to use code that is more in your style. I think the generation code itself is really simple, so it might be a good base to start with, but I'm not attached to it at all, if other solutions will work better.

I think in the meantime adjusting the current 3.5 release to modify CMakelists with your edits would fix the install issues a lot of folks are having

Yep, agreed. As I said above, it's a low-change thing that would fix the build for everyone, and even make the repo/npm-package easier to work with (no submodule) as an initial step.

konsumer avatar Nov 23 '21 02:11 konsumer

Checking out this branch. Should we move the discussion on updating to 4.0 to its own issue?

Oh yeh. forgot to agree. Yes, this should probly be another issue. I made one (#100).

konsumer avatar Nov 23 '21 02:11 konsumer

Is this till borked?

RobLoach avatar May 11 '22 20:05 RobLoach

I think this can be closed. I am using it on mac now, and when I setup build system tested on linux (x86_64 and pi)

konsumer avatar May 11 '22 20:05 konsumer