ArnoldC
ArnoldC copied to clipboard
Variable type is documented as signed 16 bit in wiki but appears actually to be signed 32 bit integer
Hi, nice project!
I wrote an implementation of the fibonacci sequence in ArnoldC and found that numbers over 2^15 were represented properly, but numbers over 2^31 went round the number wheel and into negative. So it appears that the integer type would be a signed 32 bit integer instead of a signed 16 bit integer.
Links:
- Fibonacci sequence output with numbers printed: https://github.com/jamesrr39/fibonacci-arnoldc-docker/blob/48134ecdf16d4404e52442203ddca53cec900383/README.md
- Wiki page with
The only variable type in ArnoldC is a 16bit signed integer
: https://github.com/lhartikk/ArnoldC/wiki/ArnoldC