mlx
mlx copied to clipboard
MLX: An array framework for Apple silicon
[ONNX](https://onnx.ai/) is an open-standard for machine learning interoperability. Allowing developers to utilize different backends to execute their models. The backend implementation is relatively straight forward where we would need to...
As a reset_parameters function of some layer usually is custom in PyTorch, while that is missing in MLX.
Running the snippet ``` import mlx.core as mx a = mx.array([1.]) b = mx.array([2.]) mx.array([a,b]) ``` produces error ``` ----> [4] mx.array([a,b]) ValueError: Invalid type in array initialization. ``` This...
I just did a git pull and got 0.0.6.dev20231223+f91f450. Then I did: env CMAKE_BUILD_PARALLEL_LEVEL="" pip install -e . pip install ".[testing]" python -m unittest discover python/tests And I got failures:...
Ran into an issue with the sum function when using int64. the following runs ok ```python import mlx.core as mx y = mx.ones((70), dtype=mx.int32) mx.sum(y) ``` the following crashes every...
## Proposed changes Implement recurrent cells and layers (Elman RNN, GRU, LSTM) in Python. Ultimately, it would probably be more efficient to implement in metal for parallelization (esp. for multi-layer...
I was reading through the matmul kernels, and I noticed the beginning of the vecmat kernel looked like this: ``` static METAL_FUNC void run( const device T* mat, const device...
Dear MLX Contributors, I hope this message finds you well. I am reaching out to discuss a potential enhancement to the MLX framework that could significantly improve its efficiency, particularly...
I wonder how this compares to llama.cpp for example in terms of performance in the same settings?
Currently, all modules such as `nn.Linear` create weights and biases in float32 default dtype. So If I am not mistaken, to use other dtypes one would first init the model...