Exo Project Status
There have been nearly no commits or comments on the tickets in a while, can you give us an update on the project status please?
Thanks in advance.
Lead dev's Github profile doesn't look hopeful. Hope they're ok.
I agree with your concern, @KodeMunkie. Has anyone heard from Alex Cheema (last active May 10th)?
The significance of Exo lies in its potential to redefine AI through three pillars:
- Decentralization: No single entity controls compute power or models.
- Democratization: Access is not gatekept by wealth or corporate approval.
- Privacy: Data remains local, dismantling surveillance capitalism’s infrastructure.
If Exo Labs (and similar tools) succeed, they could disrupt the AI oligopoly—this is Big Tech’s worst nightmare. Recall how they had a go at Deepseek; this was day 7 Read more.
Just sharing my perspective.
Joshua Njovu
AI/ML Architect | Open Source
Senior Software Engineer
+254 751 098722 | +27 82 4129520
LinkedIn | StackOverflow
On Sat, Apr 19, 2025 at 12:09 AM, @KodeMunkie wrote:
Lead dev's GitHub profile doesn't look hopeful. Hope they're ok.
@KodeMunkie please see: https://github.com/exo-explore/exo/pull/825. Contributions are welcome.
No Exo updates on the official EXO Labs website/blog either?
-
https://exolabs.net
- https://blog.exolabs.net
Not sure if anyone heard more via Discord or social media?
-
https://discord.gg/EUnjGpsmWw
-
https://x.com/exolabs
-
https://t.me/+Kh-KqHTzFYg3MGNk
@AlexCheema looks to still be posting new EXO Labs related posts on Linkedin as well as on his personal Twitter (X) account:
-
https://www.linkedin.com/company/exolabsai/
-
https://www.linkedin.com/posts/exolabsai_last-week-at-iclr-2025-in-singapore-our-activity-7325267026963292161-oW04
-
https://www.linkedin.com/posts/alex-cheema_last-week-at-iclr-2025-in-singapore-our-activity-7325318752936370176-ThWe
- https://openreview.net/forum?id=stFPf3gzq1
-
https://www.linkedin.com/in/alex-cheema/
-
-
-
https://x.com/alexocheema
- https://x.com/alexocheema/status/1935792086978379903
PS: To paraphrase Alex Cheema's old post on the EXO Labs account on Twitter (X) I really hope that the future is still open-source:
- https://x.com/exolabs
@AlexCheema looks to still be posting new EXO Labs related posts on Linkedin:
Could be automated. Yeah, I'm hoping they're OK and this project isn't abandoned, has a lot of potential. If so, maybe @smithcoin would be pick it up with a fork
I think the consensus is this is over.
@smithcoin Looks like yours is the main fork now, this is Alex's profile recently:
Shame this project died off, had to much potential.
Two Scenarios
- Big industry players could not allow it to move forward.
- It went private with this version as the last open source.
The best we can do is to keep it alive. There is a new repository.
Seems they are rebuilding the whole project: source
exo v2 pending… completely rebuilt from the ground up using event sourcing design principles; scalable, reliable and future-proof. looking forward to showing everyone what we’ve been working on!!
Seems they are rebuilding the whole project: source
That makes we wonder if that new project still still be open-source licensed under the same GPL-3.0 license? Suspicions were that they went into stealth mode because mergers and acquisitions or something and if so the risk is they want to change the license.
- https://blueoakcouncil.org/mergers-and-acquisitions
Another common reason why companies go into stealth mode is right before they announce an IPO (Initial Public Offering)
- https://en.wikipedia.org/wiki/Initial_public_offering
Anyway, hoping that @AlexCheema or @MattBeton might comment on that here as well.
PS: Can see in @MattBeton profile on GitHub that he is still activly working in the repo for EXO Gym:
- https://github.com/exo-explore/gym
That makes we wonder if that new project still still be open-source licensed under the same GPL-3.0 license? Suspicions were that they went silent because they where being bought or something and risk is then that they want to change the license.
I still get the feeling that they used the community for free to work on their commercial product. Its happening more and more out there. Get free bug testing, code contributions. Then when its up and running, vanish and bring it in-house.
After months of not hearing about closing my PR139 and how things are now, I have also started a hard fork but focused on using just pytorch. Solo dev project but doing more updates as I can.
https://github.com/shamantechnology/xotorch
After months of not hearing about closing my PR139 and how things are now, I have also started a hard fork but focused on using just pytorch. Solo dev project but doing more updates as I can.
https://github.com/shamantechnology/xotorch
It seems this and the digital mint fork here are the ongoing efforts around EXO open source.
https://github.com/digitalmint/exo
@risingsunomi just making sure you know there are two forks, maybe want to collab with digital mint going forward.
I'm happy to jump in too, just been figuring out current project status this weekend.
After months of not hearing about closing my PR139 and how things are now, I have also started a hard fork but focused on using just pytorch. Solo dev project but doing more updates as I can. https://github.com/shamantechnology/xotorch
It seems this and the digital mint fork here are the ongoing efforts around EXO open source.
https://github.com/digitalmint/exo
@risingsunomi just making sure you know there are two forks, maybe want to collab with digital mint going forward.
I'm happy to jump in too, just been figuring out current project status this weekend.
Sounds good, happy to see so much effort towards this project and the whole idea of distributed inference and learning. I wouldn't mind helping any other efforts like the @smithcoin project and yours. Right now, I am focusing on trying to re-write for pytorch and making inference faster but the support I have been getting from this community has been fantastic. Will be building out more and open to any PRs.
Appreciate you all, thank you
My opinion having fought with Exo/tinygrad to get both AMD GPUs and Alchemist to "work" with it, is the project is a technological dead end which is the reason it's founders abandoned it as soon as they got a golden ticket from Nvidia for some free gear (and maybe jobs at a FAANG corp).
Shortly before the end, the focus was largely shifted towards MLX based workloads. This work, while supporting many more models on Apple Silicon, was something like half as performant as native distributed-MLX.
On Linux, the support was always limited to some old Llama3 models because tinygrad (the library that does inference on Linux) seems to only supports them as a PoC from what I see. Recent changes to tinygrad even broke the ability to distributed inference at all w/ exo.
So as I said before, efforts are likely better spent elsewhere rather than on an interesting project who's creators chose to ghost.
The future of distributed inference on commodity cards is likely with distributed-ollama / distributed-llama.cpp.
@deftdawg -- thanks for the summary and advice, greatly appreciated.
If you would please consider updating the project readme (or asking someone who can to), and/or the discord server, a lot of people would appreciate that.
It's quite a waste of time for a lot of people to find the project and readme, go to the discord, then have to dig into the github issues and read the voluminous contents of 819 and 825 to discover the project is now a dead end.
On one hand, super appreciate that you're bothering to reply at all. On the other, just a few minutes from someone with write access to the repo and/or admin access to the discord would save the folks still discovering the project, and hoping to contribute, a lot of time wasted trying to understand project status.
@deftdawg thank you for a more detailed explanation as to where you were coming from.
I deleted my fork as there did not appear to be any interest in continuing the project. If there is interest I would be open to starting a community fork.
Oof, had Exo on my list of projects to explore for distributed AI model work. Started working on this this weekend, then quickly ran into a few bugs here, a few bugs there... and found this issue.
Glad I found it after like 30 minutes and not 3 hours!
Sad to see the project die off though, it seemed like it would've been a bit easier for getting started than other options.
I'm really sad that project got abandoned.
Classic VC rug pull? A so-called venture capitalist rug pull happens when developers of a project deceive investors and disappear with their money. https://www.bitpanda.com/academy/en/lessons/rug-pull-definition-meaning-examples/
I agree this project has been shut down.
I am reluctant to close this issue so others can easily find it under open tickets, or give Alex final chance to respond (I pinged them on Discord to this ticket)
I started down the path of using exo with 2 mac mini m4s and it looked pretty promising but needs some work. I have since abandoned exo but haven't found a replacement. What are others using?
I started down the path of using exo with 2 mac mini m4s and it looked pretty promising but needs some work. I have since abandoned exo but haven't found a replacement. What are others using?
https://github.com/gpustack/gpustack
May comment was Censored
Thanks @KodeMunkie I'll check that out.
I started down the path of using exo with 2 mac mini m4s and it looked pretty promising but needs some work. I have since abandoned exo but haven't found a replacement. What are others using?
Give my exo fork a try. It's focused on using PyTorch and still needs MoE support which I am working on and hope to have up next week. Trying to fix the issues left in v1 but hope to expand this more for different LLMs and quantization support
https://github.com/shamantechnology/xotorch
I will. Thanks for your work on this. Does your girl support MLX?
I will. Thanks for your work on this. Does your girl support MLX?
It should through Pytorch but let me know any issues.
I will. Thanks for your work on this. Does your girl support MLX?
Im sorry I am just getting back to working on my project. xotorch via pytorch uses MPS and right now does not work. I am going to create another repo for just MLX distributed inference, and later training, using the xotorch base code. It will be at xomlx but just starting it tonight.