exo icon indicating copy to clipboard operation
exo copied to clipboard

Will exo support public internet access and service provider payment systems in the future?

Open dengbuqi opened this issue 4 months ago • 4 comments

It's not an issue, it's something from my brainstorming. I am a Chinese student pursuing a PhD in Korea.

About 6 months ago, I had a similar idea just like this repo, trying to make something that could split the model and release them into low-performance devices to become a cluster for use.

However, since it's possible to use it in a local network, why not further expose it to the Internet, allowing your neighbors, or even everyone, to use it? A few days ago, I had a bold idea and roughly wrote this idea in my blog link. I also wrote a related paper one year ago DFF: Distributed Forward-Forward Algorithm for Large-Scale Model in Low-Performance Devices

I believe that we can build an AI model computing-power sharing market platform based on a system similar to exo.

On this platform, everyone can share their model (or part of the model, or workflow) nodes as a service. Others can use these nodes by paying a certain fee to the node owners. I think this form will greatly encourage everyone to share their computing power.

Therefore, to build this AI model computing-power sharing market platform, I think we need to overcome some points.

  1. Build client software like exo that can split, connect, and share models. In this way, even non-professionals can easily share their computing power.
  2. Build a NAT traversal service can help everyone share nodes to the public internet. Relying on the public IP of the AI model computing-power sharing market platform, anyone can expose their node services on the internet.
  3. Build a payment system: The most effective way to encourage people to apply new things is free or monetary income. So, I think a payment system that matches the platform is necessary, and node service providers can benefit from it. Users of node services can also get a low-cost large model user experience from it.
  4. Security, all the above functions must be guaranteed to be safe and reliable, including data security, payment security, anonymization, and other security issues, which we need to pay attention to all the time when building the entire platform.
  5. Model training, in addition, I think besides model inference, we can also try to bring model training into this idea, corresponding research has already begun, such as A PRELIMINARY REPORT ON DISTRO.
  6. Furthermore, I think most current model structures are still mainly serialized deep models, but these models are not friendly to decentralized AI deployment. I think research on highly parallelizable operations with ultra-flat models, such as Making Models Shallow Again: Jointly Learning To Reduce Non-Linearity and Depth for Latency-Efficient Private Inference is also necessary.

I am very interested in Decentralized AI and its corresponding basic model research. If anyone is interested in it, we can be friends and discuss it together. [email: dengqikang0 at gmail dot com]

dengbuqi avatar Oct 07 '24 05:10 dengbuqi