mesa icon indicating copy to clipboard operation
mesa copied to clipboard

huggingface hub integration

Open adamamer20 opened this issue 1 year ago • 6 comments

I have seen that https://gradio.app/ is used in the UIs for Hugging Face. @wang-boyu have you looked into it, since it is listed in one of the possible frameworks to use in the GSoC wiki? See also https://github.com/projectmesa/mesa/discussions/1276.

Originally posted by @rht in https://github.com/projectmesa/mesa/discussions/1622#discussioncomment-6062194

adamamer20 avatar May 19 '24 00:05 adamamer20

I was wondering if you were considering integrating the library with Hugging Face. The free tier supports easy uploading and downloading of datasets, and this could also be applied to models. More details here: https://huggingface.co/docs/hub/en/models-adding-libraries#register-your-libraries-supported-tasks-on-the-hub

adamamer20 avatar May 19 '24 00:05 adamamer20

That's a good idea. We can integrate the RL models (more relevant to AI) near the end of the summer once @harshmahesheka has written several models.

rht avatar May 20 '24 00:05 rht

I think this is a good idea for the future and for the community. Many ABMs can be seen as graphs that change over time, and Graph Neural Networks, which are very popular in deep learning now, work well with these graph structures. So, I believe there will be more connections between AI and ABM in the future.

adamamer20 avatar May 20 '24 16:05 adamamer20

A bit late to the reply. Been sick.

IIRC, the usual intersection between AI and ABM, other than the similarity in the mathematical structure, is instead in surrogate modelling. You approximate expensive ABM's by having a NN simulates the time evolution of the model. It's similar to AlphaFold vs simulating the actual protein folding. I have seen this implemented in Julia: https://github.com/SciML/Surrogates.jl.

rht avatar May 25 '24 08:05 rht

@adamamer20 what's your view on this issue now?

EwoutH avatar Sep 03 '24 15:09 EwoutH

@adamamer20 what's your view on this issue now?

It's worthwhile, primarily for publishing mesa-rl models. For now, the primary purpose would be to introduce mesa to the AI community. In the future, it could also be valuable for publishing multi-agent models with millions of agents, where having trained parameters / tensors it's necessary. Also, the absence of capacity limit could be extremely useful for creating fully reproducible mesa models (including the datasets used, which is usually hosted elsewhere).

adamamer20 avatar Sep 05 '24 04:09 adamamer20