ElegantRL icon indicating copy to clipboard operation
ElegantRL copied to clipboard

Massively Parallel Deep Reinforcement Learning. 🔥

Results 156 ElegantRL issues
Sort by recently updated
recently updated
newest added
trafficstars

There is no way to run examples and demo projects on google colab! none of them works appropriately and faces a bunch of errors!

bug

我们想让 `elegantrl_helloworld` 容易安装、运行、阅读。 We want to keep `elegantrl_helloworld` easy to install/ run/ read. - **容易安装**:只需安装 `torch` 和 `gym` - **容易运行**:只有四个核心文件 `agent.py, net.py, run.py, config.py` - **容易阅读**:总代码行数不超过1000行 - **easy to install**:...

dicussion

Would it be possible to get an example of training an MAPPO in a sample MA environment?

bug

https://github.com/AI4Finance-Foundation/ElegantRL/blob/68e11747176928d6b225f90395c1e2c97b8dfdb6/elegantrl/train/config.py#L30 Arguments no longer has a horizon_len member https://github.com/AI4Finance-Foundation/ElegantRL/blob/4ad1367a4da5c92d77862cab936978655a323dff/elegantrl/train/run.py#L73

bug

Search the funtion `build_env` in ElegantRL. There are: > 21 code results in [AI4Finance-Foundation/ElegantRL](https://github.com/AI4Finance-Foundation/ElegantRL) The `from elegantrl.envs.Gym import build_env` could be changed to `from elegantrl.train.config import build_env` The function `build_env`...

refactoring

There is an error when I try to excute the train function in Demo_MultiCrypto_Trading.ipynb AttributeError: 'AgentPPO' object has no attribute 'traj_list' ![image](https://user-images.githubusercontent.com/74358636/176986404-418420e3-162f-4609-a04f-dec5007d496e.png) Please kindly advise how should I fix this...

bug

Offline RL has become a really hot research topic recently, and it seems to redefine the paradigm of RL (such as decision transformer). May I ask would ElegantRL support offline...

Suggestion

i find that there are some missing files in MA algorithm,i wonder whether there can be a demo for MA algorithms

bug

First,thanks for your works,which helps me a lot. These days I'm trying to use MADDPG, but I met many obstacles, so if you can provide some demo for multiagent, It...

help_wanted

If you have any suggestion about `ElegantRL Helloworld`, you can discuss them here, and **we will keep an eye on this issue**. ElegantRL's code, especially the Helloworld, really needs a...

Suggestion