batch_rl
batch_rl copied to clipboard
Can a customized env be added to the current framework?
Hi, I am new to this repo and offline setting for RL. I guess it should be possible, but still would like to hear some suggestions from the pros. More importantly, if it is possible to add new gym env, how to prepare the offline data?
Thank you very much!
@psc might answer more details .. but Dopamine supports openAI gym environments too and there is data logging code in the repo (look in the baselines directory) that can be used for any env supported by Dopamine.