DeepRL-PPO-tutorial
DeepRL-PPO-tutorial copied to clipboard
This repository contains tutorial material on Doing DeepRL with PPO in GDG DevFest 2017 Seoul.
I am using python2 and when i run it, this error came up: DeepRL-PPO-tutorial-master$ python ppo.py File "ppo.py", line 198 *step_losses, _ = tf.get_default_session().run([self.ent, self.vf_loss, self.pol_loss, self.update_op],feed_dict = {self.obs_place: traj["ob"][cur:cur+self.batch_size],...
`cmake -DBUILD_SHARED_LIBS=ON -DUSE_DOUBLE_PRECISION=1 _DCMAKE_INSTALL_PREFIX:PATH=$ROBOSCHOOL_PATH/roboschool/cpp-household/bullet_local_install -DBUILD_CPU_DEMOS=OFF -DBUILD_BULLET2_DEMOS=OFF -DBUILD_EXTRAS=OFF -DBUILD_UNIT_TESTS=OFF -DBUILD_CLSOCKET=OFF -DBUILD_ENET=OFF -DBUILD_OPENGL3_DEMOS=OFF ..` 에서 `_DCMAKE_INSTALL_PREFIX` 를 `-DCMAKE_INSTALL_PREFIX` 으로 수정해야할 것 같습니다.