safemutations
safemutations copied to clipboard
safemutations
Why this line is required? https://github.com/uber-research/safemutations/blob/40e5fd03a244f89bf157d4bedf79201e706aedc1/maze_domain.py#L344 Already in the line below we are appending all visited `state`: https://github.com/uber-research/safemutations/blob/40e5fd03a244f89bf157d4bedf79201e706aedc1/maze_domain.py#L357 Due to this initial `state` in appended in `state_buffer` twice.
Should I just replace `states` with `list(states)` to resolve the issue? ``` Iteration, fitness, theta0, theta1 0, -2.294979, -0.883752, -0.712093 Traceback (most recent call last): File "sm_simple.py", line 405, in...
Note that torch.autograd.backward() calculates the sum of gradients in all states (at least in 0.4.1 https://pytorch.org/docs/stable/autograd.html?highlight=backward#torch.autograd.backward) SM-G-SUM feeds backward() outputs of 1 and then uses the returned gradients unaltered (ie...
Hi am not able to run the any version of the simple.py script. no matter what arguments are i change. Its seems to show the following: python sm_simple.py --domain easy...