ElegantRL icon indicating copy to clipboard operation
ElegantRL copied to clipboard

Bug Of Agent init

Open LoveSophia opened this issue 3 years ago • 2 comments

in the AgentBase.py AgentBase.init() :

class AgentBase:
    def __init__(self, net_dim: int, state_dim: int, action_dim: int, gpu_id: int = 0, args: Arguments = None):
      ...
      '''network'''
      act_class = getattr(self, "act_class", None)
              cri_class = getattr(self, "cri_class", None)
              self.act = act_class(net_dim, self.num_layer, state_dim, action_dim).to(self.device)
              self.cri = cri_class(net_dim, self.num_layer, state_dim, action_dim).to(self.device) \
                  if cri_class else self.act

the code pass the num_layer when init the act ,but in the agent/net.py, most of the networks init not receive num_layer param, such as QNetTwinDuel:

class QNetTwinDuel(nn.Module):  # D3QN: Dueling Double DQN

    def __init__(self, mid_dim, state_dim, action_dim):
        super().__init__()
        self.net_state = nn.Sequential(
            nn.Linear(state_dim, mid_dim),
            nn.ReLU(),
            nn.Linear(mid_dim, mid_dim),
            nn.ReLU(),
        )

this will cause agent init error.

LoveSophia avatar Sep 29 '22 10:09 LoveSophia

thanks for reporting it. recently elegantrll is updated frequently.

zhumingpassional avatar Nov 11 '22 00:11 zhumingpassional

The follow Pull request fix this bug ↓ Fix bug for vec env and agentbase init https://github.com/AI4Finance-Foundation/ElegantRL/pull/248

Yonv1943 avatar Jan 09 '23 02:01 Yonv1943