tsai icon indicating copy to clipboard operation
tsai copied to clipboard

AttributeError: 'gMLP' object has no attribute 'get_X_preds'

Open makinno opened this issue 2 years ago • 0 comments

Context: I am using the Flower framework along with a custom gMLP model to perform federated learning. I encountered an issue when trying to use the get_X_preds method with the gMLP model. It seems that the gMLP model does not have a get_X_preds method, which results in an error.

Error Message: I am encountering the following error message when attempting to run federated learning with the FlowerClient class: AttributeError: 'gMLP' object has no attribute 'get_X_preds'

Steps to Reproduce: To reproduce the issue, you can follow these steps:

  1. Create a FlowerClient class similar to the one mentioned below:
model = gMLP(2, 1, seq_len=seq_length).to(device)
class FlowerClient(fl.client.NumPyClient):
    def __init__(self, model, data):
        self.model = model
        self.X_train, self.y_train, self.X_test, self.y_test = data
        splits = RandomSplitter(valid_pct=0.2, seed=42)(range_of(self.X_train))
        tfms = [None, [TSStandardize(by_sample=True, by_var=True)]]
        batch_tfms = [TSStandardize(by_sample=True, by_var=True)]
        train_dls = get_ts_dls(self.X_train, self.y_train, splits=splits, tfms=tfms, batch_tfms=batch_tfms, bs=64)
        self.learn = ts_learner(train_dls, self.model, metrics=[mae, rmse])

    def get_parameters(self, config):
        parameters = [p.detach().cpu().numpy() for p in self.model.parameters()]
        return parameters

    def fit(self, parameters, config):      
        with torch.no_grad():
            for i, (param, param_np) in enumerate(zip(self.model.parameters(), parameters)):
                param.copy_(torch.Tensor(param_np))
        
        self.learn.fit_one_cycle(2, 1e-3)       
        updated_parameters = self.get_parameters(self.model)
        return updated_parameters, len(self.X_train), {}

    def evaluate(self, parameters, config):       
        with torch.no_grad():
            for i, (param, param_np) in enumerate(zip(self.model.parameters(), parameters)):
                param.copy_(torch.Tensor(param_np).to(device))
                
        probas, targets, preds = self.learn.get_X_preds(self.X_test, self.y_test, with_decoded=True)
        probas = probas.to(device)
        targets = targets.to(device)
        preds = preds.to(device)
        mae_score = mae(preds, targets)
        rmse_score = rmse(preds, targets)
        return {"rmse": float(rmse_score), "mae": float(mae_score)}
  1. Use a gMLP model as the self.model in the FlowerClient.
  2. Execute federated learning rounds with Flower using this client.

Expected Behavior: I expected the federated learning process to work correctly with the gMLP model and FlowerClient, including the evaluation step that uses the get_X_preds method.

Actual Behavior: Instead, I encountered the AttributeError mentioned above, indicating that the gMLP model does not have a get_X_preds method.

Additional Information: Python version: 3.8 PyTorch version: 2.0.1 Flower version: 1.5.0 Tsai version: 0.3.7

Please let me know if there are any specific details or additional information that should be included in the issue. Thank you for your assistance in resolving this problem.

makinno avatar Oct 03 '23 14:10 makinno