pinot icon indicating copy to clipboard operation
pinot copied to clipboard

Gpytorch integration - variational GP

Open dnguyen1196 opened this issue 4 years ago • 2 comments

Example usage can be found in scripts/gp/gp_playground_gpytorch.ipynb

    net_variational_gp = pinot.Net(
        pinot.representation.Sequential(
            pinot.representation.dgl_legacy.gn(kwargs={"allow_zero_in_degree":True}),
                [64, 'relu', 64, 'relu', 64, 'relu']),
        output_regressor_class=pinot.regressors.VariationalGP,
        num_inducing_points=150,
        num_data=902,
        beta = beta,
    )
    
    lr = 1e-4
    optimizer = torch.optim.Adam([
        {'params': net_variational_gp.representation.parameters(), 'weight_decay': 1e-4},
        {'params': net_variational_gp.output_regressor.parameters(), 'lr': lr*0.1}
    ], lr=lr)

    for n in range(n_epochs):
        total_loss = 0.
        for (g, y) in data:
            optimizer.zero_grad()
            loss = net_variational_gp.loss(g, y.flatten())
            loss.backward()
            optimizer.step()
            total_loss += loss.item()

dnguyen1196 avatar Feb 04 '21 04:02 dnguyen1196

guys, can we get some movement here?

karalets avatar Apr 01 '21 00:04 karalets

Yeah, Duc and I have been discussing progress on it and benchmarking

On Wed, Mar 31, 2021, 8:13 PM karalets @.***> wrote:

guys, can we get some movement here?

— You are receiving this because your review was requested. Reply to this email directly, view it on GitHub https://github.com/choderalab/pinot/pull/122#issuecomment-811546900, or unsubscribe https://github.com/notifications/unsubscribe-auth/AADZLMWQQKKS5HBNKJ2G2WTTGO3BNANCNFSM4XCEA4IQ .

miretchin avatar Apr 02 '21 19:04 miretchin