spacecutter
spacecutter copied to clipboard
Under OrdinalModule
Hi could u help understand the below peace of code. Why do we subtract the elements in linkmat ,then concatinating them . Isnt just cutpoints-X is sufficient ?
sigmoids=cutpoints-X
link_mat = sigmoids[:, 1:] - sigmoids[:, :-1]
link_mat = torch.cat((
sigmoids[:, [0]],
link_mat,
(1 - sigmoids[:, [-1]])
),
dim=1
- when does this AscensionCallback gets called up.. start of every batch,epoch,or end of batch or epoch
The link_mat
comes from the middle line of this equation (from my blog post on this)
Adjacent cutpoints have to be subtracted from each other.
The AscenscionCallback
gets called at the end of every batch.
NOt sure why but my loss keeps circulating around few values so model dsn converges
self.link = LogisticCumulativeLink1(6,
init_cutpoints='ordered')
def forward(self, x):
x = self.enc(x)
x = self.head(x) # it output dim is 1
x=self.link(x)
return x
loss=CumulativeLinkLoss()
Sorry I can't really help debug -- it could have something to do with the library or be something else. Perhaps you can try training as a regular regression model to see if you can fit that. If you can, then maybe it's something to do with ordinal regression, specifically.