spacecutter icon indicating copy to clipboard operation
spacecutter copied to clipboard

Under OrdinalModule

Open jaideep11061982 opened this issue 4 years ago • 3 comments

Hi could u help understand the below peace of code. Why do we subtract the elements in linkmat ,then concatinating them . Isnt just cutpoints-X is sufficient ?

sigmoids=cutpoints-X
 link_mat = sigmoids[:, 1:] - sigmoids[:, :-1]
        link_mat = torch.cat((
                sigmoids[:, [0]],
                link_mat,
                (1 - sigmoids[:, [-1]])
            ),
            dim=1
  1. when does this AscensionCallback gets called up.. start of every batch,epoch,or end of batch or epoch

jaideep11061982 avatar Jun 16 '20 09:06 jaideep11061982

The link_mat comes from the middle line of this equation (from my blog post on this)

image

Adjacent cutpoints have to be subtracted from each other.

The AscenscionCallback gets called at the end of every batch.

EthanRosenthal avatar Jun 16 '20 12:06 EthanRosenthal

NOt sure why but my loss keeps circulating around few values so model dsn converges

self.link = LogisticCumulativeLink1(6,
                                           init_cutpoints='ordered')
def forward(self,  x):
     
       x = self.enc(x)
      
       x = self.head(x) # it output dim is 1 
       x=self.link(x)
       
     
       return x

loss=CumulativeLinkLoss()

jaideep11061982 avatar Jun 17 '20 06:06 jaideep11061982

Sorry I can't really help debug -- it could have something to do with the library or be something else. Perhaps you can try training as a regular regression model to see if you can fit that. If you can, then maybe it's something to do with ordinal regression, specifically.

EthanRosenthal avatar Jun 18 '20 14:06 EthanRosenthal