Transformer-Hawkes-Process icon indicating copy to clipboard operation
Transformer-Hawkes-Process copied to clipboard

Doubt about the integral calculating function.

Open caojiangxia opened this issue 4 years ago • 2 comments

def compute_integral_unbiased(model, data, time, non_pad_mask, type_mask): """ Log-likelihood of non-events, using Monte Carlo integration. """

num_samples = 100

diff_time = (time[:, 1:] - time[:, :-1]) * non_pad_mask[:, 1:]
temp_time = diff_time.unsqueeze(2) * \
            torch.rand([*diff_time.size(), num_samples], device=data.device)
temp_time /= (time[:, :-1] + 1).unsqueeze(2)

temp_hid = model.linear(data)[:, 1:, :]
temp_hid = torch.sum(temp_hid * type_mask[:, 1:, :], dim=2, keepdim=True)

all_lambda = F.softplus(temp_hid + model.alpha * temp_time, threshold=10)
all_lambda = torch.sum(all_lambda, dim=2) / num_samples

unbiased_integral = all_lambda * diff_time
return unbiased_integral

For my understanding ,

temp_hid = torch.sum(temp_hid * type_mask[:, 1:, :], dim=2, keepdim=True) -> temp_hid = torch.sum(temp_hid, dim=2, keepdim=True)

caojiangxia avatar Nov 17 '20 07:11 caojiangxia

There is no error of this part, the lentgth of temp_hid = torch.sum(temp_hid, dim=2, keepdim=True) is not equal to diff_time, maybe you could check it again.

DavidZhang88 avatar Nov 21 '20 13:11 DavidZhang88

Thanks for your reply! I just doubt the integral function T_T. For my understanding, when we minimize the LAMBDA=\int \sum lambda_k, the type mask should be removed? The type mask just needs at the maximizing loglikelihood function part(just maximizing lambda_k at time t).

caojiangxia avatar Nov 22 '20 13:11 caojiangxia