intro_dgm
intro_dgm copied to clipboard
Why stochastic_euler?
Hi everyone, thank you for the fantastic work. I have a question regarding the sampling process. Since flow matching aims to learn the vector field of an ODE, the sampling should be deterministic. Could you clarify why noise is introduced in this case? Does this alter the original distribution?
def sample(self, batch_size=64):
# Euler method
# sample x_0 first
x_t = self.sample_base(torch.empty(batch_size, self.D))
# then go step-by-step to x_1 (data)
ts = torch.linspace(0., 1., self.T)
delta_t = ts[1] - ts[0]
for t in ts[1:]:
t_embedding = self.time_embedding(torch.Tensor([t]))
x_t = x_t + self.vnet(x_t + t_embedding) * delta_t
# Stochastic Euler method
if self.stochastic_euler:
x_t = x_t + torch.randn_like(x_t) * delta_t
x_final = torch.tanh(x_t)
return x_final