Dany Yatim
Dany Yatim
Hello, Yes it is : log loss is used for classification problems, mse for regression ones (as we are reconstructing time series here). However, both are related (proportional) : if...
Oh I see. Your understanding of the output's sigma is right. Actually, they are (instead of an mse) directly estimating the log-likehood of Xt based on the log-probability of a...
Why not :)
Hi, Yes, of course. The purpose of autoencoders is to compress the data, then rebuild it. Along this process, the main properties of the data are kept (in order to...
Hi @Mechov, You could query the latent space with the function "reduce" of the encoder. Depending on the latent dimension you choose, you must use a dim reduction algorithm (t-sne,...
Hi, intermediate_dim is actually the dimension of the LSTM layer (of shape batch_size, timesteps, intermediate_dim). z is the latent space, and 3 is its dimension. You can freely change these...
hi, A good start/use case could be this dataset https://www.drivendata.org/competitions/52/anomaly-detection-electricity
Hello, Thank for your interest ! Unfortunately, I did not have the time to update this repo, and you are right, it has been a while... but your request might...
Hi, thanks for your interest. I used a power consumption dataset, from this competition https://github.com/drivendataorg/power-laws-anomalies. I don't know if it is still available...