Boris Mocialov
Boris Mocialov
> I don't think that is correct. If you look through any of the .ply files in this folder: [https://github.com/mocialov/pc_registration/tree/main/data/pcd](https://github.com/mocialov/pc_registration/tree/main/data/pcd) you will see that the point clouds are in mm...
> I guess, millimeters are still meters but one thousandth of a meter. In either case it does not change the fact that colmap's scale is not in meters or...
> Without spending much time, it looks like you are missing to invert the translation as `-R^T * translation` where R is the rotation from images.txt. Otherwise it looks like...
Every PyTorch implementation of the attention model I have come across uses encoder-decoder architecture adds attention model to the decoder. Is it possible to have attention in your implementation?
@Ziyueyork I have tried `train_finetune.py` with slm: model: 'openai/whisper-large-v3', but I get `Whisper expects the mel input features to be of length 3000, but found 20000. Make sure to pad...