RoyAmoyal
RoyAmoyal
@ethnhe and you maybe have any suggestions on how to create a new dataset for multiple objects together like the YCB dataset?
what is the solution? I encounter the same problem using the next code: ``` import os from mmagic.apis import MMagicInferencer from mmengine import mkdir_or_exist # Create a MMagicInferencer instance and...
> > Is there any way to extract the exact distance(in meters) of any pixel in the image? > > Metric depth models (like [ZoeDepth](https://github.com/isl-org/ZoeDepth)) attempt to do this. With...
@szymonkulpinski @laxnpander @SamuelDudley Hey, I am trying to work on a similar project. If I understood you correctly, you use the Monocular version of ORB-SLAM2 and initialize the scale of...
> @RoyAmoyal Don't quite understand what you say. LiDAR as in 1D-LiDAR? Height information is not sufficient to use RGBD, for RGBD you'd need dense scene information. There is LiDARs...
> You can see all the real-time results in WandB by just adding `--vis viewer+wandb` into `ns-train nerfacto` (before putting `--data` and `--output-dir`). That's they way to visualize some of...
> > Check [xaldyz/dataflow-orbslam#2](https://github.com/xaldyz/dataflow-orbslam/pull/2) > > Works on jetson nx/opencv4. > > This doesn't appear to have the CUDA support. Here's a fork of ORB_SAM2_CUDA with OpenCV4: > > https://github.com/dusty-nv/ORB_SLAM2_CUDA...
> Hi @ArtlyStyles, when you start to see the environment, the 1st frame cannot say meaningful things alone. You miss the scale, you miss the relativitya and you cannot estimate...
> you need another sensor to evaluate the scale. I have successfully initialized absolutely scaled orbslam maps using my robot's odometers. > […](#) > On Tue, Mar 12, 2019 at...
Can you please share the code? I am trying to do something similar and I want to check that too. Thanks!