rtabmap icon indicating copy to clipboard operation
rtabmap copied to clipboard

Android app with exteranal depth camera

Open SaurabhMishra2003 opened this issue 5 months ago • 5 comments

Hi community,

Is there any way I can create an Android app with using external depth camera like Intel Realsense. I have managed to run this rtabmap repo for Android. But, how can we use external camera in it ?

Your guidance would be of great help. @matlabbe

Thank you. Saurabh

SaurabhMishra2003 avatar Jul 27 '25 10:07 SaurabhMishra2003

Cross-ref with

  • https://github.com/introlab/rtabmap/issues/463
  • https://github.com/introlab/rtabmap/issues/575
  • https://github.com/introlab/rtabmap/issues/1426

matlabbe avatar Aug 03 '25 20:08 matlabbe

@matlabbe The above shared threads are very open ended and there is no such guide on how to run the Android app with Intel realsense. Can you please guide and point out the steps on how to use Intel Realsense depth camera with the Android app here to run our RTABMap pipeline. I would really appreciate your help here.

SaurabhMishra2003 avatar Sep 19 '25 12:09 SaurabhMishra2003

These posts are open-ended because we never officially integrated that feature in the android app. I never made it work myself actually, the farther I went was to test that my realsense did indeed worked on my phone with the realsense demo app, but to integrate realsense android inside rtabmap code, I didn't try anything.

matlabbe avatar Sep 21 '25 18:09 matlabbe

@matlabbe Just in theory, what would you suggest my steps should be, if I want to use this pipeline for Intel realsense ? How should I pass the frames (rgb and depth) to the current pipeline to get the reconstruction done ?

SaurabhMishra2003 avatar Oct 06 '25 14:10 SaurabhMishra2003

Well, in theory, if the realsense camera can be used from c++ code exactly like CameraRealsense2 driver. One could change this line: https://github.com/introlab/rtabmap/blob/99471819d5a892660e8f1e8cd8ce8d32f62e0a33/app/android/jni/RTABMapApp.cpp#L1054 to:

- sensorCaptureThread_ = new rtabmap::SensorCaptureThread(camera_); 
+ sensorCaptureThread_ = new rtabmap::SensorCaptureThread(new CameraRealsense2()); 
+ odomThread_ = new OdometryThread(Odometry::create());

+ UEventsManager::addHandler(odomThread_);
+ UEventsManager::createPipe(sensorCaptureThread_, odomThread_, "SensorEvent");
+ UEventsManager::createPipe(sensorCaptureThread_, this, "SensorEvent");
+ odomThread_->start();

If ARCore can be used at the same time, maybe this could work to use odometry from ARCore instead of starting an OdomeryThread:

sensorCaptureThread_ = new rtabmap::SensorCaptureThread(new CameraRealsense2(), camera_, arcoreToRealsenseExtrinsics); 

Note that more stuff that will break would need to be updated from there. In particular the First person rendering view would obviously break (no camera rendering texture), and the odometry pose sent for rendering may need to be converted to OpenGL coordinate frame.

matlabbe avatar Oct 11 '25 16:10 matlabbe