OpenNI2
OpenNI2 copied to clipboard
Listener attached with VideoStream::addNewFrameListener() captures both streams
When I execute the attached code, I receive only depth frames in my callback. If I move the addNewFrameListener() call to // MOVE HERE, I get both color and depth frames.
openni::Status status = openni::OpenNI::initialize();
if (status != openni::STATUS_OK) throw NIException(status, "OpenNI::initialize()");
status = Device.open(openni::ANY_DEVICE);
if (status != openni::STATUS_OK) throw NIException(status, "Device::open(openni::ANY_DEVICE)");
if (Device.getSensorInfo(openni::SENSOR_COLOR) != nullptr)
{
status = Video.create(Device, openni::SENSOR_COLOR);
if (status != openni::STATUS_OK) throw NIException(status, "VideoStream::create(Device, openni::SENSOR_COLOR");
status = Video.start();
if (status != openni::STATUS_OK) throw NIException(status, "VideoStream::start()");
}
// MOVE HERE
if (Device.getSensorInfo(openni::SENSOR_DEPTH) != nullptr)
{
status = Video.create(Device, openni::SENSOR_DEPTH);
if (status != openni::STATUS_OK) throw NIException(status, "VideoStream::create(Device, openni::SENSOR_DEPTH");
status = Video.start();
if (status != openni::STATUS_OK) throw NIException(status, "VideoStream::start()");
}
status = Video.addNewFrameListener(this);
if (status != openni::STATUS_OK) throw NIException(status, "VideoStream::addNewFrameListener(&FrameListener)");
Now you use Video for both color and depth.
I think you should use different VideoStream objects for color and depth.
Good point. Now I have this code:
openni::Status status = openni::OpenNI::initialize();
if (status != openni::STATUS_OK) throw NIException(status, "OpenNI::initialize()");
status = Device.open(openni::ANY_DEVICE);
if (status != openni::STATUS_OK) throw NIException(status, "Device::open(openni::ANY_DEVICE)");
if (Device.getSensorInfo(openni::SENSOR_COLOR) != nullptr)
{
status = Color.create(Device, openni::SENSOR_COLOR);
if (status != openni::STATUS_OK) throw NIException(status, "VideoStream::create(Device, openni::SENSOR_COLOR");
status = Color.start();
if (status != openni::STATUS_OK) throw NIException(status, "VideoStream::start()");
status = Color.addNewFrameListener(this);
if (status != openni::STATUS_OK) throw NIException(status, "VideoStream::addNewFrameListener(&FrameListener)");
}
if (Device.getSensorInfo(openni::SENSOR_DEPTH) != nullptr)
{
status = Depth.create(Device, openni::SENSOR_DEPTH);
if (status != openni::STATUS_OK) throw NIException(status, "VideoStream::create(Device, openni::SENSOR_DEPTH");
status = Depth.start();
if (status != openni::STATUS_OK) throw NIException(status, "VideoStream::start()");
status = Depth.addNewFrameListener(this);
if (status != openni::STATUS_OK) throw NIException(status, "VideoStream::addNewFrameListener(&FrameListener)");
}
The second call to addNewFrameListener() returns 1. If I comment it out, I still get both color and depth frames. Could it be that frame listeners are shared across streams?
It looks that you try to use one Listener (this?) for both depth and color.
But as what I think, you should use different Listener for different VideoStream.
I guess that makes sense. But I still wonder why I get depth frames even if I don't call Depth.addNewFrameListener().
VideoStream::addNewFrameListener() also segfaults when you pass it nullptr.