dlib icon indicating copy to clipboard operation
dlib copied to clipboard

Add input_tensor input type

Open kSkip opened this issue 1 year ago • 3 comments

These changes add a new input type that enables feeding networks with batches of data that already reside in device memory.

For context, I am developing a robot simulator for batch reinforcement learning. A deep Q-network receives inputs generated from an OpenGL pipeline that renders a camera view representation of the world to a texture. This texture can be read using CUDA graphics interoperability. The experiences then accumulate in device memory. To avoid the round trip from host to device, I added this input layer.

I think the functionality could be useful beyond my application.

kSkip avatar Apr 26 '24 16:04 kSkip

Thanks for the PR.

Why not just call .forward() with your tensor you have on hand though? Like instead of calling operator() you can call forward() and give that the tensor on device directly.

davisking avatar Apr 27 '24 13:04 davisking

As I understand, .forward() only supports feeding a single tensor that is already assembled. I can concatenate the tensors from replay memory before hand, but is that not the responsibility of the network input?

There is a larger issue though with training. The dnn_trainer expects the input to properly support to_tensor() for the expected input_type. Therefore, if I do not define this input class, I cannot train the model like below

// "replay" is a container of tensors that were read from device memory
auto batch = sample(replay.begin(), replay.end(), batch_size, rng);
trainer.train_one_step(batch.begin(), batch.end(), target_values.begin());

I would have to develop my own trainer.

For inference, the same situation applies. I will be streaming video frames from a camera connected to a Jetson system. nvarguscamerasrc is used to capture the frames, and they reside in device memory.

Let me know if I am missing something, and thanks!

kSkip avatar Apr 28 '24 15:04 kSkip

Ah, didn't realize you wanted to train with it. Yeah this is cool, makes sense :D

Can you add a short unit test to check that it works and then I'll merge it?

davisking avatar May 04 '24 14:05 davisking

Awesome. I added a unit test. Let me know what you think.

kSkip avatar May 06 '24 17:05 kSkip

Nice, thanks for the PR :)

davisking avatar May 12 '24 00:05 davisking

Of course. Btw, dlib is great!

kSkip avatar May 12 '24 16:05 kSkip