Use ByteBuffers for all image types
TensorFlow may use ByteBuffers directly for creating the tensors while other buffers are copied.
From the Documentation of the Tensor class:
create (long[] shape, FloatBuffer data):
Create a Float Tensor with data from the given buffer.
Creates a Tensor with the given shape by copying elements from the buffer (starting from its current position) into the tensor. For example, if shape = {2,3} (which represents a 2x3 matrix) then the buffer must have 6 elements remaining, which will be consumed by this method.
create (Class<T> type, long[] shape, ByteBuffer data):
Create a Tensor of any type with data from the given buffer.
Creates a Tensor with the provided shape of any type where the tensor's data has been encoded into data as per the specification of the TensorFlow C API.
Has this issue been resolved? I noticed the following lines of code in Tensors.java which to my Java-uneducated eyes appears to have fixed the issue.
public static Tensor<Float> tensorFloat(
final RandomAccessibleInterval<FloatType> image)
{
final float[] value = floatArray(image);
FloatBuffer buffer = FloatBuffer.wrap(value);
return Tensor.create(shape(image), buffer);
}
No, the issue hasn't been solved. The code you showed creates a java.nio.FloatBuffer object and calls the first method mentioned in the issue description.
This method copies the content of the buffer into the tensor (according to the documentation).
If the data of the image was written into a java.nio.ByteBuffer the second method could be called which doesn't state that the content is copied (I thinkt that this method would use the data directly from where it is in memory).
I am not entirely sure about the implementations of the methods. My assumptions are based on the documentation which doesn't state this very clearly. I would be happy if someone could take a look into the implementation and confirm this.