TensorFlow.NET
TensorFlow.NET copied to clipboard
[Question]: load image from memory ( byte[] decoded with jpeg)
Description
Hello
I'm trying to load image from memory (array of byte), decode to jpeg, so I created a tensor with the byte[] to pass it to decode_jpeg, but it returns me a type error ;
Tensorflow.InvalidArgumentError : 'Input 'contents' passed uint8 expected string while building NodeDef 'DecodeJpeg' using Op<name=DecodeJpeg; signature=contents:string -> image:uint8; attr=channels:int,default=0; attr=ratio:int,default=1; attr=fancy_upscaling:bool,default=true; attr=try_recover_truncated:bool,default=false; attr=acceptable_fraction:float,default=1; attr=dct_method:string,default="">'
Thanks
Alternatives
No response
Hi, the following code is an example to load a jpeg image (from file) and in unit test it works well. Could you please give a minimal example to reproduce your error? According to the error message, I guess that you passed a tensor with uint8 as dtype to tf.image.decode_jpeg
but I'm not sure.
var contents = tf.io.read_file(imgPath);
var img = tf.image.decode_image(contents);
Hi, I modified the ReadTensorFromImageFile function so that I could load the image byte[], I created a tensor then passed it in decode_jpeg, I no longer have the previous error, but it generated another error , the code is as follows:
`private NDArray ReadTensorFromMemory(byte[] data,
int input_height = 299,
int input_width = 299,
int input_mean = 0,
int input_std = 255)
{
var graph = tf.Graph().as_default();
string dataa = Encoding.UTF8.GetString(data, 0, data.Length);
var dataT = tf.convert_to_tensor(dataa);
var image_reader = tf.image.decode_jpeg(dataT, channels: 3);
var caster = tf.cast(image_reader, tf.float32);
var dims_expander = tf.expand_dims(caster, 0);
var resize = tf.constant(new int[] { input_height, input_width });
var bilinear = tf.image.resize_bilinear(dims_expander, resize);
var sub = tf.subtract(bilinear, new float[] { input_mean });
var normalized = tf.divide(sub, new float[] { input_std });
using (var sess = tf.Session(graph))
return sess.run(normalized);
}`
Hi, it seems that tensorflow cannot recognize the image format. Does the data
in your code contains the header information of jpeg?
Hi, i have a jpeg image, i used a function to convert it to byte[], then pass it to decode_jpeg for processing, but i always get this error, the convert function is following
`public byte[] GetBytes(Image image) // or Image
{
MemoryStream stream= new MemoryStream();
image.Save(stream, ImageFormat.Jpeg);
return stream.ToArray();
}`
Thanks
I'm not sure whether the byte data is not correct or a certain implementation of tf.net is not correct. Did you try this in tensorflow python?
Hi, I want to load byte[] too. Is there any progress on this issue? thanks