rust icon indicating copy to clipboard operation
rust copied to clipboard

can't run minimizer without feeding inputs even though I have output and label vectors.

Open Corallus-Caninus opened this issue 2 years ago • 2 comments

I am trying to backprop/minimize my network without the input vector since I already have output and label vectors

This is the relevant code I'm trying to implement:

...
        //backprop
        let mut run_args = SessionRunArgs::new();
        run_args.add_target(&self.minimize);

        let error_squared_fetch = run_args.request_fetch(&self.Error, 0);
        // set output feed manually
        //TODO: runtime says we need to feed input
        run_args.add_feed(&self.Output_op, 0, &output);
        run_args.add_feed(&self.Label, 0, &labels);
        self.session.run(&mut run_args)?;

        let res: Tensor<f32> = run_args.fetch(error_squared_fetch)?;
...

where Output_op and Label are my output and label operations respectively and output and labels are my output and label tensors. self.minimize is either GradientDescent optimizer or Adadelta optimizer. Error operation is defined as a function of output and label exclusively. The network is very similar to the xor example in this repository and is from my NormNet repo (its very messy and initial so beware).

Based on my understanding of backprop this should be possible. Is this feature missing or did I make a mistake? Please let me know how I can clarify this further.

stdout log from runtime:

thread 'tests::test_evaluate' panicked at 'called Result::unwrap() on an Err value: {inner:0x2147c7c9480, InvalidArgument: You must feed a value for placeholder tensor 'input' with dtype float and shape [1,2] [[{{node input}}]]}', src\lib.rs:1197:94 note: run with RUST_BACKTRACE=1 environment variable to display a backtrace test tests::test_evaluate ... FAILED

Corallus-Caninus avatar Mar 02 '22 06:03 Corallus-Caninus

I am unable to "backfeed" the output_op. It seems it is overridden by the input op placeholder since the network forward propagates anyways, after feeding to the Output_op operation. I believe I have to rework my graph, but this seems like it should be possible. please advise on the order of operations in the graph (pun unintended). From what I have gathered feeds are set in the graph but overriden if mutated during session.run() calls.

Corallus-Caninus avatar Mar 07 '22 23:03 Corallus-Caninus

this may just be me not knowing what to do for the Tensorflow graph and not specific to Tensorflow-rs

Corallus-Caninus avatar Apr 26 '22 01:04 Corallus-Caninus