devicehive-video-analysis
devicehive-video-analysis copied to clipboard
Using this sample with GPU support
Hi And thanks for this repo.
I got it to work "OUT OF THE BOX" with tensorflow cpu 1.4.0 in the requirements.txt. But it does not work with any other cpu version of tensorflow nor 1.4.0-gpu or any other gpu version. Tried hard in the last couple days (even installed ubuntu more than 10 times breaking and starting all over again) but no success so far. only TF1.4.0 cpu. Other examples i use works flawlessly with my GPU. GTX1080.
Any idea why and how to fix this excellent example code to work with gpu even newer version of tensorflow?
10X --{-@
This is what i get when i use TF1.4.0-gpu.
`$ python eval.py
Traceback (most recent call last):
File "eval.py", line 20, in
Failed to load the native TensorFlow runtime.`
@sa7ina This error is because you done have CUDA 8.0 libraries installed
@igor-panteleev I tried using tensorflow-gpu 1.5 since it uses CUDA 9.0 libraries but I get the following error. Can you take a look?
(va3) anand@Alienware15:~/objd/devicehive-video-analysis$ python eval.py Traceback (most recent call last): File "/home/anand/.virtualenvs/va3/lib/python3.6/site-packages/absl/flags/_flag.py", line 166, in _parse return self.parser.parse(argument) File "/home/anand/.virtualenvs/va3/lib/python3.6/site-packages/absl/flags/_argument_parser.py", line 114, in parse type(argument))) TypeError: flag value must be a string, found "<class 'int'>"
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "eval.py", line 123, in
@saket424: i had the same Problem, i made it when i changed tf.flags.DEFINE_string('video', 0, 'Path to the video file.') (the 0 is here an int) to tf.flags.DEFINE_string('video', '0', 'Path to the video file.') (the 0 is here an string)
And for Webcam mode i changed
cam = cv2.VideoCapture(Video)
to
cam = cv2.VideoCapture(0)
it runs for me under Windows 10 (64bit), with Tensorflow 1.8 gpu and python 3.5.5 with constant 8 FPS i hopw it helps u
@MartinWeniger, Thanks for this tips. With those two changes to eval.py that you outlined, I got it working using tensorflow-gpu 1.8 and I am getting 15fps on a nvidia geforce GTX 1080ti