Deep-Learning-Experiments
Deep-Learning-Experiments copied to clipboard
How long does it take to run 10K iterations?
How long does it take to run 10K iterations on a 2.7 GHZ intel Core i5 processor Mac book?
I was at 150 iterations after so many hours? Is there a problem with code, system or I need a GPU?
Did you get your answer?
You have to understand that deep neural network architectures demand enormous amount of parallelism. Executing these types of workload on a CPU is futile and unproductive. Even a simple model that runs for 30 minutes on a GPU usually takes a day or so on CPUs.
On Wed, Feb 28, 2018, 7:51 PM Raynold Gyasi [email protected] wrote:
How long does it take to run 10K iterations on a 2.7 GHZ intel Core i5 processor Mac book?
I was at 150 iterations after so many hours? Is there a problem with code, system or I need a GPU?
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/roatienza/Deep-Learning-Experiments/issues/12, or mute the thread https://github.com/notifications/unsubscribe-auth/AHDI_KGTfrBFva7UkS7eN7nCVb_-HD67ks5tZT25gaJpZM4SWh12 .
Nvidia Geforce 1060 GPU 6GB; I am able to get to 1000 iterations in about 2 minutes. You will want to make sure you install tensorflow-gpu once you do get your gpu; otherwise, it will use the cpu still.
Is it the Nvidia GEFORCE GTX 1060 which costs $299? https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1060/ Please provide more details on how or a link to the project if you can Thanks
Yeah, with 6GB of RAM. It mentions it's sold out atm. Amazon shows only a used option. I am not sure if they don't make it anymore or what.
Also, if it's helpful I am using an AMD Ryzen 7 2700X 8-core Processor. While running the GAN training, I am at about 56% usage on GPU & 10% on CPU with about 8 GB of RAM being used.
Total Time: 52 minutes
I would prefer to be able to run 10, 000 iterations in about 2 - 5 minutes. Any processors out there that can do this in 2019? Please update. Thanks
I have used google colab. It did well.
On Fri, Mar 29, 2019, 22:29 Raynold Gyasi <[email protected] wrote:
I would prefer to be able to run 10, 000 iterations in about 2 - 5 minutes. Any processors out there that can do this in 2019? Please update. Thanks
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/roatienza/Deep-Learning-Experiments/issues/12#issuecomment-478093582, or mute the thread https://github.com/notifications/unsubscribe-auth/AnHjks1GAqVhjPusptAAU1dto8ZnKc93ks5vblRjgaJpZM4SWh12 .
It did well is too vague!!
Please provide exact details. How many minutes did it take to run 10000 iterations? What GPUs?
Scientific and empirical evidence only.
We will test all reported results to prove its authenticity!!!!!! We believe in proven facts!!!!
On Fri, Mar 29, 2019 at 9:36 PM daryabiparva [email protected] wrote:
I have used google colab. It did well.
On Fri, Mar 29, 2019, 22:29 Raynold Gyasi <[email protected] wrote:
I would prefer to be able to run 10, 000 iterations in about 2 - 5 minutes. Any processors out there that can do this in 2019? Please update. Thanks
— You are receiving this because you commented. Reply to this email directly, view it on GitHub < https://github.com/roatienza/Deep-Learning-Experiments/issues/12#issuecomment-478093582 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AnHjks1GAqVhjPusptAAU1dto8ZnKc93ks5vblRjgaJpZM4SWh12
.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/roatienza/Deep-Learning-Experiments/issues/12#issuecomment-478194358, or mute the thread https://github.com/notifications/unsubscribe-auth/AH-EmlHc4Tnzfr8o9jbF73n8c9oNItyxks5vbr-ygaJpZM4SWh12 .
Sorry.
It was couple of month ago. I do not remember the exact details. I think It took about 1 hour or less to finish 10000 iterations. I do not have a GPU. So google colab is my only chance. Sorry If I could not provide exact details.
On Sat, Mar 30, 2019, 06:13 Raynold Gyasi <[email protected] wrote:
It did well is too vague!!
Please provide exact details. How many minutes did it take to run 10000 iterations? What GPUs?
Scientific and empirical evidence only.
We will test all reported results to prove its authenticity!!!!!! We believe in proven facts!!!!
On Fri, Mar 29, 2019 at 9:36 PM daryabiparva [email protected] wrote:
I have used google colab. It did well.
On Fri, Mar 29, 2019, 22:29 Raynold Gyasi <[email protected] wrote:
I would prefer to be able to run 10, 000 iterations in about 2 - 5 minutes. Any processors out there that can do this in 2019? Please update. Thanks
— You are receiving this because you commented. Reply to this email directly, view it on GitHub <
https://github.com/roatienza/Deep-Learning-Experiments/issues/12#issuecomment-478093582
, or mute the thread <
https://github.com/notifications/unsubscribe-auth/AnHjks1GAqVhjPusptAAU1dto8ZnKc93ks5vblRjgaJpZM4SWh12
.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub < https://github.com/roatienza/Deep-Learning-Experiments/issues/12#issuecomment-478194358 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AH-EmlHc4Tnzfr8o9jbF73n8c9oNItyxks5vbr-ygaJpZM4SWh12
.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/roatienza/Deep-Learning-Experiments/issues/12#issuecomment-478194877, or mute the thread https://github.com/notifications/unsubscribe-auth/AnHjkrMZYg3SOpl-Ylo_lTKnqXMOPGnyks5vbsFRgaJpZM4SWh12 .
Hi daryabiparva, please kindly run the experiment again now so we all know, Sir. google Colab is free to use, please. No need to say Sorry when you can use something for free and help others.
Best regards, Thanks