pytorch-scripts icon indicating copy to clipboard operation
pytorch-scripts copied to clipboard

Static LibTorch mobile build for windows

Open janakg opened this issue 4 years ago • 5 comments

@peterjc123 Would it be possible to build a static LibTorch build for Windows, right now we are facing a couple of issues

  1. The final lib file is too big, it is around 740MB. the corresponding builds in Mac/Ubuntu is around 64MB. Even though we switched off the USE_MKL
  2. It gives us a linking error while loading a touch script model. https://github.com/pytorch/pytorch/issues/14367

It would be helpful if you have any suggestions to get a static library

janakg avatar May 26 '20 15:05 janakg

@janakg

  1. The final lib file is too big, it is around 740MB. the corresponding builds in Mac/Ubuntu is around 64MB. Even though we switched off the USE_MKL

The available options are:

  1. Use the CMake configuration MinSizeRel (maybe you are already using it)
  2. Use clang on Windows (This significantly lowers the size because it tried to do more inlining, but it has some issues if you raise some exceptions at runtime. Please refer to https://github.com/pytorch/pytorch/pull/35145 )

2. It gives us a linking error while loading a touch script model. pytorch/pytorch#14367

From the latest comment, it seems that you'll need to use /WHOLEARCHIVE in the linker args.

BTW, I'm just curious about your use case. What do you use LibTorch mobile build on Windows for?

peterjc123 avatar May 26 '20 15:05 peterjc123

Thanks! @peterjc123 for the response. We'll try

We use client-side inference for semantic text analysis, and we package it with the desktop application. Currently using OpenCV DNN for image models and works perfectly. But for text and a few advanced use cases DNN doesn't support a few network layers so trying to bring LibTorch in our system.

We have built a static LibTorch Mobile library for Linux and Mac and it works. Windows is something new for us and also a bit tricky.

janakg avatar May 27 '20 02:05 janakg

@peterjc123 We have slightly modified pytorch/scripts/build_mobile.sh to run in Windows.

torch_cpu.lib size for Release is 711MB and MinSizeRel reduced it to 645MB But running with clang has not reduced it further. Are we missing something here?

Set-up C and CXX complier --> clang-cl Generator --> Visual Studio 16 2019 We are currently not using Ninja. Do you have any suggestion for us?

santhiya-v avatar May 29 '20 12:05 santhiya-v

As for Clang builds, you could try http://blog.llvm.org/2018/11/30-faster-windows-builds-with-clang-cl_14.html and make sure you use lld-link over link.exe.

peterjc123 avatar May 29 '20 14:05 peterjc123

Also could you please check that you have turned off all the debugging flags like /Z7 or /Zi or /DEBUG:FULL in the build?

peterjc123 avatar May 29 '20 14:05 peterjc123