LLamaSharp
LLamaSharp copied to clipboard
dotnet publish issue - multiple publish output files with same relative path
dotnet publish/src/Tools/Chat/Chat.csproj -c Release -o /out -p:SelfContained=true -p:PublishReadyToRun=true
C:\Program Files\dotnet\sdk\8.0.100\Sdks\Microsoft.NET.Sdk\targets\Microsoft.NET.ConflictResolution.targets(112,5): error NETSDK1152: Fo
und multiple publish output files with the same relative path: C:\Users\kk\.nuget\packages\llamasharp.backend.cpu\0.8.1\runtimes\linux
-x64\native\avx\libllama.so, C:\Users\jrlh\.nuget\packages\llamasharp.backend.cpu\0.8.1\runtimes\linux-x64\native\avx2\libllama.so, C:\U
sers\kk\.nuget\packages\llamasharp.backend.cpu\0.8.1\runtimes\linux-x64\native\avx512\libllama.so, C:\Users\kk\.nuget\packages\llama
sharp.backend.cpu\0.8.1\runtimes\linux-x64\native\libllama.so.
Same here for Windows x64 publishing
It appears that the issue here is publish tries to place all of the various binaries (e.g. avx\libllama.so
, avx2\libllama.so
, avx512\libllama.so
etc) into the root folder. Since they all have the same name and are being placed in the same directory this causes a conflict. They should be placed into separate folders, I'm not sure why publish isn't respecting that.
Publishing LLama.Examples
from the main project does work, so this must be an issue with the way the nuget packages are set up. Unfortunately I don't know much about that at all. Is this something you'd be interested in investigating further @JohnGalt1717 ?
Perhaps related to this? https://github.com/SciSharp/LLamaSharp/blob/master/LLama/runtimes/build/LLamaSharpBackend.props
Looks like it tells it to copy all the files to the same directory.
Perhaps related to this? https://github.com/SciSharp/LLamaSharp/blob/master/LLama/runtimes/build/LLamaSharpBackend.props
Looks like it tells it to copy all the files to the same directory.
I would agree, however I don't have experience with props so I wouldn't feel comfortable changing this.
There is a ignore duplicates flag in the csproj that might be of use, but needs to be made at the package level from what I can tell (doesn't work in any of my projects)
Yeah to be honest I don't have any experience with the .props file either.
There is a ignore duplicates flag in the csproj that might be of use
I don't think that would work. There are a load of different binaries (noavx, AVX, AVX2, AVX512 etc) which all need to be in different folders, the bug here seems to be that they're all copies to the same place.
Ignoring duplicates would just make your build unreliable, as one of those binaries would basically be selected at random!
Perhaps related to this? https://github.com/SciSharp/LLamaSharp/blob/master/LLama/runtimes/build/LLamaSharpBackend.props Looks like it tells it to copy all the files to the same directory.
I would agree, however I don't have experience with props so I wouldn't feel comfortable changing this.
There is a ignore duplicates flag in the csproj that might be of use, but needs to be made at the package level from what I can tell (doesn't work in any of my projects)
Ignore duplicates will cause avx\libllama.so
, avx2\libllama.so
, and avx512\libllama.so
to overwrite each other.
I just encountered this issue again myself and dug into it. I think #561 should fix it.
Unfortunately that fix didn't work :(
I'm using the following workaround to publish native library files in relative paths from the backend packages. This will let you publish for now.
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<ErrorOnDuplicatePublishOutputFiles>false</ErrorOnDuplicatePublishOutputFiles>
<RestorePackagesPath>bin\$(Configuration)\.nuget\packages</RestorePackagesPath>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="LLamaSharp" Version="0.12.0" />
<PackageReference Include="LLamaSharp.Backend.Cpu" Version="0.12.0" />
<PackageReference Include="LLamaSharp.Backend.Cuda11" Version="0.12.0" />
<PackageReference Include="LLamaSharp.Backend.Cuda12" Version="0.12.0" />
<PackageReference Include="LLamaSharp.Backend.OpenCL" Version="0.12.0" />
</ItemGroup>
<ItemGroup>
<LlamaSharpBackendCpu Include="$(RestorePackagesPath)\llamasharp.backend.cpu\0.12.0\runtimes\**\*.*" />
<LlamaSharpBackendCuda11 Include="$(RestorePackagesPath)\llamasharp.backend.cuda11\0.12.0\runtimes\**\*.*" />
<LlamaSharpBackendCuda12 Include="$(RestorePackagesPath)\llamasharp.backend.cuda12\0.12.0\runtimes\**\*.*" />
<LlamaSharpBackendOpenCL Include="$(RestorePackagesPath)\llamasharp.backend.opencl\0.12.0\runtimes\**\*.*" />
</ItemGroup>
<Target Name="CopyRuntimesFolderOnPublish" AfterTargets="Publish">
<Delete Files="$(PublishDir)llama.dll" />
<Delete Files="$(PublishDir)llava_shared.dll" />
<Copy SourceFiles="@(LlamaSharpBackendCpu)" DestinationFolder="$(PublishDir)\runtimes\%(RecursiveDir)" />
<Copy SourceFiles="@(LlamaSharpBackendCuda11)" DestinationFolder="$(PublishDir)\runtimes\%(RecursiveDir)" />
<Copy SourceFiles="@(LlamaSharpBackendCuda12)" DestinationFolder="$(PublishDir)\runtimes\%(RecursiveDir)" />
<Copy SourceFiles="@(LlamaSharpBackendOpenCL)" DestinationFolder="$(PublishDir)\runtimes\%(RecursiveDir)" />
</Target>
</Project>
Also struggling with this issue. It seems to occur for any x64 publish target, though I haven't extensively tested the target runtimes available
Apple MacBook Air M2
I'm also having this issue. I'm trying to embed it into Godot. When using the workaround above, it publishes successfully and creates the app bundle. However, when running Llama it gives me this error:
System.TypeInitializationException: The type initializer for 'LLama.Native.NativeApi' threw an exception. ---> LLama.Exceptions.RuntimeError: The native library cannot be correctly loaded. It could be one of the following reasons:
- No LLamaSharp backend was installed. Please search LLamaSharp.Backend and install one of them.
- You are using a device with only CPU but installed cuda backend. Please install cpu backend instead.
- One of the dependency of the native library is missed. Please use
ldd
on linux,dumpbin
on windows andotool
to check if all the dependency of the native library is satisfied. Generally you could find the libraries under your output folder. - Try to compile llama.cpp yourself to generate a libllama library, then use
LLama.Native.NativeLibraryConfig.WithLibrary
to specify it at the very beginning of your code. For more information about compilation, please refer to LLamaSharp repo on github.
Meaning that it's somehow not installed when exporting the project.