You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
I reviewed the Discussions, and have a new and useful enhancement to share.
Feature Description
What I have done
I have succeeded in building project on Win laptop by following "Build on Android" and running it with Android Debug Bridge.
I also succeeded in building project on Win by running cmake -B build -DGGML_VULKAN=ON cmake --build build --config Release on cmd and running examples on my Win laptop with Nvidia GPU recognized.
-- The C compiler identification is Clang 18.0.1
-- The CXX compiler identification is Clang 18.0.1
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/Users/Xinyu/AppData/Local/Android/Sdk/ndk/27.0.12077973/toolchains/llvm/prebuilt/windows-x86_64/bin/clang.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/Users/Xinyu/AppData/Local/Android/Sdk/ndk/27.0.12077973/toolchains/llvm/prebuilt/windows-x86_64/bin/clang++.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: C:/Users/Xinyu/Git/cmd/git.exe (found version "2.45.2.windows.1")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Check if compiler accepts -pthread
-- Check if compiler accepts -pthread - yes
-- Found Threads: TRUE
-- Found OpenMP_C: -fopenmp=libomp
-- Found OpenMP_CXX: -fopenmp=libomp
-- Found OpenMP: TRUE
-- OpenMP found
-- Using llamafile
-- Found Vulkan: C:/VulkanSDK/1.3.283.0/Lib/vulkan-1.lib (found version "1.3.283") found components: glslc glslangValidator
-- Vulkan found
-- ccache found, compilation results will be cached. Disable with GGML_CCACHE=OFF.
-- CMAKE_SYSTEM_PROCESSOR: x86_64
-- x86 detected
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - not found
-- Configuring done (6.0s)
-- Generating done (0.3s)
-- Build files have been written to: C:/Users/Xinyu/llama-adb-vulkan/llama.cpp/build
However when I ran `ninja`, error happens:
![image](https://github.com/user-attachments/assets/0cae0ba1-3f25-4984-96fa-44790c70c9a8)
![image](https://github.com/user-attachments/assets/cb950ab7-a9bd-4939-9442-d02fdf697b44)
### What I expect
I hope to build binary files with Vulkan support that can run on my Android device.
Motivation
Some Android Devices support vulkan for rendering. And Llama.cpp supports vulkan for inference accelaration as well. If project with vulkan support that can run on Android devices can be built, it will be much better to deploy LLMs locally on edge devices.
Possible Implementation
No response
The text was updated successfully, but these errors were encountered:
Prerequisites
Feature Description
What I have done
I have succeeded in building project on Win laptop by following "Build on Android" and running it with Android Debug Bridge.
I also succeeded in building project on Win by running
cmake -B build -DGGML_VULKAN=ON cmake --build build --config Release
on cmd and running examples on my Win laptop with Nvidia GPU recognized.Problem I meet
Then I try running on Win cmd:
Project was built.
Log
-- The C compiler identification is Clang 18.0.1
-- The CXX compiler identification is Clang 18.0.1
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/Users/Xinyu/AppData/Local/Android/Sdk/ndk/27.0.12077973/toolchains/llvm/prebuilt/windows-x86_64/bin/clang.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/Users/Xinyu/AppData/Local/Android/Sdk/ndk/27.0.12077973/toolchains/llvm/prebuilt/windows-x86_64/bin/clang++.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: C:/Users/Xinyu/Git/cmd/git.exe (found version "2.45.2.windows.1")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Check if compiler accepts -pthread
-- Check if compiler accepts -pthread - yes
-- Found Threads: TRUE
-- Found OpenMP_C: -fopenmp=libomp
-- Found OpenMP_CXX: -fopenmp=libomp
-- Found OpenMP: TRUE
-- OpenMP found
-- Using llamafile
-- Found Vulkan: C:/VulkanSDK/1.3.283.0/Lib/vulkan-1.lib (found version "1.3.283") found components: glslc glslangValidator
-- Vulkan found
-- ccache found, compilation results will be cached. Disable with GGML_CCACHE=OFF.
-- CMAKE_SYSTEM_PROCESSOR: x86_64
-- x86 detected
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - not found
-- Configuring done (6.0s)
-- Generating done (0.3s)
-- Build files have been written to: C:/Users/Xinyu/llama-adb-vulkan/llama.cpp/build
Motivation
Some Android Devices support vulkan for rendering. And Llama.cpp supports vulkan for inference accelaration as well. If project with vulkan support that can run on Android devices can be built, it will be much better to deploy LLMs locally on edge devices.
Possible Implementation
No response
The text was updated successfully, but these errors were encountered: