52

升级gcc解决编译llama-cpp-python错误

 11 months ago
source link: https://finisky.github.io/build-llama-cpp-python-error-solution/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

升级gcc解决编译llama-cpp-python错误

2023-05-11Linux

安装 text-generation-webui

~/text-generation-webui$ pip install -r requirements.txt

遇到错误:

...
Building wheels for collected packages: llama-cpp-python, peft
  Building wheel for llama-cpp-python (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [131 lines of output]

...

      -- Configuring done (0.3s)
      -- Generating done (0.0s)
      -- Build files have been written to: /tmp/pip-install-rzg2ayje/llama-cpp-python_ae749d81f3f74840a75abef221510a55/_skbuild/linux-x86_64-3.9/cmake-build
      [1/2] Generating /tmp/pip-install-rzg2ayje/llama-cpp-python_ae749d81f3f74840a75abef221510a55/vendor/llama.cpp/libllama.so
      FAILED: /tmp/pip-install-rzg2ayje/llama-cpp-python_ae749d81f3f74840a75abef221510a55/vendor/llama.cpp/libllama.so
      cd /tmp/pip-install-rzg2ayje/llama-cpp-python_ae749d81f3f74840a75abef221510a55/vendor/llama.cpp && make libllama.so
      I llama.cpp build info:
      I UNAME_S:  Linux
      I UNAME_P:  x86_64
      I UNAME_M:  x86_64
      I CFLAGS:   -I.              -O3 -std=c11   -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -pthread -march=native -mtune=native
      I CXXFLAGS: -I. -I./examples -O3 -std=c++11 -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native
      I LDFLAGS:
      I CC:       cc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0
      I CXX:      g++ (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0

      g++ -I. -I./examples -O3 -std=c++11 -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native -c llama.cpp -o llama.o
      llama.cpp: In function ‘size_t llama_set_state_data(llama_context*, const uint8_t*)’:
      llama.cpp:2610:36: warning: cast from type ‘const uint8_t* {aka const unsigned char*}’ to type ‘void*’ casts away qualifiers [-Wcast-qual]
                   kin3d->data = (void *) in;
                                          ^~
      llama.cpp:2614:36: warning: cast from type ‘const uint8_t* {aka const unsigned char*}’ to type ‘void*’ casts away qualifiers [-Wcast-qual]
                   vin3d->data = (void *) in;
                                          ^~
      cc  -I.              -O3 -std=c11   -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -pthread -march=native -mtune=native   -c ggml.c -o ggml.o
      ggml.c: In function ‘ggml_vec_dot_q4_2_q8_0’:
      ggml.c:3253:40: warning: implicit declaration of function ‘_mm256_set_m128’; did you mean ‘_mm256_set_epi8’? [-Wimplicit-function-declaration]
               const __m256 d = _mm256_mul_ps(_mm256_set_m128(d1, d0), _mm256_broadcast_ss(&y[i].d));
                                              ^~~~~~~~~~~~~~~
                                              _mm256_set_epi8
      ggml.c:3253:40: error: incompatible type for argument 1 of ‘_mm256_mul_ps’
      In file included from /usr/lib/gcc/x86_64-linux-gnu/7/include/immintrin.h:41:0,
                       from ggml.c:189:
      /usr/lib/gcc/x86_64-linux-gnu/7/include/avxintrin.h:318:1: note: expected ‘__m256 {aka __vector(8) float}’ but argument is of type ‘int’
       _mm256_mul_ps (__m256 __A, __m256 __B)
       ^~~~~~~~~~~~~
      ggml.c:3257:22: warning: implicit declaration of function ‘_mm256_set_m128i’; did you mean ‘_mm256_set_epi8’? [-Wimplicit-function-declaration]
               __m256i bx = _mm256_set_m128i(bx1, bx0);
                            ^~~~~~~~~~~~~~~~
                            _mm256_set_epi8
      ggml.c:3257:22: error: incompatible types when initializing type ‘__m256i {aka __vector(4) long long int}’ using type ‘int’
      Makefile:185: recipe for target 'ggml.o' failed
      make: *** [ggml.o] Error 1
      ninja: build stopped: subcommand failed.
      Traceback (most recent call last):
        File "/tmp/pip-build-env-cm_vdfny/overlay/lib/python3.9/site-packages/skbuild/setuptools_wrap.py", line 674, in setup
          cmkr.make(make_args, install_target=cmake_install_target, env=env)
        File "/tmp/pip-build-env-cm_vdfny/overlay/lib/python3.9/site-packages/skbuild/cmaker.py", line 696, in make
          self.make_impl(clargs=clargs, config=config, source_dir=source_dir, install_target=install_target, env=env)
        File "/tmp/pip-build-env-cm_vdfny/overlay/lib/python3.9/site-packages/skbuild/cmaker.py", line 741, in make_impl
          raise SKBuildError(msg)

      An error occurred while building with CMake.
        Command:
          /tmp/pip-build-env-cm_vdfny/overlay/lib/python3.9/site-packages/cmake/data/bin/cmake --build . --target install --config Release --
        Install target:
          install
        Source directory:
          /tmp/pip-install-rzg2ayje/llama-cpp-python_ae749d81f3f74840a75abef221510a55
        Working directory:
          /tmp/pip-install-rzg2ayje/llama-cpp-python_ae749d81f3f74840a75abef221510a55/_skbuild/linux-x86_64-3.9/cmake-build
      Please check the install target is valid and see CMake's output for more information.

      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for llama-cpp-python
  Building wheel for peft (pyproject.toml) ... done
  Created wheel for peft: filename=peft-0.4.0.dev0-py3-none-any.whl size=56306 sha256=aeadc3499eecab0d78df3bb60230221046062dd49030b88f0f36d154a1b01fcd
  Stored in directory: /tmp/pip-ephem-wheel-cache-x1u8mjti/wheels/45/06/33/0048c03714539b315d99beef9ea6b6dad0fd5750105e221583
Successfully built peft
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects

发现是gcc/g++版本过低所致,Ubuntu 18.04默认的gcc版本为cc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0,通过apt无法直接升级至gcc 11。

Ubuntu 18.04安装g++/gcc 11

首先增加ppa:

$ add-apt-repository ppa:ubuntu-toolchain-r/test

如下载原始国外源http://ppa.launchpad.net/很慢,则修改/etc/apt/sources.list.d/ubuntu-toolchain-r-ubuntu-test-bionic.list为中科大镜像(https):

deb https://launchpad.proxy.ustclug.org/ubuntu-toolchain-r/test/ubuntu bionic main
#deb http://ppa.launchpad.net/ubuntu-toolchain-r/test/ubuntu bionic main
# deb-src http://ppa.launchpad.net/ubuntu-toolchain-r/test/ubuntu bionic main

不幸apt update又遇到问题:

$ apt update
Hit:1 http://mirrors.cloud.aliyuncs.com/ubuntu bionic InRelease
Hit:2 http://mirrors.cloud.aliyuncs.com/ubuntu bionic-updates InRelease
Hit:3 http://mirrors.cloud.aliyuncs.com/ubuntu bionic-backports InRelease
Hit:4 http://mirrors.cloud.aliyuncs.com/ubuntu bionic-security InRelease
Hit:5 https://packagecloud.io/github/git-lfs/ubuntu bionic InRelease
Err:6 http://launchpad.proxy.ustclug.org/ubuntu-toolchain-r/test/ubuntu bionic InRelease
  Connection failed [IP: 103.88.47.230 80]
Reading package lists... Done
Building dependency tree
Reading state information... Done
267 packages can be upgraded. Run 'apt list --upgradable' to see them.
W: Failed to fetch http://launchpad.proxy.ustclug.org/ubuntu-toolchain-r/test/ubuntu/dists/bionic/InRelease  Connection failed [IP: 103.88.47.230 80]
W: Some index files failed to download. They have been ignored, or old ones used instead.

证书验证不过,安装如下包:

$ apt install ca-certificates
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following packages will be upgraded:
  ca-certificates
1 upgraded, 0 newly installed, 0 to remove and 266 not upgraded.
Need to get 140 kB of archives.
After this operation, 15.4 kB disk space will be freed.
Get:1 http://mirrors.cloud.aliyuncs.com/ubuntu bionic-updates/main amd64 ca-certificates all 20211016ubuntu0.18.04.1 [140 kB]
Fetched 140 kB in 0s (1,482 kB/s)
Preconfiguring packages ...
(Reading database ... 157674 files and directories currently installed.)
Preparing to unpack .../ca-certificates_20211016ubuntu0.18.04.1_all.deb ...
Unpacking ca-certificates (20211016ubuntu0.18.04.1) over (20210119~18.04.1) ...
Setting up ca-certificates (20211016ubuntu0.18.04.1) ...
Updating certificates in /etc/ssl/certs...
rehash: warning: skipping ca-certificates.crt,it does not contain exactly one certificate or CRL
7 added, 12 removed; done.
Processing triggers for man-db (2.8.3-2ubuntu0.1) ...
Processing triggers for ca-certificates (20211016ubuntu0.18.04.1) ...
Updating certificates in /etc/ssl/certs...
0 added, 0 removed; done.
Running hooks in /etc/ca-certificates/update.d...

done.
done.

此时,可以顺利安装gcc 11:

$ apt install gcc-11 g++11

查看gcc版本,还没变:

$ gcc --version
gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0
Copyright (C) 2017 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

update-alternatives更新默认gcc/g++版本:

$ sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-7 1
$ sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-11 10

$ sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-7 1
$ sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-11 10

再检查gcc版本:

$ gcc --version
gcc (Ubuntu 11.1.0-1ubuntu1~18.04.1) 11.1.0
Copyright (C) 2021 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

再次pip安装text-generation-webui即可,编译llama-cpp-python成功。


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK