Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Installation issue: tokenizers version #62

Open
rbgreenway opened this issue Sep 3, 2024 · 2 comments
Open

Installation issue: tokenizers version #62

rbgreenway opened this issue Sep 3, 2024 · 2 comments
Assignees

Comments

@rbgreenway
Copy link

rbgreenway commented Sep 3, 2024

My system:
Ubuntu 22.04
4 x 4090
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.183.01 Driver Version: 535.183.01 CUDA Version: 12.2 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA GeForce RTX 4090 Off | 00000000:18:00.0 Off | Off |
| 0% 45C P8 31W / 450W | 11MiB / 24564MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
| 1 NVIDIA GeForce RTX 4090 Off | 00000000:51:00.0 On | Off |
| 0% 44C P8 32W / 450W | 52MiB / 24564MiB | 5% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
| 2 NVIDIA GeForce RTX 4090 Off | 00000000:8A:00.0 Off | Off |
| 0% 40C P8 26W / 450W | 628MiB / 24564MiB | 8% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
| 3 NVIDIA GeForce RTX 4090 Off | 00000000:C3:00.0 Off | Off |
| 0% 43C P8 35W / 450W | 11MiB / 24564MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+

Following the build and run with docker compose, I get this module versioning error during docker compose build:

280.3 Traceback (most recent call last):
280.3   File "<string>", line 1, in <module>
280.3   File "/usr/local/lib/python3.10/dist-packages/transformers/__init__.py", line 26, in <module>
280.3     from . import dependency_versions_check
280.3   File "/usr/local/lib/python3.10/dist-packages/transformers/dependency_versions_check.py", line 57, in <module>
280.3     require_version_core(deps[pkg])
280.3   File "/usr/local/lib/python3.10/dist-packages/transformers/utils/versions.py", line 117, in require_version_core
280.3     return require_version(requirement, hint)
280.3   File "/usr/local/lib/python3.10/dist-packages/transformers/utils/versions.py", line 111, in require_version
280.3     _compare_versions(op, got_ver, want_ver, requirement, pkg, hint)
280.3   File "/usr/local/lib/python3.10/dist-packages/transformers/utils/versions.py", line 44, in _compare_versions
280.3     raise ImportError(
280.3 ImportError: tokenizers>=0.19,<0.20 is required for a normal functioning of this module, but found tokenizers==0.20.0.
280.3 Try: `pip install transformers -U` or `pip install -e '.[dev]'` if you're working with git main
------
failed to solve: process "/bin/sh -c ./setup-whisperfusion.sh" did not complete successfully: exit code: 1

So I edited setup-whisperfusion.sh from:

pip install -U huggingface_hub tokenizers

to:

pip install -U huggingface_hub tokenizers==0.19.0

Now getting the following error after running docker compose up:

dssadmin@gpu022:~/WhisperFusion$ docker compose up

[+] Running 8/8
 ✔ nginx Pulled                                                                                                                                                               6.6s 
   ✔ e4fff0779e6d Pull complete                                                                                                                                               4.5s 
   ✔ 2a0cb278fd9f Pull complete                                                                                                                                               5.7s 
   ✔ 7045d6c32ae2 Pull complete                                                                                                                                               5.8s 
   ✔ 03de31afb035 Pull complete                                                                                                                                               5.8s 
   ✔ 0f17be8dcff2 Pull complete                                                                                                                                               5.8s 
   ✔ 14b7e5e8f394 Pull complete                                                                                                                                               5.8s 
   ✔ 23fa5a7b99a6 Pull complete                                                                                                                                               5.8s 
[+] Running 3/2
 ✔ Network whisperfusion_default            Created                                                                                                                           0.1s 
 ✔ Container whisperfusion-whisperfusion-1  Created                                                                                                                           0.4s 
 ✔ Container whisperfusion-nginx-1          Created                                                                                                                           0.1s 
Attaching to nginx-1, whisperfusion-1
whisperfusion-1  | Running build-models.sh...
whisperfusion-1  | whisper_small_en directory does not exist or is empty. Running build-whisper.sh...
whisperfusion-1  | Downloading PyTorch weights for small.en model
nginx-1          | Starting web service with Nginx
whisperfusion-1  | Building Whisper TensorRT Engine...
whisperfusion-1  | [TensorRT-LLM] TensorRT-LLM version: 0.10.0
whisperfusion-1  | [09/03/2024-16:41:34] [TRT-LLM] [I] plugin_arg is None, setting it as float16 automatically.
whisperfusion-1  | [09/03/2024-16:41:34] [TRT-LLM] [I] plugin_arg is None, setting it as float16 automatically.
whisperfusion-1  | [09/03/2024-16:41:34] [TRT-LLM] [I] plugin_arg is None, setting it as float16 automatically.
whisperfusion-1  | [09/03/2024-16:41:34] [TRT] [W] Unable to determine GPU memory usage: forward compatibility was attempted on non supported HW
whisperfusion-1  | [09/03/2024-16:41:34] [TRT] [W] Unable to determine GPU memory usage: forward compatibility was attempted on non supported HW
whisperfusion-1  | [09/03/2024-16:41:34] [TRT] [I] [MemUsageChange] Init CUDA: CPU +0, GPU +0, now: CPU 576, GPU 0 (MiB)
whisperfusion-1  | terminate called without an active exception
whisperfusion-1  | [e4b7b7abb3d0:00018] *** Process received signal ***
whisperfusion-1  | [e4b7b7abb3d0:00018] Signal: Aborted (6)
whisperfusion-1  | [e4b7b7abb3d0:00018] Signal code:  (-6)
whisperfusion-1  | [e4b7b7abb3d0:00018] [ 0] /usr/lib/x86_64-linux-gnu/libc.so.6(+0x42520)[0x7f0f29b0f520]
whisperfusion-1  | [e4b7b7abb3d0:00018] [ 1] /usr/lib/x86_64-linux-gnu/libc.so.6(pthread_kill+0x12c)[0x7f0f29b639fc]
whisperfusion-1  | [e4b7b7abb3d0:00018] [ 2] /usr/lib/x86_64-linux-gnu/libc.so.6(raise+0x16)[0x7f0f29b0f476]
whisperfusion-1  | [e4b7b7abb3d0:00018] [ 3] /usr/lib/x86_64-linux-gnu/libc.so.6(abort+0xd3)[0x7f0f29af57f3]
whisperfusion-1  | [e4b7b7abb3d0:00018] [ 4] /usr/lib/x86_64-linux-gnu/libstdc++.so.6(+0xa2b9e)[0x7f0f28676b9e]
whisperfusion-1  | [e4b7b7abb3d0:00018] [ 5] /usr/lib/x86_64-linux-gnu/libstdc++.so.6(+0xae20c)[0x7f0f2868220c]
whisperfusion-1  | [e4b7b7abb3d0:00018] [ 6] /usr/lib/x86_64-linux-gnu/libstdc++.so.6(+0xae277)[0x7f0f28682277]
whisperfusion-1  | [e4b7b7abb3d0:00018] [ 7] /usr/local/lib/python3.10/dist-packages/tensorrt_libs/libnvinfer.so.10(+0x4d1d47)[0x7f0e2d21cd47]
whisperfusion-1  | [e4b7b7abb3d0:00018] [ 8] /usr/local/lib/python3.10/dist-packages/tensorrt_bindings/tensorrt.so(+0x87b58)[0x7f0d83c87b58]
whisperfusion-1  | [e4b7b7abb3d0:00018] [ 9] /usr/local/lib/python3.10/dist-packages/tensorrt_bindings/tensorrt.so(+0x46123)[0x7f0d83c46123]
whisperfusion-1  | [e4b7b7abb3d0:00018] [10] python3(+0x15adae)[0x563154f02dae]
whisperfusion-1  | [e4b7b7abb3d0:00018] [11] python3(_PyObject_MakeTpCall+0x25b)[0x563154ef952b]
whisperfusion-1  | [e4b7b7abb3d0:00018] [12] python3(+0x169680)[0x563154f11680]
whisperfusion-1  | [e4b7b7abb3d0:00018] [13] python3(+0x165dc7)[0x563154f0ddc7]
whisperfusion-1  | [e4b7b7abb3d0:00018] [14] python3(+0x1518db)[0x563154ef98db]
whisperfusion-1  | [e4b7b7abb3d0:00018] [15] /usr/local/lib/python3.10/dist-packages/tensorrt_bindings/tensorrt.so(+0x41b73)[0x7f0d83c41b73]
whisperfusion-1  | [e4b7b7abb3d0:00018] [16] python3(_PyObject_MakeTpCall+0x25b)[0x563154ef952b]
whisperfusion-1  | [e4b7b7abb3d0:00018] [17] python3(_PyEval_EvalFrameDefault+0x6f0b)[0x563154ef216b]
whisperfusion-1  | [e4b7b7abb3d0:00018] [18] python3(_PyObject_FastCallDictTstate+0xc4)[0x563154ef86c4]
whisperfusion-1  | [e4b7b7abb3d0:00018] [19] python3(+0x1657a4)[0x563154f0d7a4]
whisperfusion-1  | [e4b7b7abb3d0:00018] [20] python3(_PyObject_MakeTpCall+0x1fc)[0x563154ef94cc]
whisperfusion-1  | [e4b7b7abb3d0:00018] [21] python3(_PyEval_EvalFrameDefault+0x68bc)[0x563154ef1b1c]
whisperfusion-1  | [e4b7b7abb3d0:00018] [22] python3(_PyFunction_Vectorcall+0x7c)[0x563154f036ac]
whisperfusion-1  | [e4b7b7abb3d0:00018] [23] python3(_PyEval_EvalFrameDefault+0x6d5)[0x563154eeb935]
whisperfusion-1  | [e4b7b7abb3d0:00018] [24] python3(_PyFunction_Vectorcall+0x7c)[0x563154f036ac]
whisperfusion-1  | [e4b7b7abb3d0:00018] [25] python3(_PyEval_EvalFrameDefault+0x6d5)[0x563154eeb935]
whisperfusion-1  | [e4b7b7abb3d0:00018] [26] python3(+0x140096)[0x563154ee8096]
whisperfusion-1  | [e4b7b7abb3d0:00018] [27] python3(PyEval_EvalCode+0x86)[0x563154fddf66]
whisperfusion-1  | [e4b7b7abb3d0:00018] [28] python3(+0x260e98)[0x563155008e98]
whisperfusion-1  | [e4b7b7abb3d0:00018] [29] python3(+0x25a79b)[0x56315500279b]
whisperfusion-1  | [e4b7b7abb3d0:00018] *** End of error message ***

Not sure what to do from here.

@makaveli10 makaveli10 self-assigned this Sep 10, 2024
@goranskular
Copy link

Same here.

@goranskular
Copy link

goranskular commented Sep 12, 2024

Ok.. on the host machine I have Cuda 12.2, not 12.4. So I changed it in the Dockerfile to:
'RG BASE_TAG=12.2.0-runtime-ubuntu22.04'.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

3 participants