대충 Cuda랑 pytorch 버전 안맞춰서 설치하려고 하다보면 Cuda 이슈가 뜨는데
ImportError: libtorch_cuda_cpp.so: cannot open shared object file: No such file or directory
ImportError: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by ~.conda/envs/qwen_env/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so)
ImportError: ~.conda/envs/qwen_env/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c105ErrorC2ENS_14SourceLocationESs
이런 에러들이 발생한다.
따라서 https://github.com/Dao-AILab/flash-attention/releases에 들어가서 Cuda랑 Torch버전 대략적으로 확인하고 설치 진행해보자.
!pip uninstall -y flash-attn torch torchvision torchaudio
!pip install torch==2.3.0 torchvision torchaudio --index-url https://download.pytorch.org/whl/cu121
!pip uninstall flash-attn -y
!pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.8/flash_attn-2.5.8+cu122torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
torch 2.3.0
cuda 12.1
flash attention 2.5.8 / cu122 / torch 2.3으로 진행했는데 잘 되었다...
Releases · Dao-AILab/flash-attention
Fast and memory-efficient exact attention. Contribute to Dao-AILab/flash-attention development by creating an account on GitHub.
github.com
반응형
'Codes > 다목적 오류 (디버깅)' 카테고리의 다른 글
| 우분투 설치 시 Minimum 설치하면서 생긴 Ethernet 이슈 (0) | 2025.12.31 |
|---|---|
| 우분투 22.04 설치 + Real Time kernel (RT커널) (0) | 2025.12.31 |
| nvitop 에러 수정 (0) | 2025.02.27 |
| Tiff file 사용하기 (0) | 2024.12.03 |
| Python Module import 방법 (0) | 2024.11.25 |