Codes/다목적 오류 (디버깅)

Flash attention 2 문제 관련

Na_ai 2025. 10. 14. 00:11

대충 Cuda랑 pytorch 버전 안맞춰서 설치하려고 하다보면 Cuda 이슈가 뜨는데

ImportError: libtorch_cuda_cpp.so: cannot open shared object file: No such file or directory
ImportError: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by ~.conda/envs/qwen_env/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so)
ImportError: ~.conda/envs/qwen_env/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c105ErrorC2ENS_14SourceLocationESs

이런 에러들이 발생한다.

따라서 https://github.com/Dao-AILab/flash-attention/releases에 들어가서 Cuda랑 Torch버전 대략적으로 확인하고 설치 진행해보자. 

!pip uninstall -y flash-attn torch torchvision torchaudio
!pip install torch==2.3.0 torchvision torchaudio --index-url https://download.pytorch.org/whl/cu121
!pip uninstall flash-attn -y
!pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.8/flash_attn-2.5.8+cu122torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whl

torch 2.3.0 
cuda 12.1

flash attention 2.5.8 / cu122 / torch 2.3으로 진행했는데 잘 되었다...

 

 

Releases · Dao-AILab/flash-attention

Fast and memory-efficient exact attention. Contribute to Dao-AILab/flash-attention development by creating an account on GitHub.

github.com

 

 

 

반응형