Installation
This guide covers all the ways you can install and set up Axolotl for your environment.
1 Requirements
- NVIDIA GPU (Ampere architecture or newer for
bf16and Flash Attention) or AMD GPU - Python ≥3.11
- PyTorch ≥2.6.0
2 Installation Methods
Please make sure to have Pytorch installed before installing Axolotl in your local environment.
Follow the instructions at: https://pytorch.org/get-started/locally/
For Blackwell GPUs, please use Pytorch 2.7.0 and CUDA 12.8.
2.1 uv Installation (Recommended)
# Install uv if not already installed
curl -LsSf https://astral.sh/uv/install.sh | sh
# Add Axolotl to a project (recommended)
uv init my-project && cd my-project
uv add axolotl
uv pip install flash-attn --no-build-isolation
source .venv/bin/activateFor a quick one-off install without creating a project:
uv pip install axolotl
uv pip install flash-attn --no-build-isolation2.2 pip Installation
pip install --no-build-isolation axolotl[deepspeed]
pip install --no-build-isolation flash-attnWe use --no-build-isolation in order to detect the installed PyTorch version (if
installed) in order not to clobber it, and so that we set the correct version of
dependencies that are specific to the PyTorch version or other installed
co-dependencies. Flash Attention is resolved separately so it can be built against
the environment configured by the previous step.
2.3 Advanced uv Installation
uv is a fast, reliable Python package installer and resolver built in Rust. It offers significant performance improvements over pip and provides better dependency resolution, making it an excellent choice for complex environments.
Install uv if not already installed
curl -LsSf https://astral.sh/uv/install.sh | sh
source $HOME/.local/bin/envChoose your CUDA version to use with PyTorch; e.g. cu124, cu126, cu128,
then create the venv and activate
export UV_TORCH_BACKEND=cu126
uv venv --no-project --relocatable
source .venv/bin/activateInstall PyTorch - PyTorch 2.6.0 recommended
uv pip install torch==2.6.0
uv pip install awscli pydanticInstall axolotl from PyPi
uv pip install --no-build-isolation axolotl[deepspeed]
# optionally install with vLLM if you're using torch==2.6.0 and want to train w/ GRPO
# uv pip install --no-build-isolation axolotl[deepspeed,vllm]
uv pip install flash-attn --no-build-isolation2.4 Edge/Development Build
For the latest features between releases:
2.4.1 Using uv (recommended)
git clone https://github.com/axolotl-ai-cloud/axolotl.git
cd axolotl
curl -LsSf https://astral.sh/uv/install.sh | sh # If not already installed
uv sync
uv pip install flash-attn --no-build-isolation2.4.2 Using pip
git clone https://github.com/axolotl-ai-cloud/axolotl.git
cd axolotl
pip install --no-build-isolation -e '.[deepspeed]'
pip install --no-build-isolation flash-attn2.5 Docker
docker run --gpus '"all"' --rm -it axolotlai/axolotl:main-latestFor development with Docker:
docker compose up -ddocker run --privileged --gpus '"all"' --shm-size 10g --rm -it \
--name axolotl --ipc=host \
--ulimit memlock=-1 --ulimit stack=67108864 \
--mount type=bind,src="${PWD}",target=/workspace/axolotl \
-v ${HOME}/.cache/huggingface:/root/.cache/huggingface \
axolotlai/axolotl:main-latestFor Blackwell GPUs, please use axolotlai/axolotl:main-py3.11-cu128-2.7.0 or the cloud variant axolotlai/axolotl-cloud:main-py3.11-cu128-2.7.0.
Please refer to the Docker documentation for more information on the different Docker images that are available.
3 Cloud Environments
3.1 Cloud GPU Providers
For providers supporting Docker:
- Use
axolotlai/axolotl-cloud:main-latest - Available on:
3.2 Google Colab
4 Platform-Specific Instructions
4.1 macOS
uv pip install --no-build-isolation -e '.'See Section 6 for Mac-specific issues.
4.2 Windows
We recommend using WSL2 (Windows Subsystem for Linux) or Docker.
5 Environment Managers
5.1 Conda/Pip venv
- Install Python ≥3.11
- Install PyTorch: https://pytorch.org/get-started/locally/
- Install Axolotl:
# Option A: add Axolotl to the environment
uv add axolotl
uv pip install flash-attn --no-build-isolation
# Option B: quick install
uv pip install axolotl
uv pip install flash-attn --no-build-isolation(Optional) Login to Hugging Face:
huggingface-cli login
6 Troubleshooting
If you encounter installation issues, see our FAQ and Debugging Guide.