Getting Started
This document walks you through how to set up to run models using tt-forge. The following topics are covered:
- Configuring Hardware
- Setting up the Docker Container
- Installing Dependencies
- Creating a Virtual Environment
- Installing a Wheel
- Running a Demo
NOTE: If you encounter issues, please request assistance on the tt-forge-fe Issues page.
NOTE: If you plan to do development work in the tt-forge repo, you need to build from source. To do that, please see the [build instructions for tt-forge-fe](https://github.com/tenstorrent/tt-forge-fe/ blob/main/docs/src/build.md).
Configuring Hardware
Configure your hardware with tt-installer:
TT_SKIP_INSTALL_PODMAN=0 TT_SKIP_INSTALL_METALIUM_CONTAINER=0 /bin/bash -c "$(curl -fsSL https://github.com/tenstorrent/tt-installer/releases/latest/download/install.sh)"
NOTE: This walkthrough assumes that you use the [Quick Installation] (https://docs.tenstorrent.com/getting-started/README. html#quick-installation) instructions for set up. Please ensure that after you run this script, you activate the virtual environment it sets up -
source ~/.tenstorrent-venv/bin/activate
.
Setting up the Docker Container
The simplest way to run models is to use the Docker image. T
- Base Image: This image includes all the necessary dependencies.
- ghcr.io/tenstorrent/tt-forge-fe/tt-forge-fe-base-ird-ubuntu-22-04
- Prebuilt Environment Image: This image contains all necessary dependencies and a prebuilt environment.
- ghcr.io/tenstorrent/tt-forge-fe/tt-forge-fe-ird-ubuntu-22-04
To install, do the following:
- Install Docker if you do not already have it:
sudo apt update
sudo apt install docker.io -y
sudo systemctl start docker
sudo systemctl enable docker
- Test that docker is installed:
docker --version
- Add your user to the docker group:
sudo usermod -aG docker $USER
newgrp docker
- Run the docker container:
sudo docker run \
--rm \
-it \
--privileged \
--device /dev/tenstorrent/0 \
-v /dev/hugepages-1G:/dev/hugepages-1G \
--mount type=bind,source=/sys/devices/system/node,target=/sys/devices/system/node \
ghcr.io/tenstorrent/tt-forge-fe/tt-forge-fe-ird-ubuntu-22-04
- If you want to check that it's running, open a new tab with the Same Command option and run the following:
docker ps
Creating a Virtual Environment
It is recommended that you install a virtual environment for the wheel you want to work with. Wheels from different repos may have conflicting dependencies.
Create a virtual environment (the environment name in the command is an example for the command, it's not required to use the same name listed):
python3 -m venv forge-venv
source forge-venv/bin/activate
Installing a Wheel
This section walks you through downloading and installing a wheel. You can install the wheel wherever you would like if it's for running a model. If you want to do development work, you must clone the repo you want, navigate into it, and then set up the wheel.
- Make sure you are in an active virtual environment.
NOTE: If you plan to do development work, before continuing with these instructions, clone the repo you plan to use, then navigate into the repo. If you are just running models, this step is not necessary.
- Download the wheel(s) you want to use from the Tenstorrent Nightly Releases page.
NOTE: The link for each wheel you install is structured as
https://github.com/tenstorrent/tt-forge/releases/download/nightly-0.1.0.dev{number from top of release page}/forge-0.1.0.dev{number from top of release page}-cp310-cp310-linux_x86_64.whl
For this walkthrough, tt-forge-fe is used. You need to install two wheels for set up:
pip install https://github.com/tenstorrent/tt-forge/releases/download/nightly-0.1.0.dev20250514060212/forge-0.1.0.dev20250514060212-cp310-cp310-linux_x86_64.whl
pip install https://github.com/tenstorrent/tt-forge/releases/download/nightly-0.1.0.dev20250509060216//tvm-0.1.0.dev20250509060216-cp310-cp310-linux_x86_64.whl
NOTE: The commands are examples, for the latest install link, go to the Tenstorrent Nightly Releases page. If you plan to work with wheels from different repositories, make a separate environment for each one. Some wheels have conflicting dependencies.
- Clone the tt-forge-fe repo:
git clone https://github.com/tenstorrent/tt-forge-fe.git
Run First Example Case
To confirm that our environment is properly setup, let's run one sanity test for element-wise add operation:
pytest forge/test/mlir/operators/eltwise_binary/test_eltwise_binary.py::test_add
In a few seconds, you should get confirmation if this test passed successfully. Once that's done, we can run one of our model tests as well:
pytest forge/test/mlir/llama/tests/test_llama_prefil.py::test_llama_prefil_on_device_decode_on_cpu
Running Models
You can try one of the models in the tt-forge repo. For a list of models that work with tt-forge-fe, navigate to the Demos folder in the tt-forge repo. Follow the Getting Started instructions there.
Where to Go Next
Now that you have set up tt-forge-fe, you can compile and run your own models.
For a quick start, here is an example of how to run your own model. Note the introduction of the forge.compile
call:
import torch
from transformers import ResNetForImageClassification
def resnet():
# Load image, pre-process, etc.
...
# Load model (e.g. from HuggingFace)
framework_model = ResNetForImageClassification.from_pretrained("microsoft/resnet-50")
# Compile the model using Forge
compiled_model = forge.compile(framework_model, input_image)
# Run compiled model
logits = compiled_model(input_image)
...
# Post-process output, return results, etc.