hide | ||
---|---|---|
|
Deploy DeepSeek distilled models with SGLang
Deploy Llama 3.1 with vLLM
Deploy Llama 4 with TGI
Deploy a DeepSeek distilled model with NIM
Deploy DeepSeek R1 and its distilled version with TensorRT-LLM
<p>
Fine-tune Llama 4 on a custom dataset using Axolotl.
</p>
</a>
<a href="/examples/fine-tuning/trl"
class="feature-cell">
<h3>
TRL
</h3>
<p>
Fine-tune Llama 3.1 8B on a custom dataset using TRL.
</p>
</a>
<p>
Deploy and fine-tune LLMs on AMD
</p>
</a>
<a href="/examples/accelerators/intel"
class="feature-cell sky">
<h3>
Intel Gaudi
</h3>
<p>
Deploy and fine-tune LLMs on Intel Gaudi
</p>
</a>
<a href="/examples/accelerators/tpu"
class="feature-cell sky">
<h3>
TPU
</h3>
<p>
Deploy and fine-tune LLMs on TPU
</p>
</a>
<p>
Deploy and train Deepseek models
</p>
</a>
<a href="/examples/llms/llama"
class="feature-cell sky">
<h3>
Llama
</h3>
<p>
Deploy Llama 4 models
</p>
</a>
<p>
Use Docker and Docker Compose inside runs
</p>
</a>
<a href="/examples/misc/nccl-tests"
class="feature-cell sky">
<h3>
NCCL tests
</h3>
<p>
Run multi-node NCCL tests with MPI
</p>
</a>
<a href="/examples/misc/a3mega-clusters"
class="feature-cell sky">
<h3>
A3 Mega
</h3>
<p>
Set up GCP A3 Mega clusters with optimized networking
</p>
</a>
<a href="/examples/misc/a3high-clusters"
class="feature-cell sky">
<h3>
A3 High
</h3>
<p>
Set up GCP A3 High clusters with optimized networking
</p>
</a>