Overview

This guide will help you set up and run a LLMule provider node. By running a node, you contribute computing power to the network and earn tokens.

Prerequisites

Setup Options

Option 1: Ollama Setup

  1. Install Ollama
# macOS/Linux
curl <https://ollama.ai/install.sh> | sh

# Windows
# Download from <https://ollama.ai/download>

  1. Pull supported models:
# Tiny tier
ollama pull tinyllama

# Small tier
ollama pull mistral:latest

# Medium tier
ollama pull phi-4:latest

  1. Verify Ollama installation:
ollama list

Option 2: LM Studio Setup

  1. Download LM Studio:
  2. Configure LM Studio:
  3. Load Models:

LLMule Client Setup