Easy Ollama Plugin and Samples for Unity
This guide explains how to set up Ollama on Windows for Unity Editor / Standalone builds.
There are two common ways to prepare models for Ollama.
Method A: ollama pull (recommended)
Method B: Direct GGUF placement (advanced/custom)
| Scenario | Recommended | Reason |
|---|---|---|
| Typical use | Method A | Simple and reliable |
| Custom unsupported model | Method B | Bring your own GGUF |
| Fine-tuned parameter control | Method B | Use Modelfile |
Assets/StreamingAssets/.EasyLocalLLM/Ollama/
├── ollama.exe
├── lib/
└── models/
ollama pull (recommended)$env:OLLAMA_MODELS="<ProjectPath>\Assets\StreamingAssets\.EasyLocalLLM\Ollama\models"
mkdir $env:OLLAMA_MODELS
cd "<ProjectPath>\Assets\StreamingAssets\.EasyLocalLLM\Ollama"
.\ollama.exe serve
cd "<ProjectPath>\Assets\StreamingAssets\.EasyLocalLLM\Ollama"
.\ollama.exe pull mistral
blobs/ and manifests/ are created under StreamingAssets/.EasyLocalLLM/Ollama/models/.Example:
Assets/StreamingAssets/.EasyLocalLLM/Ollama/models/mistral/
└── mistral-7b-instruct-v0.1.Q4_K_M.gguf
ModelfileFROM ./your-model-name.Q4_K_M.gguf
PARAMETER temperature 0.7
PARAMETER top_k 40
PARAMETER top_p 0.9
$env:OLLAMA_MODELS="<ProjectPath>\Assets\StreamingAssets\.EasyLocalLLM\Ollama\models"
cd "<ProjectPath>\Assets\StreamingAssets\.EasyLocalLLM\Ollama\models\mistral"
..\..\ollama.exe create mistral -f ./Modelfile
..\..\ollama.exe list
var config = new OllamaConfig
{
ServerUrl = "http://localhost:11434",
ExecutablePath = Application.streamingAssetsPath + "/.EasyLocalLLM/Ollama/ollama.exe",
ModelsDirectory = Application.streamingAssetsPath + "/.EasyLocalLLM/Ollama/models",
DefaultModelName = "mistral",
AutoStartServer = true,
DebugMode = true
};
OllamaServerManager.Initialize(config);
var client = LLMClientFactory.CreateOllamaClient(config);
ollama.exe serve fails: check port conflict with netstat -an | findstr :11434Modelfile shows not found: ensure FROM ./...gguf uses relative pathMaxConcurrentSessions = 1