Local LLM Installation using Ollama


Step 1: Install Ollama

The first step of the Ollama setup process is to download Ollama from the official website. Navigate to the Download Ollama page from the official OIlama website, then select your PC Operating System (Windows, macOS, Linux).

macOS
Linux
Windows

Sign up for emails in order to ensure you never miss a new Ollama update.

Scripts to Execute

start-ollama-server.bat

This script first stops any existing Ollama server instances. Then the script enables browser extension support. After enabling browser extension support, the scripts ends with spinning up a fresh Ollama session.

@echo off
echo Stopping any existing Ollama server...
taskkill /F /IM ollama.exe >nul 2>&1

echo.
echo Starting Ollama server with extension support...

set OLLAMA_ORIGINS=moz-extension://*,chrome-extension://*,safari-web-extension://*
start cmd /k ollama serve

If you would prefer to use a different browser extension beyond the three included in the script above, you can edit the script in seconds using the code snippet I have attached below. The snippet includes syntax for two additional browsers extensions that can accommodate for uncommon browser extensions such as Electron apps or custom home-brew browser extensions.

set OLLAMA_ORIGINS=moz-extension://*,chrome-extension://*,safari-web-extension://*,electron-extension://*,my-custom-extension://*

Common Errors

Port 11434 is the default port used by the Ollama API server (locally hosted). When you run: ollama run llama3 or access http://localhost:11434, you’re actually talking to Ollama’s backend service! If another process is using port 11434 or if a previous Ollama session didn’t shut down properly, it can block new sessions from starting up cleanly.


Below are some of the common errors and behaviors I have experience when running Ollama locally on my machine; these errors are specifically related to port 11434.

  • error: listen tcp 127.0.0.1:11434 bind: address already in use
  • error: failed to connect to Ollama server at localhost:11434
  • error: (behavior): Ollama binds to port w11434 followed by exit/crash
  • error: (behavior): No response from http://localhost:11434
  • error: (behavior): Extensive hang time when running models

fix-and-start-ollama.bat

This script should be ran when your Ollama session isn’t starting correctly, typically due to port conflicts (port 11434). If your Ollama session is experiencing any of the errors mentioned above, run this script to resolve them prior to any other troubleshooting steps.

@echo off
echo Finding process using port 11434...

FOR /F "tokens=5" %%P IN ('netstat -aon ^| findstr :11434 ^| findstr LISTENING') DO (
    echo Terminating process using port 11434: PID %%P
    taskkill /PID %%P /F >nul 2>&1
)

echo.
echo Setting OLLAMA_ORIGINS environment variable...
set OLLAMA_ORIGINS=moz-extension://*,chrome-extension://*,safari-web-extension://*

echo.
echo Starting Ollama server...
start cmd /k ollama serve

Extended Troubleshooting

If you are still having issues, and this goes for any time you are running Ollama locally – not just during initial setup and installation: run the script I have attached below.
The script below is ideal for those times where you just want to throw your PC out the window and start with a clean slate. This script serves a variety of purposes:

  1. Diagnostic Tool: runs system checks and installation verification processes (ie. proper path setup validation)
  2. Logging: documents diagnostic findings in logs outputted as .log and .html export files
  3. Error Summary: breakdown and of core issues preventing Ollama from functioning properly

Resources

Communities