Ollama on Linux

Install-auto

Install Ollama running this one-liner:

curl -fsSL https://ollama.com/install.sh | sh

Ollama3 大模型安装及使用_CUDA

Manual install

Download the ollama binary

sudo curl -L https://ollama.com/download/ollama-linux-amd64 -o /usr/bin/ollama
sudo chmod +x /usr/bin/ollama

Adding Ollama as a startup service (recommended)

Create a user for Ollama:

sudo useradd -r -s /bin/false -m -d /usr/share/ollama ollama

Create a service file in /etc/systemd/system/ollama.service:

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3

[Install]
WantedBy=default.target

配置本地大模型对局域网提供服务,在Linux上创建如下配置文件,并配置环境变量 OLLAMA_HOST 来指定对局域网提供服务的地址,再重启Ollama服务即可

mkdir -p  /etc/systemd/system/ollama.service.d/
cat  /etc/systemd/system/ollama.service.d/environment.conf 
[Service]
Environment=OLLAMA_HOST=0.0.0.0:11434

Then start the service:

sudo systemctl daemon-reload
sudo systemctl enable ollama

Install CUDA drivers (optional(可选) – for Nvidia GPUs)

Download and install CUDA.网址:

https://developer.nvidia.com/cuda-downloads

Verify that the drivers are installed by running the following command, which should print details about your GPU:

nvidia-smi

Install ROCm (optional(可选) - for Radeon GPUs)

Download and Install. 网址:

https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html

Make sure to install ROCm v6

Start Ollama(Start Ollama using systemd)

sudo systemctl start ollama

Update

Update ollama by running the install script again:

curl -fsSL https://ollama.com/install.sh | sh

Or by downloading the ollama binary:

sudo curl -L https://ollama.com/download/ollama-linux-amd64 -o /usr/bin/ollama
sudo chmod +x /usr/bin/ollama

Viewing logs

To view logs of Ollama running as a startup service, run:

journalctl -e -u ollama

Uninstall

Remove the ollama service:

sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service

Remove the ollama binary from your bin directory :

sudo rm $(which ollama)

Remove the downloaded models and Ollama service user and group:

sudo rm -r /usr/share/ollama
sudo userdel ollama
sudo groupdel ollama

运行llama3

Ollama3 大模型安装及使用_CUDA_02

或者命令行:

curl http://localhost:11434/api/generate -d '{ 
"model": "llama3", 
"prompt":"Why is the sky blue?", 
"stream": false
 }'

查看

Ollama3 大模型安装及使用_CUDA_03

下载的大模型存储路径

macOS: ~/.ollama/models

Linux: /usr/share/ollama/.ollama/models

Windows: C:\Users<username>.ollama\models

若要修改大模型的存储路径,可修改环境变量OLLAMA_MODELS来指定存储路径,再重启Ollama服务

cat /etc/systemd/system/ollama.service.d/environment.conf 
[Service] 
Environment=OLLAMA_MODELS=<path>/OLLAMA_MODELS