Ollama
Install ec2 env
ec2 basic install
Install nvidia
wget https://developer.download.nvidia.com/compute/cuda/12.9.0/local_installers/cuda-repo-amzn2023-12-9-local-12.9.0_575.51.03-1.x86_64.rpm
sudo rpm -i cuda-repo-amzn2023-12-9-local-12.9.0_575.51.03-1.x86_64.rpm
sudo dnf clean all
sudo dnf -y install cuda-toolkit-12-9
echo "export PATH=/usr/local/cuda-12.9/bin:$PATH" >> ~/.zshrc
. ~/.zshrc
nvcc -V
sudo dnf -y install kernel-devel-$(uname -r) kernel-headers-$(uname -r) kernel-modules-extra-$(uname -r) -y
sudo dnf -y module install nvidia-driver:latest-dkms
Install ollama
curl -fsSL https://ollama.com/install.sh | sh
NVIDIA installed
message check
- or run again
sudo su
mv /usr/share/ollama/.ollama/ /data/.ollama/
ln -s /data/.ollama /usr/share/ollama/.ollama
ls -al /usr/share/ollama
exit
ollama pull llama3.3
ollama pull gemma3
ollama pull phi4
Ollama Network Open
sudo systemctl stop ollama
sudo vi /etc/systemd/system/ollama.service
- [Service] Environment 부분 수정
Environment="OLLAMA_HOST=0.0.0.0:11434"
sudo systemctl daemon-reload
sudo systemctl start ollama
- 열린 포트 확인 TCP *:11434 (LISTEN) 확인
sudo lsof -i :11434
Install uv
curl -sSf https://astral.sh/uv/install.sh | sh
uv init example -p 3.11
cd example
uv venv
source .venv/bin/activate
Install open-webui
uv pip install open-webui
run open-webui
open-webui serve
related