jmopines/site/content/software/ollama.md
Jean-Michel Tremblay b2727be8ce Initial commit: Hugo site with Terminal theme
Scripts: setup.sh, build.sh, serve.sh (Docker-based)
Content: about, config, software, posts sections
Custom: CSS overrides, HTML sitemap layout, extended_head partial
Theme: hugo-theme-terminal via Hugo modules (go.mod)
2026-04-03 16:30:38 -04:00

471 B

title date draft tags
Running Local LLMs with Ollama 2026-04-02 false
ollama
llm
tools
linux

Ollama lets you run large language models locally with a single command.

Quick start

ollama run llama3

Why local?

  • No API keys or rate limits.
  • Data stays on your machine.
  • Works offline.

Models I use

  • llama3 — general purpose
  • codellama — code generation
  • nomic-embed-text — embeddings for vector search