summaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
authorNate Sesti <sestinj@gmail.com>2023-08-01 01:34:05 -0700
committerNate Sesti <sestinj@gmail.com>2023-08-01 01:34:05 -0700
commit932fadb0187d939f5f746ab7118528e04bf64bec (patch)
treebb304b03c89e1c5ea356d0f521755ceb45155cfd /docs
parent5dcdcd81da2050825212e216bf5e7e69678d8c6e (diff)
downloadsncontinue-932fadb0187d939f5f746ab7118528e04bf64bec.tar.gz
sncontinue-932fadb0187d939f5f746ab7118528e04bf64bec.tar.bz2
sncontinue-932fadb0187d939f5f746ab7118528e04bf64bec.zip
docs: :memo: ollama customization docs
Diffstat (limited to 'docs')
-rw-r--r--docs/docs/customization.md4
1 files changed, 4 insertions, 0 deletions
diff --git a/docs/docs/customization.md b/docs/docs/customization.md
index 60764527..4226b4d3 100644
--- a/docs/docs/customization.md
+++ b/docs/docs/customization.md
@@ -53,6 +53,10 @@ config = ContinueConfig(
Continue will automatically prompt you for your Anthropic API key, which must have access to Claude 2. You can request early access [here](https://www.anthropic.com/earlyaccess).
+### Run Llama-2 locally with Ollama
+
+[Ollama](https://ollama.ai/) is a Mac application that makes it easy to locally run open-source models, including Llama-2. Download the app from the website, and it will walk you through setup in a couple of minutes. You can also read more in their [README](https://github.com/jmorganca/ollama). Configure Continue by importing `from continuedev.libs.llm.ollama import Ollama` and setting `default=Ollama(model="llama-2")`.
+
### Local models with ggml
See our [5 minute quickstart](https://github.com/continuedev/ggml-server-example) to run any model locally with ggml. While these models don't yet perform as well, they are free, entirely private, and run offline.