summaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
authorNate Sesti <sestinj@gmail.com>2023-08-01 09:13:55 -0700
committerNate Sesti <sestinj@gmail.com>2023-08-01 09:13:55 -0700
commit2447182803877ac2d117d8353f652d62cc63d352 (patch)
tree65a17003182970e7197ae0aee6c1961d31ccc7fb /docs
parent932fadb0187d939f5f746ab7118528e04bf64bec (diff)
downloadsncontinue-2447182803877ac2d117d8353f652d62cc63d352.tar.gz
sncontinue-2447182803877ac2d117d8353f652d62cc63d352.tar.bz2
sncontinue-2447182803877ac2d117d8353f652d62cc63d352.zip
docs: :memo: make ollama docs more clear
Diffstat (limited to 'docs')
-rw-r--r--docs/docs/customization.md13
1 files changed, 12 insertions, 1 deletions
diff --git a/docs/docs/customization.md b/docs/docs/customization.md
index 4226b4d3..22fcbb3d 100644
--- a/docs/docs/customization.md
+++ b/docs/docs/customization.md
@@ -55,7 +55,18 @@ Continue will automatically prompt you for your Anthropic API key, which must ha
### Run Llama-2 locally with Ollama
-[Ollama](https://ollama.ai/) is a Mac application that makes it easy to locally run open-source models, including Llama-2. Download the app from the website, and it will walk you through setup in a couple of minutes. You can also read more in their [README](https://github.com/jmorganca/ollama). Configure Continue by importing `from continuedev.libs.llm.ollama import Ollama` and setting `default=Ollama(model="llama-2")`.
+[Ollama](https://ollama.ai/) is a Mac application that makes it easy to locally run open-source models, including Llama-2. Download the app from the website, and it will walk you through setup in a couple of minutes. You can also read more in their [README](https://github.com/jmorganca/ollama). Continue can then be configured to use the `Ollama` LLM class:
+
+```python
+from continuedev.libs.llm.ollama import Ollama
+
+config = ContinueConfig(
+ ...
+ models=Models(
+ default=Ollama(model="llama2")
+ )
+)
+```
### Local models with ggml