From 2447182803877ac2d117d8353f652d62cc63d352 Mon Sep 17 00:00:00 2001 From: Nate Sesti Date: Tue, 1 Aug 2023 09:13:55 -0700 Subject: docs: :memo: make ollama docs more clear --- docs/docs/customization.md | 13 ++++++++++++- 1 file changed, 12 insertions(+), 1 deletion(-) (limited to 'docs') diff --git a/docs/docs/customization.md b/docs/docs/customization.md index 4226b4d3..22fcbb3d 100644 --- a/docs/docs/customization.md +++ b/docs/docs/customization.md @@ -55,7 +55,18 @@ Continue will automatically prompt you for your Anthropic API key, which must ha ### Run Llama-2 locally with Ollama -[Ollama](https://ollama.ai/) is a Mac application that makes it easy to locally run open-source models, including Llama-2. Download the app from the website, and it will walk you through setup in a couple of minutes. You can also read more in their [README](https://github.com/jmorganca/ollama). Configure Continue by importing `from continuedev.libs.llm.ollama import Ollama` and setting `default=Ollama(model="llama-2")`. +[Ollama](https://ollama.ai/) is a Mac application that makes it easy to locally run open-source models, including Llama-2. Download the app from the website, and it will walk you through setup in a couple of minutes. You can also read more in their [README](https://github.com/jmorganca/ollama). Continue can then be configured to use the `Ollama` LLM class: + +```python +from continuedev.libs.llm.ollama import Ollama + +config = ContinueConfig( + ... + models=Models( + default=Ollama(model="llama2") + ) +) +``` ### Local models with ggml -- cgit v1.2.3-70-g09d2