From 932fadb0187d939f5f746ab7118528e04bf64bec Mon Sep 17 00:00:00 2001 From: Nate Sesti Date: Tue, 1 Aug 2023 01:34:05 -0700 Subject: docs: :memo: ollama customization docs --- docs/docs/customization.md | 4 ++++ 1 file changed, 4 insertions(+) (limited to 'docs') diff --git a/docs/docs/customization.md b/docs/docs/customization.md index 60764527..4226b4d3 100644 --- a/docs/docs/customization.md +++ b/docs/docs/customization.md @@ -53,6 +53,10 @@ config = ContinueConfig( Continue will automatically prompt you for your Anthropic API key, which must have access to Claude 2. You can request early access [here](https://www.anthropic.com/earlyaccess). +### Run Llama-2 locally with Ollama + +[Ollama](https://ollama.ai/) is a Mac application that makes it easy to locally run open-source models, including Llama-2. Download the app from the website, and it will walk you through setup in a couple of minutes. You can also read more in their [README](https://github.com/jmorganca/ollama). Configure Continue by importing `from continuedev.libs.llm.ollama import Ollama` and setting `default=Ollama(model="llama-2")`. + ### Local models with ggml See our [5 minute quickstart](https://github.com/continuedev/ggml-server-example) to run any model locally with ggml. While these models don't yet perform as well, they are free, entirely private, and run offline. -- cgit v1.2.3-70-g09d2