From 6c67b381724515306ce85bdf70ac8d854597eb54 Mon Sep 17 00:00:00 2001 From: Nate Sesti Date: Thu, 24 Aug 2023 09:51:53 -0700 Subject: codellama docs --- docs/docs/walkthroughs/codellama.md | 48 +++++++++++++++++++++++++++++++++++++ 1 file changed, 48 insertions(+) create mode 100644 docs/docs/walkthroughs/codellama.md (limited to 'docs') diff --git a/docs/docs/walkthroughs/codellama.md b/docs/docs/walkthroughs/codellama.md new file mode 100644 index 00000000..19cfa226 --- /dev/null +++ b/docs/docs/walkthroughs/codellama.md @@ -0,0 +1,48 @@ +# Using Code Llama with Continue + +With Continue, you can use Code Llama as a drop-in replacement for GPT-4, either by running locally with Ollama or GGML or through Replicate. + +For more general information on customizing Continue, read [our customization docs](../customization.md). + +## Ollama + +1. Download Ollama [here](https://ollama.ai/) (it should walk you through the rest of these steps) +2. Open a terminal and run `ollama pull codellama:7b-q4_0`\* +3. Run `ollama serve codellama:7b-q4_0` +4. Change your Continue config file to look like this: + +```python +from continuedev.src.continuedev.libs.llm.ollama import Ollama + +config = ContinueConfig( + ... + models=Models( + default=Ollama(model="codellama:7b-q4_0") + ) +) +``` + +5. Reload the VS Code window for changes to take effect + +\*Only the 7b model is available right now. The others will be ready later today or tomorrow. + +## Replicate + +1. Get your Replicate API key [here](https://replicate.ai/) +2. Change your Continue config file to look like this: + +```python +from continuedev.src.continuedev.core.models import Models +from continuedev.src.continuedev.libs.llm.replicate import ReplicateLLM + +config = ContinueConfig( + ... + models=Models( + default=ReplicateLLM( + model="", + api_key="") + ) +) +``` + +3. Reload the VS Code window for changes to take effect -- cgit v1.2.3-70-g09d2