summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--docs/docs/walkthroughs/codellama.md15
-rw-r--r--extension/package-lock.json4
-rw-r--r--extension/package.json2
3 files changed, 18 insertions, 3 deletions
diff --git a/docs/docs/walkthroughs/codellama.md b/docs/docs/walkthroughs/codellama.md
index 68e99948..ff5b9cf3 100644
--- a/docs/docs/walkthroughs/codellama.md
+++ b/docs/docs/walkthroughs/codellama.md
@@ -67,3 +67,18 @@ config = ContinueConfig(
```
3. Reload the VS Code window for changes to take effect
+
+## FastChat API
+1. Setup the FastChat API (https://github.com/lm-sys/FastChat) to use one of the Codellama models on Hugging Face (e.g: codellama/CodeLlama-7b-Instruct-hf).
+2. Start the OpenAI compatible API (ref: https://github.com/lm-sys/FastChat/blob/main/docs/openai_api.md).
+3. Change your Continue config file to look like this:
+
+```python
+config = ContinueConfig(
+ ...
+ models=Models(default=OpenAI(
+ model="CodeLlama-7b-Instruct-hf",
+ openai_server_info={'api_base': 'http://localhost:8000/v1'})
+
+```
+4. Reload the VS Code window for changes to take effect.
diff --git a/extension/package-lock.json b/extension/package-lock.json
index 12487021..ed22f779 100644
--- a/extension/package-lock.json
+++ b/extension/package-lock.json
@@ -1,12 +1,12 @@
{
"name": "continue",
- "version": "0.0.354",
+ "version": "0.0.355",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"name": "continue",
- "version": "0.0.354",
+ "version": "0.0.355",
"license": "Apache-2.0",
"dependencies": {
"@electron/rebuild": "^3.2.10",
diff --git a/extension/package.json b/extension/package.json
index 16d9bce5..13bf7127 100644
--- a/extension/package.json
+++ b/extension/package.json
@@ -1,7 +1,7 @@
{
"name": "continue",
"icon": "media/terminal-continue.png",
- "version": "0.0.354",
+ "version": "0.0.355",
"repository": {
"type": "git",
"url": "https://github.com/continuedev/continue"