summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorNate Sesti <sestinj@gmail.com>2023-07-27 14:07:43 -0700
committerNate Sesti <sestinj@gmail.com>2023-07-27 14:07:43 -0700
commit64f4750e5accdb619a447c9c36688454d9f9ec68 (patch)
tree3d54f432d7bed3689eef55ed8a106b540fc7a526
parentd3b4103cd2f639fc072b8a3269d7730478c8bb1c (diff)
parent91c1299cf0ba01a1f9d3694cfa88cdf6fc804f45 (diff)
downloadsncontinue-64f4750e5accdb619a447c9c36688454d9f9ec68.tar.gz
sncontinue-64f4750e5accdb619a447c9c36688454d9f9ec68.tar.bz2
sncontinue-64f4750e5accdb619a447c9c36688454d9f9ec68.zip
Merge branch 'main' of https://github.com/continuedev/continue
-rw-r--r--CONTRIBUTING.md8
-rw-r--r--docs/docs/customization.md6
2 files changed, 10 insertions, 4 deletions
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index f7166411..a958777f 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -64,11 +64,11 @@ If editing the VS Code extension (`/extension` directory) or GUI (`/extension/re
### Writing Steps
-A Step can be used as a custom slash command, or called otherwise in a `Policy`. See the [steps README](./continuedev/src/continuedev/steps/README.md) to learn how to write a Step.
+A Step can be used as a custom slash command, or called otherwise in a `Policy`. See the [steps README](./continuedev/src/continuedev/plugins/steps/README.md) to learn how to write a Step.
### Writing Context Providers
-A `ContextProvider` is a Continue plugin that lets type '@' to quickly select documents as context for the language model. The simplest way to create a `ContextProvider` is to implement the `provide_context_items` method. You can find a great example of this in [GitHubIssuesContextProvider](./continuedev/src/continuedev/libs/context_providers/github_issues.py), which allows you to search GitHub Issues in a repo.
+A `ContextProvider` is a Continue plugin that lets type '@' to quickly select documents as context for the language model. The simplest way to create a `ContextProvider` is to implement the `provide_context_items` method. You can find a great example of this in [GitHubIssuesContextProvider](./continuedev/src/continuedev/plugins/context_providers/github.py), which allows you to search GitHub Issues in a repo.
## 📐 Continue Architecture
@@ -126,11 +126,11 @@ Everything in Continue is a "Step". The `Step` class defines 2 methods:
2. `async def describe(self, models: Models) -> Coroutine[str, None, None]` - After each Step is run, this method is called to asynchronously generate a summary title for the step. A `Models` object is passed so that you have access to LLMs to summarize for you.
-Steps are designed to be composable, so that you can easily build new Steps by combining existing ones. And because they are Pydantic models, they can instantly be used as tools useable by an LLM, for example with OpenAI's function-calling functionality (see [ChatWithFunctions](./continuedev/src/continuedev/steps/chat.py) for an example of this).
+Steps are designed to be composable, so that you can easily build new Steps by combining existing ones. And because they are Pydantic models, they can instantly be used as tools useable by an LLM, for example with OpenAI's function-calling functionality (see [ChatWithFunctions](./continuedev/src/continuedev/plugins/steps/chat.py) for an example of this).
Some of the most commonly used Steps are:
-- [`SimpleChatStep`](./continuedev/src/continuedev/steps/chat.py) - This is the default Step that is run when the user enters natural language input. It takes the user's input and runs it through the default LLM, then displays the result in the GUI.
+- [`SimpleChatStep`](./continuedev/src/continuedev/plugins/steps/chat.py) - This is the default Step that is run when the user enters natural language input. It takes the user's input and runs it through the default LLM, then displays the result in the GUI.
- [`EditHighlightedCodeStep`](./continuedev/src/continuedev/steps/core/core.py) - This is the Step run when a user highlights code, enters natural language, and presses CMD/CTRL+ENTER, or uses the slash command '/edit'. It opens a side-by-side diff editor, where updated code is streamed to fulfil the user's request.
diff --git a/docs/docs/customization.md b/docs/docs/customization.md
index 46dd0b0d..c768c97d 100644
--- a/docs/docs/customization.md
+++ b/docs/docs/customization.md
@@ -25,6 +25,12 @@ If you have access, simply set `default_model` to the model you would like to us
See our [5 minute quickstart](https://github.com/continuedev/ggml-server-example) to run any model locally with ggml. While these models don't yet perform as well, they are free, entirely private, and run offline.
+### Self-hosting an open-source model
+
+If you want to self-host on Colab, RunPod, Replicate, HuggingFace, Haven, or another hosting provider you will need to wire up a new LLM class. It only needs to implement 3 methods: `stream_complete`, `complete`, and `stream_chat`, and you can see examples in `continuedev/src/continuedev/libs/llm`.
+
+If by chance the provider has the exact same API interface as OpenAI, the `GGML` class will work for you out of the box, after changing the endpoint at the top of the file.
+
### Azure OpenAI Service
If you'd like to use OpenAI models but are concerned about privacy, you can use the Azure OpenAI service, which is GDPR and HIPAA compliant. After applying for access [here](https://azure.microsoft.com/en-us/products/ai-services/openai-service), you will typically hear back within only a few days. Once you have access, set `default_model` to "gpt-4", and then set the `azure_openai_info` property in the `ContinueConfig` like so: