summaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
authorTy Dunn <ty@tydunn.com>2023-08-16 09:15:01 -0700
committerGitHub <noreply@github.com>2023-08-16 09:15:01 -0700
commit1881be67acf279363bb2a6fbb6a69957436c576c (patch)
tree6e5d36c8c60cd06baa931b5c579cdcea241ee68c /docs
parentba9c046171323c3e56ec6ba4160ba7b9457c832f (diff)
downloadsncontinue-1881be67acf279363bb2a6fbb6a69957436c576c.tar.gz
sncontinue-1881be67acf279363bb2a6fbb6a69957436c576c.tar.bz2
sncontinue-1881be67acf279363bb2a6fbb6a69957436c576c.zip
Adding together API docs
Diffstat (limited to 'docs')
-rw-r--r--docs/docs/customization.md20
1 files changed, 20 insertions, 0 deletions
diff --git a/docs/docs/customization.md b/docs/docs/customization.md
index 2d1cb1a0..f7244da8 100644
--- a/docs/docs/customization.md
+++ b/docs/docs/customization.md
@@ -7,6 +7,8 @@ Continue can be deeply customized by editing the `ContinueConfig` object in `~/.
In `config.py`, you'll find the `models` property:
```python
+from continuedev.src.continuedev.core.sdk import Models
+
config = ContinueConfig(
...
models=Models(
@@ -96,6 +98,24 @@ config = ContinueConfig(
)
```
+### Together
+
+The Together API is a cloud platform for running large AI models. You can sign up [here](https://api.together.xyz/signup), copy your API key on the initial welcome screen, and then hit the play button on any model from the [Together Models list](https://docs.together.ai/docs/models-inference). Change the config file to look like this:
+
+```python
+from continuedev.src.continuedev.libs.llm.replicate import ReplicateLLM
+
+config = ContinueConfig(
+ ...
+ models=Models(
+ default=TogetherLLM(
+ api_key="<API_KEY>",
+ model="togethercomputer/llama-2-13b-chat"
+ )
+ )
+)
+```
+
### Replicate (beta)
Replicate is a great option for newly released language models or models that you've deployed through their platform. Sign up for an account [here](https://replicate.ai/), copy your API key, and then select any model from the [Replicate Streaming List](https://replicate.com/collections/streaming-language-models). Change the config file to look like this: