summaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
Diffstat (limited to 'docs')
-rw-r--r--docs/docs/customization.md54
-rw-r--r--docs/docs/getting-started.md4
-rw-r--r--docs/docs/troubleshooting.md21
3 files changed, 49 insertions, 30 deletions
diff --git a/docs/docs/customization.md b/docs/docs/customization.md
index 8b3a87d6..8fe57fdf 100644
--- a/docs/docs/customization.md
+++ b/docs/docs/customization.md
@@ -41,7 +41,7 @@ These classes support any models available through the OpenAI API, assuming your
Import the `AnthropicLLM` LLM class and set it as the default model:
```python
-from continuedev.libs.llm.anthropic import AnthropicLLM
+from continuedev.src.continuedev.libs.llm.anthropic import AnthropicLLM
config = ContinueConfig(
...
@@ -58,7 +58,7 @@ Continue will automatically prompt you for your Anthropic API key, which must ha
[Ollama](https://ollama.ai/) is a Mac application that makes it easy to locally run open-source models, including Llama-2. Download the app from the website, and it will walk you through setup in a couple of minutes. You can also read more in their [README](https://github.com/jmorganca/ollama). Continue can then be configured to use the `Ollama` LLM class:
```python
-from continuedev.libs.llm.ollama import Ollama
+from continuedev.src.continuedev.libs.llm.ollama import Ollama
config = ContinueConfig(
...
@@ -72,7 +72,18 @@ config = ContinueConfig(
See our [5 minute quickstart](https://github.com/continuedev/ggml-server-example) to run any model locally with ggml. While these models don't yet perform as well, they are free, entirely private, and run offline.
-Once the model is running on localhost:8000, import the `GGML` LLM class from `continuedev.libs.llm.ggml` and set `default=GGML(max_context_length=2048)`.
+Once the model is running on localhost:8000, change `~/.continue/config.py` to look like this:
+
+```python
+from continuedev.src.continuedev.libs.llm.ggml import GGML
+
+config = ContinueConfig(
+ ...
+ models=Models(
+ default=GGML(max_context_length=2048, server_url="http://localhost:8000")
+ )
+)
+```
### Self-hosting an open-source model
@@ -85,7 +96,7 @@ If by chance the provider has the exact same API interface as OpenAI, the `GGML`
If you'd like to use OpenAI models but are concerned about privacy, you can use the Azure OpenAI service, which is GDPR and HIPAA compliant. After applying for access [here](https://azure.microsoft.com/en-us/products/ai-services/openai-service), you will typically hear back within only a few days. Once you have access, instantiate the model like so:
```python
-from continuedev.libs.llm.openai import OpenAI, OpenAIServerInfo
+from continuedev.src.continuedev.libs.llm.openai import OpenAI, OpenAIServerInfo
config = ContinueConfig(
...
@@ -110,7 +121,7 @@ You can write your own system message, a set of instructions that will always be
System messages can also reference files. For example, if there is a markdown file (e.g. at `/Users/nate/Documents/docs/reference.md`) you'd like the LLM to know about, you can reference it with [Mustache](http://mustache.github.io/mustache.5.html) templating like this: "Please reference this documentation: {{ Users/nate/Documents/docs/reference.md }}". As of now, you must use an absolute path.
-## Custom Commands
+## Custom Commands with Natural Language Prompts
You can add custom slash commands by adding a `CustomCommand` object to the `custom_commands` property. Each `CustomCommand` has
@@ -141,6 +152,39 @@ config = ContinueConfig(
)
```
+## Custom Slash Commands
+
+If you want to go a step further than writing custom commands with natural language, you can use a `SlashCommand` to run an arbitrary Python function, with access to the Continue SDK. To do this, create a subclass of `Step` with the `run` method implemented, and this is the code that will run when you call the command. For example, here is a step that generates a commit message:
+
+```python
+class CommitMessageStep(Step):
+ async def run(self, sdk: ContinueSDK):
+
+ # Get the root directory of the workspace
+ dir = sdk.ide.workspace_directory
+
+ # Run git diff in that directory
+ diff = subprocess.check_output(
+ ["git", "diff"], cwd=dir).decode("utf-8")
+
+ # Ask the LLM to write a commit message,
+ # and set it as the description of this step
+ self.description = await sdk.models.default.complete(
+ f"{diff}\n\nWrite a short, specific (less than 50 chars) commit message about the above changes:")
+
+config=ContinueConfig(
+ ...
+ slash_commands=[
+ ...
+ SlashCommand(
+ name="commit",
+ description="Generate a commit message for the current changes",
+ step=CommitMessageStep,
+ )
+ ]
+)
+```
+
## Temperature
Set `temperature` to any value between 0 and 1. Higher values will make the LLM more creative, while lower values will make it more predictable. The default is 0.5.
diff --git a/docs/docs/getting-started.md b/docs/docs/getting-started.md
index 9f70e2de..36555d9a 100644
--- a/docs/docs/getting-started.md
+++ b/docs/docs/getting-started.md
@@ -1,9 +1,5 @@
# Getting started
-:::note
-Continue requires that you have Python 3.8 or greater. If you do not, please [install](https://python.org) it
-:::
-
1. Click `Install` on the **[Continue extension in the Visual Studio Marketplace](https://marketplace.visualstudio.com/items?itemName=Continue.continue)**
2. This will open the Continue extension page in VS Code, where you will need to click `Install` again
diff --git a/docs/docs/troubleshooting.md b/docs/docs/troubleshooting.md
index fcf58336..fc157d9e 100644
--- a/docs/docs/troubleshooting.md
+++ b/docs/docs/troubleshooting.md
@@ -2,23 +2,6 @@
The Continue VS Code extension is currently in beta. It will attempt to start the Continue Python server locally for you, but sometimes this will fail, causing the "Starting Continue server..." not to disappear, or other hangups. While we are working on fixes to all of these problems, there are a few things you can do to temporarily troubleshoot:
-## For Windows Users
-
-In order to activate the Continue virtual environment, you must enable running scripts in PowerShell. In this case, the following error will appear in the console:
-
-> A Python virtual enviroment cannot be activated because running scripts is disabled for this user. In order to use Continue, please enable signed scripts to run with this command in PowerShell: `Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser`, reload VS Code, and then try again.
-
-Please open PowerShell, run the command (`Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser`), and reload VS Code.
-
-## For Linux Users
-
-Linux does not come pre-installed with `python3-venv`, which is needed for Continue to donwload dependencies and run the local server. If you see a related error, you can solve by:
-
-1. Checking your Python version with `python3 --version`
-2. Update `apt-get` with `apt-get update`
-3. Install the correct version of `python3-venv` by running `apt-get install python3.X-venv` (replacing `3.X` with your Python version. For example, if you saw "Python 3.11.4", this should be "3.11")
-4. Reload VS Code
-
## Reload VS Code
Open the command palette with cmd+shift+p, then type "Reload Window" and select it. This will give Continue another chance to start the server.
@@ -54,10 +37,6 @@ If your Continue server is not setting up, try checking the console logs:
3. Select `Console`
4. Read the console logs
-## Python requirement
-
-Continue requires that you have Python 3.8 or greater. You can check what version you have by running either `python3 --version` or `python --version` from a terminal. If the version is not 3.8.x or higher, please [install](https://python.org) it
-
## Still having trouble?
Create a GitHub issue [here](https://github.com/continuedev/continue/issues/new?assignees=&labels=bug&projects=&template=bug-report-%F0%9F%90%9B.md&title=), leaving the details of your problem, and we'll be able to more quickly help you out.