Age | Commit message (Expand) | Author |
2023-09-02 | fix: :bug: remove empty grammar from llama_cpp_args | Nate Sesti |
2023-09-02 | fix: :bug: fix timeout type | Nate Sesti |
2023-09-02 | feat: :sparkles: set session timeout on GGML requests | Nate Sesti |
2023-09-02 | fix: :bug: fix usages of LLM.complete | Nate Sesti |
2023-09-02 | fix: :bug: llamacpp fix indexing max_tokens | Nate Sesti |
2023-09-02 | fix: :loud_sound: better logging for ggml completion endpoint | Nate Sesti |
2023-09-02 | fix: error printing bug leading to uncaught err | Nate Sesti |
2023-09-02 | docs: :memo: update docs for openaiserverifno | Nate Sesti |
2023-09-02 | fix: :bug: number of bug fixes | Nate Sesti |
2023-09-02 | fix: :bug: don't fail on disconnected websocket | Nate Sesti |
2023-09-01 | fix: :bug: avoid removing disallowed file windows | Nate Sesti |
2023-09-01 | refactor: :recycle: refactoring LLM to avoid repetition | Nate Sesti |
2023-09-01 | feat: :sparkles: improved model dropdown | Nate Sesti |
2023-09-01 | feat: :sparkles: allow changing the summary prompt | Nate Sesti |
2023-09-01 | Windows meilisearch (#441) | Nate Sesti |
2023-08-31 | feat: :sparkles: change proxy url for openai class | Nate Sesti |
2023-08-31 | fix: :bug: fix model changing bug | Nate Sesti |
2023-08-31 | Merge branch 'main' of https://github.com/continuedev/continue | Nate Sesti |
2023-08-30 | html unescape | Nate Sesti |
2023-08-30 | don't url decode ollama | Nate Sesti |
2023-08-30 | fix: :art: many small improvements | Nate Sesti |
2023-08-29 | feat: :sparkles: huggingface tgi LLM class | Nate Sesti |
2023-08-29 | fix: 🐛 typo in core.py (#429) | Ikko Eltociear Ashimine |
2023-08-29 | fix: :bug: fix 2 model config bugs | Nate Sesti |
2023-08-29 | docs: :memo: Better documentation about Meilisearch Windows support | Nate Sesti |
2023-08-29 | feat: :mute: complete removal of telemetry when allow_anonymous_telemetry false | Nate Sesti |
2023-08-28 | feat: :sparkles: @terminal context provider | Nate Sesti |
2023-08-28 | feat: :sparkles: text-gen-webui, cleaning config and welcome | Nate Sesti |
2023-08-28 | fix: :bug: fix telemetry bug | Nate Sesti |
2023-08-27 | fix: :bug: streaming url_decode for Ollama | Nate Sesti |
2023-08-27 | refactor: :zap: use code llama / llama2 prompt for TogetherLLM | Nate Sesti |
2023-08-27 | fix: :bug: patch for ocassional 0 choices from older azure versions | Nate Sesti |
2023-08-27 | fix: :bug: fix togetherAI model json parsing | Nate Sesti |
2023-08-27 | feat: :art: custom prompt templates per model | Nate Sesti |
2023-08-27 | fix: :bug: default to counting chars if tiktoken blocked | Nate Sesti |
2023-08-27 | fix: :bug: urldecode ollama responses, make edit faster | Nate Sesti |
2023-08-27 | feat: :sparkles: LlamaCpp LLM subclass | Nate Sesti |
2023-08-26 | fix: :bug: correctly generate uris for remote | Nate Sesti |
2023-08-26 | fix: :bug: fix ssh /edit by checking for file through vscode fs | Nate Sesti |
2023-08-26 | feat: :sparkles: select model from dropdown | Nate Sesti |
2023-08-25 | small fixes, mostly ggml updates | Nate Sesti |
2023-08-25 | migration logic for filter by workspace | Nate Sesti |
2023-08-25 | feat: :sparkles: filter history by workspace | Nate Sesti |
2023-08-25 | Merge branch 'main' of https://github.com/continuedev/continue | Nate Sesti |
2023-08-25 | fix: :bug: ssh compatibility by reading from vscode.workspace.fs | Nate Sesti |
2023-08-25 | fix(google): remove unnecessary parameter (#394) | Thomas Ngo Trung |
2023-08-24 | don't clear context with /clear | Nate Sesti |
2023-08-24 | fix: :bug: replace hardcoded path for config file | Nate Sesti |
2023-08-23 | fix: :bug: fix when multiple cursor ranges are selected | Nate Sesti |
2023-08-23 | fix: :bug: bug where old server doesn't get updated | Nate Sesti |