summaryrefslogtreecommitdiff
path: root/continuedev/src
AgeCommit message (Expand)Author
2023-09-02fix: :loud_sound: better logging for ggml completion endpointNate Sesti
2023-09-02fix: error printing bug leading to uncaught errNate Sesti
2023-09-02docs: :memo: update docs for openaiserverifnoNate Sesti
2023-09-02fix: :bug: number of bug fixesNate Sesti
2023-09-02fix: :bug: don't fail on disconnected websocketNate Sesti
2023-09-01fix: :bug: avoid removing disallowed file windowsNate Sesti
2023-09-01refactor: :recycle: refactoring LLM to avoid repetitionNate Sesti
2023-09-01feat: :sparkles: improved model dropdownNate Sesti
2023-09-01feat: :sparkles: allow changing the summary promptNate Sesti
2023-09-01Windows meilisearch (#441)Nate Sesti
2023-08-31feat: :sparkles: change proxy url for openai classNate Sesti
2023-08-31fix: :bug: fix model changing bugNate Sesti
2023-08-31Merge branch 'main' of https://github.com/continuedev/continueNate Sesti
2023-08-30html unescapeNate Sesti
2023-08-30don't url decode ollamaNate Sesti
2023-08-30fix: :art: many small improvementsNate Sesti
2023-08-29feat: :sparkles: huggingface tgi LLM classNate Sesti
2023-08-29fix: 🐛 typo in core.py (#429)Ikko Eltociear Ashimine
2023-08-29fix: :bug: fix 2 model config bugsNate Sesti
2023-08-29docs: :memo: Better documentation about Meilisearch Windows supportNate Sesti
2023-08-29feat: :mute: complete removal of telemetry when allow_anonymous_telemetry falseNate Sesti
2023-08-28feat: :sparkles: @terminal context providerNate Sesti
2023-08-28feat: :sparkles: text-gen-webui, cleaning config and welcomeNate Sesti
2023-08-28fix: :bug: fix telemetry bugNate Sesti
2023-08-27fix: :bug: streaming url_decode for OllamaNate Sesti
2023-08-27refactor: :zap: use code llama / llama2 prompt for TogetherLLMNate Sesti
2023-08-27fix: :bug: patch for ocassional 0 choices from older azure versionsNate Sesti
2023-08-27fix: :bug: fix togetherAI model json parsingNate Sesti
2023-08-27feat: :art: custom prompt templates per modelNate Sesti
2023-08-27fix: :bug: default to counting chars if tiktoken blockedNate Sesti
2023-08-27fix: :bug: urldecode ollama responses, make edit fasterNate Sesti
2023-08-27feat: :sparkles: LlamaCpp LLM subclassNate Sesti
2023-08-26fix: :bug: correctly generate uris for remoteNate Sesti
2023-08-26fix: :bug: fix ssh /edit by checking for file through vscode fsNate Sesti
2023-08-26feat: :sparkles: select model from dropdownNate Sesti
2023-08-25small fixes, mostly ggml updatesNate Sesti
2023-08-25migration logic for filter by workspaceNate Sesti
2023-08-25feat: :sparkles: filter history by workspaceNate Sesti
2023-08-25Merge branch 'main' of https://github.com/continuedev/continueNate Sesti
2023-08-25fix: :bug: ssh compatibility by reading from vscode.workspace.fsNate Sesti
2023-08-25fix(google): remove unnecessary parameter (#394)Thomas Ngo Trung
2023-08-24don't clear context with /clearNate Sesti
2023-08-24fix: :bug: replace hardcoded path for config fileNate Sesti
2023-08-23fix: :bug: fix when multiple cursor ranges are selectedNate Sesti
2023-08-23fix: :bug: bug where old server doesn't get updatedNate Sesti
2023-08-23fix: :lipstick: don't display entirety of large tracebacksNate Sesti
2023-08-23fix: :bug: fix meilisearch empty body content-type bugNate Sesti
2023-08-23Testing gh workflow (#401)Nate Sesti
2023-08-22fix: :bug: correction to ContinueConfig serialization modelNate Sesti
2023-08-22fix: :bug: fix serialization bug for context_providersNate Sesti