summaryrefslogtreecommitdiff
path: root/continuedev/src/continuedev/libs
AgeCommit message (Expand)Author
2023-09-02fix: :bug: fix timeout typeNate Sesti
2023-09-02feat: :sparkles: set session timeout on GGML requestsNate Sesti
2023-09-02fix: :bug: llamacpp fix indexing max_tokensNate Sesti
2023-09-02fix: :loud_sound: better logging for ggml completion endpointNate Sesti
2023-09-02fix: :bug: number of bug fixesNate Sesti
2023-09-01refactor: :recycle: refactoring LLM to avoid repetitionNate Sesti
2023-09-01feat: :sparkles: improved model dropdownNate Sesti
2023-09-01Windows meilisearch (#441)Nate Sesti
2023-08-31feat: :sparkles: change proxy url for openai classNate Sesti
2023-08-30don't url decode ollamaNate Sesti
2023-08-30fix: :art: many small improvementsNate Sesti
2023-08-29feat: :sparkles: huggingface tgi LLM classNate Sesti
2023-08-29fix: :bug: fix 2 model config bugsNate Sesti
2023-08-29feat: :mute: complete removal of telemetry when allow_anonymous_telemetry falseNate Sesti
2023-08-28feat: :sparkles: @terminal context providerNate Sesti
2023-08-28feat: :sparkles: text-gen-webui, cleaning config and welcomeNate Sesti
2023-08-28fix: :bug: fix telemetry bugNate Sesti
2023-08-27fix: :bug: streaming url_decode for OllamaNate Sesti
2023-08-27refactor: :zap: use code llama / llama2 prompt for TogetherLLMNate Sesti
2023-08-27fix: :bug: patch for ocassional 0 choices from older azure versionsNate Sesti
2023-08-27fix: :bug: fix togetherAI model json parsingNate Sesti
2023-08-27feat: :art: custom prompt templates per modelNate Sesti
2023-08-27fix: :bug: default to counting chars if tiktoken blockedNate Sesti
2023-08-27fix: :bug: urldecode ollama responses, make edit fasterNate Sesti
2023-08-27feat: :sparkles: LlamaCpp LLM subclassNate Sesti
2023-08-26fix: :bug: fix ssh /edit by checking for file through vscode fsNate Sesti
2023-08-26feat: :sparkles: select model from dropdownNate Sesti
2023-08-25small fixes, mostly ggml updatesNate Sesti
2023-08-24fix: :bug: replace hardcoded path for config fileNate Sesti
2023-08-23fix: :bug: bug where old server doesn't get updatedNate Sesti
2023-08-22Config UI (#399)Nate Sesti
2023-08-21fix: :bug: fixing bugs with ggmlNate Sesti
2023-08-20feat: :sparkles: saved context groupsNate Sesti
2023-08-20fix: :bug: fix replicate to work with models requiring prompt inputNate Sesti
2023-08-19fix: :bug: fix ggml bugNate Sesti
2023-08-19don't ignore unused import for tiktokenNate Sesti
2023-08-19fix: :bug: make sure server_version.txt existsNate Sesti
2023-08-18style: :art: autoformat with black on all python filesNate Sesti
2023-08-16fix together.py for llama-70bNate Sesti
2023-08-14fix: :bug: MAX_TOKENS_FOR_MODEL bug fix, more testingNate Sesti
2023-08-13feat: :white_check_mark: update test and add model telemetryNate Sesti
2023-08-13fix: :ambulance: hotfix and package.json seo experimentNate Sesti
2023-08-13fix: :bug: fix for Azure OpenAI model namesNate Sesti
2023-08-11Merge pull request #371 from bra1nDump/fix-url-context [skip ci]Nate Sesti
2023-08-11feat: :recycle: load preset_urls at load_indexNate Sesti
2023-08-10Draft fixing broken default configuration and introducing default urls that y...Kirill Dubovitskiy
2023-08-09feat: :sparkles: testing in ci, final test ofNate Sesti
2023-08-09feat: :sparkles: support for Together.ai modelsNate Sesti
2023-08-09feat: :sparkles: add urlcontextprovider back to default configNate Sesti
2023-08-08feat: :sparkles: testing improved prompting for stablecodeNate Sesti