summaryrefslogtreecommitdiff
path: root/continuedev
AgeCommit message (Collapse)Author
2023-09-09ci: :green_heart: debugging testNate Sesti
2023-09-09Merge branch 'main' of https://github.com/continuedev/continueNate Sesti
2023-09-09feat: :sparkles: make follow-up editsNate Sesti
2023-09-09feat: :lipstick: query input indicator for ctx provsNate Sesti
2023-09-08feat: :lipstick: nested context provider dropdownNate Sesti
2023-09-07ci: 🏷 Update PyPI version [skip ci]GitHub Action
2023-09-07ci: 🏷 Update PyPI version [skip ci]GitHub Action
2023-09-07Merge branch 'main' of https://github.com/continuedev/continueNate Sesti
2023-09-07style: :children_crossing: remove -h flag conflicting with helpNate Sesti
2023-09-07adding support for Hugging Face Inference Endpoints (#460)Ty Dunn
* stream complete sketch * correct structure but issues * refactor: :art: clean up hf_inference_api.py * fix: :bug: quick fix in hf_infrerence_api.py * feat: :memo: update documentation code for hf_inference_api * hf docs * now working --------- Co-authored-by: Nate Sesti <sestinj@gmail.com>
2023-09-07ci: 🏷 Update PyPI version [skip ci]GitHub Action
2023-09-07ci: 🏷 Update PyPI version [skip ci]GitHub Action
2023-09-07ci: 🏷 Update PyPI version [skip ci]GitHub Action
2023-09-07fix: :bug: templating fix for queued LLMNate Sesti
2023-09-07ci: 🏷 Update PyPI version [skip ci]GitHub Action
2023-09-07refactor: :art: template_messages for GGMLNate Sesti
2023-09-06adding some basic unit tests (#456)Ty Dunn
* basic unit tests * addressing comments
2023-09-06feat: :lipstick: handful of UI improvementsNate Sesti
2023-09-06feat: :adhesive_bandage: QueuedLLM for simultaneous reqs (LM Studio)Nate Sesti
2023-09-06feat: :globe_with_meridians: alpaca chat templateNate Sesti
2023-09-06fix: :adhesive_bandage: allow GGML to use api.openai.comNate Sesti
2023-09-06fix: :bug: separately load ctx provs, fix filetreeNate Sesti
2023-09-06Merge branch 'main' of https://github.com/continuedev/continueNate Sesti
2023-09-06fix: :bug: fixes for a few context_providersNate Sesti
2023-09-06restart_meilisearchsestinj
2023-09-06ci: 🏷 Update PyPI version [skip ci]GitHub Action
2023-09-06chore: :bookmark: update pypi version manuallyNate Sesti
2023-09-06feat: :sparkles: run continue immediately from pypi pkgNate Sesti
2023-09-06fix: :construction: working on fixing lspNate Sesti
2023-09-05Development Data Logging (#455)Nate Sesti
* feat: :tada: playing around with dlt for data loading * feat: :loud_sound: log development data * feat: :loud_sound: log tokens generated by model * fix: :safety_vest: try/except around dev_data_logger.capture
2023-09-05feat: :sparkles: /cmd slash commandNate Sesti
2023-09-05feat: :sparkles: support browser-based IDEs with createMemoryRouterNate Sesti
2023-09-04fix: :bug: fix context length bug for /editNate Sesti
2023-09-04fix: :bug: traceback fixes, remove replicate from hiddenimportsNate Sesti
2023-09-04Integrate LSP for debugging (#450)Nate Sesti
* headless IDE subclass * finish headless_ide methods * feat: :sparkles: headless mode running with config flag * work on debugging * python lsp support * more lsp+debugging work * refactor: :safety_vest: safely load LSP * test: :white_check_mark: testing steps in headless mode * refactor: :clown_face: cleanup subprocesses * fix: :bug: handle data: [DONE] from Together
2023-09-03fix: :bug: allow None for timeoutNate Sesti
2023-09-03refactor: :construction: Initial, not tested, refactor of LLM (#448)Nate Sesti
* refactor: :construction: Initial, not tested, refactor of LLM * refactor: :construction: replace usages of _complete with complete * fix: :bug: fixes after refactor * refactor: :recycle: template raw completions in chat format * test: :white_check_mark: simplified edit prompt and UNIT TESTS! * ci: :green_heart: unit tests in ci * fix: :bug: fixes for unit tests in ci * fix: :bug: start uvicorn in tests without poetry * fix: :closed_lock_with_key: add secrets to main.yaml * feat: :adhesive_bandage: timeout for all LLM classes * ci: :green_heart: prepare main.yaml for main branch
2023-09-02fix: :bug: remove empty grammar from llama_cpp_argsNate Sesti
2023-09-02fix: :bug: fix timeout typeNate Sesti
2023-09-02feat: :sparkles: set session timeout on GGML requestsNate Sesti
2023-09-02fix: :bug: fix usages of LLM.completeNate Sesti
2023-09-02fix: :bug: llamacpp fix indexing max_tokensNate Sesti
2023-09-02fix: :loud_sound: better logging for ggml completion endpointNate Sesti
2023-09-02fix: error printing bug leading to uncaught errNate Sesti
2023-09-02docs: :memo: update docs for openaiserverifnoNate Sesti
2023-09-02fix: :bug: number of bug fixesNate Sesti
2023-09-02fix: :bug: don't fail on disconnected websocketNate Sesti
2023-09-01fix: :bug: avoid removing disallowed file windowsNate Sesti
2023-09-01refactor: :recycle: refactoring LLM to avoid repetitionNate Sesti
2023-09-01feat: :sparkles: improved model dropdownNate Sesti