0.5.0-7
Pre-release
Pre-release
Changes
- fix: cortex engines init api with empty body should init default options @louis-jan (#962)
- fix: change default host cortex @marknguyen1302 (#961)
- feat: watch models and engines update for proper data retrieval @louis-jan (#960)
- fix: fix issue when re-download aborted model @marknguyen1302 (#959)
- Add launchpad uninstaller prerm and postrm @hiento09 (#957)
- feat: support martian nvidia engine @marknguyen1302 (#956)
- chore: delete local file when abort download @marknguyen1302 (#954)
- fix: terminate system should kill cortex process @louis-jan (#955)
- fix: start and run models are not outputting last error logs @louis-jan (#951)
- fix: remove hardcode stream @marknguyen1302 (#952)
- chore: persist context length and ngl from gguf file @louis-jan (#947)
- chore: show error from remote engine, lint @marknguyen1302 (#949)
- Fix openai api pipeline @hiento09 (#948)
- chore: destroy dangling processes on uninstall @louis-jan (#945)
- feat: support openrouter, cohere engine @marknguyen1302 (#946)
- feature: support local model pull @louis-jan (#944)
- chore: handle failed download @marknguyen1302 (#943)
- chore: specify engine version to pull @louis-jan (#942)
- bump cortex llamacpp for llama 3.1 @Van-QA (#941)