Releases: janhq/cortex.llamacpp
Releases · janhq/cortex.llamacpp
0.1.12
Changes
- feat: add more info on getting models @vansangpfiev (#66)
- Update llama.cpp submodule to latest release b3029 @jan-service-account (#65)
- fix: use mistra model for submodule e2e @vansangpfiev (#64)
- feat: add python @vansangpfiev (#61)
- feat: separate CI for llama.cpp submodule update @vansangpfiev (#58)
- Add codesign @hiento09 (#59)
Contributor
@hiento09, @jan-service-account and @vansangpfiev
Full Changelog: v0.1.11...v0.1.12
0.1.11
What's Changed
- feat: replace llama_token_eos by llama_token_is_eog by @vansangpfiev in #57
- Update llama.cpp submodule to latest release b3003 by @jan-service-account in #56
- Update llama.cpp submodule to latest release b3012 by @jan-service-account in #62
- feat: change n_batch default to 2048 by @vansangpfiev in #63
Full Changelog: v0.1.10...v0.1.11
0.1.10
What's Changed
- Update llama.cpp submodule to latest release b2985 by @jan-service-account in #53
- fix: small lock improvement by @vansangpfiev in #19
New Contributors
- @jan-service-account made their first contribution in #53
Full Changelog: v0.1.9...v0.1.10
0.1.9
What's Changed
- Chore fix auto sync by @hiento09 in #49
- feat: get running models by @vansangpfiev in #34
- fix: flash attention param typo by @vansangpfiev in #50
- chore: pr sync remote weekdays by @vansangpfiev in #51
- fix: return false if error during loading model by @vansangpfiev in #52
Full Changelog: v0.1.8...v0.1.9
0.1.8
What's Changed
- chore: bump llama.cpp to b2961 by @vansangpfiev in #45
Full Changelog: v0.1.7...v0.1.8
0.1.7
What's Changed
- Add CI auto create PR to update submodule by @hiento09 in #38
- feat: support Flash Attention by @vansangpfiev in #32
- Add trigger CI after create PR by @hiento09 in #43
- Replace deprecated steps github action by @hiento09 in #46
Full Changelog: v0.1.6...v0.1.7
0.1.6
0.1.5
What's Changed
- fix: decode if has_images by @vansangpfiev in #28
- fix: handle error if decode failed by @vansangpfiev in #29
Full Changelog: v0.1.4...v0.1.5
0.1.4
What's Changed
- chore: bump llama.cpp to b2894 by @vansangpfiev in #26
- fix: timeout 60 mins for CI by @vansangpfiev in #27
Full Changelog: v0.1.3...v0.1.4
0.1.3
What's Changed
- fix: print logic by @vansangpfiev in #25
- feat: update README.md by @vansangpfiev in #23
Full Changelog: v0.1.2...v0.1.3