Releases: janhq/cortex.onnx
Releases · janhq/cortex.onnx
0.1.7
0.1.6
0.1.5
What's Changed
- chore: use stable onnx runtime version by @vansangpfiev in #16
- feat: Model converter CI by @hiro-v in #15
- fix: use nightly release build for onnxruntime by @vansangpfiev in #19
New Contributors
Full Changelog: v0.1.4...v0.1.5
0.1.4
What's Changed
- chore: bump onnxruntime to nightly 1.19.0-dev-20240621 by @vansangpfiev in #14
Full Changelog: v0.1.3...v0.1.4
0.1.3
What's Changed
- chore: bump onnxruntime-genai to f36396f and fix build errors by @vansangpfiev in #13
Full Changelog: v0.1.2...v0.1.3
0.1.2
What's Changed
- feat: log generated tokens per second by @vansangpfiev in #11
- feat: add max_history_chat parameter by @vansangpfiev in #12
Full Changelog: v0.1.1...v0.1.2
0.1.1
What's Changed
- feat: update README.md by @vansangpfiev in #6
- fix: disable genai tests and benchmark by @vansangpfiev in #7
- feat: use async queue for chat completion by @vansangpfiev in #8
- feat: support pre_prompt by @vansangpfiev in #9
Full Changelog: v0.1.0...v0.1.1
0.1.0
fix: non-stream inference bugfix (#4) Co-authored-by: sangjanai <[email protected]>