0.1.35
What's Changed
- Update llama.cpp submodule to latest release b3849 by @jan-service-account in #252
- fix: proper way to extract base64 by @vansangpfiev in #235
- fix: support chat completion content array type by @vansangpfiev in #253
- fix: pr sync workflow by @vansangpfiev in #254
- fix: nightly workflow by @vansangpfiev in #255
- fix: warm-up by @vansangpfiev in #259
Full Changelog: v0.1.34...v0.1.35