v1.0.0
🎉 Highlights 🎉
- LLMs are now accessed through
litellm
, meaning OntoGPT may now be used with a large collection of API endpoints and local or alternative model providers. See the full list with theontogpt list-models
command. - Local, open models may be downloaded and used through the
ollama
package. - Numerous bugfixes
- Documentation updates
- Updates for the webapp
If something seems broken, please let us know! Open an issue here: https://github.com/monarch-initiative/ontogpt/issues
What's Changed
- Fixing failure to parse markdown, JSON formatting, and numbered lists by @caufieldjh in #394
- Add template for summarizing LinkML data validation reports by @caufieldjh in #400
- Improvements for the alzrd template by @caufieldjh in #399
- Further updates to the ADRD extraction template by @caufieldjh in #403
- Restructure Alz extraction template; split into two variants by @caufieldjh in #405
- Additional updates to the Alzheimers extraction template by @caufieldjh in #406
- Fix issue with early termination of multilanguage analysis runs by @caufieldjh in #408
- Add support for more models by @caufieldjh in #373
- Tests for new LLMClient by @caufieldjh in #410
- General cleanup: docs, code optimization, and project structure by @caufieldjh in #411
- Add option to truncate input_text in outputs by @caufieldjh in #413
- eliminate hardcoded gpt4-turbo from multilingual function by @leokim-l in #415
- Misc changes for 1.0.0rc2 by @caufieldjh in #416
- Prep for v1.0.0 release by @caufieldjh in #418
New Contributors
Full Changelog: v0.3.15...v1.0.0