Releases: maitrix-org/llm-reasoners
LLM Reasoners v1.0.0
Initial release of LLM Reasoners
LLM Reasoners is a library to enable LLMs to conduct complex reasoning, with advanced reasoning algorithms. It approaches multi-step reasoning as planning and searches for the optimal reasoning chain, which achieves the best balance of exploration vs exploitation with the idea of "World Model" and "Reward".
Core Features
Cutting-Edge Reasoning Algorithms
We offer the most up-to-date search algorithms for reasoning with LLMs, such as:
- Reasoning-via-Planning, MCTS (Hao et al., 2023)
- StructChem (Ouyang et al., 2023)
- Chain-of-thoughts (Wei et al., 2022)
- Least-to-most prompting (Zhou et al., 2022)
- Tree-of-Thoughts, BFS (Yao et al., 2023)
- Tree-of-Thoughts, DFS (Yao et al., 2023)
- Guided Decoding, Beam Search (Xie et al., 2023)
- Grace Decoding, Greedy Decoding (Khalifa et al., 2023)
Intuitive Visualization and Interpretation
Our library provides a visualization tool to aid users in comprehending the reasoning process. Even for complex reasoning algorithms like Monte-Carlo Tree Search, users can easily diagnose and understand the process with one line of python code. See an exmaple in the tutorial notebook.
Compatibility with popular LLM libraries
Our framework is compatible with popular LLM frameworks, e.g. Huggingface transformers
, OpenAI
/Google
/Anthropic
API, etc. Specifically, we have integrated LLaMA-1/2/3 with the option of using fairscale
(1,2, 3), LLaMA.cpp, Exllama or huggingface
for different needs, e.g., fastest inference speed, minimal hardware requirements, etc.