Skip to content

Simple front-end interface for querying a local Ollama API server

License

Notifications You must be signed in to change notification settings

mthongvanh/amallo

Repository files navigation

amallo

Simple GUI to query a local Ollama API server for inference written in Flutter and manage large language models

  • Add models from Ollama servers
  • Create local models from Modelfile with template, parameter, adapter and license options
  • Copy/Delete installed models
  • View Modelfile information, including system prompt template and model parameters
  • Save/Delete conversations

screenshot

Getting Started

  1. Download and install Ollama
  2. Install amallo or,
  3. Build from source.
   git clone [email protected]:mthongvanh/amallo.git
   cd amallo
   flutter pub get
   flutter run -d macos

Built with:

Flutter 3.13.2

Dart 3.1.0

Tested on: macOS Sonoma (14.0) & Xcode 14.3.1

About

Simple front-end interface for querying a local Ollama API server

Resources

License

Stars

Watchers

Forks

Packages

No packages published