Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Groq Added #141

Open
wants to merge 8 commits into
base: main
Choose a base branch
from
Open

Groq Added #141

wants to merge 8 commits into from

Conversation

Piyu-Pika
Copy link
Contributor

Groq Integration Update

Overview

We have integrated Groq as a new model provider in our application, expanding our suite of AI model options. This integration allows users to leverage Groq's high-performance inference capabilities alongside existing providers like Anthropic, OpenAI, and others.

Added Features

New Configuration Options

  • Added Groq API configuration in the settings:
    • API Key configuration
    • Model selection with options:
      • mixtral-8x7b-32768
      • llama2-70b-4096
      • gemma-7b-it

Implementation Details

  • Full streaming support for real-time responses
  • Proper error handling for API issues
  • Integrated with the existing message handling system
  • Support for max token configuration

Benefits of Groq

1. Superior Speed

  • Groq offers significantly faster inference times compared to other providers
  • Capable of processing responses up to 100x faster than traditional GPU-based systems

2. Consistent Performance

  • Delivers reliable and stable response times
  • Reduced latency variations in high-load scenarios

3. Cost-Effective

  • Competitive pricing model
  • Efficient token usage leading to potential cost savings

4. Model Variety

  • Access to popular open-source models optimized for Groq's architecture
  • Includes both large and efficient model variants

5. Easy Integration

  • Compatible with existing message formats
  • Seamless integration with the current configuration system

Usage

To use Groq in your application:

  1. Obtain a Groq API key
  2. Configure the API key in settings
  3. Select Groq as your provider
  4. Choose your preferred model

@Piyu-Pika
Copy link
Contributor Author

Piyu-Pika commented Nov 6, 2024

@andrewpareles i dont know why its showing that much changes

@Piyu-Pika
Copy link
Contributor Author

@andrewpareles pls review it

@andrewpareles
Copy link
Contributor

Can you share where you got these models? Is this an extensive list?

[
"mixtral-8x7b-32768",
"llama2-70b-4096",
"gemma-7b-it"
] as const

@Piyu-Pika
Copy link
Contributor Author

@andrewpareles there are many modules in groq I currently added only 3 but there are more you can check them at

https://console.groq.com/docs/models

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants