A RedM standalone Development API System exposing popular AI (LLM) systems to developers. Supports: ChatGPT, Google Gemini, Mistral AI, Horde AI
This is an experimental project, use at your own discretion.
- Download this repo
- Copy and paste
bcc-ai
folder toresources/bcc-ai
- Add
ensure bcc-ai
to yourserver.cfg
file (ABOVE any scripts that use it) - Now you are ready to get coding!
local AI = exports['bcc-ai'].Initiate('gpt', 'YOUR_KEY_HERE')
local test = AI.generateText({
prompt = "Hello what is your name?",
model = "gpt-3.5-turbo",
max_tokens = 40,
temperature = 1
})
print(test[1].text)
local AI = exports['bcc-ai'].Initiate('gemini', 'YOUR_KEY_HERE')
local prompt = [[List a few popular cookie recipes using this JSON schema:
Recipe = {'recipeName': string}
Return: Array<Recipe>]]
local test = AI.generateText({
prompt = prompt,
model = "gemini-1.5-flash"
})
print(test[1].recipeName)
local AI = exports['bcc-ai'].Initiate('mistral', 'YOUR_KEY_HERE')
local test = AI.generateText({
model = "mistral-large-latest",
messages= {
{"role": "user", "content": "Who is the most renowned French painter?"}
}
})
print(test.choices[0].message.content)
local AI = exports['bcc-ai'].Initiate()
local test = AI.generateText('Hello what is your name?')
print(test[1].text)
-- Example response: Why so nervous? I'm not here to harm you." Another step closer. "My name is Sma, by the way. And you are?"
The more you use this without contributing by joining the horde, the lower your request is in any priority processing queue. However, this helps it remain completely free. If you wish to help support this free to use system, join the horde and host an LLM. This will increase your queue status.