Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider adding sliders for temperature and top_p alongside mood and persisting with custom prompt #17

Open
jmckenzie-dev opened this issue Apr 17, 2023 · 1 comment

Comments

@jmckenzie-dev
Copy link

The OpenAI API has a couple params that can be tuned specifically based on what you're looking for; for code lookups and analysis, a more rigorous constraint on temperature (i.e. don't make things up 😉 ) would probably be preferable to the default which is apparently 1 (go be creative). On the flip side, creative writing, brainstorming, marketing and sales material, etc - all really well served by high temperature.

This is a pretty straightforward and interesting explanation of how temperature works in an LLM.

I think it'd be a big boost to the utility of this project to add sliders for both temperature and top_p:

api reference for temperature

What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.

We generally recommend altering this or top_p but not both.

api reference for top_p

An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.

We generally recommend altering this or temperature but not both.

A couple of UX thoughts here:

  1. Might be worth making the 2 options mutually exclusive from one another so if one is selected the other is not given the guidance not to use them together.
  2. There may be value in tying whatever the last used temperature / top_p / mood combination (i.e. prompt addendum) to a given custom prompt, so each prompt + level of creativity / generation can be persisted across multiple uses and specific use-cases tuned in that way.

Right now it looks like the mood may be decoupled from the character string in ChatOptionState:

/* --- STATE --- */
export interface ChatOptionsState {
  selectedCharacter: string;
  chatMood: number;

From a cursory search (plus direction from ChatGPT 🎉) it looks like stringifying objects to JSON in react isn't too painful so hopefully this wouldn't be too disruptive of a change (from string to slightly more complex string + floats kind of class), assuming it makes sense.

Famous last words. 😀

@silopolis
Copy link

About additional params and UI, mimicking https://platform.openai.com/playground would surely be nice!

Also love the idea of being able to create some, say, "profiles" bundling all settings (character, prompt, mood,...) into predefined user configs. That would surely be handy!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants