All notable changes to Chainlit will be documented in this file.
The format is based on Keep a Changelog.
Nothing is unreleased!
- Llama index callback handler should now correctly nest the intermediary steps
- Toggling hide_cot parameter in the UI should correctly hide the
took n steps
buttons running
loading button should only be displayed once whenhide_cot
is true and a message is being streamed
on_logout
hook allowing to clear cookies when a user logs out
- Chainlit apps won't crash anymore if the data layer is not reachable
- File upload now works when switching chat profiles
- Avatar with an image no longer have a background color
- If
hide_cot
is set totrue
, the UI will never get the intermediary steps (but they will still be persisted) - Fixed a bug preventing to open past chats
- Scroll down button
- If
hide_cot
is set totrue
, arunning
loader is displayed by default under the last message when a task is running.
- Avatars are now always displayed
- Chat history sidebar has been revamped
- Stop task button has been moved to the input bar
- If
hide_cot
is set totrue
, the UI will never get the intermediary steps (but they will still be persisted)
- Elements are now working when authenticated
- First interaction is correctly set when resuming a chat
- The copy button is hidden if
disable_feedback
istrue
- Copy button under messages
- OAuth samesite cookie policy is now configurable through the
CHAINLIT_COOKIE_SAMESITE
env var
- Relax Python version requirements
- If
hide_cot
is configured totrue
, steps will never be sent to the UI, but still persisted. - Message buttons are now positioned below
- cl.Step
- File upload uses HTTP instead of WS and no longer has size limitation
cl.AppUser
becomescl.User
Prompt
has been split inChatGeneration
andCompletionGeneration
Action
now display a toaster in the UI while running
- Support for custom HTML in message content is now an opt in feature in the config
- Uvicorn
ws_per_message_deflate
config param is now configurable likeUVICORN_WS_PER_MESSAGE_DEFLATE=false
- Latex support is no longer enabled by default and is now a feature in the config
- Fixed LCEL memory message order in the prompt playground
- Fixed a key error when using the file watcher (-w)
- Fixed several user experience issues with
on_chat_resume
on_chat_end
is now always called when a chat ends- Switching chat profiles correctly clears previous AskMessages
on_chat_resume
now works properly with non json serializable objectsLangchainCallbackHandler
no longer send tokens to the wrong user under high concurrency- Langchain cache should work when
cache
is totrue
inconfig.toml
- Markdown links special characters are no longer encoded
- Collapsed messages no longer make the chat scroll
- Stringified Python objects are now displayed in a Python code block
- Latex support (only supporting $$ notation)
- Go back button on element page
- Code blocks should no longer flicker or display
[object object]
. - Now properly displaying empty messages with inlined elements
- Fixed
Too many values to unpack error
in langchain callback - Langchain final streamed answer is now annotable with human feedback
- AzureOpenAI should now work properly in the Prompt Playground
- Code blocks display has been enhanced
- Replaced aiohttp with httpx
- Prompt Playground has been updated to work with the new openai release (v1). Including tools
- Auth0 oauth provider has a new configurable env variable
OAUTH_AUTH0_ORIGINAL_DOMAIN
cl.on_chat_resume
decorator to enable users to continue a conversation.- Support for OpenAI functions in the Prompt Playground
- Ability to add/remove messages in the Prompt Playground
- Plotly element to display interactive charts
- Langchain intermediate steps display are now much more readable
- Chat history loading latency has been enhanced
- UTF-8 characters are now correctly displayed in json code blocks
- Select widget
items
attribute is now working properly - Chat profiles widget is no longer scrolling horizontally
- Support for Langchain Expression Language. https://docs.chainlit.io/integrations/langchain
- UI rendering optimization to guarantee high framerate
- Chainlit Cloud latency optimization
- Speech recognition to type messages. https://docs.chainlit.io/backend/config/features
- Descope OAuth provider
LangchainCallbackHandler
is now displaying inputs and outputs of intermediate steps.
- AskUserMessage now work properly with data persistence
- You can now use a custom okta authorization server for authentication
ChatProfile
allows to configure different agents that the user can freely chose- Multi modal support at the input bar level. Enabled by
features.multi_modal
in the config cl.AskUserAction
allows to block code execution until the user clicked an action.- Displaying readme when chat is empty is now configurable through
ui.show_readme_as_default
in the config
cl.on_message
is no longer taking a string as parameter but rather acl.Message
- Chat history is now correctly displayed on mobile
- Azure AD OAuth authentication should now correctly display the user profile picture
@cl.on_file_upload
is replaced by true multi modal support at the input bar level
- Logo is displayed in the UI header (works with custom logo)
- Azure AD single tenant is now supported
collapsed
attribute on theAction
class- Latency improvements when data persistence is enabled
- Chat history has been entirely reworked
- Chat messages redesign
config.ui.base_url
becomesCHAINLIT_URL
env variable
- File watcher (-w) is now working with nested module imports
- Unsupported character during OAuth authentication
- Pydantic v2 support
- Okta auth provider
- Auth0 auth provider
- Prompt playground support for mix of template/formatted prompts
@cl.on_chat_end
decorator- Textual comments to user feedback
- Langchain errors are now correctly indented
- Langchain nested chains prompts are now correctly displayed
- Langchain error TypeError: 'NoneType' object is not a mapping.
- Actions are now displayed on mobile
- Custom logo is now working as intended
- Authentication is now unopinionated:
@cl.password_auth_callback
for login/password auth@cl.oauth_callback
for oAuth auth@cl.header_auth_callback
for header auth
- Data persistence is now enabled through
CHAINLIT_API_KEY
env variable
@cl.auth_client_factory
(see new authentication)@cl.db_client_factory
(see new data persistence)
disable_human_feedback
parameter oncl.Message
- Configurable logo
- Configurable favicon
- Custom CSS injection
- GCP Vertex AI LLM provider
- Long message collpasing feature flag
- Enable Prompt Playground feature flag
- History page filters now work properly
- History page does not show empty conversations anymore
- Langchain callback handler Message errors
@cl.on_file_upload
to enable spontaneous file uploadsLangchainGenericProvider
to add any Langchain LLM in the Prompt Playgroundcl.Message
content now support dict (previously only supported string)- Long messages are now collapsed by default
- Deadlock in the Llama Index callback handler
- Langchain MessagesPlaceholder and FunctionMessage are now correctly supported
- Complete rework of the Prompt playground. Now supports custom LLMs, templates, variables and more
- Enhanced Langchain final answer streaming
remove_actions
method on theMessage
class- Button to clear message history
- Chainlit CLI performance issue
- Llama Index v0.8+ callback handler. Now supports messages prompts
- Tasklist display, persistence and
.remove()
- Custom headers growing infinitely large
- Action callback can now handle multiple actions
- Langflow integration load_flow_from_json
- Video and audio elements on Safari
- Make the chat experience configurable with Chat Settings
- Authenticate users based on custom headers with the Custom Auth client
- Author rename now works with all kinds of messages
- Create message error with chainlit cloud (chenjuneking)
- Security improvements
- Haystack callback handler
- Theme customizability
- Allow multiple browser tabs to connect to one Chainlit app
- Sidebar blocking the send button on mobile
- Factories, run and post process decorators are removed.
- langchain_rename becomes author_rename and works globally
- Message.update signature changed
Migration guide available here.
- Langchain final answer streaming
- Redesign of chainlit input elements
- Possibility to add custom endpoints to the fast api server
- New File Element
- Copy button in code blocks
- Persist session between websocket reconnection
- The UI is now more mobile friendly
- Avatar element Path parameter
- Increased web socket message max size to 100 mb
- Duplicated conversations in the history tab
- Add the video element
- Fix the inline element flashing when scrolling the page, due to un-needed re-rendering
- Fix the orange flash effect on messages
- Task list element
- Audio element
- All elements can use the
.remove()
method to remove themselves from the UI - Can now use cloud auth with any data persistence mode (like local)
- Microsoft auth
- Files in app dir are now properly served (typical use case is displaying an image in the readme)
- Add missing attribute
size
to Pyplot element
- AskUserMessage.remove() now works properly
- Avatar element cannot be referenced in messages anymore
- New data persistence mode
local
andcustom
are available on top of the pre-existingcloud
one. Learn more here.
- Performance improvements and bug fixes on run_sync and asyncify
- File watcher now reloads the app when the config is updated
- cl.cache to avoid wasting time reloading expensive resources every time the app reloads
- Bug introduced by 0.4.0 preventing to run private apps
- Long line content breaking the sidebar with Text elements
- File watcher preventing to keyboard interrupt the chainlit process
- Updated socket io to fix a security issue
- Bug preventing config settings to be the default values for the settings in the UI
- Pyplot chart element
- Config option
default_expand_messages
to enable the default expand message settings by default in the UI (breaking change)
- Scoped elements sharing names are now correctly displayed
- Clickable Element refs are now correctly displayed, even if another ref being a substring of it exists
- Moving from sync to async runtime (breaking change):
- Support async implementation (eg openai, langchain)
- Performance improvements
- Removed patching of different libraries
- Elements:
- Merged LocalImage and RemoteImage to Image (breaking change)
- New Avatar element to display avatars in messages
- AskFileMessage now supports multi file uploads (small breaking change)
- New settings interface including a new "Expand all" messages setting
- The element sidebar is resizable
- Secure origin issues when running on HTTP
- Updated the callback handler to langchain 0.0.198 latest changes
- Filewatcher issues
- Blank screen issues
- Port option in the CLI does not fail anymore because of os import
- Pdf element reloading issue
- CI is more stable
AskFileMessage
's accept parameter can now can take a Dict to allow more fine grained rules. More infos here https://react-dropzone.org/#!/Accepting%20specific%20file%20types.- The PDF viewer element helps you display local or remote PDF files (documentation).
- When running the tests, the chainlit cli is installed is installed in editable mode to run faster.
- URL preview for social media share
max_http_buffer_size
is now set to 100mb, fixing themax_size_mb
parameter ofAskFileMessage
- Enhanced security
- Global element display
- Display elements with display
page
based on their ids instead of their names
- Rework of the Message, AskUserMessage and AskFileMessage APIs:
cl.send_message(...)
becomescl.Message(...).send()
cl.send_ask_user(...)
becomescl.AskUserMessage(...).send()
cl.send_ask_file(...)
becomescl.AskFileMessage(...).send()
update
andremove
methods to thecl.Message
class
- Blank screen for windows users (Chainlit#3)
- Header navigation for mobile (Chainlit#12)
- Starting to log changes in CHANGELOG.md
- Port and hostname are now configurable through the
CHAINLIT_HOST
andCHAINLIT_PORT
env variables. You can also use--host
and--port
when runningchainlit run ...
. - A label attribute to Actions to facilitate localization.
- Clicks on inlined
RemoteImage
now opens the image in a NEW tab.