It’s similar to AI text2SQL bots, but with added information:
- Table names in your SQL database
- Schema details for each table
- Sample rows from each table
- Support various SQL DBs (have been tested on mssql)
# .\.venv\Lib\site-packages\langchain\chains\sql_database\prompt.py SQL_PROMPTS = { "crate": CRATEDB_PROMPT, "duckdb": DUCKDB_PROMPT, "googlesql": GOOGLESQL_PROMPT, "mssql": MSSQL_PROMPT, "mysql": MYSQL_PROMPT, "mariadb": MARIADB_PROMPT, "oracle": ORACLE_PROMPT, "postgresql": POSTGRES_PROMPT, "sqlite": SQLITE_PROMPT, "clickhouse": CLICKHOUSE_PROMPT, "prestodb": PRESTODB_PROMPT, }
This makes it more accurate and includes the query results as well.
FYI: In the example below, you can see the previous steps. Clicking on them will show you the process it followed to generate the SQL query, including the final query that provides the answer.
python -m venv .venv
.\.venv\Scripts\activate
pip install -r requirements.txt
DB_CONNECTION_STRING="mssql+pyodbc://<username>:<password>@<server-name>/<db-name>?trusted_connection=yes&driver=ODBC+Driver+17+for+SQL+Server"
FIREWORKS_API_KEY='xxx'
You can get the FIREWORKS_API_KEY
here.
There are a lot of LLM services to choose such as AzureOpenAI, OpenAI, Cohere, Ollama, Groq, Fireworks, GoogleGenerativeAI,...
Feel free to make pull requests or make a request to me if you interested
streamlit run app.py
- Message history
- Dynamic LLMs with dynamic model names