You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 13, 2024. It is now read-only.
Is this your first time submitting a feature request?
I have searched the existing issues, and I could not find an existing issue for this feature
I am requesting a straightforward extension of existing functionality
Describe the feature
Allow to filter the search in Pinecone using metadata for each call made with the API. A metadata parameter would be added to the endpoint .../?metadata=string
Describe alternatives you've considered
No response
Who will this benefit?
No response
Are you interested in contributing this feature?
No response
Anything else?
No response
The text was updated successfully, but these errors were encountered:
Hi @DataVigi, you did not clarify which endpoint you wish to add this functionality but I am assuming it is either chat/completions or context/query route. If it was for /chat/completions route, adding metadata would break the compatibility with the existing OpenAI compatible clients. We want to be fully compatible in order to ease the transition for our users. If it was for the context/query endpoint, then you can attach the metadata filter to each query you send. Having said that we also have another way of overriding the global_metadata_filter using the config. (context_engine/params/global_metadata_filter).
In any case I would be glad to know more about your use-case and let's examine whether it is solvable with the existing functionality.
@DataVigi As @izellevy has mentioned, in a chat mode you can set a global metadata filter for all queries, using the Canopy config. That's on top of the /context/query API that supports passing a filter for every individual query.
If you want the the ChatBot to dynamically "decide" on a metadata filter according to chat conversation contents - either by LLM reasoning or by pre-defined logic - this is the responsibility of the QueryGenerator.
You can either modify the prompt of the query generator in the config file, or you can also derive a new QueryGenerator class and add your logic there.
Hey @igiloh-pinecone how would making a custom query generator entail exactly? That would mean making a class based on the QueryGenerator class, but how could that be then configured so that the chatengine for example uses the configured one? I can't seem to find any information on how that sort plugin system would work... Happy to contribute to the docs!
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Is this your first time submitting a feature request?
Describe the feature
Allow to filter the search in Pinecone using metadata for each call made with the API. A metadata parameter would be added to the endpoint .../?metadata=string
Describe alternatives you've considered
No response
Who will this benefit?
No response
Are you interested in contributing this feature?
No response
Anything else?
No response
The text was updated successfully, but these errors were encountered: