-
Notifications
You must be signed in to change notification settings - Fork 494
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLM implementation ideas #3667
Comments
We should split this up into individual issues so that we can prioritize them separately |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add a reverse explanation of the query: the input is a Cypher query the output is a natural language description of it, given the graph model. #4003
In case of the llm generates a wrong query, improve by sending the query with the error to llm so it can improve instead of just generating new query, still retry with new query if no results #4002
Add a procedure for RAG you pass the user question plus a graph pattern (paths) and relevant attributes and it creates a prompt to answer the user question using the data on those paths and executes that with the llm provider and returns the answer #4005
Add self-explanation to the model, include the verbal schema description to the flow #4000
Top k parameters in just one call, to retrieve k results #4001
Given (a set of) queries return the schema + explanation of the subgraph #4004
The text was updated successfully, but these errors were encountered: