You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If question & answers/FAQs/documentations are made available in local storage or indexdb, they can be used for doing retrieval augmented generation.
Solution Description
Currently the chat responses are based only on the data provided during the pre-training. That data might either be outdated or insufficient. To overcome that RAG can be considered.
Content can be either stored locally or can be fetched from search engines or specified sites using tool calling.
Content can be stored locally in:
local storage if curated question-answer pairs exist in limited quantity
indexdb if curated question-answer pairs exist in a large number that can't fit into local storage
Voy, a WASM based vector db, can be used to store the content after embedding.
The content can either be uploaded from local files or synced via REST API (can be provided via settings or a button next to prompts).
Retrieval can be done using elasticlunr.js for plain text and Transformers.js for semantic search in case of embeddings.
Problem Description
If question & answers/FAQs/documentations are made available in local storage or indexdb, they can be used for doing retrieval augmented generation.
Solution Description
Currently the chat responses are based only on the data provided during the pre-training. That data might either be outdated or insufficient. To overcome that RAG can be considered.
Content can be either stored locally or can be fetched from search engines or specified sites using tool calling.
Content can be stored locally in:
The content can either be uploaded from local files or synced via REST API (can be provided via settings or a button next to prompts).
Retrieval can be done using elasticlunr.js for plain text and Transformers.js for semantic search in case of embeddings.
Alternatives Considered
Something similar has been done in https://github.com/jacoblee93/fully-local-pdf-chatbot
But it does not allow:
Additional Context
No response
The text was updated successfully, but these errors were encountered: