You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, system prompts are scattered throughout the codebase and are only available in English. This setup makes it difficult to manage and internationalize prompts, especially when scaling to new languages or integrating multiple adapters. We propose centralizing the system prompts into a single module for easier management and allowing for internationalization.
Objective :
Centralize the system prompt management into a dedicated class.
Enable internationalization (i18n) for system prompts to support multiple languages.
Affected Adapters:
azure-openai
mistral
claude
titan
llama
Proposed Solution:
Create a new class system_prompts.py to store and manage all system prompts.
Refactor the following methods in /lib/model-interfaces/langchain/functions/request-handler/adapters/base/base.py:
get_prompt
get_condense_question_prompt
get_qa_prompt
These methods will fetch the system prompts from the new system_prompts.py class to centralize and simplify prompt management.
Steps for Implementation:
Create a new module system_prompts.py to store all system prompts.
Refactor the aforementioned methods to pull prompts from the system_prompts.py class.
Add support for multiple languages by enabling the selection of language-specific prompts in system_prompts.py.
Update all adapters to use the new prompt management system.
Expected Outcome:
The system prompts will be centralized in system_prompts.py.
Adapters will use a unified way to fetch prompts.
The system will be capable of handling prompts in multiple languages, improving flexibility and scalability.
Environment Information:
Affected adapters: azure-openai, mistral, claude, titan, llama in in /lib/model-interfaces/langchain/functions/request-handler/adapters/
Additional Information:
This change will streamline the process of adding new languages for system prompts.
It will also reduce code duplication and improve maintainability across adapters.
Cela permet d'expliquer clairement la problématique, l'objectif, et la solution envisagée pour les contributeurs du projet.
The text was updated successfully, but these errors were encountered:
Hello @charles-marion
I've just noticed the change. I'm going to migrate to the latest commit for my work.
I will also take this change into account for #572
Description :
Currently, system prompts are scattered throughout the codebase and are only available in English. This setup makes it difficult to manage and internationalize prompts, especially when scaling to new languages or integrating multiple adapters. We propose centralizing the system prompts into a single module for easier management and allowing for internationalization.
Objective :
Affected Adapters:
azure-openai
mistral
claude
titan
llama
Proposed Solution:
system_prompts.py
to store and manage all system prompts./lib/model-interfaces/langchain/functions/request-handler/adapters/base/base.py
:get_prompt
get_condense_question_prompt
get_qa_prompt
These methods will fetch the system prompts from the new
system_prompts.py
class to centralize and simplify prompt management.Steps for Implementation:
system_prompts.py
to store all system prompts.system_prompts.py
class.system_prompts.py
.Expected Outcome:
system_prompts.py
.Environment Information:
azure-openai
,mistral
,claude
,titan
,llama
in in/lib/model-interfaces/langchain/functions/request-handler/adapters/
Additional Information:
Cela permet d'expliquer clairement la problématique, l'objectif, et la solution envisagée pour les contributeurs du projet.
The text was updated successfully, but these errors were encountered: