Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Centralize and Internationalize System Prompts Across Adapter #571

Open
michel-heon opened this issue Sep 18, 2024 · 3 comments
Open

Centralize and Internationalize System Prompts Across Adapter #571

michel-heon opened this issue Sep 18, 2024 · 3 comments

Comments

@michel-heon
Copy link
Contributor


Description :

Currently, system prompts are scattered throughout the codebase and are only available in English. This setup makes it difficult to manage and internationalize prompts, especially when scaling to new languages or integrating multiple adapters. We propose centralizing the system prompts into a single module for easier management and allowing for internationalization.


Objective :

  • Centralize the system prompt management into a dedicated class.
  • Enable internationalization (i18n) for system prompts to support multiple languages.

Affected Adapters:

  • azure-openai
  • mistral
  • claude
  • titan
  • llama

Proposed Solution:

  • Create a new class system_prompts.py to store and manage all system prompts.
  • Refactor the following methods in /lib/model-interfaces/langchain/functions/request-handler/adapters/base/base.py:
    • get_prompt
    • get_condense_question_prompt
    • get_qa_prompt

These methods will fetch the system prompts from the new system_prompts.py class to centralize and simplify prompt management.


Steps for Implementation:

  1. Create a new module system_prompts.py to store all system prompts.
  2. Refactor the aforementioned methods to pull prompts from the system_prompts.py class.
  3. Add support for multiple languages by enabling the selection of language-specific prompts in system_prompts.py.
  4. Update all adapters to use the new prompt management system.

Expected Outcome:

  • The system prompts will be centralized in system_prompts.py.
  • Adapters will use a unified way to fetch prompts.
  • The system will be capable of handling prompts in multiple languages, improving flexibility and scalability.

Environment Information:

  • Affected adapters: azure-openai, mistral, claude, titan, llama in in /lib/model-interfaces/langchain/functions/request-handler/adapters/

Additional Information:

  • This change will streamline the process of adding new languages for system prompts.
  • It will also reduce code duplication and improve maintainability across adapters.

Cela permet d'expliquer clairement la problématique, l'objectif, et la solution envisagée pour les contributeurs du projet.

@charles-marion
Copy link
Collaborator

charles-marion commented Sep 19, 2024

Hi @michel-heon ,
Thank you for the details.

I recently merged the following which reduces the prompts for Bedrock by using a common solution for various models.
It might help your use case in the short term
#569
https://github.com/aws-samples/aws-genai-llm-chatbot/blob/main/lib/model-interfaces/langchain/functions/request-handler/adapters/bedrock/base.py#L49

Regarding the update of prompts. An other issue is also suggesting a way to modify the prompts but using a registry: #334

@michel-heon
Copy link
Contributor Author

Hello @charles-marion
I've just noticed the change. I'm going to migrate to the latest commit for my work.
I will also take this change into account for #572

@michel-heon
Copy link
Contributor Author

Hi @charles-marion
see PR #576

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: No status
Development

No branches or pull requests

2 participants