Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exposing a model ID or similar #3

Open
domenic opened this issue Jul 1, 2024 · 1 comment
Open

Exposing a model ID or similar #3

domenic opened this issue Jul 1, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@domenic
Copy link
Collaborator

domenic commented Jul 1, 2024

As discussed in the explainer's Goals section, it might be useful to allow web developers to know some identifier for the language model in use, separate from the browser version. This would allow them to allowlist or blocklist specific models to maintain a desired level of quality, or restrict certain use cases to a specific model.

This would probably not have significant privacy implications, since we already expose prompt() and sufficiently-detailed prompting should be able to distinguish between possible models.

However, I worry a bit about the interop issues. Adding such an API does make it much easier to write code that only works in one browser, or with one model.

@domenic domenic added the enhancement New feature or request label Jul 1, 2024
@MiguelsPizza
Copy link

For the sake of adoption, it might make the most sense to have the window.ai provider be as high level as possible and abstract away model selection. I don't think users will be super jazzed about multiple models being added to their browser because different websites request different models

Providing a few model agnostic api's like the current prompt and prompt streaming api's, plus maybe a feature extraction and classification api makes the most sense to me.

This leaves room for an browser extension to overwrite and match the window.ai api so users can opt in to different local/cloud provided models via the extension popup. From the api consumer's perspective things remain constant and the complexity of supporting different models is offloaded to the extension provider.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants