Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug : error code : 403 {detial : stream object no attribute usage} #1301

Open
2 of 3 tasks
mustangs0786 opened this issue Oct 21, 2024 · 1 comment
Open
2 of 3 tasks
Labels
triage Default label assignment, indicates new issue needs reviewed by a maintainer

Comments

@mustangs0786
Copy link

mustangs0786 commented Oct 21, 2024

Do you need to file an issue?

  • I have searched the existing issues and this bug is not already filed.
  • My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
  • I believe this is a legitimate bug, not just a question. If this is a question, please use the Discussions area.

Describe the issue

Hi i after creating parquet file , when I am hitting the query

I am getting error back

raise self._make_sttus_error_fromresponse(err.response) from None openai.Permissiondeniederror: error code : 403 {detial : stream object no attribute usage}

as per I understood, my gpt instance does not support streaming of the response, hence this error I am seeing.
to solve this is there any setting in graph rag, so it will expect complete response in the end rather then waiting for streaming answer. thanks

Steps to reproduce

No response

GraphRAG Config Used

# Paste your config here

Logs and screenshots

No response

Additional Information

  • GraphRAG Version:
  • Operating System:
  • Python Version:
  • Related Issues:
@mustangs0786 mustangs0786 added the triage Default label assignment, indicates new issue needs reviewed by a maintainer label Oct 21, 2024
@mustangs0786 mustangs0786 changed the title [Issue]: <title> Bug : error code : 403 {detial : stream object no attribute usage} Oct 21, 2024
@mustangs0786
Copy link
Author

Hi Team after digging for some time i found this section of code under file

Path: "graphrag/query/structured_search/local_search/search.py"

response = await self.llm.generate( messages=search_messages, streaming=True, callbacks=self.callbacks, **self.llm_params, )

By making streaming = False, issue got resolve,

same goes for global search file

I request ream to bring this variable as user input, by keeping it in settings.yaml file

thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
triage Default label assignment, indicates new issue needs reviewed by a maintainer
Projects
None yet
Development

No branches or pull requests

1 participant