Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: DeepSeek Coder V2 Not Working #541

Open
2 tasks done
cliffordh opened this issue Jun 20, 2024 · 5 comments
Open
2 tasks done

[Bug]: DeepSeek Coder V2 Not Working #541

cliffordh opened this issue Jun 20, 2024 · 5 comments
Assignees
Labels
bug Something isn't working investigating

Comments

@cliffordh
Copy link

Before Reporting

  • I have checked FAQ, and there is no solution to my issue
  • I have searched the existing issues, and there is no existing issue for my issue

What happened?

Using Ollama to run deepseek-coder-v2:

ollama run deepseek

I've setup a model in CopilotForXcode, but when I try to chat with it I get "The data couldn’t be read because it is missing."

How to reproduce the bug.

Install deepseek-coder-v2 and configure a chat model to talk to Ollama.

Relevant log output

n/a

macOS version

15

Xcode version

15

Copilot for Xcode version

0.33.4

@cliffordh cliffordh added the bug Something isn't working label Jun 20, 2024
@intitni
Copy link
Owner

intitni commented Jun 20, 2024

This model doesn't support messages from the role system. I will see what I can do with it later.

@intitni
Copy link
Owner

intitni commented Jun 20, 2024

It seems that it's not about the system message. I only know that it's complaining about the prompt template but I don't know why yet.

Here is what I saw from ollama:

ERROR [validate_model_chat_template] The chat template comes with this model is not yet supported, falling back to chatml. This may cause the model to output suboptimal responses | tid="0x1f9f70c00" timestamp=1718876553

Here is what the stream response contains:

{\"error\":\"an unknown error was encountered while running the model \"}

@intitni
Copy link
Owner

intitni commented Jun 20, 2024

The maintainer of Ollama said that the template not yet supported error is irrelevant.
╮(╯▽╰)╭

I did some experiment but can't tell what was wrong. Let see if Ollama 0.1.45 will change anything.

@intitni
Copy link
Owner

intitni commented Jun 20, 2024

I found that it will fail whenever a message is long enough.. If this is the case, there is nothing I can do. In Copilot for Xcode, we will include a part of the code in the editor as prompt, so you would see the error.

Screenshot 2024-06-20 at 21 46 29

@cliffordh
Copy link
Author

Ok, I will try to adjust the scope and message history I am sending, but I believe this model may be unusable as is.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working investigating
Projects
None yet
Development

No branches or pull requests

2 participants