Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

InvokeAI package install failed #939

Closed
leeumane opened this issue Oct 8, 2024 · 10 comments · Fixed by #942
Closed

InvokeAI package install failed #939

leeumane opened this issue Oct 8, 2024 · 10 comments · Fixed by #942
Labels
bug Something isn't working

Comments

@leeumane
Copy link

leeumane commented Oct 8, 2024

Package

InvokAI

When did the issue occur?

Installing the Package

What GPU / hardware type are you using?

RTX 3060 12GB

What happened?

I had Stability Matrix installed and updated to the latest version.
I saw that an update came up for InvokeAI (5.1.0) and that they started supporting GGUF FLUX so I decided to install and test it.
After updating the InvokeAI package I couldn't launch it - giving an error that it couldn't find the GGUF folders.
I uninstalled the package and tried installing it again - again, I couldn't launch it.
I removed Stability Matrix from my computer and reinstalled it altogether.
InvokeAI couldn't be installed at all giving an error that the package installation failed.
I installed ComfyUI and it launches fine.
Still can't install InvokeAI though.

Console output

No response

Version

2.12.1

What Operating System are you using?

Windows

@leeumane leeumane added the bug Something isn't working label Oct 8, 2024
@fbauer-kunbus
Copy link

Same here ... installed version 5.0.2 crashes with =ModuleNotFoundError: No module named 'gguf'= and update to 5.1.0 fails after some time of doing a lot of stuff with errors I can not even detect in the huge log output that is being created. Invoke is dead on me now, while all other packages in S.M. run without problems.

@leeumane
Copy link
Author

leeumane commented Oct 8, 2024

Same here ... installed version 5.0.2 crashes with =ModuleNotFoundError: No module named 'gguf'= and update to 5.1.0 fails after some time of doing a lot of stuff with errors I can not even detect in the huge log output that is being created. Invoke is dead on me now, while all other packages in S.M. run without problems.

So far I've found that InvokeAI 5.1.0 installs fine but it doesn't run because of the gguf error.
All versions from InvokeAI 5.1.0rc5 to 5.1.0.rc2 are not installing at all probably because of the GGUF support.
The most current InvokeAI version that installs properly and runs with Stability Matrix is 5.1.0rc1
If you need InvokeAI asap, just remove the installed package, check the "Package" folder in Stability Matrix if the package left anything there. If it did - remove it probably just a folder named "InvokeAI".
Then go to +AddPackage in SM, and when choosing InvokeAI, use the Branches->Releases selector and select Release v5.1.0rc1
Works fine for me but it would be great if I could get the 5.1.0 version :)

@fbauer-kunbus
Copy link

"Then go to +AddPackage in SM, and when choosing InvokeAI, use the Branches->Releases selector and select Release v5.1.0rc1 Works fine for me but it would be great if I could get the 5.1.0 version :)"

Isn't Stability Matrix supposed to allow parallel isolated installations of same packages?

@leeumane
Copy link
Author

leeumane commented Oct 8, 2024

Isn't Stability Matrix supposed to allow parallel isolated installations of same packages?

For me - once I install a package it's no longer available for installation. Only to launch our update.

@mohnjiles
Copy link
Contributor

The "Add Package" button remains available at the bottom of the Packages page for you to install any number of packages, duplicate or not.

As of v2.12.0 or later, you can also Change Version directly from the package card's 3-dots menu, to change to rc1 while we work on a fix without reinstalling completely.

@fbauer-kunbus
Copy link

fbauer-kunbus commented Oct 9, 2024

Is there a time horizon when we can expect this to be fixed? I'm just starting with InvokeAI and can easily wait a couple more days - but wouldn't want to wait more than a week ... So I am still undecided whether starting with RC1 (which seems to be the only one that is running at the moment) is the best option here.

@leeumane
Copy link
Author

leeumane commented Oct 9, 2024

Is there a time horizon when we can expect this to be fixed? I'm just starting with InvokeAI and can easily wait a couple more days - but wouldn't want to wait more than a week ... So I am still undecided whether starting with RC1 (which seems to be the only one that is running at the moment) is the best option here.

They're quite fast with updates, but let's not give them any pressure. :)
Nobody is obliged to do anything really especially within a given timeframe.

@fbauer-kunbus
Copy link

They're quite fast with updates, but let's not give them any pressure. :) Nobody is obliged to do anything really especially within a given timeframe.

As I said - no immediate pressure here, but I also need to plan when I can deep dive into this.
And being a developer myself for almost four decades, I know that being asked about a time frame is nothing extraordinary ... especially considering a showstopper like this error, where we don't have any transparency how this could happen in the first place.
We are not talking about a minor function glitch that could have easily stay unnoticed while testing ... :-)

@mohnjiles
Copy link
Contributor

Definitely won't be a week, we have the fix implemented and are just doing some final testing on various platforms. Hope to get it out later today if all looks good.

As for the "why" - Invoke 5.1.x updated to use torch 2.4.1 / cuda12.4 and our install script was still pointing it at the old cuda12.1 version, and there is no pre-built xformers wheel compatible with both torch 2.4.1 + cuda12.1, causing it to try to build xformers on your local PC. Most folks probably don't have the proper build tools to do that (and we don't install them for you cuz it's typically not necessary), hence these failures.

Fix is to just update our install to use the new cuda12.4 index. If you already build SM from source feel free to DIY in the meantime 😄

@mohnjiles mohnjiles mentioned this issue Oct 10, 2024
@mohnjiles mohnjiles linked a pull request Oct 10, 2024 that will close this issue
@fbauer-kunbus
Copy link

And people have complained in the past when PHP caused incompatibilities every FIVE YEARS or so :-)
I know why I don't like this JENGA tower infrastructure that the whole AI stuff is currently based on.
Thanks for your effort...!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants