-
Notifications
You must be signed in to change notification settings - Fork 756
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unsupported type byte size: ComplexFloat #109
Comments
Got the same error. On Mac Mini M2 |
+1 mbp m1 pro |
same error on Macbook M2 MAX |
Same error on mbp m1 pro |
https://github.com/Plachtaa/VALL-E-X/pull/102/files |
but the same error. Does the running code need any modifications? |
If I comment MPS support part, I will get no hardware supported error |
It is working now to use CPU, is it still not possible today to utilize MPS in Vall-E? |
I met the same error. Here is the output of the command line terminal.
I attached the runtime environment information for you to refer to.
Here is the test1.py code that I ran:
|
Get same error +1 |
Just found a way around, it works now for me. Force the device to CPU device = torch.device("cpu")
if torch.cuda.is_available():
device = torch.device("cuda", 0)
# if torch.backends.mps.is_available():
# device = torch.device("mps") |
@KodeurKubik yep. But not all the relevant code is commented out. plz check that comment and PR: #109 (comment) |
System : Apple (mac pro m2)
Use 12 cpu cores for computing 。
when i use Infer from prompt show this error
The text was updated successfully, but these errors were encountered: