Use MPS backend on Apple Silicon devices if it's available.#108
Conversation
This is required to make VAE work on mps.
|
that'd be amazing, I'm really struggling with the fp64 dtype error and was hoping for someone to find a fix |
|
Sorry, I just clicked merge and then undid it. I will instead add an entry in the README.md so that Mac users can easily find this pull request. |
|
Thank you for your hard work! Greatly appreciated! ComfyUI Error ReportError Details
Stack TraceSystem Information
Devices
LogsAttached WorkflowPlease make sure that workflow does not contain any sensitive information such as API keys or passwords. Additional Context(Please add any additional context or steps to reproduce the error here) |
|
@john2stai I asumme you're trying to use kijai/ComfyUI-PyramidFlowWrapper. |
See #113 instead, that has a small fix on top of this Pull Request. Since this Pull Rueqst has been merged accidentally and reverted on
main, I can't update this Pull Request anymore.Problems
For the inference, it works faster by using MPS backend on Apple Silicon devices but it's not enabled by default and requires some modification to the code, which only considering CUDA availability.
Solution
Use MPS backend if it's available.
NOTE: This patch is not taking trainig account at all, only for inference. I tried to make it works as well as CUDA with this patch, but because of for example, dependencies update, which may not be preferred, therefore I don't expect that this Pull Request is mergable into
main. However, anyways posting here because I think it’s worth to have it for those who want to try inferencing easily on such as thier MacBook Pro.