You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, many thanks for your blog post and for these scripts and ressources.
I'm able to get around x2 speed on the large-v2 model with an AMD Rx 6800 (Ubuntu 22.04, PyTorch with ROCm 5.2 backend).
However, this hardware is not mine, and I'm considering upgrading my current Mac to a Mini M1 or Mac Studio. That is why, I would be very interested to know if you are able to run Whisper with the newly released MPS version of PyTorch
Are you able to run the large model ?
How fast (hopefully faster than your CPU based tests) ?
Thanks
The text was updated successfully, but these errors were encountered:
Hi, many thanks for your blog post and for these scripts and ressources.
I'm able to get around x2 speed on the large-v2 model with an AMD Rx 6800 (Ubuntu 22.04, PyTorch with ROCm 5.2 backend).
However, this hardware is not mine, and I'm considering upgrading my current Mac to a Mini M1 or Mac Studio. That is why, I would be very interested to know if you are able to run Whisper with the newly released MPS version of PyTorch
Thanks
The text was updated successfully, but these errors were encountered: