Ah, I don't know why I didn't think to check if there was a way to set it to CPU only for openai/whisper.
I suppose for Windows users with low GPU VRAM but high enough system VRAM, this will do fine, but I'd prefer Linux users to use WhisperCPP, as the models are just as good but use less RAM.
Ah, I don't know why I didn't think to check if there was a way to set it to CPU only for openai/whisper.
I suppose for Windows users with low GPU VRAM but high enough system VRAM, this will do fine, but I'd prefer Linux users to use WhisperCPP, as the models are just as good but use less RAM.
For users wishing to run larger whisper model but lack VRAM, run it on CPU.
8ed09f9b87
into masterAh, I don't know why I didn't think to check if there was a way to set it to CPU only for openai/whisper.
I suppose for Windows users with low GPU VRAM but high enough system VRAM, this will do fine, but I'd prefer Linux users to use WhisperCPP, as the models are just as good but use less RAM.
Pull request successfully merged and closed
:vram
can now be deleted.