Catch OOM and run whisper on cpu automatically. #117
No reviewers
Labels
No Label
bug
duplicate
enhancement
help wanted
insufficient info
invalid
news
not a bug
question
wontfix
No Milestone
No project
No Assignees
2 Participants
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: mrq/ai-voice-cloning#117
Loading…
Reference in New Issue
Block a user
No description provided.
Delete Branch ":vram"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
For users wishing to run larger whisper model but lack VRAM, run it on CPU.
Ah, I don't know why I didn't think to check if there was a way to set it to CPU only for openai/whisper.
I suppose for Windows users with low GPU VRAM but high enough system VRAM, this will do fine, but I'd prefer Linux users to use WhisperCPP, as the models are just as good but use less RAM.