gradio.exceptions.Error: 'torch.cat(): expected a non-empty list of Tensors' #21
Labels
No Label
bug
duplicate
enhancement
help wanted
insufficient info
invalid
news
not a bug
question
wontfix
No Milestone
No project
No Assignees
2 Participants
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: mrq/ai-voice-cloning#21
Loading…
Reference in New Issue
Block a user
No description provided.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
You're gonna hate me...
Setting samples in Gradio to any number divides the output by 12, 16 etc?
Setting a low enough number will cause an error:
That's from it automatically deducing your batch size. On DirectML, there doesn't seem to be an (easy) way to get VRAM capacity, but it's exposed with ROCm, so it's using larger batch sizes.
I suppose I technically never set it to a low enough number, since 16 is the lowest I'll go, and the batch-size-deduction from original tortoise-tts I think goes up to a batch size of 16.
Can you provide a full stack trace? I'm pretty sure I won't need one, since I'll just clamp down the batch size to the sample size.Actually, I should be able to replicate it.I see, when I was on Windows using the Generate Tab, it would always show the exact number of samples i set (if I set 100 under samples it would show 1/100 steps etc).
This is with Samples set to 9:
Should be fixed in commit
9e64dad785
. I can't do any tests for it on my 2060 since I can't increase it's batch size all that much for higher numbers.This was the output after updating.
Oops, a slight oversight. Should be fixed in commit
fefc7aba03
.I'm either really thick or I'm missing something, when I read samples in the UI, I'm actually reading the batch size of 16? So when I set 256 samples, it's 16 batches of 16 samples in the progress field on the right hand side of gradio?
Cause like I said a little earlier, when I was on Windows and set say 100 samples, it would show while generating on the right hand side of gradio "1/100 steps" etc.
It is working again however, thank you.
The Sample slider in the Generate tab will only affect how many (autoregressive) samples you want.
The progress bar on the right will report
[Current Batch / Total Batches]
, whereTotal Batches = Autoregressive Samples / Batch Size
(or exchange batch with step or iteration).Yep. If you do 128 samples, it's 8 batches of 16 samples, while 96 samples will give you 6 batches of 16 samples.
I wouldn't worry much about it (especially in terms of finetuning/training), just know more samples will give better output, while higher batch sizes will make things faster, at the cost of VRAM.
Yep, since with DirectML, it'll default to a batch size of 1, as there's no way to gauge VRAM size, as the only real limitation for increasing batch size is from VRAM.