Using TPUs in Google Colab? #365

Open
opened 2023-09-02 03:42:13 +00:00 by helloitsme · 2 comments

Is anyone using TPUs in Google Colab? If so, how can you do so? I'm assuming there might be different libraries that have to be installed, since simply selecting TPU runtime doesn't work. It would significantly speed up training and inference time. Thanks

Is anyone using TPUs in Google Colab? If so, how can you do so? I'm assuming there might be different libraries that have to be installed, since simply selecting TPU runtime doesn't work. It would significantly speed up training and inference time. Thanks

I used Kaggle last time i used TPU. Google how to setup TPU in Kaggle. Then you’ll have to modify this repo code to use XLA device from PyTorch to run your code on TPU. Better choice is using PyTorch Lightning library and automatically using TPU.

I used Kaggle last time i used TPU. Google how to setup TPU in Kaggle. Then you’ll have to modify this repo code to use XLA device from PyTorch to run your code on TPU. Better choice is using PyTorch Lightning library and automatically using TPU.

Btw TPU memory footprint is notorious. I trained EfficientNet on P100 (16 GB) successfully but on TPU I had to significantly reduce the batch size.

Btw TPU memory footprint is notorious. I trained EfficientNet on P100 (16 GB) successfully but on TPU I had to significantly reduce the batch size.
Sign in to join this conversation.
No Milestone
No project
No Assignees
2 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: mrq/ai-voice-cloning#365
No description provided.