TransWikia.com

Use a GPU to speed up neural net training in R

Data Science Asked by Blake Lucey on January 17, 2021

I’m currently training a neural net model in R and am wanting to use a GPU to speed up this process. I’ve looked into this but it appears that this is unavailable to Mac users as Apple no longer uses NVIDIA GPUs.

Can anyone tell me if this is the case, and if not how I can go about utilizing a GPU?

One Answer

If you're able to convert the code into python, then you could use the Google colab environment or Kaggle kernels. These online platforms provide free GPU's that you can utilize.

Kaggle kernels also support R directly.

Correct answer by Anoop A Nair on January 17, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP