Data Science Asked by Blake Lucey on January 17, 2021
I’m currently training a neural net model in R and am wanting to use a GPU to speed up this process. I’ve looked into this but it appears that this is unavailable to Mac users as Apple no longer uses NVIDIA GPUs.
Can anyone tell me if this is the case, and if not how I can go about utilizing a GPU?
If you're able to convert the code into python, then you could use the Google colab environment or Kaggle kernels. These online platforms provide free GPU's that you can utilize.
Kaggle kernels also support R directly.
Correct answer by Anoop A Nair on January 17, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP