5

I am new in deep learning. I am running a MacBook Pro yosemite (upgraded from Snowleopard). I don't have a CUDA-enabled card GPU, and running the code on the CPU is extremely slow. I heard that I can buy some instances on AWS, but it seems that they don't support macOS.

My question is, to continue with the deep learning, do I need to purchase a graphic card? Or is there other solution? I don't want to spend too much on this...

Blaszard
  • 911
  • 1
  • 13
  • 29
Lilianna
  • 153
  • 1
  • 3
  • Can you add some additional information to help clarify your question? Is the algorithm you are running written by you in CUDA or are you using a package written by someone else? Does that package or code only run on OSX or are you wondering about connectivity between AWS and OSX? You can certainly connect a Mac to AWS and run your code/package on AWS in a Linux or Windows environment. Further, OSX is built on a Unix environment, so most small code packages that run in OSX will run in Linux. – AN6U5 Jul 31 '15 at 16:38
  • Thank you AN6U5! I want to use lasagne to train my neural network. And yes, I wrote that code, and will try convolutional NN later. I want to use the AWS instances to accelerate the computation. My doubt came from the AWS website saying that new users have 750h / month of Linux /Windows micro instances usage for free, Mac OS is not included. – Lilianna Aug 02 '15 at 06:47

3 Answers3

4

I would recommend familiarizing yourself with AWS spot instances. It's the most practical solution I can think of for your problem, and it works your computer too. So, no you don't have to buy an Nvidia card, but as of today you will want to use one since almost all the solutions rely on them.

Emre
  • 10,491
  • 1
  • 29
  • 39
2

AWS GPU instances are an option, if you want to do CUDA development. If you don't want to leverage the cloud, you can look into the Nvidia Jetson TK1 development kits. They are about 200 dollars (in July 2015), have 192 CUDA cores, a quad-core ARM processor, and 2GB of RAM.

Conversely, the same amount of money could buy you a 640 CUDA core GeForce GTX 750 Ti, or maybe a 1024 CUDA core GTX 960. A GT 720 with 1GB RAM and 192 cores could be had for 45 dollars (in July 2015).

You don't have to use NVidia GPUs with deep learning. GPUs will increase the speed dramatically, though. There is very little support for non-NVidia GPUs with common deep learning toolkits.

Steve Kallestad
  • 3,128
  • 4
  • 21
  • 39
-1

AWS does "support" Mac OS. You can use any SSH client from your Mac to access the GPU instance.