27

Deep Learning on a Budget: $450 eGPU vs Google Colab

 3 years ago
source link: https://towardsdatascience.com/deep-learning-on-a-budget-450-egpu-vs-google-colab-494f9a2ff0db?gi=f3adaedb47b3
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Colab is phenomenal for beginning deep learning, but how does it stack up against an eGPU + Ultrabook?

eE7beiJ.jpg!web

Photo by Nana Dua on Unsplash

Deep learning is expensive. GPUs are an absolute given for even the simplest of tasks. For people who want the best on-demand processing power, a new computer will cost upwards of $1500 and borrowing the processing power with cloud computing services, when heavily utilized, can easily cost upwards of $100 each month. That is absolutely fine for businesses, but for the average individual, that adds up.

Because of this, 2 months ago, I made my first major purchase to give myself reasonable computing power. I already own an older XPS 15 with a small GPU (a GTX 960m that just was not cutting it), so I decided to buy a Razer core + NVIDIA GTX 1080, and I immediately had around 4x the processing power I had before, for under $450 (I bought them both used). It was such a great transformation, that I wrote a medium article detailing the whole process and results ( you can read that here ).

The article was very well received, but from multiple readers and colleagues of mine, I was asked the question, “how does this stack up to colab?” And to all of those, I would answer that I had heard of people liking it, but I had never considered it. After all, how good could a free GPU be?

Before I get into the results, I’ll give a small background on Colab itself. Colab is a product developed by Google Research to give anyone the ability to execute Python code. It is run completely in the browser and uses google drive as its primary file store. This allows it to be easily sharable and completely in the cloud.

The catch is that compute resources are not guaranteed, meaning you’re essentially getting what google is not currently using. Colab states that most of the time, this means the GPU assigned will be chosen from Nvidia K80s, T4s, P4s, and P100s. The other catch is that if you are temporarily not using Colab, it will assign those resources to somebody else. And this, in my tests, was not very forgiving. Even in the time it took me to take my dog out to the bathroom, I was experiencing timeouts. This could get annoying if you let something run, do something else, and do not have the results saved upon completion. The final drawback is the 12 hour limit on runtime. Most people should not worry about that, but for large projects where tuning may take a while, this could be troublesome.

Now that I have explained what Colab is, I can explain how I tested it. Similar to what I used to test my eGPU in my previous article, I used AI Benchmark , an incredibly simple yet powerful python library that utilizes Tensorflow to run 42 tests on 19 different sections , providing a great generalized AI score for a GPU. Next, I initialized a runtime 15 times and ran AI benchmark each time, recording the results as well as the assigned GPU — I got the K80 10 times, the P100 4 times, and the T4 once (I never got the P4, but it should perform slightly worse than the T4) — here are the speed results!

E7ZjeeN.png!web

Colab Results

As you can see, there is a large amount of variance in the results, the top GPU performed almost 4x faster than the slowest one they can assign, but for a free GPU, they are all phenomenal. I was blown away that any of these are given for free. They all provide 12+ gigabytes of RAM, plenty for most projects, and the P100, when it is assigned, has speeds close to a brand new RTX 2070. Even the K80, which is the slowest and most common, performs slightly worse than a GTX 1050 Ti, which is still a $150 GPU.

How does this compare to my eGPU? Honestly, it stacked up very well. Below is the same graph when my 1080 setup is plotted alongside!

AJ3EnuJ.png!web

Colab vs GTX 1080 eGPU

On the median case, Colab is going to assign users a K80, and the GTX 1080 is around double the speed, which does not stack up particularly well for Colab. However, on occasion, when a P100 is assigned, the P100 is an absolute killer GPU (again, for FREE ). Because of how great the P100 is, when comparing the average case, you can see that the two results are not that different, my eGPU on average only performed around 15% better.

QvUfEn.png!web

eGPU vs Colab Average

So if you are looking at getting into machine learning (or looking to upgrade your current setup) what should you do? My personal recommendation is that if you are only looking to dabble in machine learning (at least to start), taking a first-level machine learning course, or are not consistently going to need computing resources, Colab is an absolute no brainer. The compute it provides for free is tremendous.

However, if you feel limited by the offerings of Colab, possibly needing faster speeds, knowing your resources at all times, needing long training times, want to use a non-python language, or are planning to use your GPU for more than just machine learning tasks, an eGPU (or dedicated GPU) will probably work out better for you!


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK