Deep learning applications don't necessarily need GPU (graphics card), however it enhances the speed many times in compute-intensive functions. GPUs are necessary for image and video analysis.
We can always rent AWS GPU enabled servers, however the cost is still too high. You can check P2 instances on AWS (Instance Types)
The pictures below explain the differences between a CPU and GPU
Once you have decided to configure your own desktop/laptop with GPU, there are few things you should keep in mind:
1. Compatibility of motherboard and graphics card:
Check your motherboard compatibility first. A good description can be found here:
2. SMPS which can support GPU
You might need SMPS which could support the power requirement of your GPU. You can check the specification of GPU about specific power requirement.
3. Which GPU to pick
GPUs are available in all ranges. Though there are GPUs by AMD and Xeon Phi, NVIDIA should be the first choice. NVIDIA’s standard libraries made it very easy to establish the first deep learning libraries in CUDA. The community for NVIDIA’s GPU is pretty large and you will surely find the forum supports useful.
A really good post which could help you in selection and also a performance comparisons:
Best GPU overall: Titan X Pascal
Cheapest card with no troubles: GTX 1060
I work with data sets > 250GB: Regular GTX Titan X or Titan X Pascal
I started deep learning and I am serious about it: Start with a GTX 1060. Depending of what area you choose next (startup, Kaggle, research, applied deep learning) sell your GTX 1060 and buy something more appropriate
(Source: Tim Dettmers