What is a GPU? (GPUs in AI)

If you’ve been in the Blockchain industry, then you may have heard of GPU mining. What you may or may not know is exactly what a GPU is. In addition to this, it’s been reported over and over in the popular press that the Blockchain space is depleting the world’s supply of GPUS. For the purposes of this piece, we’ll stick to a different discussion of GPUs in today’s press, meaning, its lesser-known connection to the Artificial Intelligence industry.

A GPU’s Utility in AI

So, what is the GPU?

To put it plainly, a GPU is a Graphics Processing Unit.

Inside your computer, almost everything gets done using this and the Central Processing Unit. The easiest way to think about both is that the Central Processing Unit executes commands from applications to make them go or function, while the Graphics Processing Unit uses your computer’s memory to do certain mathematical calculations quickly. A GPUs work can, therefore, involve rendering images quickly and efficiently from these calculations, for example. In this case, rendering images means running calculations so that a series of some sort of pictures or graphics appear on a screen, in as realistic a way as they are supposed to. In the case of AI, rendering and the overall process of using GPUs plays out in a bit more of an involved way. One of the most prevalent use cases for a GPU in AI, overall, seems to be to accelerate AI projects. The key way that this appears to be playing out is in GPU-accelerated computing, at least, according to leaders in the space like NVIDIA, the company that apparently invented the GPU. When it all comes down to it, the essence of GPU-accelerated computing is that it appears to greatly increase the possibilities of what we can do with our computers. Specific examples include scientific research with what NVIDIA calls “molecular-scale” experiments, which seem to mean going down to the size of a molecule to widely analyze certain things. Even more importantly, if GPUA computing allows us to run any sort of experiment at this scale, then parallels can be drawn between it and Quantum Computing in terms of processing ability.

Secondly, and more specifically with regards to AI projects, NVIDIA reports that when Google’s Brain project learned to identify cats as well as people via Youtube, it used 2,000 CPUs or roughly, 2,000 halves of computers to do so.

If this is kept in consideration, then the advantage of GPUA computing is clear. Having more powerful and efficient GPUs that can take the pressure off of CPUs means AI projects will be able to do more with fewer resources.

Looking down the road…

This is, of course, all contingent upon the advancement in GPUs keeping up with all other relevant computing advancements. If expected phenomenons like Quantum Computing outpace GPUA computing, then a new solution that’s yet non-existent might cause GPUs to become obsolete. To put it simply, GPUA computing will also need to answer the rise of Quantum Computing, before it snowballs into a global standard for certain industries.

According to certain computer scientists, the chief benefits of using a Quantum Computer lie in being able to do many calculations and operations at once and then being able to put them together efficiently in order to quickly analyze the results.

Another way to conceptualize GPUA computing is to think about the idea of parallel computation. To clarify, this effectively means that accelerated GPUs can be put together with programs that cater to performing many operations at once, without decreased performance. On the flip side, GPUA computing has the limitation of not being able to speed up programs that follow a very strict order in their executing instructions. Effectively, this could mean that if there’s no room for workarounds, then there’s no room for GPUA computing. With reports to the effect that GPUA computing exponentially increases the efficiency of Supervised Learning inside of Deep Learning, it’s hard to ignore the utility of this practice in AI. With the knowledge of how Quantum Computing compares to GPUA computing, it also seems clear that a singular focus on one or the other might lead to industry progress becoming stagnant. All in all, what could be argued here is that the Artificial Intelligence industry as a whole needs to analyze all considerable advances in technology that can affect it and act accordingly. This could play out in a few simple ways. First, the industry could have a central governance body or even a DAO, if they aim to value Decentralization, in the future. This body would have the responsibility of preparing the industry for possible future issues. In doing so, this governing body could run through several scenario analyses, which are models that can help a business prepare for almost any future risk.

Overall, what’s included in this process is having a meeting to define the biggest problems that you see for your business or your industry in the future, followed by several other steps related to developing an understanding of these problems as well as acting on this understanding. It’s logically important to note that during this process, a business or the governing body of industry needs to decide as reasonably as possible which problems are most urgent and which problems are not. This, according to popular conceptions of the scenario analysis process, appears to be a purely subjective exercise. In short, whatever the business decides are the highest priority problems, are the highest priority problems. Following this prioritization, a process begins which appears to be similar to the process in Design and Marketing of developing user stories, in one key way. The organization that is running through this process is advised to develop scenario stories, but not typically around every scenario that they’ve listed. It’s said to be ideal to stick to the highest and middle priority options, which ideally equals two, maximum. If, in fact, an AI firm or governing body were to do this, it could begin to develop at least partial solutions to the highest priority future issues related to the development of technology and therefore, avoid an over-dependence on one form of computing.

Admittedly, all of this is highly theoretical, but the mention of the gap between GPUA computing and Quantum Computing brought to mind the possible utility of such a framework, which, therefore, it seemed important to make clear. In AI overall, much of what the industry is based on involves theory over proven use-cases, even with the stories of the system that beat the GO Master and Alexa, that can effectively be your personal shopper. With this in consideration, it’s not off-base to apply strategic theory to a highly technical industry’s possible future roadblocks. Strategic offerings akin to scenario analysis have been applied successfully across the global business market, with a high degree of success. For a more specific example, look at how bigger firms like Unilever, GE, and IBM have survived and thrived in a world that appears to be leaning towards startups and decentralized organizations. For an AI specific example of surviving and thriving, take a look at the fact that most AI firms are startups and even when they’re not, they’re structured as separate businesses. The Fortune magazine article on the most successful AI firms below helps to illustrate this. What this amounts to is that the AI industry is arguably the quickest changing industry in the world and the businesses that are already operating in it know this. Scenario analysis as well as other strategic frameworks can only serve to further the industry’s interest of thriving in the face of said change.

References:

Artificial Intelligence and GPUs: https://www.netformation.com/industry-perspectives/in-the-era-of-artificial-intelligence-gpus-are-the-new-cpus/

Floating Point Arithmetic: https://en.wikipedia.org/wiki/Floating-point_arithmetic

GPU Computing versus Quantum Computing: http://www.tomshardware.co.uk/answers/id-2423603/thing-quantum-computing-gpu-computing.html

Most Successful AI Firms: http://fortune.com/2018/01/08/artificial-intelligence-ai-companies-invest-startups/

New York Times Report on GPUS: https://www.nytimes.com/2018/05/08/technology/gpu-chip-shortage.html

NVIDIA: CPUs versus GPUs: https://blogs.nvidia.com/blog/2009/12/16/whats-the-difference-between-a-cpu-and-a-gpu/

NVIDIA: Accelerating AI with GPUs: https://blogs.nvidia.com/blog/2016/01/12/accelerating-ai-artificial-intelligence-gpus/

Quantum Computing vs. GPUs: https://physics.stackexchange.com/questions/161338/would-quantum-computers-be-better-at-everything-a-gpu-normally-does

Quora Discussion: Why GPUs over CPUs: https://www.quora.com/Why-are-GPUs-preferred-for-AI-computing-instead-of-CPUs

Rendering Images: https://en.wikipedia.org/wiki/Rendering_(computer_graphics)

Scenario Analysis: https://www.mindtools.com/pages/article/newSTR_98.htm

What is a Central Processing Unit?: https://en.wikipedia.org/wiki/Central_processing_unit

What is a Graphics Processing Unit?: https://en.wikipedia.org/wiki/Central_processing_unit

What is GPU Accelerated Computing?: https://www.techopedia.com/definition/32876/gpu-accelerated-computing

What is Parallel Computing?: https://en.wikipedia.org/wiki/Parallel_computing

What is a User Story?: https://www.mountaingoatsoftware.com/agile/user-stories

About Ian LeViness 113 Articles
Professional Writer/Teacher, dedicated to making emergent industries acceptable to the general populace