GPUs are one of the most important components of a PC, and you absolutely need one if you want to run graphic-intensive tasks on your computer. It's also among the most exciting components to shop for. In fact, if there were a tier list of the most exciting PC components, GPUs would easily be S-tier. Most understand the GPU as the part of a computer that runs games really well, but there's much more to it than that.
What is a GPU?
One of the core components of a PC
First, let's clarify what exactly a Graphics Processing Unit (GPU) is. It's essentially a processor that uses several individually weak cores called shaders to render 3D graphics. GPUs can either be integrated into a CPU (as seen in Intel's mainstream CPUs and AMD's APUs), soldered onto a motherboard in devices like laptops, or come as a part of a complete device you can plug into your PC, known as a graphics card. Although we often use the words GPU and graphics card interchangeably, the GPU is just the processor inside the card.
Although we often use the words GPU and graphics card interchangeably, the GPU is just the processor inside the card.
The key distinction between CPUs and GPUs is that the latter are usually specialized for 3D tasks (playing video games or rendering, for example) and excel at processing workloads in parallel. With parallel processing, all the cores tackle the same task simultaneously, which CPUs struggle with. CPUs are optimized for serial or sequential processing, which usually means only one or maybe a few cores can work on a single task simultaneously. However, specialization means that GPUs aren't suitable for many things that CPUs can be used for.
The GPU is most famous for being the backbone of the graphics card, a device that's practically a computer unto itself as it comes with a processor, motherboard, dedicated memory called VRAM, and cooler. Not only that, but graphics cards can easily plug into PCs via a single PCIe slot or even through a Thunderbolt port that uses the USB Type C port, which allows graphics cards to connect to laptops, too. There's really no other component like the graphics card in terms of performance potential and ease of upgrading.
What companies make GPUs?
And where it all began
The origin of the GPU is tied up in the history of video games, which has been the traditional use case for GPUs. It's hard to say when the first GPU was created, but the term was apparently first coined by Sony for its original PlayStation console in 1994, which used a Toshiba chip. Five years later, Nvidia claimed its GeForce 256 was the first ever GPU on the grounds that it could perform the transform and lighting tasks that were previously done on the CPU. Arguably, Nvidia didn't create the first GPU, and it wasn't even the first graphics card for desktops.
Because the earliest graphics processors were simple, there were tons of companies making their GPUs for consoles, desktops, and other computers. But as GPUs evolved, which made them more difficult to make, companies began dropping out of the market. You no longer saw generic electronics companies like Toshiba and Sony making their GPUs, and many companies that did nothing but make GPUs, like 3dfx Interactive, went bust, too. By the early 2000s, the wheat had been cut from the chaff, and there were only two major players left in graphics: Nvidia and ATI, the latter of which was acquired by AMD in 2006.
For over two decades, Nvidia and ATI/AMD were the only companies capable of making high-performance graphics for consoles and PCs. The integrated graphics scene, by contrast, has been more diverse because there are strong incentives to make integrated GPUs (like not having to pay Nvidia or AMD to use their GPUs) and also because it's not that hard. However, Intel entered the high-performance GPU scene in 2022, transforming the duopoly into a triopoly.
What are GPUs used for?
More than just a component for gaming
Gaming was always the principal use for graphics cards, but that's becoming less and less true in recent years as the demand for broader computing power continues to increase. As it turns out, the cores in GPUs are useful for more than just graphics, giving rise to data center and artificial intelligence-optimized chips called general-purpose GPUs (GPGPUs). Demand has gotten so high that Nvidia's data center GPUs make more money for the company than gaming ones. AMD and Intel also seek to cash in on the demand for data center and AI GPUs. Eventually, other companies may enter the race, too, as a new market materializes.
In the case of AI workloads, GPUs have become incredibly important in the likes of AI Image Generation or the training of Large Language Models (LLMs). This is thanks to their ability to process in parallel being perfect for large matrix operations that are extremely common in AI.
Nvidia vs. AMD vs. Intel GPUs
Which is best for gaming?
"Which brand makes the best GPUs?" — this question gets thrown around a lot, especially now when there is more than one big player on the market. That's right, it was only Nvidia and AMD who were making GPUs until recently, but now we also have Intel in the mix, which entered the fray not too long ago with its Arc GPUs. While no brand is completely better than the other, each has its strengths and weaknesses.
Nvidia has always positioned itself as the option for those who want the best gaming GPU with the most performance and the most enticing features. While Nvidia's graphics cards are usually good, they are also often expensive, at least relative to those from AMD and now Intel. Additionally, while Nvidia's exclusive features are cool on paper, the actual experience might not live up to the hype. For example, DLSS is a really nice feature for Nvidia cards, but it isn't in many games and has quite a few drawbacks.
There's really no other component like the graphics card in terms of performance potential and ease of upgrading.
AMD, by contrast, hasn't always competed with Nvidia. AMD has historically offered better-value GPUs, often letting Nvidia's top-end cards go unchallenged. On the other hand, Nvidia often lets AMD have free rein over the budget GPU market, and that's especially true today. Additionally, AMD usually lags behind Nvidia when it comes to features, but it has a decent track record of catching up and sometimes even exceeding its competition.
Intel has only been in the gaming GPU market for about a year at the time of writing, so it hasn't really established itself yet, but so far, Intel has competed on the value AMD usually does. The biggest struggle for Intel has been its drivers, or the software that plays a crucial role in how well a GPU performs in games. Since its first-generation Arc Alchemist GPUs launched in 2022, Intel has been hard at work optimizing its drivers, and today Intel's Arc A750 is a popular choice for low-end and midrange PCs.
Closing thoughts
It's pretty clear where each company excels. Nvidia is the top choice for those who have a big budget and want both great performance and cutting-edge features. AMD is the brand for more people, and although AMD's top-end cards usually don't quite match Nvidia's in performance and features, it definitely has the budget space locked up. Intel, at the moment, offers a good midrange GPU, but that's about it. We're yet to hear what Intel has in store for us in 2024, but we hope it's something better its previous showcase.
In general, the GPU market is set for a change over the next few years, with Nvidia moving even more to the high-end segment and Intel entering the fray. We'll have to see how this continues to shape up.
What is a GPU?
One of the core components of a PC
First, let's clarify what exactly a Graphics Processing Unit (GPU) is. It's essentially a processor that uses several individually weak cores called shaders to render 3D graphics. GPUs can either be integrated into a CPU (as seen in Intel's mainstream CPUs and AMD's APUs), soldered onto a motherboard in devices like laptops, or come as a part of a complete device you can plug into your PC, known as a graphics card. Although we often use the words GPU and graphics card interchangeably, the GPU is just the processor inside the card.
Although we often use the words GPU and graphics card interchangeably, the GPU is just the processor inside the card.
The key distinction between CPUs and GPUs is that the latter are usually specialized for 3D tasks (playing video games or rendering, for example) and excel at processing workloads in parallel. With parallel processing, all the cores tackle the same task simultaneously, which CPUs struggle with. CPUs are optimized for serial or sequential processing, which usually means only one or maybe a few cores can work on a single task simultaneously. However, specialization means that GPUs aren't suitable for many things that CPUs can be used for.
The GPU is most famous for being the backbone of the graphics card, a device that's practically a computer unto itself as it comes with a processor, motherboard, dedicated memory called VRAM, and cooler. Not only that, but graphics cards can easily plug into PCs via a single PCIe slot or even through a Thunderbolt port that uses the USB Type C port, which allows graphics cards to connect to laptops, too. There's really no other component like the graphics card in terms of performance potential and ease of upgrading.
What companies make GPUs?
And where it all began
The origin of the GPU is tied up in the history of video games, which has been the traditional use case for GPUs. It's hard to say when the first GPU was created, but the term was apparently first coined by Sony for its original PlayStation console in 1994, which used a Toshiba chip. Five years later, Nvidia claimed its GeForce 256 was the first ever GPU on the grounds that it could perform the transform and lighting tasks that were previously done on the CPU. Arguably, Nvidia didn't create the first GPU, and it wasn't even the first graphics card for desktops.
Because the earliest graphics processors were simple, there were tons of companies making their GPUs for consoles, desktops, and other computers. But as GPUs evolved, which made them more difficult to make, companies began dropping out of the market. You no longer saw generic electronics companies like Toshiba and Sony making their GPUs, and many companies that did nothing but make GPUs, like 3dfx Interactive, went bust, too. By the early 2000s, the wheat had been cut from the chaff, and there were only two major players left in graphics: Nvidia and ATI, the latter of which was acquired by AMD in 2006.
For over two decades, Nvidia and ATI/AMD were the only companies capable of making high-performance graphics for consoles and PCs. The integrated graphics scene, by contrast, has been more diverse because there are strong incentives to make integrated GPUs (like not having to pay Nvidia or AMD to use their GPUs) and also because it's not that hard. However, Intel entered the high-performance GPU scene in 2022, transforming the duopoly into a triopoly.
What are GPUs used for?
More than just a component for gaming
Gaming was always the principal use for graphics cards, but that's becoming less and less true in recent years as the demand for broader computing power continues to increase. As it turns out, the cores in GPUs are useful for more than just graphics, giving rise to data center and artificial intelligence-optimized chips called general-purpose GPUs (GPGPUs). Demand has gotten so high that Nvidia's data center GPUs make more money for the company than gaming ones. AMD and Intel also seek to cash in on the demand for data center and AI GPUs. Eventually, other companies may enter the race, too, as a new market materializes.
In the case of AI workloads, GPUs have become incredibly important in the likes of AI Image Generation or the training of Large Language Models (LLMs). This is thanks to their ability to process in parallel being perfect for large matrix operations that are extremely common in AI.
Nvidia vs. AMD vs. Intel GPUs
Which is best for gaming?
"Which brand makes the best GPUs?" — this question gets thrown around a lot, especially now when there is more than one big player on the market. That's right, it was only Nvidia and AMD who were making GPUs until recently, but now we also have Intel in the mix, which entered the fray not too long ago with its Arc GPUs. While no brand is completely better than the other, each has its strengths and weaknesses.
Nvidia has always positioned itself as the option for those who want the best gaming GPU with the most performance and the most enticing features. While Nvidia's graphics cards are usually good, they are also often expensive, at least relative to those from AMD and now Intel. Additionally, while Nvidia's exclusive features are cool on paper, the actual experience might not live up to the hype. For example, DLSS is a really nice feature for Nvidia cards, but it isn't in many games and has quite a few drawbacks.
There's really no other component like the graphics card in terms of performance potential and ease of upgrading.
AMD, by contrast, hasn't always competed with Nvidia. AMD has historically offered better-value GPUs, often letting Nvidia's top-end cards go unchallenged. On the other hand, Nvidia often lets AMD have free rein over the budget GPU market, and that's especially true today. Additionally, AMD usually lags behind Nvidia when it comes to features, but it has a decent track record of catching up and sometimes even exceeding its competition.
Intel has only been in the gaming GPU market for about a year at the time of writing, so it hasn't really established itself yet, but so far, Intel has competed on the value AMD usually does. The biggest struggle for Intel has been its drivers, or the software that plays a crucial role in how well a GPU performs in games. Since its first-generation Arc Alchemist GPUs launched in 2022, Intel has been hard at work optimizing its drivers, and today Intel's Arc A750 is a popular choice for low-end and midrange PCs.
Closing thoughts
It's pretty clear where each company excels. Nvidia is the top choice for those who have a big budget and want both great performance and cutting-edge features. AMD is the brand for more people, and although AMD's top-end cards usually don't quite match Nvidia's in performance and features, it definitely has the budget space locked up. Intel, at the moment, offers a good midrange GPU, but that's about it. We're yet to hear what Intel has in store for us in 2024, but we hope it's something better its previous showcase.
In general, the GPU market is set for a change over the next few years, with Nvidia moving even more to the high-end segment and Intel entering the fray. We'll have to see how this continues to shape up.