Bitter GPU Rivalry

When we decide to head to the local Big Box Electronics Store for a new PC, we begin to learn a whole new vocabulary. Terminology such as: graphics card, central processing unit, random access memory, and harddisk abound. Then we hear of Nvidia and AMD… Bitter GPU Rivalry indeed.

Here is how it normally goes:

We talk to the salesperson and they talk a good game about why brand A is better than brand B even if it is slightly more expensive. Then we stand there, hoping not to appear as clueless as we feel because then we would be easy pickings wouldn’t we? Finally we leave with some idea of what we bought or agreed to payments on and that becomes the end of it. 


I can play games on it too?!

Later on in the year we decide to buy Grand Theft Auto 5 for PC because… well why not? $60 USD later and we find out the PC we’re still paying for is woefully underpowered and that we need to buy a gaming GPU.

That’s where we’re left with two major choices for performance cards. We have AMD (formerly ATI) and NVIDIA. Both claim to have the best cards, both have developed a cult like following throughout the years, and both gather at polarizing colors: Red and Green.

The one upmanship between these two companies dates back to April, 2000 with the smug heads of ATI stating that the Radeon DDR was “the most powerful graphics processor ever designed for desktop PCs.” It was only twenty-four hours later that Nvidia claimed a similar announcement with their GeForce 2 GTS. It included a derivative of ATI’s Pixel Tapestry Architecture though it was referred to as Nvidia Shading Rasterizer. In as little as one year the PC graphics card market silently bowed to an unspoken duopoly between discrete card makers Nvidia and ATI–with Intel concentrating on integrated graphic chipsets instead.

This reciprocal formula would then be established as the norm for years to come.

Bitter GPU Rivalry

ATI/AMD

  • ATI introduces Mach8™ chip and board products: first ATI products to process graphics independently of the CPU. (1991)
  • ATI introduces Mach64™: first ATI graphics boards to accelerate motion video. (1994)
  • ATI is first graphics company to ship Mac-compatible graphics boards. (1995)
  • ATI releases industry’s first 3D graphics chip, first combination graphics and TV tuner card, and first chip to display computer graphics on a television. (1996)
  • ATI is first graphics company to release products supporting Accelerated Graphics Port, the new industry standard. (1997)
  • ATI Radeon™ graphics technology debuts: leading product for high-end gaming and 3D workstations. (2000)
  • ATI launches ATI Radeon™ 9700 Pro: world’s first DirectX 9 graphics processor. (2002)
  • ATI introduces first 110nm GPUs (ATI Radeon™ X800 XL). (2004)
  • ATI GPU is featured in Microsoft Xbox 360, revolutionizing high-definition gaming. (2005)
  • CrossFire™ multi-GPU gaming platform debuts. (2006)
  • AMD unveils ATI Eyefinity multi-display technology, a revolutionary feature in the ATI Radeon™ family of graphics processors that gives PCs the ability to seamlessly connect up to six ultra high definition displays for a stunning new perspective on their PC experience. (2009)
  • AMD introduces ATI Radeon™ HD 5970, the fastest graphics card in the world to date, designed to support the most demanding PC games at ultra-high resolutions and ensure superior performance in the latest DirectX 11 games. (2009)
  • AMD launches the fastest discrete graphics card in the world, the AMD Radeon™ HD 6990. (2011)
  • AMD launches the AMD Fusion Family of APUs – which consist of both a CPU and powerful GPU on a single die – marking perhaps the greatest advancement in processing since the introduction of the x86 architecture more than 40 years ago. As of the second quarter, AMD shipped more than 12 million APUs. (2011)
  • Launched the world’s fastest and most versatile graphics card in the AMD Radeon™ HD 7970 GHz Edition, delivering world-class gameplay at the highest resolutions. (2012)
  • AMD’s technology is featured inside every major next generation gaming console and home entertainment system: Microsoft’s Xbox One, Sony’s PS4™, and Nintendo’s Wii U. (2013)
  • AMD introduced the AMD Radeon™ R9 295X2, the world’s fastest and most powerful graphics card, powered by two AMD Radeon™ R9 Series GPUs on a single card. (2014)
  • Acer, BenQ and LG Electronics began offering displays supporting AMD FreeSync™ technology, designed to enable fluid gaming and video playback at virtually any frame. (2015)
  • AMD introduced the industry’s first graphics chip to combine high-bandwidth memory (HBM) and die-stacking technology in a single package with its new flagship AMD Radeon™ R9 Fury X GPU, which delivers 60 percent more memory bandwidth and 3x the performance-per-watt of previous generation GDDR memory. (2015)

Featured on AMD’s website under “Our History.” 

Bitter GPU Rivalry

Nvidia

  • Releases the NV1 on September 1995 making use of quadratic texture mapping and upgradable memory from 2 MB to 4 MB. (1995)
  • The Riva 128 is released with polygon-based 3D mapping. (1997)
  • Riva TNT arrived as a fast 3D card with plenty of memory for 1998, and built-in 2D capabilities. (1998)
  • NV 5 otherwise known as the TNT2 made its appearance. (1999)
  • Late in 1999, Nvidia announced the GeForce 256. This was the first card to use what Nvidia called a “GPU,” but its major advance was really consumer hardware support for T&L (transform and lighting). It was also the first card to use DDR SDRAM. (1999)
  • GeForce 2 GTS was introduced using a 180 nm fab process. It was faster than the GeForce 256. (2000)
  • GeForce 2 MX was the first Nvidia card that could manage more than one display. This was their first foray into lower-end GPU’s for less affluent gamers.
  • GeForce 3 makes its debut, the first to be DirectX 8 compatible, it supported programmable pixel shaders. (2001)
  • The Microsoft XBOX makes do with the NV2A, an intermediate chip between the GeForce 3 and GeForce 4. It supported DirectX 8.1. (2001)
  • The successor to the GeForce 3, released in February 2002, was called the GeForce 4 Ti. Its architecture was similar to that of the NV20 (GeForce 3), but the NV25 was significantly faster due to its 150 nm process. (2002)
  • Weak reception of the FX 5800 due to high noise level and low performance. (2003)
  • Improved, the FX 5900 held its own against competing cards with newer 256-bit memory bus and better vertex calculating power. (2003)
  • GeForce 6800 developed to be extremely efficient and more powerful than the FX 5900. It also had support for the new standard SLI. (2004)
  • GeForce 8800 series cards released with support for DirectX 10 and developed using a 65 nm process.

Featured on Tom’s Hardware, “13 Years of Nvidia Graphics Cards“, Dandumont (2008).

  • The Sony Playstation 3 bills Nvidia as their processor developer of choice. (2005)
  • CUDA Architecture is unveiled allowing people to use their GPU for general use computing taking advantage of their parallel process capabilities. (2006)
  • Forbes names Nvidia company of the year. Achieves 1 billion in revenue for their first quarter. Tesla GPU released, previously only available in supercomputers. (2007)
  • Tegra Mobile Processor is launched. GeForce 9400M GPU is adopted by Apple for its Macbook products. (2008)
  • Tegra processors work closely with Android. 3D Vision, the world’s first high definition 3D stereo solution for the home is launched at CES.(2009)
  • Best visual effects at the Academy Awards for both 2010 and 2011 nominees comes via Nvidia GPUs. (2010/2011)
  • Kepler-based GeForce GTX 600 series introduced offering world’s fastest gaming performance. (2012)
  • GeForce GTX Titan unleashed. (2013)
  • Maxwell architecture introduced powering GeForce GTX advancements. (2014)
  • GeForce GTX Titan X unleashed. (2015)

Featured on Nvidia’s website under, “Nvidia History.”

Bitter GPU Rivalry

You might also like: The best graphics cards in 2024

Comparisons

While this all seems a bit technical we can see through the various years that each company started with their own innovations which were in turn reformatted and branded with a different name, a good example of this is the dual GPU solution from AMD referred to as Crossfire and SLI from Nvidia. This pattern would continue to current day as the enthusiast GPU market is now shared by the two behemoths.

As expected, numerous things happened between this competition. First was the acquirement of ATI by AMD. Next was the influx of employees from 3DFX into Nvidia. Both carried their years of experience into their newer fold, adding onto the development of faster, more efficient processing cards.

Through the subsequent years, AMD and Nvidia diversified their product portfolio with applications in general computing, mobile computing, and laptop graphics solutions. Although either is known for their graphics capabilities, Nvidia does major work in the mobile field with their Tegra processors while AMD caters to performance enthusiasts with their FX line of chips.

Bitter GPU Rivalry

But what does this all mean really?

We can deduce several things:

  1. Both companies have other products beyond GPUs.
  2. Both companies are known for their competition in the GPU arena.
  3. Both companies make up the lion’s share of GPU cards currently on the consumer market for PC gamers.
  4. While both companies claim a specific technology as their own and that they were the first to implement it, they seem to use each other’s products as examples and refine them to suit their own purposes.
  5. Numbers, figures, and the odd naming system does little to help us understand what it actually means in real life applications.

Rank my Computer

Coming back down to earth, we will subject our current desktop setup to the “Rank my Computer” website. 

Bitter GPU Rivalry
rank my computer 2 Image: System Requirements Lab
Bitter GPU Rivalry
rank my computer 4 Image: System Requirements Lab

According to the website our setup, an AMD FX-8350 @ 4.7 Ghz, and an AMD Radeon R9 290x ranks in the 97th and 99th percentile. Perhaps we went a little overboard with our choices but it’s a rig that’ll run Assassins Creed IV: Black Flag at 4k resolutions flawlessly.

These images can only illustrate the numbers, and those numbers state that overall, Nvidia and ATI/AMD both center their numbers in the enthusiast market with Intel coming up with integrated solutions. What does this mean for gameplay? For this we have to look at another site that ranks GPU’s based on Benchmarks.


Benchmarks (PassMark, 3DMark)

Bitter GPU Rivalry
Image: Copyright © 2016 PassMark® Software

Under this graph we see that the top performers were mostly made up of Nvidia with the Radeon r9 Fury X coming in at #9.

However according to 3DMark user submissions we see that the AMD Radeon R9 295X2 comes in first.

Bitter GPU Rivalry
Image: © 2016 Futuremark Corporation
Bitter GPU Rivalry
Image: © 2016 Futuremark Corporation

While both of these graphs illustrate the dominance between Nvidia and ATI/AMD it doesn’t answer the question of real life applications. For that we need real game framerates.


Framerates and Resolutions

The reason why numbers in benchmarks don’t seem to matter is because while they look good on paper the difference between those numbers and real life gameplay involves more than just the GPU. The CPU works to deliver raw information to the GPU for processing and display. If the CPU is somewhat slow, say an dual core Pentium 4 @ 3.2 Ghz then it isn’t going to perform nearly as well as say a 6-core Intel i7 @ 3.00 Ghz. The former is definitely faster but does not push as much information to the GPU as the latter. So the i7 is going to do a whole lot better with a higher end GPU than a Pentium 4.

Another aspect of gameplay is the display settings. Anti-aliasing is a major performance hitter as it smooths out jagged edges of polygons in the gaming environment by taking additional samples and adding those additional pixels in order for the graphics to appear smoother. This requires a substantial amount of processing power.

Another performance hit is the resolution. A game displayed at 720p is going to run faster than a game displayed at 1440p. The reason for this is that there are less pixels overall that need to move in the 720p display. The more resolution you use the more taxing it is on both your CPU and GPU.

Since testing each individual GPU rests on many factors within each game (e.g. AA, MSAA, tessellation, ambient occlusion, and texture quality) we can only advise that you go out and look up your cards’ individual results for the specific game.

Two examples from GamersNexus.net appear here:

Bitter GPU Rivalry
Image: Copyright © 2016 GamersNexus
Bitter GPU Rivalry
Image: Copyright © 2016 GamersNexus

What we can tell from the graphs is gameplay between both resolutions and to a large extent, quality level, can make or break your gaming experience. Each GPU offering also performs differently as some games are optimized for Nvidia while others are optimized for ATI/AMD.

Coming back to our original predicament, we have a few options. We can return the game and get a console, or we might go out and upgrade our components in our PC. It’s really a matter of personal choice.

Bitter GPU Rivalry

Desktop or Other?

The desktop PC remains our all-in-one solution for a variety of needs that our current technological environment requires. While we can buy a nice tablet for touchscreen applications we still need our PCs for word processing, imaging, video editing, and serious gaming. It may be cheaper to invest in a high-end tablet for on-the-go connectivity and somewhat similar productivity, a console for gaming needs, and a DSLR camera, but it absolutely behooves you to look into future proofing your Desktop PC with a higher end Processor and Graphics Card to take care of all the above (laptops can fulfill the mobile connectivity–and they have a keyboard!)

What now?

From what we have discussed we have arrived at a few conclusions. Both Nvidia and ATI/AMD have been around for a long time and are quite willing to continue to trade shots over GPU supremacy. Both make claims of having the best or fastest which the other then surpasses the following year. Both hold a large market share and don’t seem to be going anywhere soon.

Bitter GPU Rivalry

Elias Stevens is a freelance journalist, personal chef, and tech enthusiast.

Leave a comment

Your email address will not be published. Required fields are marked *