Graphics Cards: What Do the Numbers Mean?

gtx980release

Bigger is always better, right?  When it comes to your graphics card, this isn’t always the case.  Both AMD and NVIDIA use easily marketable combinations of letters and numbers to identify their GPUs, but this does not mean that the numbers are easy to understand.  This article will explain the subtle nuances in the names of performance graphics cards.

Generally, larger numbers denote better performance.  Within a generation of graphics cards this is easy to understand.  It is obvious that the GTX 780 will perform better than the GTX 760, and that the R9 280X is faster than the R9 270.  But how does the GTX 680 compare to the GTX 760?  And the HD 6950 to the HD 7790?  At first glance the performance of these two cards might not be easily discernible.

NVIDIA

image

NVIDIA’s card names look like this:

GT 610, GT 620, GT 630, GT 640, GTX 650, GTX 650 Ti, GTX 650 Ti Boost, GTX 660, GTX 660 Ti. All of these are part of the 600-series, since they start with 6. The 500-series cards would look quite similar, but start with 5: GT 520, GTX 560, GTX 560 Ti, and so on.

The first thing to notice is that some are “GT” cards, and some are “GTX”. The “GT” label is for lower-end cards, which are not meant to handle powerful games, but are more suited for office work and light graphics. GTX cards are more powerful, and aimed generally at gaming-oriented machines.

The first number (i.e. the 7 in GTX 760) denotes the generation of cards it belongs to, and can be used as a very general performance indicator.  The 700-series cards are one generation newer than 600-series cards, two generations newer than 500 series cards, and so on. (Update: with the release of the 900-series cards, NVIDIA decided to skip naming any generation as the 800-series. So, you’re not crazy, there are no 800-series NVIDIA cards.)

The second number in the name (i.e. the 6 in GTX 760) indicates the performance level of the card.  In this case, the “60” means it is classified  as a mid-range graphics card.  There are several levels of performance within each family of cards:  700-735 cards are considered low-end or mainstream cards, and will not perform well in demanding 3D applications.  Cards in the 740-765 range are performance cards and are well-suited for comfortable frame rates at 1080p resolution.  Enthusiast-class cards are the 770 cards and beyond, meant for high FPS and multiple monitors.

The “Ti” cards are a bit more powerful than the non-Ti cards with the same number. So a GTX 750 Ti is more powerful than a GTX 750. NVIDIA sometimes has a “Boost” edition as well: There is a GTX 650, and GTX 650 Ti, and a GTX 650 Ti Boost. The “Boost” version is more powerful than the non-“Boost” card with the same number.

So, is the GTX 680 faster than the GTX 760?  The answer is yes, but only slightly.  The 680 is a previous generation card and has more raw performance than the 760 (faster clock rates, more shaders, texture mapping units, and SMX units) however, this performance comes at the cost of higher thermal design power and power consumption compared to the newer GTX 760.  If you wanted to overclock the 760, you could end up with a card roughly on par with the GTX 680 for significantly less money.

AMD

image

Until their latest generation of cards, AMD used a name scheme similar to the one NVIDI uses.  They were named HD XXXX, where the first X represented the generation, and the next three described the relative performance of the card within that generation. So the 5000-series were named HD 5770, HD 5850, HD 5870. The 6000-series had HD 6850, HD 6870, HD 6950, HD 6970. For those old cards, the comparison was easy: As long as the cards were from the same series (5000, 6000 or 7000), then the higher numbered card was more powerful.

AMD’s new cards, on the other hand, are a whole different world.  With their latest generation of cards, AMD adopted a new and very unique naming scheme. They follow the RN NNN(X) convention where the first N represents the overall performance level, and the next three Ns indicate varying degrees of power within that range of cards.  An X at the end of the number refers to higher clock speeds, or a more powerful version of the card (R9 280 vs R9 280X)

At the time of writing, AMD’s new cards are R5 2XX, R7 2XX, or R9 2XX. The R5/R7/R9 are meant to help distinguish the target market, while the “2” means that all of these cards are from AMD’s 200-series. AMD’s next series will likely be called the 300-series, and will probably have card names that look like R5 3XX, R7 3XX, and R9 3XX.

R5-series cards are meant to be entry-level cards not used for gaming.  They range from the R5 210 to the R5 235X.  The mid-level cards span quite a distance, from the R7 240 (which is a relatively weak card) to the R7 265 which is a decent budget graphics card.  The next series of cards can be subdivided into two categories.  The R9 270 through R9 280X cards are high end units, capable of good FPS at 1080p resolution on medium to high settings.  The R9 290 cards and beyond are meant for enthusiasts with a large amount of memory and plenty of raw horsepower.

The newest AMD cards aren’t easily comparable to previous generations which followed the HD XXXX naming scheme, which is similar to the one NVIDIA currently uses.  Also, comparing cards across the different manufacturers requires a closer look at the raw specs to get the best idea of performance differences. Try and look for reviews from reputable reviewers to see how the card you intend to purchase compares to the competition.

Logical Increments is looking for bloggers who enjoy writing about PC hardware and gaming. More information here.

  • Johnny

    Thank you so much for clarifying this. I was so lost and no one seemed to explain it well

    • Matthew Zehner

      We’re glad you found it helpful!

  • Marvin Allen

    GOOD GOD! THANK YOU!!! I’m diving into my very first build and am discovering that I am almost completely computer illiterate! shopping around and researching GPUs has been a massive headache do to my complete ignorance. This is monumental!

    • Matthew Zehner

      We’re so glad you like it!

  • Andrew Ding

    Great guide! There’s one bit that confused me though. In the 3rd-4th paragraph under nVidia, it says that the 700 cards are a generation older than the 600 cards. Shouldn’t it be one generation newer or one generation younger instead?

    Also, out of curiosity, how long is a generation for graphics cards? I’m guessing one year, but it’d be great if someone could verify that. Thanks again for the awesome guide!

    • Matthew Zehner

      Thanks for pointing that out! I believe you are correct.

      It is typically about every year to year and a half and a half that there is a new line of GPUs released. For example, the Nvidia 700s line launched in around May 2012 and the 900s line in around September of last year.

  • dwayne

    What about all the other numbers? (gb, ram, and so on)

    • Matthew Zehner

      GB refer to the size of the memory, typically. For example, 8GB of RAM has double the storage of 4GB of RAM. Other numbers denote different series of RAM/CPUs the architecture therein.

      Maybe we will address this in an article or something.

  • Matt

    Thanks a BILLION for this article – I should’ve looked up this info years ago, because I’m an idiot and I’ve always bought my GPUs based on the generational number (always wound up with an X50, which sucks for high-end gaming). Now I know to stick to X60+ when I can afford it!

    • Matthew Zehner

      We’re so glad you found the article to be helpful!

  • irritated

    what about geforce and other non-gtx crds please help

    • Matthew Zehner

      The process of naming Nvidia produced GeForce cards is covered. These are the ones that start with the “Gt” or “GtX”. Non-Nvidia cards (AMD cards) are covered right below that in the article.

      I believe all the information you are looking for should be inside the article!

      • brandonsimon ly

        I really hope you get around to makking an article about that and i reallyhope it was as helpful as this one. Thanks

        • Matthew Zehner

          Thanks, and no problem!

          • brandonsimon ly

            Whoops, when i wrote that i hoped you would get around to the doing the article i was meant to post that that in the comment below this one(dwayne’s comment).xD

          • Matthew Zehner

            It’s all good! Thanks for letting me know.

  • Theihackz

    Thanks a ton, guys : ) I never knew how to tell good cards and bad cards apart : )

    • Matthew Zehner

      We’re so glad that you found it helpful!

  • Nunutrxh

    Thank you so much!
    Now i can finally tell the diffrence between a good card and bad card!

    • Matthew Zehner

      Yay! We’re glad that we could help.

  • Nunutrxh

    Thank you so much!
    Now i can finally tell the diffrence between a good card and bad card!

  • Philip399

    Do you have something similar for Motherboards and CPUs?

    • Matthew Zehner

      We do not have an article specifically written about numbers and what they mean, with regards to those components. We will consider writing such a thing, though!

  • chase
    • Matthew Zehner

      The details are a bit sparse on the second two, but the first one (by iBuyPower) has the best processor and a solid GPU (though I don’t know the specific models of the others), so it is a good choice.

  • Maverik00

    I currently have a nvidia gtx960m, is that considered a good graphics card? because in games I’m rarely getting my fps above 60

    • Matthew Zehner

      It’s a solid mid to high tier GPU, for sure. In which games are you not getting 60 FPS and at what resolution?

      • Maverik00

        The game where I see this the most is skyrim. My frame rate usually maxes at that game around 45-50, and its min is usually 10-15 at some points (either immediately after a loading screen or sometimes when it’s rendering background details) I’m pretty certain the resolution is the one native to my monitor, but I’ll have to check as I dont have access to that computer right now.

        • Matthew Zehner

          What is the resolution? What settings are you running the game on, and with mods?

          • Maverik00

            the resolution is 1920×1080, and i haven’t installed mods yet, but im planning on later. (and the settings are on what the game recommended which is either high or medium)

          • Matthew Zehner

            If you want to improve performance, I would reccomend turning down AA, shadows, and the like sense these tend to be quite demanding.

          • Ryan Thomas

            With Skyrim I would not be blaming your GPU for the fps drops as there are lots of little buggy things in that game that slow things down. Luckily there are fixes for some of that in the modding community. There’s the Unofficial Skyrim Patch which is one of the most popular Skyrim mods of all time, and one that helped my game performance quite a bit was the Possessive Corpses mod. My current game is many hours in so the save games were getting clogged up with lots of junk and it got to the point where I could barely play the game within cities so when I installed that mod it sped things up very surprising amount so the fps drop isn’t nearly as bad and the game is playable again. Those mods are both available on the Skyrim Nexus, I’d recommend getting started with modding right away, fixing those bugs with the patch mods will allow your GPU to handle the extra load of the fun/pretty mods. I run a heavily modded Skyrim game (well over 100 mods with lots of extra lands, NPCs, high res textures, etc) on my laptop which has a 6 year old 330M card, at 1080p medium settings. Your 960 should be able to handle Skyrim no problem, even with tons of mods and AA etc, as long as the bugs are sorted out and you take care to manage your load order with LOOT.

  • Maverik00

    I currently have a nvidia gtx960m, is that considered a good graphics card? because in games I’m rarely getting my fps above 60

  • Small Dick Rick

    Hey, I’m having some troubles for deciding on a video card for the PC I’m building. I’m planning on getting an i7-4790k and a 1440p monitor eventually, but I’m not sure if I should get an r9 390 or a GTX 970, and eventully SLI/Crossfire, or if I should just save up money and get a 980 or 980ti or something. Thanks for reading, hope you can get back to me soon.

    • Matthew Zehner

      So what games do you want to play? 60 FPS at maximum settings? This will help me decide which components to reccomend.

      • Small Dick Rick

        I’d like to play the latest games such as GTA V, the division, Witcher 3, ARK Survival Evolved, etc at 1440p and 60fps at high/ultra settings. I know two r9 390s would handle those well, but then again two 970’s would also… EDIT: if I got a 390 and then eventually another 390, I would probably need a new PSU and a lot better cooling…

        • Matthew Zehner

          If you are doing Nvidia SLI, I would reccomend it with two 980s or Tis over the 970s in order to get the best performance. However, either option should perform quite well, and largely comes down to which card/manufacturer you prefer.

    • Winterzbite

      Hi rick.

      Generally, you are better off spending more money on a single card, rather than planning for two cards. Games are not always compatible with sli/crossfire, meaning that sometimes you will be running only one card, despite spending money on two or more. There is less support for multiple cards, and occasionally when there is a game breaking driver bug the multi card users will not get an update for weeks or more. Heat can be an issue, and can easily raise the temperature of the cards to dangerous levels if you do not get properly set up cooling. I would also add that when you buy two cards, you do not get double performance. Its actually usually between 15-40% boost off having a single card, so your second card really isnt delivering as much performance for your dollar. I would also suggest googling micro-stutter, because having multiple cards *can create a stutter that is so irritating that you cannot play games.

      There is a good chance you will at least run into one or more of the issues i listed, and im sure there are more im not thinking of. Honestly, save your money. Buy the best single card u can afford. i personally run a gtx 780 6gb at 1440p, and every game runs at ultra. the occasional game will run at about 45 fps, but for the most part every game is on max at 60 fps. Save your money, get a 980, or even a 980ti if thats what you want. but dont buy dual cards.

  • Nikola Dzh

    The 700-series cards are currently the newest from nVidia, and are one generation older than 600-series cards, two generations older than 500 series cards/ Is this written correctly? I cannot understand how the newest cards can be older than 600 series? Great article by the way 🙂

    • Matthew Zehner

      At the time of writing that was the 700s were the newest, but currently the 900-series cards are the newest from Nvidia. the 700-series is the second newest, the 600s the third, and so on. Hopefully that makes sense!

      • Rywen Erendani

        And now we have the 10-series

        • Matthew Zehner

          Yep!

  • James Hoy

    I’m late to the comment party, but this article was extremely valuable. Thank you so much.

    • Matthew Zehner

      We’re glad you found it helpful!