Rumours of an Intel entry into the graphics card market have been circulating for years, and many people had eagerly anticipated the arrival of a third player to the discrete GPU space.
Well, it happened. A few days ago, Intel released the Arc A750 and Arc A770 cards. Let us take a look at the good and the bad of this launch.
The two new GPUs from Intel are
- Arc A750 for $290
- Arc A770 for $350
With these prices, it is clear that Intel is aiming to take a slice of the upper midrange market, so we are hoping for higher competition (and cheaper prices) in that segment down the road.
The cards perform well in 1080p/1440p in a selection of games, matching or beating the competition. These are not integrated or mobile GPUs (Intel’s forte!) slapped into a discrete form-factor, but really solid desktop cards.
All cards have a performance drop when you play on a higher resolution. But a closer look at benchmarks reveals that Intel’s relative performance drop is better than AMD or nVidia when going up to 4K. Of course, it has to be noted that upper midrange cards are not meant for 4K gaming. So, perhaps Intel will release bigger cards in the future with even better 4K performance? We can only hope.
The two new cards have above-average power consumption overall, and very high power consumption when idle (compared to the competition). A small increase in power draw is not too big of an issue though, especially since the temperatures seem fine.
At launch, there are no custom cards from board partners like Gigabyte or ASUS. Cards with custom HSFs typically run cooler/quieter, and have much wider availability.
And speaking of availability: Intel’s did actually launch a graphics card earlier this year, the A380, which was initially limited to China. Back then, Intel promised to “expand globally during the summer“. We are well past “during the summer” at this point, and A380 availability in the USA at the time of writing is a single card on Newegg. The new Arc A770 and Arc A750 will hopefully have much better availability than that… but we will definitely not count on it.
There are two major issues with this launch:
1. Big performance loss on older games
The cards perform well in 1080p/1440p in a selection of games. This selection is DX12/Vulkan titles, and a cursory glance shows that modern games built on Vulkan/DX12 are roughly 5% of all games. In older titles, these new cards suffer from a serious performance hit, underperforming the competition. A non-representative (but hilarious) example would be Counter Strike: Global Offensive, where the $350 A770 gives less than half the performance of the $250 RX6600.
Let us stress that the above example is extreme. For typical games, the performance loss is roughly 25% vs the competition.
2. Big performance loss on older systems with no resizable BAR
Discussing the details and benefits of resizable BAR is beyond the scope of this post, but nVidia explains it quite well. What we need to note is that resizable BAR is only an option on recently built systems: Intel’s 10th Generation (or newer), or AMD Zen 3 (or newer). Since both of those were launched in 2020, if your PC is older and does not support resizable BAR, the A750/A770 will suffer a serious performance loss. How much? Around 25%!
Taking the above two points together: The Intel A770/A750 cards are only really worth it if you have a very modern (less than 2-year-old) system, and you exclusively play the most modern 5% of games. Thus, the segment of gamers that would get their full money’s worth from these cards is very small!
Intel’s first global discrete graphics card launch is a mixed bag. On one hand, the cards can perform decently under some circumstances. On the other hand, there is the remaining 95% of circumstances where these cards lose at least a quarter of their performance (making them completely illogical).
With enough time, I believe that these cards will get updates that make them perform better in older games. With enough time, resizable BAR will go from “a feature on the most recent systems” to “a default feature”. With enough time, Intel may be able to improve availability and operating temperature by partnering with established graphics card manufacturers. When there has been enough time and such improvements have been made, we can add Intel’s cards to our main chart.
For now though, we will not be adding either of these cards to our recommendations.
On a personal note: Strong competition is healthy for tech development, and it is in the customer’s favour to have multiple good options. As a result, I earnestly hope that Intel fixes the issues rather than forfeiting, and comes out stronger in the future.