Can Intel Arc shake up the depressingly bleak state of graphics cards?
The graphics card duopoly ended this week. For the first time in decades, we’re staring at a true three-way battle for gaming supremacy. Yes, Intel makes discrete graphics cards now. It couldn’t have come at a better time, as desktop gaming feels increasingly expensive and painful, and the unorthodox launch of Intel’s Arc graphics drives home that Chipzilla is plotting its ascent in very strategic ways.
Hopefully this fight rejuvenates the consumer graphics card space with a much-needed breath of fresh air. Things were looking bleak even before scalpers and chip shortages descended on us all.
In search of a hero
For years, enthusiasts flocked to Nvidia’s GeForce graphics cards, and for good reason: They rocked the socks off AMD’s rival Radeon offerings, which struggled to match the power efficiency, raw performance, and software prowess of the Green Team. There’s a reason GeForce cards dominate the Steam hardware survey and enjoy such great brand recognition among PC gamers. Nvidia consistently delivers great products, aside from a few duds here and there.
There’s a downside to consistent domination though. Nvidia keeps pushing the price of graphics cards upwards. Yes, the RTX 20-series GPUs dragged PC gaming kicking and screaming into the ray-traced, AI-upscaled era, but they didn’t deliver tangibly faster performance in traditional games (a.k.a. the vast majority of them). The $700 RTX 2080 slung rays just fine, but it slung frames for roughly the same price as the prior-gen GTX 1080 Ti. Performance stagnated, and sales stalled. Nvidia could afford to sit on its laurels in raw frame rates because its GeForce offerings were that far ahead of its Radeon rivals, which suffered after rolling snake eyes on a big bet on expensive HBM memory technology.
Brad Chacos/IDG
AMD didn’t sit idly by though. Its RDNA architecture, found in the Radeon RX 5000-series, greatly improved AMD’s energy efficiency, which in turn helped spur greater performance. The Radeon 5700 stood strong as our top pick for a 1440p graphics card during its tenure. But AMD truly started shining with the RDNA 2 architecture found inside its new Radeon RX 6000-series graphics cards. For the first time in a long time, Radeon offers tangible, worthwhile alternatives to the GeForce lineup up and down the entire product stack. Even the vaunted GeForce RTX 3090 found itself threatened by the Radeon RX 6900 XT’s incredible gaming prowess. Competition is thriving yet again.
But here’s the thing: That two-way competition isn’t benefitting mainstream consumers yet. Sure, more games are supporting ray tracing and image upscaling now, and both Nvidia and AMD have been stepping up in terms of software features and hardware integration. That’s great. But prices keep going up, even before a GPU shortage and rampant scalping made this a lost GPU generation of sorts.
Nvidia struck first this generation with its RTX 30-series cards. These GPUs offered improved performance over their stagnant RTX 20-series predecessors, but also came with higher price tags. The GTX 1070 launched at $380; the RTX 3070 launched at $500. You see similar trends up and down the entire GeForce stack. The GTX 1050 launched at $109; the RTX 3050 at $250 (and even that was unrealistically low posturing). And so on.
AMD built its reputation by being an affordable alternative to Nvidia’s offering. When the GTX 1060 launched for $250, the Radeon RX 580 offered similar performance for $200, for example. But now, with AMD firing on all cylinders with both its Ryzen CPUs and Radeon GPUs, it has turned its attention to higher profit margins. Rather than slap ultra-competitive prices on most of its ultra-competitive Radeon RX 6000 GPUs, it instead hovered roughly around the higher MSRPs Nvidia established first with its RTX 30-series chips. The $480 Radeon RX 6700 XT was only $20 cheaper than its RTX 3070 rival, for example, and performed more closely to the $400 RTX 3060 Ti.
It also came in at $80 more expensive than the Radeon RX 5700 XT that came before it. Both of those packed 40 total compute units; before that, the Radeon RX 580 and its 36 compute units debuted at $200 or $240, depending on your memory configuration.
Part of the price creep can be attributed to the increased costs of manufacturing today’s exceptionally complex graphics cards. (The GTX 1060 and RX 580 didn’t need to worry about ray tracing or tensor cores, for example.) Part of it is due to inflation. There are other factors too, including the current pandemic crunch—but it’s impossible to argue that Nvidia and AMD’s thirst for higher profit margins play heavily into it as well. When today’s $200 graphics cards aren’t any faster than 2016’s $200 graphics cards, it’s hard to feel good about being a PC gamer on a budget.
Enter Intel’s Xe HPG architecture, and the Arc graphics cards they’ve crafted from it.
An Arc against rising tides?
Intel
It’s still too early to tell if Intel’s Arc gambit will prove successful. Desktop Arc graphics cards aren’t expected until the second quarter, and Xe HPG launched today in its most modest form—as mainstream add-in cards for portable laptops, targeting Medium to High graphics settings in games at 1080p. Arc is no RTX 3090 Ti rival, at least not yet.
But these humble beginnings play cleverly to Intel’s strengths. Laptops vastly outship desktops, and DIY is a mere rounding error in comparison. Launching Xe HPG in Arc 3 laptops starting at $899 lets Intel flex its Evo platform and sidestep the ongoing supply woes circling desktop GPUs. Better yet, it lets Chipzilla drive home Arc features that are only available when paired with Intel’s Core processors.
Intel
Rather than get into a brawl it might not be able to win with Nvidia and AMD in desktop gaming cards, Arc’s laptop debut let Intel focus on “Deep Link” integration features that bring the power of Intel CPUs and their exceptional media engines to bear on content creation tasks as well. These Arc-powered Evo laptops will be able to dynamically shift power between the CPU and GPU, encode videos at a ferocious clip, accelerate AI tasks (like enhancing old pictures), automatically capture in-game highlights, speed up frame rates with XeSS upscaling, and yes, tap into QuickSync. Arc is also the first chip to support hardware-accelerated AV1 encoding, in a world where AV1 is rising at a torrid pace.
These modest Arc notebooks could—could—be ultimate content creation laptops if applications support those features. And they likely will.
For all the attention around Nvidia and AMD’s discrete graphics cards, the integrated graphics in Intel’s processors are the most popular graphics chips in the world, and because of that, many of the world’s most popular applications support Intel features. Intel has more software developers than AMD has employees period. Handbrake, Adobe Premiere Pro, and others already support some aspects of Deep Link.
Intel
What does all this have to do with desktop prices? Nothing. Yet.
We still need to see how well Xe HPG translates to the desktop when discrete Arc graphics cards launch later this year. If it’s a dud, it’s a dud. But this clever, unorthodox launch shows that Intel is looking to battle from positions of strength and being very shrewd about how it approaches discrete graphics. And with Intel/Nvidia being the most popular combination in laptops with discrete graphics, simply launching Xe with Intel-exclusive features is already a Will Smith-esque shot to Nvidia’s dominance in the mobile space.
Arc may not have launched straight into a desktop brawl, but Intel is indeed looking for a fight—just a fight on the grounds of its choosing. Even if Xe HPG doesn’t compete with the upper echelon of Nvidia and AMD’s graphics cards in this first iteration of Arc, having Intel in the fray with rolled-up sleeves and a deep war chest means the stage is now set for even more intense competition. And with how bleak PC gaming has been for everyone but deep-pocketed enthusiasts over the last few years, a new player entering the game couldn’t have come at a better time…if Intel comes out swinging to win market share, rather than striving for maximized profits itself.
Fingers crossed.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.