Intel and Nvidia deny rumors that they have teamed up against AMD

The illustration for the article titled Intel and Nvidia probably does not work together to keep AMD out of laptops

Screenshot: Nvidia

If you the AMD subreddit, the Linus Tech Tips forums, or elsewhere over the past few months, you may have encountered a conspiracy theory that Intel and Nvidia have entered into a secret agreement to take higher-end GPUs out of account AMD Ryzen 4000 Series Laptops. If you look at the slate AMD laptops released last year, you can believe it. The Asus ROG Zephyrus G14, Lenovo Legion 5 and others all came with an AMD processor, but nothing higher than an RTX 2060. Conspiracy theories are provocative, but they seem to be nothing more than a product of the Intel / AMD / Nvidia wars. . It does not help unfounded claims from blogs and news sites around the world continue to push the same narrative. All it takes is a little digging to see that this is not a juicy scandal – just a complicated web of how CPUs and GPUs work together.


In April 2020, Frank Azor, AMD’s Chief Architect for Game Solutions and Marketing, responded to a Twitter user’s question about the lack of high-end GPUs in AMD laptopsand said, “You’ll have to ask the question to your favorite OEMs and PC builders.” It was about the time the conspiracy theory began to take shape, but Azor seemed right. Laptop configurations are determined by OEMs, not chip manufacturers. And these configurations are usually driven by cost, but it should also make sense. An underpowered CPU with an excessive GPU is not a good combination, and it is a kind of fall in which the Ryzen 9 4900HS or lower falls.

Azor even sat down with him The Full Nerd in May 2020 to address the issue again and talk specifically about OEMs’ confidence in Ryzen processors. ‘I think Ryzen 4000 exceeded everyone’s expectations, but mostly everyone gives us a tip. As a result, it was difficult to imagine a world where we were the fastest mobile processor, ”said Azor. “I think if you’ve been planning your notebook portfolio as an OEM, and you have not yet come to the realization – and remember, all this planning for these notebooks was done last year – you leaned a bit in AMD.”

In essence, OEMs’ confidence that AMD has a screaming fast processor was just not there. So why would they pair a high-end mobile processor with something they think would be inferior? To find the middle ground, the ‘meat of the market’, as Azor put it, were laptops with RTX 2060s and lower. Yet, even with this reasonable explanation, the rumor mill continues, looking for clues in the processors’ specifications for answers.

Gizmodo has helped Intel and Nvidia over these rumors, which both companies have strongly denied. A Nvidia spokesman told Gizmodo: “The claim is not true. OEMs decide on their system configurations and choose GPU and then CPU to pair with it. We support both Intel and AMD CPUs in our product stack. ”

An Intel spokesman confirmed the sentiment: “These allegations are false and there is no such agreement.” Intel is committed to doing business with uncompromising integrity and professionalism. ”

The firm denials of Nvidia and Intel certainly suggest that this theory holds little to no hold water, yet I do not think you necessarily need their denial first to theory is bunk bed. The fact is that the Ryzen 4000 series will never be a strong contender for high-end mobile games.


There are three elements of AMD’s Ryzen 4000 series that probably decided not to pair the OEMs with a higher graphics card. These are PCIe restrictions, CPU cache, and the most obvious: single core performance.

Gameplay relies more on single-core performance than multi-core performance, and Intel usually has better single-core performance. This is true both historically and with respect to Intel’s 10th generation versus AMD’s Ryzen 4000 series. Heck, the 10th generation Core i9-10900K’s gaming benchmarks are even equal to AMD’s newer Ryzen 9 5950X when both are connected to an RTX 3080.

IIn our previous laptop test, AMD’s Ryzen 9 4900HS in Asus’s ROG Zephyrus G14 had poorer single-core performance than the Intel Core i7-10875H in MSI’s Creator 15. The Core i7-10875H is not at the top of Intel’s tenth generation mobile line, but the Ryzen 9 4900HS is at the top of AMDs. Yet with almost the same GPU (RTX 2060 Max-Q on the G14, RTX 2060 on the Creator 15), the Intel system was still 1-3 fps higher (1080p, ultra-settings.). If you paired a more powerful GPU with a Ryzen 9 4900HS, it would probably have had the game’s bottleneck due to its single core performance.

This will result in less than stellar performance compared to Intel’s offering, especially when combined with the wimpy L3 CPU case in the Ryzen 4000 series. Net 8 MB L3. This is helf from Intel’s. The average time it will take to access data from the main memory will therefore be longer than the Intel processor.

The PCIe limitations of the Ryzen 4000 series could also have contributed to the OEMs being reluctant to adopt, but the idea is a bit shaky. It originated from a blog post on igor’sLAB because Ryzen 4000s CPUs have only eight PCIe 3.0 lanes dedicated to discrete GPUs, this can cause a bottleneck when paired with something higher than an RTX 2060. Each PCIe device requires a certain number of lanes to run on full capacity to work, and both Nvidia and AMD GPUs have 16. Because Intel’s tenth-generation processors supported 16 lanes, it made a better fit for the RTX 2070 and higher GPUs in last year’s gaming laptops.

However, many people in Reddit and other online forums pointed out that the performance degradation of pairing a Ryzen 4000 CPU with an RTX 2070 or higher GPU would be very small, if at all noticeable, so the explanation was for them not meaningful. . (More fuel for the conspiracy theory.) Of course, I had to test it all myself to see if there really is a decrease in performance from 16 lanes to 8.

I performed my own tests that 16 lanes indeed offer better performance on higher parts GPUs, but that performance can also be quite insignificant. Gsaid I used a much more powerful processor than the Ryzen 9 4900HS, so it was able to handle an RTX 2060 and higher, no matter how many PCIe lanes were available.

My test computer was set up on: an Intel Core i9-10900K, Asus ROG Maximus XII Extreme, 16 GB (8 GB x 2) G.Skill Trident Z Royal DDR4-3600 DRAM, Samsung 970 Evo 500 GB M.2 PCIe SSD, a Seasonic 1000W PSU and a Corsair H150i Pro RGB 360mm AIO for cooling.

Game performance barely changed after I switched the PCIe configuration from 16-lane to 8-lane, but the performance difference was noticeable in synthetic standards. Comparing an RTX 2060 to an RTX 2070 Super (the closest GPU I had to an RTX 2070), I performed standard tests in GeekBench 5, 3DMark, PCMark 10, Shadow of the Tomb Raider, en Metro exit, some of which are part of our regular tests.

The frame rate only increased a maximum of 4 fps, the most noticeable difference of which was Shadow of the Tomb Raider at 1080p. It supports what many have said about the gaming performance not being significantly affected by halving the number of PCIe lanes to the GPU, until you get something so powerful with RTX 2080 Ti.

The synthetic standard tests did not change much from 8-lane to 16-lane with the RTX 2060, but the difference in scores was more pronounced with the RTX 2070 Super, indicating that there is a measurable difference found in other applications of may be of interest. The RTX 2070 Super’s GeekBench score jumped by 3,000 when all 16 lanes were made available to the GPU. Time Spy has delivered results in line with the benchmarks at stake, and oddly enough, the RTX 2060 has a bigger boost in the PCMark test compared to the 2070 Super.

Of course, synthetic metrics are not a measure of actual performance, and PCIe bandwidth is not the most important thing that will slow down your system. But if many reviewers used these statistics to sketch a laptop from a laptop or computer, the AMD 4000 series processors, coupled with something higher than an RTX 2060, would have looked lower than usual. For higher-end GPUs that are ‘performance driven’, every extra number, every extra frame is important, especially if there are many OEMs competing for a place on your desktop.

This suggests that, yes, OEMs will favor the “better” CPU, even if the better CPU is only slightly better. A lack of AMD 4000 series processors, coupled with higher-quality Nvidia graphics, may be due to OEMs not even estimating how many consumers were interested in the type of laptop configuration last year, but it probably has to do with the lack of 4000 series. of L3 gearbox and slower single-core speeds. Of course, the RTX 2070 and higher can work well on PCIe x8, but if the CPU does not have the juice to handle the GPU, none of that matters.


There is one last point to unravel this theory. If Intel and Nvidia teamed up to exclude AMD why is more OEMs wholeheartedly embrace the AMD / Nvidia combination this time? Many of their AMD Ryzen 5000 series-powered laptops have an RTX 3070 or 3080; the latest AMD Ryzen mobile processors have 20MB L3 + L2 memory and support up to 24 PCIe Gen4 lanes (16 lanes dedicated to a discrete GPU) – exactly what it needs to connect nicely to anything higher than ‘ a middle distance map.

Companies are often found to be involved in a myriad of shady activities that enhance their core while harming consumers and the choices we make affect every time we enter a Best Buy business with money burning in our pockets. But no, Intel and Nvidia are probably not to blame for the slow acceptance of AMD CPUs by OEMs. AMD has had to build its reputation over the past few years and create processors that can really compete with Intel in the mobile space and support the powerful GPUs that Nvidia manufactures for laptops.

The Ryzen 4000 series was very good, but not quite ready to compete in the areas that matter most gamers, and OEMs for laptops. The Ryzen 5000 series, if OEM adoption suggests anything, is going to be a whole other beast. And it will probably be found in all the big games laptops that the 4000 series was not. Nvidia and Intel have nothing to do with it.

.Source