Intel’s new GPU and CPU, match made in heaven?

How did we get here?

It started when I thought it would be a good idea to buy the Intel Arc GPU to test. I picked up the A750 for a reasonable price, (which is about to drop), and went to work. Testing did have it’s issues and there were some bumps in the road, but I got results that were helpful. I then turned my attention to building and testing a rig with both Intel CPU and GPU in it, and that wasn’t so easy.

There were issues recognizing the GPU and an NVMe drive, but the primary problem was acting like something altogether different. After realizing a BIOS update was necessary, I was back on track, but nowhere near as excited. Links to the BLOG and VIDEO, in case you are interested. It was some work, but it did work.

I wasn’t able to complete benchmarks because of issues with time, but I needed to post the video. So, I went back to work on the same rig after posting, and ran into more issues. I was already aware that some games were not going to run properly, so I left them off the list. I was also aware that one game just takes forever to benchmark, so I left it off as well. Bad choice. Benchmarking on that is going on as I write this.

Issues

One of my biggest issues was that the game with the Vulkan graphics API won’t select that protocol, forcing me to stick with DX11, a known shortcoming. In fact I had problems with another title that has DX 11, Borderlands 3. It’s a game I have thought about dropping, but now I may actually have to.

I was testing on DX11 and DX12, and it just quit. Period. Won’t run. It shows the splash screen then shuts down completely and won’t start up. I have a similar problem trying to run this game on my editing rig, but Borderlands3 now defaults to DX12 and I have several games with that setting. It’s great to benchmark, but it may be time to retire it.

Other issues included World War Z where it caps the framerate and I get horrible artifacting, and Horizon Zero Dawn has the same framerates no matter the actual resolution. The latter of the two was confusing during my tests with the Ryzen CPU, so I was expecting it, but it shows Intel still has some work to do on their drivers.

What’s next for the Arc GPU?

I’m not done with this one yet, though I decided that CoolBlue (intel test rig) needs some changes. Instead of just changing a fan and getting an Intel NVMe to bring this closer to a true DeepCool/Intel build, I ordered a new platform. For the first time in over ten years, I am probably going to make my main rig an Intel.

I will be changing the GPU, mind you, but I will test this A750 with a 12700KF in a brand new B760 motherboard. The choice wasn’t made lightly. I was going to go with AM5, but recent problems with motherboards overvolting these CPUs had me turn to something already proven for my new build. But that’s another story…..

Intel Arc Graphics. Outstanding, Good, or Trash?

How did we get here?

Intel dominates part of the Central Proccessing arena, but what about Graphics? For several years there have been two major players in the GPU space, NVidia and AMD. They’ve had decades to refine that craft and both have strong products. NVivida has dominated until recent years, but what about a third? Can someone else come in and compete with two established jugernaughts?

Enter the Arc line of GPU’s. It’s Intel’s first foray into the market in a dedicated PCI slot, and the first dedicated card since 1998’s AGP card. They have had integrated graphics built into motherboard chipsets, and processors for decades, but stayed clear of discrete cards. That is, until lately, with Xe graphics.

Okay, but Intel?

Having a small You Tube channel gives me an opportunity to test many different PC parts. Most are mainstream, some a little odd, but which category does the Arc GPU fit into? I didn’t have an interest in that question until I found out Intel looks to continue making GPUs. At the time of writing this, there are many unconfirmed rumors that Intel will indeed continue for at least two more generations.

Intel has three discrete cards, the A380, A750, and the A770, with all three being reasonably priced. I chose the 750 because it has fair competition in the same price range. NVidia has the RTX 3050 and RTX 3060, while AMD has the RX 6600 and 6600XT, both also have older products. With an RTX 3060 and RX6600XT both on hand, I had test material. If only it were that easy.

First impressions of the card were favorable. It’s sleek, appears to be well built and beautiful to look at. Other things about it aren’t as compelling. The drivers are cumbersome, though I understand they’re much improved, and the need for Resizable BAR ( a direct storage option) required a bios update. It also required digging through outside source material to configure the bios before I could reinstall the Graphics drivers. Set up wasn’t painless.

Finally

It took several hours, but I was finally able to start testing. With the proper drivers installed, I spend some time repeating the same exercises that gave me issues after the first attempt, and I was pleased to find things finally working. No locked screens, no blackouts, and no error message saying resizable BAR wasn’t enabled. It was finally time to test games.

The Arc performed well in Direct X12 titles, but was I not prepared for the nose dive with the Direct X 11 API. The card stumbled out of the gate on just the second game, Borderlands 3. It typically runs better on the older API than the new one, but is always within margin of error. Results didn’t get above 77 frames pers second in DX11, where the other cards approached 200. I knew it struggled, but not that much.

So, is the Intel Arc GPU junk?

Not at all, but it does still need some work. The performance in Direct X12 matches the other cards, and switching Borderlands 3 to DX12 saw it beating the 3060. I need another example of a Vulkan title, but what I’ve seen so far is promising. Temperatures stay in the low to mid 70s, the clock speed maintains 2400 MHz. Menus and apps are responsive with no delay or hang ups. Overall performance is good, with no artifacting or glitches. It also looks nice with smooth lines and a tasteful white LED.

AMD and NVidia options straddle the Intels in both price and performance, and although Intel has no high end offering, it wasn’t long ago that AMD only targeted the mid range market. The A750 retails at $250, placing it firmly in the mid tier of GPUs. It’s a solid card and I look forward to testing others.

The video can be found here

Back to the blog page