Buying a better graphics card for the same price.

How did we get here?

Recently, I found a deal on an older Xeon combo, and thought it would be good to do a theme build. The build works great, I used a GTX 1660, available for about a hundred bucks used. It’s a solid graphics card and a solid price. New, however, is a different story.

This card is available for around 200USD new, and widely available, but it’s not the only one available for that. In fact, several cards are in that range. Better ones, in fact. One of those is the Arc A750 from Intel. I opted for the 1660 because it’s better than the AMD offer of an RX580 2048sp version. Granted, the 580 is a new card, but it’s more of a refurb, than a brand new option. We’ll touch on that soon.

To test these I used my editing rig with a 12th Gen i7, the newest CPU I have. Matched up with 32 GB of DDR4 memory and a list of games, I ran some benchmarks. The 1660 is about four years old, the A750 about a year, but again, prices are the same for the new cards and both are available with driver support. The refurb 580s may be new, but driver support ended a few months ago. For that matter, the 1660 is no longer in production, but still available.

Well, what were the results?

Not surprisingly, the A750 performed better. The difference, however, surprised me. I ran two DX 11 titles and a handful of DX12 games and the difference was noticeable. DX11 titles were reasonably close, but in some cases DX12 titles had twice the framerate. The improvement in DX11 titles shows just how much Intel continues to improve the driver support for their new GPUs.

This ARC series of graphics card keeps getting better and the price makes it a great value. AMD cards like the RX580, RX5700, and even RX6600 all have continued to drop at a reasonable rate, but not so with the Green Team. The GTX 1660, an RTX 2060 without Ray Tracing, just isn’t worth the money trying to buy it new.

I bought the Arc A750 for more than it sells now, but AMD’s RX6600 has similar performance to the ARC card, and both are better than the 1660 at the same price. NVidia’s comparable card brand new runs almost a hundred USD more (The RTX 3060). The green team still does lead in Ray Tracing and the NVenc encoder is outstanding, but the ARC does have the AV1 encoder and that has been a huge leap forward.

So, why use the older graphics card?

That part is simple. It’s still a solid card. The price used is great, and it’s perfect for the build it’s in. Matched up with the Xeon, from Intel’s forth gen chips, the balance of CPU to GPU is almost a perfect match. Everything in that build seems to compliment everything else, and it ends up being a great gaming PC for 1080p resolution.

For that matter, the A750 ends up being a great compliment to the i7 12700 in the editing rig. I didn’t think that would be the case when I put it in there. The driver effort from Intel has been that good.

The YT video on the comparison is here, while the video on the build itself is here. There wasn’t a blog post on the Xeon/1660 pairing, but there is a blog about matching older and newer hardware in a ‘broken’ PC here.

Can Arc GPU’s play the new Starfield? Sort of.

How did we get here

The A750, being a new architecture, is bound to have issues, but you would think some things shouldn’t happen. Intel released the Arc GPUs about a year ago and declared that they were concentrating on the most up-to-date processes first. This meant games running DX12 and the new AV1 encoder among other things. They admitted older APIs and processes would take time to ‘fill in’ and to many of us, that was okay.

I saw it as an opportunity to do some testing and make content, so the discovery process worked well for me. Newer titles run well, and as recently discovered, the AV1 encoder is amazing. Older titles running Direct X 11 or 9 had some issues, but that was expected. No one felt they needed to call Intel out on that because they told us what to expect. They reported, and we confirmed. Everything checked out fine.

Intel has released several driver updates over the year to improve running those older titles. Improvements have been seen in older games, especially the more popular ones, and even the upscale Xess is solid. Some games like Borderlands3 still suffer in DX11, though, and straight-up crash in DX12. Others, that have the option of DX11 or Vulkan, are abysmal. World War Z, for example, is horrible in DX11 and cannot access the option to run in Vulkan. Still, for the price, the Arc GPUs are a great buy.

There’s always a but,

There is this time, as well.

Starfield is a game that Bethesda has been working on for several years, and was known about when the ARC GPUs launched. Being a new title, it should be in the new GPU’s wheelhouse. No. The pre-release wouldn’t launch at all. For that matter, as of a week after launch, it wouldn’t start on some systems, including mine.

Try as I might, I can’t think of a good excuse for this. Furthermore, I can’t pin down who should be more at fault. Intel knew about the game and should have been ready for this release, but Bethesda has made comments that even newer hardware won’t be enough to run the game. I get not being able to run it well, but to even start the game?

This was supposed to be one of the most anticipated games to come from Bethesda in a decade and you tell us things like the ARC GPUs don’t even meet minimum specs. The card is a year old and compares to cards that are above your minimum specs. That doesn’t wash. Intel isn’t in the clear here either, though, this game wasn’t a secret, and it wasn’t some sort of indy project. This was a major project and should have been anticipated.

Does it work on Arc or not?

Luckily, Intel continues to work on drivers and we can play this game on our Arc GPUs. It’s not perfect, by any means. There are issues like changing the resolution and keeping a full screen. Or weird blue shading that happens when I change settings but I can play around with it. I shouldn’t have to, but it’s not the end of the world. It does take away from the immersion, though.

The graphics on the game are beautiful and the storyline begins to take shape a few hours into playing. Without some persistence, though, owners of Intel GPUs might not ever reach that point. I could understand if many simply walked away from it, especially for the price point. At this point, it’s an average experience, and if you are used to playing other games in that genre, like No Man’s Sky or Star Citizen, you may end up passing on it all together. I will tell you, though, that I was happy to get it working on the A750, I just don’t know how much of it I’ll be playing.

Link to the video is here

Back to the blog page

Rendering videos with the new editing rig

How did we get here?

Having completed most of the work on CoolBlue, the new editing rig, it came time to start testing. Theoretically it should be better, but that still needed proving. This would primarily be a test of the CPU, because the version of DaVinci Resolve I’m using doesn’t have support using the NVENC encoder used by NVidia. I’ve been using it like this for a couple of years, but there is a wrinkle later.

I devised the rather unscientific procedure of three test runs each on identical projects. I was able to do this because the files are now kept on the server, and available to every computer on my network. It would be three passes with each the AMD Ryzen 7 5800x and the Intel i7 12700KF. I would render a twelve minute video, alternating between the two systems allowing a cooldown between each run.

Both systems are similar with anything handling the media being nearly identical. Both systems are using an intel NVMe drive, with other assets stored on WD Black 7200 RPM. Each saves to a drive on the home network to the server, I recently built. The GPUs aren’t a factor (yet), and the DDR4 memory is 32 GB at 3600 MHz. The only difference is the platform.

The testing

The first run was almost what I expected, except the AMD was twenty seconds faster. I started to question my idea of swapping rigs at this point, but there was more to test. The 5800X clocked in at 4 minutes and 34 seconds, and the Intel rendered took twenty seconds longer. The second run saw them reversed, with the 12700 turning in the faster time. The average of both runs were almost identical, I could go with either system at this point.

That changed with the third run. The AMD rig turned in a time similar to the other runs, but the Intel system rendered the video in three minutes, nineteen seconds. It shaved a full minute off anything the AMD offered. Both of these systems are the previous generation with the same generation of memory, and the same storage solution. It would be a no brainer to use the new build, but there was more.

I mentioned that the version of Resolve I am using doesn’t take advantage of the RTX3060Ti that is in the current editing rig, but it does allow for the AV1 encoder on the Arc A750. The same A750 that is in the new build. Might this be the test I need that convinces me to switch?

AV1 in the new editing rig

One advantage of the Arc graphics cards is the addition of the new AV1 encoding process. An encoder that is also now included on the new 7000 series AMD Graphics cards. After a small misstep, I was able to configure Resolve properly, and I ran the render test, which blistered the previous efforts.

The new run took one minute and fifty two seconds. My fastest run was now cut almost in half! Half! I was convinced. I used to start rendering, then leave the room to go get something to drink or a bio break. Now, I will barely have time to find which image will be my thumbnail for the video. This will save hours over a year’s time. It will pay huge dividends as I move forward.

For good measure and piece of mind, I rendered the support video for this story using the same method and the eleven minute, seven second video, finished in two minutes even. It’s a huge improvement and it makes me excited using this editing rig going forward. It does raise a question of whether I should completely swap PCs or have two systems for two different functions, though, but that’s a question for later.

The video for the testing and results is here.

Back to the blog.

New drivers, but the same results

How did we get here?

A few months back I bought Intel’s Arc A750 graphics card to test. The day one drivers, Intel admitted, were lacking support for some older games. Newer games use Direct X12 to interface with Windows, while older ones use DX11 or 9. Some games also use the Vulkan API, but we’ll get to that in a minute. The Arc GPU’s were optimized for DX12 and Vulkan with plans to catch up as they went. If they went.

Newer titles perform very well , and the A750 can be picked up right now for under 250USD. It’s a great deal for the money, and they have updated a wide variety of games. There are, however, some issues. Some games are good for benchmarks because of how they stress components. They may not always be popular, but they have a purpose.

One popular for testing is Borderlands 3. It stresses the graphics card and often makes the best cards run hot. It also runs on DX11 AND DX12, except it doesn’t; not with the Intel drivers. Okay, DX 11 technically works, but DX12 crashes and won’t restart without going into the games config file. I don’t think that is a default setting.

While using DX11, performance is less than stellar. The game measures performance from the mid 70 frames per second to around 100 whether render the resolution is set to 1440P or the more standard 1080P. Generally, 1080P should be between a 10 and 20% improvement over the higher resolution. It’s not, if anything it’s worse, AND its worse with the new driver. Only a few frames and in margin of error, but worse.

Other issues

World War Z is another game with a different problem. Why would I mention a game that has a limited popularity? Because its a dual API game, with a choice between DX11 and Vulkan. Well, it’s supposed to support two protocols, but Vulkan isn’t currently an option. I haven’t looked at the config file yet, but from the menu, it’s a no.

Still, that’s not the real problem. The real problem is the DX11 drivers are all but broken. Any combination of texture and resolution nets you 60fps or less. GPU and CPU usage are both extremely low and memory usage is through the roof. Tears, frame drops and missing textures are all common, making it almost painful to watch. WWZ isn’t that popular, so it probably won’t be fixed for quite a while. it is, a valuable benchmark, however, or was.

They didn’t list either of these as games as improved, but I had hoped.

Will there be better drivers?

Most assuredly. Intel has done a fantastic job of updating every few weeks with a larger one about every quarter. Each time, there is an improvement and we have to remember, they haven’t been producing graphics cards for two decades plus, like the other two have. For a freshman effort, it’s outstanding and priced extremely well.

The card is under 250 USD and compares well to cards that cost at least one and a half times as much. It’s a worthy opponent, and it is a beautiful, sleek looking card that compliments almost any system. The AV1 encoder will be of great use rendering video content, and it is power efficient, as well. The issues with some older titles is an inconvenience for some of us doing benchmarks, but the card performs well in newer titles. It’s actually a solid card.

I was skeptical when it came out, but thought the price was affordable if it turned out to be junk. I’m happy to say, it’s not junk. It’s actually a great card, and a great buy. With the constant improvements to drivers in Arc, I’m excited to see Battlemage when it releases, and I never imagined myself saying that. If the current example holds true, the next gen will also be affordable, putting pressure on NVidia and AMD.

This card will stay in my build and will so be my editing rig. In turn, I will use my other PC for testing, but I think the Arc will make a decent gaming and streaming option, so it gets a try. If it doesn’t cut it, I’ll just swap it out with the RTX 3060Ti I’m using.

The video on this is located here

Back to the blog page