Can Arc GPU’s play the new Starfield? Sort of.

How did we get here

The A750, being a new architecture, is bound to have issues, but you would think some things shouldn’t happen. Intel released the Arc GPUs about a year ago and declared that they were concentrating on the most up-to-date processes first. This meant games running DX12 and the new AV1 encoder among other things. They admitted older APIs and processes would take time to ‘fill in’ and to many of us, that was okay.

I saw it as an opportunity to do some testing and make content, so the discovery process worked well for me. Newer titles run well, and as recently discovered, the AV1 encoder is amazing. Older titles running Direct X 11 or 9 had some issues, but that was expected. No one felt they needed to call Intel out on that because they told us what to expect. They reported, and we confirmed. Everything checked out fine.

Intel has released several driver updates over the year to improve running those older titles. Improvements have been seen in older games, especially the more popular ones, and even the upscale Xess is solid. Some games like Borderlands3 still suffer in DX11, though, and straight-up crash in DX12. Others, that have the option of DX11 or Vulkan, are abysmal. World War Z, for example, is horrible in DX11 and cannot access the option to run in Vulkan. Still, for the price, the Arc GPUs are a great buy.

There’s always a but,

There is this time, as well.

Starfield is a game that Bethesda has been working on for several years, and was known about when the ARC GPUs launched. Being a new title, it should be in the new GPU’s wheelhouse. No. The pre-release wouldn’t launch at all. For that matter, as of a week after launch, it wouldn’t start on some systems, including mine.

Try as I might, I can’t think of a good excuse for this. Furthermore, I can’t pin down who should be more at fault. Intel knew about the game and should have been ready for this release, but Bethesda has made comments that even newer hardware won’t be enough to run the game. I get not being able to run it well, but to even start the game?

This was supposed to be one of the most anticipated games to come from Bethesda in a decade and you tell us things like the ARC GPUs don’t even meet minimum specs. The card is a year old and compares to cards that are above your minimum specs. That doesn’t wash. Intel isn’t in the clear here either, though, this game wasn’t a secret, and it wasn’t some sort of indy project. This was a major project and should have been anticipated.

Does it work on Arc or not?

Luckily, Intel continues to work on drivers and we can play this game on our Arc GPUs. It’s not perfect, by any means. There are issues like changing the resolution and keeping a full screen. Or weird blue shading that happens when I change settings but I can play around with it. I shouldn’t have to, but it’s not the end of the world. It does take away from the immersion, though.

The graphics on the game are beautiful and the storyline begins to take shape a few hours into playing. Without some persistence, though, owners of Intel GPUs might not ever reach that point. I could understand if many simply walked away from it, especially for the price point. At this point, it’s an average experience, and if you are used to playing other games in that genre, like No Man’s Sky or Star Citizen, you may end up passing on it all together. I will tell you, though, that I was happy to get it working on the A750, I just don’t know how much of it I’ll be playing.

Link to the video is here

Back to the blog page

Cooling the CoolBlue build. It should be very easy, right?

How did we get here?

This started easy enough, add an All-In-One cooler to CoolBlue. Yeah, not so much. Oh, everything leading up to the change went well, but it wasn’t until getting our hands dirty that things took a turn. Let’s start from the beginning.

I had a Corsair H100i CPU cooler not being used, but it wouldn’t fit as it was, so I went shopping. I had to find a retro kit, which in this case was four slightly shorter stand off posts. That was easy enough, they are widely available and not unreasonably priced. Not unreasonable until you factor in shipping.

The Corsair site had these for 7.99 USD, great, let’s order them. 9.99USD for shipping. What? They cost more to ship than the item cost? You’ve lost your pea picken’ mind! There were several sellers, however, on eBay, so that’s where we ordered. Four days later, things looked ready to go.

I chose to reuse the DeepCool fans already in the case, which was fine, but as it turns out, the case has a flaw. The fans that mount to the top of the case, can only mount in the center. Many case manufacturers add additional grooves so things can be pulled closer to the glass panel and not have an issue with the height of cooling blocks on the motherboard. This case lacks that. My next option was mount the radiator behind the front panel.

A Different Problem

Technically it fits, but it looks awkward. The front has three fans mounted and now the top two reach into the case further, making things look sloppy. There is also the matter of the CPU cooling block having a lot of extra cables causing clutter. Still, I could look past that if it kept things nice and cool. It didn’t.

Okay, it did keep things cool, but only as much as the tower cooler I took out. Not a good trade. It looked bad from the front, more cables to manage in the back, and now the temps were only slightly better. By slightly better, I mean margin of error better. I was able to get slightly better temps by removing the front panel, but that’s not ideal. So, you’re thinking, a horrible fail.

Yes, and No. It was a failure, but I learned something. What not to do.

What’s next for CoolBlue

The very next thing to do is remove the All-in-One cooler from the build. Aesthetically, it’s not pleasing and it’s also not effective.

Next, I will look at the big brother to the current AG400 CPU cooler, the AG620. This is rated for a higher total power output, and although it’s more than that CPU needs, it pushes more air directly out of the case. It may only mean a few degrees cooler, but to be honest, temps weren’t bad, I was just looking to improve them so I wouldn’t have any throttling when I edit.

This also may mean a different case. CoolBlue gets it’s name because is a Intel CPU/GPU build with DeepCool parts, (including the case), but I did put a CoolerMaster power supply in. I may be able to do the same with the case and still keep the theme. I may even be able to find a blue case. If I find one with better airflow, that can solve all of the problems at one time, and that would be cool. No pun intended. After that it will be the process of swapping SSD’s and loading programs I use most often, then it may be ready to become my new every day driver. We’ll see.

Link to the YouTube video

Back to the Blog Page

CoolBlue Gets a Huge Upgrade – So Much Better!

How did we get here?

It started simple enough with only wanting to build a PC with two brands. CoolBlue was a combination of DeepCool parts and Intel, but it was incomplete. I wasn’t using an available intel NVMe drive and the fans were not all addressable RGB. Sometimes, I don’t leave well enough alone. The link to that blog is here

While looking for said parts, I found a price drop on a 12700KF from intel. It was a good deal, but I was only interested if I found an equally good deal on the motherboard. Damn. I found one. That meant if it turned out better than my editing rig, I had work to do. Swapping platforms is a huge deal.

There was only one way to find out if the combination would be better, so I got to work. The parts came in and this time the build went much quicker than the last effort. With everything installed, including the new NVMe and fans, I started testing. So far, the tests have only included the Arc A750, intel’s graphics card, but the CPU intensive tests told the story.

The testing

On a free benchmark render tool called Cinebench R23, I saw what amounted to a 50% uplift in performance over my current setup. This test renders a complex image using only the Central Processor with no help from the Graphics. One and a half times better performance out of the 12700KF was far more than I could have planned. It was time to run the game benchmarks.

Not having run benchmarks on the Ryzen 7 5800X combination with the Arc A750, I had to rely on uplift from the intel 11400 tested previously. The change was not only noticeable, but I was getting results from the A750 that before had been failed tests. It just ran better with the new CPU. The A750 wasn’t good enough to be my daily driver, but it was obvious, the new CPU was.

I used a DeepCool AG400 tower cooler, which will handle most activity well. And, with four PC120 fans, temperatures were very stable during most testing. It throttled during CPU stress tests. Yes, the same R23 test that measured a 22000 score did so with a red CPU light on. It was impressive.

This is a great processor, and better than what I’m using. I now had a conundrum.

Is CoolBlue my new everyday rig?

My set up with I Am Number Four is very solid and stable. It’s been my workhorse for the channel, and my growing social media effort. Changing, would require me to swap a lot of drives, and programs over. I was hesitant. I’m still hesitant.

I will need to find which drives to move, and what to do with I Am Number Four, but CoolBlue, will indeed become my editing, streaming and gaming rig. You don’t realize how comfortable you are with something until you go to change it. I still don’t use the gaming setup in the other room, so I will be breaking that PC down as well. I didn’t see that coming, but I guess I’m crossing a threshold of sorts.

This is a very capable PC and will be a great next step in handling everything I need to throw at it. And, for the first time in a very, very long time, my ‘Go To’ rig will be an Intel. I will, of course change out the graphics card and the power supply, but it will be a Cooler Master, so the name will stay the same. Paired with an RTX 3060Ti, there won’t be much it can’t handle. Welcome to the family CoolBlue.

The YouTube video can be found here.

Back to the Blog page

Intel’s new GPU and CPU, match made in heaven?

How did we get here?

It started when I thought it would be a good idea to buy the Intel Arc GPU to test. I picked up the A750 for a reasonable price, (which is about to drop), and went to work. Testing did have it’s issues and there were some bumps in the road, but I got results that were helpful. I then turned my attention to building and testing a rig with both Intel CPU and GPU in it, and that wasn’t so easy.

There were issues recognizing the GPU and an NVMe drive, but the primary problem was acting like something altogether different. After realizing a BIOS update was necessary, I was back on track, but nowhere near as excited. Links to the BLOG and VIDEO, in case you are interested. It was some work, but it did work.

I wasn’t able to complete benchmarks because of issues with time, but I needed to post the video. So, I went back to work on the same rig after posting, and ran into more issues. I was already aware that some games were not going to run properly, so I left them off the list. I was also aware that one game just takes forever to benchmark, so I left it off as well. Bad choice. Benchmarking on that is going on as I write this.


One of my biggest issues was that the game with the Vulkan graphics API won’t select that protocol, forcing me to stick with DX11, a known shortcoming. In fact I had problems with another title that has DX 11, Borderlands 3. It’s a game I have thought about dropping, but now I may actually have to.

I was testing on DX11 and DX12, and it just quit. Period. Won’t run. It shows the splash screen then shuts down completely and won’t start up. I have a similar problem trying to run this game on my editing rig, but Borderlands3 now defaults to DX12 and I have several games with that setting. It’s great to benchmark, but it may be time to retire it.

Other issues included World War Z where it caps the framerate and I get horrible artifacting, and Horizon Zero Dawn has the same framerates no matter the actual resolution. The latter of the two was confusing during my tests with the Ryzen CPU, so I was expecting it, but it shows Intel still has some work to do on their drivers.

What’s next for the Arc GPU?

I’m not done with this one yet, though I decided that CoolBlue (intel test rig) needs some changes. Instead of just changing a fan and getting an Intel NVMe to bring this closer to a true DeepCool/Intel build, I ordered a new platform. For the first time in over ten years, I am probably going to make my main rig an Intel.

I will be changing the GPU, mind you, but I will test this A750 with a 12700KF in a brand new B760 motherboard. The choice wasn’t made lightly. I was going to go with AM5, but recent problems with motherboards overvolting these CPUs had me turn to something already proven for my new build. But that’s another story…..

The fun in assembling a great Intel Test Rig

How did we get here?

The first part of building a solid intel test rig, is of course, the platform. The i5 11400F was my choice for the CPU. It has six cores and twelve threads and is a great midrange processor. Recently purchased was an Arc A750 graphics card, so all I needed was the support material and cooling. I have a DeepCool case, and recently picked up a set of DeepCool RGB fans, so a plan was forming. The power supply and storage were also ready, so I gathered everything I needed, and was off to a great start.

Assembly wasn’t difficult. The case is great to work in and the fans fit perfectly. There was already an NVMe drive installed, and except for an errant motherboard screw, things fit well together. The second step was going well. That’s where the fun stopped, though. The motherboard chosen had a BIOS from before the Arc A750’s release, and it wasn’t until a fair bit of troubleshooting, that I discovered this.

The NVMe drive also didn’t show up as a boot drive, although it had a new copy of Windows 10 Pro. Troubleshooting one problem at a time is fair, but this masqueraded as a DRAM problem and CPU issue, instead of the GPU. One complete disassembly and reassembly later, I thought of the BIOS. Achievement unlocked.

I still had no drive showing up, so I tried to repair Windows. I don’t know why there is a repair utility in the Media Creation tool. It never works, EVER. Time to reinstall it instead. Nope, the NVMe already had a master boot record. Seriously? BIOS says it doesn’t (even after the update) and the Win Utility says it does. This isn’t going great anymore.

Day Two

With a cooler head, I came back, grabbed an SSD with no data, and started over. To my amusement, everything worked. I then started with my normal installs only to realize that if I were going to do this right, I would need an Intel NVMe. Stop installing stuff, Paul. And, while you’re at it, order another DeepCool fan to balance out the RGB. Done. Time to edit what I had, and post it on YouTube.

Everything wasn’t done yet, but there was enough material for the build video. The video was up, so I went to dinner. I then realized that I forgot the thumbnail. Oh, I made one, I just forgot to use it. The frustration finally got me. It would be hours until I could fix it, but it couldn’t be helped. It would wait.

Finishing the Intel Test Rig

All I had to do was wait for the new drive and the extra fan and I could start benchmarks. What? A notice about my order? Delayed? What the……?

To be continued. In the meantime, check out the Arc A750 GPU post.

Intel Arc Graphics. Outstanding, Good, or Trash?

How did we get here?

Intel dominates part of the Central Proccessing arena, but what about Graphics? For several years there have been two major players in the GPU space, NVidia and AMD. They’ve had decades to refine that craft and both have strong products. NVivida has dominated until recent years, but what about a third? Can someone else come in and compete with two established jugernaughts?

Enter the Arc line of GPU’s. It’s Intel’s first foray into the market in a dedicated PCI slot, and the first dedicated card since 1998’s AGP card. They have had integrated graphics built into motherboard chipsets, and processors for decades, but stayed clear of discrete cards. That is, until lately, with Xe graphics.

Okay, but Intel?

Having a small You Tube channel gives me an opportunity to test many different PC parts. Most are mainstream, some a little odd, but which category does the Arc GPU fit into? I didn’t have an interest in that question until I found out Intel looks to continue making GPUs. At the time of writing this, there are many unconfirmed rumors that Intel will indeed continue for at least two more generations.

Intel has three discrete cards, the A380, A750, and the A770, with all three being reasonably priced. I chose the 750 because it has fair competition in the same price range. NVidia has the RTX 3050 and RTX 3060, while AMD has the RX 6600 and 6600XT, both also have older products. With an RTX 3060 and RX6600XT both on hand, I had test material. If only it were that easy.

First impressions of the card were favorable. It’s sleek, appears to be well built and beautiful to look at. Other things about it aren’t as compelling. The drivers are cumbersome, though I understand they’re much improved, and the need for Resizable BAR ( a direct storage option) required a bios update. It also required digging through outside source material to configure the bios before I could reinstall the Graphics drivers. Set up wasn’t painless.


It took several hours, but I was finally able to start testing. With the proper drivers installed, I spend some time repeating the same exercises that gave me issues after the first attempt, and I was pleased to find things finally working. No locked screens, no blackouts, and no error message saying resizable BAR wasn’t enabled. It was finally time to test games.

The Arc performed well in Direct X12 titles, but was I not prepared for the nose dive with the Direct X 11 API. The card stumbled out of the gate on just the second game, Borderlands 3. It typically runs better on the older API than the new one, but is always within margin of error. Results didn’t get above 77 frames pers second in DX11, where the other cards approached 200. I knew it struggled, but not that much.

So, is the Intel Arc GPU junk?

Not at all, but it does still need some work. The performance in Direct X12 matches the other cards, and switching Borderlands 3 to DX12 saw it beating the 3060. I need another example of a Vulkan title, but what I’ve seen so far is promising. Temperatures stay in the low to mid 70s, the clock speed maintains 2400 MHz. Menus and apps are responsive with no delay or hang ups. Overall performance is good, with no artifacting or glitches. It also looks nice with smooth lines and a tasteful white LED.

AMD and NVidia options straddle the Intels in both price and performance, and although Intel has no high end offering, it wasn’t long ago that AMD only targeted the mid range market. The A750 retails at $250, placing it firmly in the mid tier of GPUs. It’s a solid card and I look forward to testing others.

The video can be found here

Back to the blog page

The most I can get out of the Xeon?

How did we get here?

Some may remember me testing the Xeon ‘replacement’ for the CPU in my Optiplex, that wasn’t a replacement. Long story short, the Graphics card was the issue and I need to do something else. I opted to throw it in another case with a different motherboard and better GPU. That was better, but I thought there was still more I could get out of it. But how much?

To find out, I borrowed the RX 6600XT from my gaming rig in the living room. It won’t stay in there, obviously, but I know it’s capable and I have some tests with similar CPUs. Its the most powerful GPU I can get to easily, so I put it in, expecting the pink case to now be an easy bake oven. I was wrong. Oh, so wrong.

Performance was better than expected and temps were perfect. The CPU and GPU both stayed in the mid to high sixties on the high end; something I never saw coming. The pink case has horrible airflow and hates anything more powerful than a Casio watch. For some reason, though, this combination loved it. I’m sure it will be better in a different case, but for testing, I’ll take it.

The important thing, though was finding where the CPU bottleneck would come, and test comparisons to the other processors. I say test comparisons, because in real world scenarios, paired with a mid-budget GPU playing games in 1080P, there is no difference. Admittedly, with the better GPU I also added more RAM, but the games never used it, so it wasn’t a factor. This thing is solid, and it will probably be the ‘go to’ PC for playing games in the bedroom, if I bother to set it up.

Xeon? for real

Getting one of these and testing against some of the newer four core. eight thread CPUs, was a great choice. It’s a much better than trying to hunt down and pay twice as much for an i7 4770, and it makes me want to see how the larger socket E5 Xeon’s do. That’s another experiment, though. This one goes down as a success, and I couldn’t be happier with the results. Talk about a surprise. Great performance, cheap price and low temps. There isn’t much more you can ask for.

One of these paired with an RX580 is a perfect budget Gamer and 1080P is no problem with anything you want to throw at it. Now it’s time to get my 6600xt out and maybe try a GTX1660. Hmm.

Back to the blog page